I don't want to start a war, but I don't understand why people always say this. Maybe I'm wrong, but applying the ideal gas law, it seems like it should take about 18 degrees F (or 10 degrees C) to change the pressure by 1 psi. Here's my reasoning.
Say we start at 50 degrees F and a tire pressure of 30 psi. 50 degrees F is 10 degrees C, or 283.15 degrees Kelvin. Applying the noble gas law (Pv = nRT), and pretending the total volume of air inside the tire doesn't change with a small change in temperature, we can simplify the formula to P = T * some constant. At our selected temp and pressure, this constant is 30/283.15. Now reduce the temperature by 10 (which is 18 degrees Fahrenheit). This gives P = 273.15 * 30 / 283.15, which is approximately 28.9, or very close to a one degree drop in pressure.
So as a rule of thumb, it seems to me it's much more accurate to say that pressure change by 1 psi for every 10 degrees C change. If this analysis is wrong, please point out where I messed up. I've been using this rule myself for a while...