“Deflate-gate” and the Ideal Gas Law – Nothing to See Here?

I was being glib with this tweet, but as a still-recovering Chemistry major, I’m convinced that the temperature of the air inside the ball could have played a major factor in the “Deflate-gate” controversy. If you know me, you know that I’m by no means a Patriots fan, I just think that the media’s feeding frenzy, spurned by social media, might be much ado about nothing.

Consider the following scenario, which makes several key assumptions:

The Patriots’s equipment managers inflated the balls with air that was at 80 degrees Fahrenheit, somewhere within the team’s facilities. Since the current conventional wisdom is that less pressure in the ball yields better offensive outcomes, the equipment managers are incentivized to keep the pressure as low as possible, which current rules stipulate is 12.5 psi.

It isn’t crazy to think that the facility would have been kept warm on Sunday. It was raining and cold all day and many of the people working at the facility were likely working both indoors and on the field all day. Keeping the indoors space warm would have given the Patriots’s staff some relief to those miserable conditions.

The balls were either inflated with this warm, indoor air immediately before being checked by league officials or they were inflated and stored inside in the warm environment before being checked. The fundamental assumption with this exercise is that at the time of the checking, the air inside the balls was at 80 degrees Fahrenheit and at 12.5 psi.

Prior to the game, the balls were brought outside to be staged for use during the game. The outside temperature was in the mid-30’s all day and rose to around 40 degrees Fahrenheit by gametime. Assume that at game time the balls’ internal temperature had dropped to 40 degrees Fahrenheit (perhaps aided by a more efficient heat transfer and/or evaporative cooling from the rain).

What does that mean for the pressure inside of the ball?

Bring on the Ideal Gas Law. The Ideal Gas Law applies to a theoretical gas and not to ambient air, but for these purposes the differences aren’t enough to matter. Deriving the relationship between pressure and temperature with a constant volume and quantity of air, we find that as temperature decreases, pressure proportionally decreases.

Represented as a formula:

State 1 Pressure / State 1 Temperature = State 2 Pressure / State 2 Temperature

In order for the formula to work, the measurement scales need to be a Ratio scale (which the Celsius and Fahrenheit scales are not). We need to convert the temperature to the Kelvin scale (which starts at zero, Absolute Zero).  Here is our calculation:

12.5 psi / 300 K = "Cold Ball Pressure" / 278 K
"Cold Ball Pressure" = 12.5 * (278/300) psi = 11.6 psi

…so a 40 degree Fahrenheit drop will net out about a 7% drop in pressure. I don’t know if this is enough to produce a noticeable performance difference, but it sure seems like it could be a plausible explanation of these observed effects.