Sunday, April 22, 2012

Ocean Warming Put Into Perspective

Here's a nice little calculation from the Levitus 2012 paper I mentioned earlier:
"We have estimated an increase of 24 x 1022 J representing a volume mean warming of 0.09°C of the 0-2000 m layer of the World Ocean. If this heat were instantly transferred to the lower 10 km of the global atmosphere it would result in a volume mean warming of this atmospheric layer by approximately 36°C (65°F)."
(This is because the heat capacity [= mass x specific heat][= dQ/dT, where Q is heat and T is temperature] of the ocean is about 1,000 times that of the atmosphere.) Lest you worry, they add:
"This transfer of course will not happen; earth’s climate system simply does not work like this. But this computation does provide a perspective on the amount of heating that the earth system has undergone since 1955."

5 comments:

Piltdown said...

What sort of thermometers did they have in 1955 that they could measure the annual average temperature of a 2 km thick slab of non-uniformly heated turbulently mixed water to an accuracy of better than 0.09 C?

How did they determine that the measurement errors were perfectly uncorrelated?

Just curious.

Dano said...

ZOMG!!!!

The intrepid piltdown has uncovered the globul scam that is climatology!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!11one!!!!

Best,

D

Piltdown said...

Thanks, Dano. That's very informative.

Here's another question for you, since the last one was so easy. If you take the heat content difference of the oceans between summer and winter, and transfer all the heat to the atmosphere instantaneously, what temperature rise would result?

riverat said...

Piltdown,

When you average a large number of measurements it's reasonable to express the result to a higher precision than the original measurement. There is a statistical theory about this but the best example I know is in baseball a batter either gets a hit or doesn't, a 1 or a 0. Yet batting averages are commonly expressed to 3 decimal places.

So if thermometers in 1955 measured accurately to 1 degree they were good enough.

Piltdown said...

riverat,

Thanks. But that statistical theory you mentioned reduces the error of an average of statistically independent measurements in proportion to the square root of the number of samples. To get a thousandfold reduction in error, you need a million measurements.

There is an amusing example in which they say if you measure the length of an object to the nearest millimetre often enough, you ought to be able to resolve individual atoms. Since atoms are 10^-8 mm you need about 10^16 measurements. What do you think? Is it possible in principle?