I had heard that California set a temperature record for 2014, but I didn't realize by how much: 1.7°F above the earlier record of 1934:
Data from NOAA.
Tuesday, January 20, 2015
Visualizing 2014's Record Warmth and Its Heat Gain
Perhaps here's a way of capturing how slight 2014's record warmth was.
<grain of salt>
<turn envelope over>
Assume the surface temperature stations are measuring the first 2 meters of the atmosphere(* but see below). The density of air at the surface is 1.2 kg/m3. So the total mass of the 2-meter sliver is ~ 1015 kg.
It has a heat capacity of about 1000 J/kgK, so it takes ~ 1016 Joules to create a 0.02°C temperature rise in this 2-meter sliver of the atmosphere, which is ~50 J/m2 over the Earth's surface.
50 Joules isn't much. It's ~ 10-5 kilowatt-hours -- about how much electricity the average American uses (12,900 kWh/yr) in 30 milliseconds.
If that 0.02°C of warming happens over 4 years, and if I've not made any mistakes, the average heating rate is 0.4 microwatts per square meter. By contrast, the average surface heat flux from the Earth's interior heat is 86 milliwatts per square meter -- 200,000 times higher.
(But the average heat flux from the Earth's interior is essentially constant, so it doesn't create global warming or global cooling.)
(*) OK, in reality the sensors that measure temperature are very small -- this one (page 18) is about 5 mm x 4 mm. So it's only measuring a 0.5 cm-thick slice of the atmosphere. That would require only 0.1 J/m2, or, over four years about 1 nanowatt per square meter.
So crowing over these types of records -- 0.02°C -- just opens up a world of "yeah, but"'s. It would be more scientific to say, as Roger Pielke Sr tweeted today (and has emphasized for years):
As he wrote in Physics Today in 2008:
How much heat can a 2-dimensional surface hold? Zero. Not much lower than can a 2-dimensional surface with a 5 mm-thick layer of air on top of it.
Of course, all NOAA and NASA (and all other) climate scientists know all of this. But try explaining this to the public -- or even to a teleconference full of journalists. And good luck explaining that you really haven't measured the temperature change of the Earth's surface, but the temperature change of your model of the Earth's surface. The public would never get it, and the usual suspects would certainly take it as an opening for all kinds of purposeful confusion and dissembling denialism.
Last word? It's the trend, stupid.
</turn envelope over>
</grain of salt>
<grain of salt>
<turn envelope over>
Assume the surface temperature stations are measuring the first 2 meters of the atmosphere(* but see below). The density of air at the surface is 1.2 kg/m3. So the total mass of the 2-meter sliver is ~ 1015 kg.
It has a heat capacity of about 1000 J/kgK, so it takes ~ 1016 Joules to create a 0.02°C temperature rise in this 2-meter sliver of the atmosphere, which is ~50 J/m2 over the Earth's surface.
50 Joules isn't much. It's ~ 10-5 kilowatt-hours -- about how much electricity the average American uses (12,900 kWh/yr) in 30 milliseconds.
If that 0.02°C of warming happens over 4 years, and if I've not made any mistakes, the average heating rate is 0.4 microwatts per square meter. By contrast, the average surface heat flux from the Earth's interior heat is 86 milliwatts per square meter -- 200,000 times higher.
(But the average heat flux from the Earth's interior is essentially constant, so it doesn't create global warming or global cooling.)
(*) OK, in reality the sensors that measure temperature are very small -- this one (page 18) is about 5 mm x 4 mm. So it's only measuring a 0.5 cm-thick slice of the atmosphere. That would require only 0.1 J/m2, or, over four years about 1 nanowatt per square meter.
So crowing over these types of records -- 0.02°C -- just opens up a world of "yeah, but"'s. It would be more scientific to say, as Roger Pielke Sr tweeted today (and has emphasized for years):
Re:global warming, changes in ocean heat should be used, NOT global average sfc temp trend http://t.co/3GBVC6gsny pic.twitter.com/0qDbtwMwab
— Roger A. Pielke Sr (@RogerAPielkeSr) January 20, 2015
As he wrote in Physics Today in 2008:
How much heat can a 2-dimensional surface hold? Zero. Not much lower than can a 2-dimensional surface with a 5 mm-thick layer of air on top of it.
Of course, all NOAA and NASA (and all other) climate scientists know all of this. But try explaining this to the public -- or even to a teleconference full of journalists. And good luck explaining that you really haven't measured the temperature change of the Earth's surface, but the temperature change of your model of the Earth's surface. The public would never get it, and the usual suspects would certainly take it as an opening for all kinds of purposeful confusion and dissembling denialism.
Last word? It's the trend, stupid.
</turn envelope over>
</grain of salt>
Monday, January 19, 2015
Sunday, January 18, 2015
2014's Warmth Record All Due to Sun*
(*) see below
2014's record warmth was all due to the Sun*. The Sun's radiance has increased since 2010, the last surface temperature record that was set.
LASP in Colorado (Laboratory for Atmospheric and Space Physics) gives a daily measurement of Total Solar Irradiance (TSI) -- the energy, per unit area, that the Sun emits. They give it at both the position of the Earth, and at 1 AU (astronomical unit -- the Earth's average distance from the Sun).
If you want to look for changes in TSI, it's best to use the 1 AU data, to avoid any complications with the Earth's orbital factors (though between 2010 and 2014 those shouldn't matter).
Downloading the data, I find the average TSI at 1 AU has been
where TOA = Top Of the Atmosphere. Annual TSI has increased by 0.57 W/m2 from 2010 to 2014. That's a fairly healthy increase -- 2014 was the top of the current solar cycle....
even in historical terms:
(Yes, some of the data are missing for each year, especially 2014. I could interpolate over the missing days, or some such blogger-like thing, but I'd be surprised if it mattered much. LASP's annual number will eventually appear here.)
So what's the effect of this increase in TSI over the last four years?
You can estimate the impact of TSI changes for the surface temperature; it's
where L is the Sun's TSI (luminosity). This is just a differentiation of the basic equation for equilibrium: energy coming in from the Sun = energy radiated out by the Earth:
where I'm stealing some equation JPGs from Nir Shaviv, who has a nice little discussion about all this. (BUT, be careful of his notation versus yours - he calculates temperature change as a function of solar irradiance at the surface, not the top of the atmosphere.) Plugging in the number for the change in TSI, we get
2014's record warmth was all due to the Sun*. The Sun's radiance has increased since 2010, the last surface temperature record that was set.
LASP in Colorado (Laboratory for Atmospheric and Space Physics) gives a daily measurement of Total Solar Irradiance (TSI) -- the energy, per unit area, that the Sun emits. They give it at both the position of the Earth, and at 1 AU (astronomical unit -- the Earth's average distance from the Sun).
If you want to look for changes in TSI, it's best to use the 1 AU data, to avoid any complications with the Earth's orbital factors (though between 2010 and 2014 those shouldn't matter).
Downloading the data, I find the average TSI at 1 AU has been
where TOA = Top Of the Atmosphere. Annual TSI has increased by 0.57 W/m2 from 2010 to 2014. That's a fairly healthy increase -- 2014 was the top of the current solar cycle....
even in historical terms:
(Yes, some of the data are missing for each year, especially 2014. I could interpolate over the missing days, or some such blogger-like thing, but I'd be surprised if it mattered much. LASP's annual number will eventually appear here.)
So what's the effect of this increase in TSI over the last four years?
You can estimate the impact of TSI changes for the surface temperature; it's
∂T/∂L = T/4L = 0.05°C/(W/m2)
where L is the Sun's TSI (luminosity). This is just a differentiation of the basic equation for equilibrium: energy coming in from the Sun = energy radiated out by the Earth:
=
where I'm stealing some equation JPGs from Nir Shaviv, who has a nice little discussion about all this. (BUT, be careful of his notation versus yours - he calculates temperature change as a function of solar irradiance at the surface, not the top of the atmosphere.) Plugging in the number for the change in TSI, we get
ΔT = 0.03°C
from 2010 to 2014. Which more than explains the 0.02°C increase seen in the GISS data (though not quite the 0.04°C in the NOAA data).
--
(*) Now for the asterisks.
OK, I'm being somewhat facetious, and the above should be taken with a grain of salt -- it's just a quick estimate -- but the point is, the observed increase since 2010 isn't obviously from greenhouse gases. And the point of that is the simplistic notion, which by now is probably appearing in kindergarten newsletters to parents and in fortune cookies baked just for the occasion, that 2014 was definitely warmer than 2010 definitely due to greenhouse gases definitely due to man and his definite burning of fossil fuels.
OK, I'm being somewhat facetious, and the above should be taken with a grain of salt -- it's just a quick estimate -- but the point is, the observed increase since 2010 isn't obviously from greenhouse gases. And the point of that is the simplistic notion, which by now is probably appearing in kindergarten newsletters to parents and in fortune cookies baked just for the occasion, that 2014 was definitely warmer than 2010 definitely due to greenhouse gases definitely due to man and his definite burning of fossil fuels.
It's not a particular year that matters, it's the long-term trend. But that doesn't make for a great headline (unfortunately).
I mean, don't get me wrong -- I certainly think the long-term trend is troubling, and due to GHGs, though I don't know if this will all end up being catastrophic (whatever that means). I think super-hyping a particular year because it's a couple hundredths of a degree warmer misses the whole point, as well, as I wrote earlier, opening the door for the same tactics from the Joules/second of the world.
I mean, don't get me wrong -- I certainly think the long-term trend is troubling, and due to GHGs, though I don't know if this will all end up being catastrophic (whatever that means). I think super-hyping a particular year because it's a couple hundredths of a degree warmer misses the whole point, as well, as I wrote earlier, opening the door for the same tactics from the Joules/second of the world.
Obviously everyone -- the media, the public information community, some scientists -- are hawking this because it's dramatic and offers at least a clear moment to draw a great deal of attention on the issue of AGW, which long-term trends don't (unfortunately). It's a chance to stick it back to the deniers and idiotic Congresspeople; as Eli Rabbet commented awhile back
Patience is a virtue. The whole thing is weird. El Nino, melting of the Artic, etc, are not things to wish for, but the horse shit is so deep you almost want these things to happen to to shut the fools up and get people to pay attention. Sucks to be human. . . . Wait.
Much the same here -- it's tempting to make a big deal out of 0.02°C, because of all the horseshit, and because too few of the public (American at least) understand enough to make sense of anything else.
So I guess all I'm saying is that by now I think all the "2014 was the warmest year in recorded history" is very misplaced, and it's sad that mainstream journalists either think they have to write such things as the only way to inform the public, or that that's all the journalists themselves can understand, or because they themselves are activists. To not at least mention the relative probabilities for the hottest year, given in the NOAA/NASA press conference -- is inexcusable.
I'll let Gavin Schmidt, now NASA GISS's director, have the last word:
@theresphysics indeed. I mean who expects journalists to be more than stenographers? /sarc @ProfMarkMaslin @ret_ward @DavidRoseUK
— Gavin Schmidt (@ClimateOfGavin) January 18, 2015
Friday, January 16, 2015
2014 -- the Longer Perspective
Here's an interesting comment from Judith Curry's blog:
In fact, the 1-year period ending in 2014 is the warmest ever, and the 5-year period 2010-2014, and the most recent 10 years, and 20 years, and 30 years and 40 years, and 50 years. (GISS data) Then I stopped calculating.
In fact, the 1-year period ending in 2014 is the warmest ever, and the 5-year period 2010-2014, and the most recent 10 years, and 20 years, and 30 years and 40 years, and 50 years. (GISS data) Then I stopped calculating.
Probability 2014 is Warmest Year = 66%
NASA and NOAA just announced their December temperature anomalies. For NASA GISS, the 2014 anomaly was +0.68°C, which beats the old record of 2010 of +0.66°C.
These numbers have uncertainties -- GISS tells me they're estimated to be ±0.05°C (which I think means 2-sigma). Then doing the same calculation I did for the Hadley Central England Temperature, the probability 2014 is warmer than 2010 is 66%.
Update 8:17 am - Here's a chart from the NOAA/NASA press conference, with their probabilities, less even than my estimate (those are "~"s in the chart, not negative signs):
A time series from NOAA:
These numbers have uncertainties -- GISS tells me they're estimated to be ±0.05°C (which I think means 2-sigma). Then doing the same calculation I did for the Hadley Central England Temperature, the probability 2014 is warmer than 2010 is 66%.
Update 8:17 am - Here's a chart from the NOAA/NASA press conference, with their probabilities, less even than my estimate (those are "~"s in the chart, not negative signs):
A time series from NOAA:
Thursday, January 15, 2015
Elk in the Columbia
A recently taken picture of a herd of elk swimming across the Columbia River near its mouth at Astoria, Oregon. Via the Oregonian.
A Convenient Little Number
Here is a nice rule of thumb: the carbon-climate response function (CCR) is, from Mathews et al (2009), 1.5°C of average global surface warming per TtC (TtC = trillion tons of carbon emitted; t="ton" = metric ton = 1,000 kg).
That is, each unit of carbon emitted causes (approximately) the same amount of warming, regardless of the carbon concentration of the atmosphere or how fast it's changing, and the cumulative carbon emitted determines the total temperature rise. regardless of exactly when it occurs (that is, assuming in occurs on century time scales and not 10s of millennia, where carbon is removed by natural process).
So CDIAC says the total carbon emitted as of 2005, from burning fossil fuels and cement production and gas flaring is 321 GtC, and carbon emissions from land use changes are 156 Tg GtC in 2005, for a total of 477 GtC (=0.477 TtC). So that would imply a warming of 0.71°C, which is pretty close.
So...if you're an American who only understands English units -- and believe it or not, some people understand even less than that, like the QVC hosts who think the moon is a planet, or perhaps even a sun, maybe with people on it -- and if, like most nonscientists, you think in terms of CO2 and not carbon...then this rule of thumb has a nice and tidy translation:
where "st" is a short ton (=2000 lbs).
And since the range for CCR is 1.0–2.1 °C per Tt C (5th to 95th percentiles), which I'll approximate as 1.0-2.0°C, the uncertainty of the CCR is
That's about as tidy as it gets, even in a system of units opposed to tidiness and clarity.
That is, each unit of carbon emitted causes (approximately) the same amount of warming, regardless of the carbon concentration of the atmosphere or how fast it's changing, and the cumulative carbon emitted determines the total temperature rise. regardless of exactly when it occurs (that is, assuming in occurs on century time scales and not 10s of millennia, where carbon is removed by natural process).
So CDIAC says the total carbon emitted as of 2005, from burning fossil fuels and cement production and gas flaring is 321 GtC, and carbon emissions from land use changes are 156 Tg GtC in 2005, for a total of 477 GtC (=0.477 TtC). So that would imply a warming of 0.71°C, which is pretty close.
So...if you're an American who only understands English units -- and believe it or not, some people understand even less than that, like the QVC hosts who think the moon is a planet, or perhaps even a sun, maybe with people on it -- and if, like most nonscientists, you think in terms of CO2 and not carbon...then this rule of thumb has a nice and tidy translation:
1.5°C/Tt C = 2/3 °F/T st CO2
where "st" is a short ton (=2000 lbs).
And since the range for CCR is 1.0–2.1 °C per Tt C (5th to 95th percentiles), which I'll approximate as 1.0-2.0°C, the uncertainty of the CCR is
(1.5 ± 0.5)°C/TtC = (1.5 ± 33%)°C/TtC = (2/3 ± 2/9)°F/T st CO2
That's about as tidy as it gets, even in a system of units opposed to tidiness and clarity.
from Mathews et al, Nature (2009) |
Wednesday, January 14, 2015
Throwing Stones At Every Dog
“You will never reach your destination if you stop and throw stones at every dog that barks.”
-- Winston Churchill
Tuesday, January 13, 2015
Oregon: 2nd Warmest Year
In Oregon, 2014 was the second-warmest year since 1895, losing only to that perennial champion, 1934.
(Data from NOAA.)
The trend since 1895 is +0.11 C/decade, and +0.12 C/decade over the last 30 years. But the most recent 10 years has been -0.16 C cooler than the 10 years before.
(Yes, the plot is in Fahrenheit, but the trends are in Celsius. I've converted to the units the U.S. ought to be using. Frankly, our refusal to covert to metric is well past embarassment and into...just plain dumb.... Can you imagine the turd Republicans would lay if Obama were to propose we transfer to the metric system?)
I don't know if the high temperature here means much. Oregon is composed of two very distinct regions -- the area west of the Cascade Mountains, where it rains alot in the winter and where the blue people live, and east of the Cascades, which is rural, agricultural -- a desert much like New Mexico or parts of Australia. The red people live there. They are of a much different stripe (or we are), and are forever losing statewide elections to the blue smelly hippie communists from Portland. (About 1/3rd of the state's residents live in the Portland metro area.) If I were them, I would seriously consider breaking off and forming a separate state. (A few people are, but the movement isn't getting much, if any, support.) The United States is just too damn large; small is beautiful.
California has the same issues (climatologically and politically), and Washington state too -- too much separation, not enough coherence. Red and blue do not always make purple.
Anyway...last year's precipitation in Oregon was about average -- though again, averaged over two very different regions.
PS: Today I'm not worry about things like this, or this. Today the state is in mourning.
(Data from NOAA.)
The trend since 1895 is +0.11 C/decade, and +0.12 C/decade over the last 30 years. But the most recent 10 years has been -0.16 C cooler than the 10 years before.
(Yes, the plot is in Fahrenheit, but the trends are in Celsius. I've converted to the units the U.S. ought to be using. Frankly, our refusal to covert to metric is well past embarassment and into...just plain dumb.... Can you imagine the turd Republicans would lay if Obama were to propose we transfer to the metric system?)
I don't know if the high temperature here means much. Oregon is composed of two very distinct regions -- the area west of the Cascade Mountains, where it rains alot in the winter and where the blue people live, and east of the Cascades, which is rural, agricultural -- a desert much like New Mexico or parts of Australia. The red people live there. They are of a much different stripe (or we are), and are forever losing statewide elections to the blue smelly hippie communists from Portland. (About 1/3rd of the state's residents live in the Portland metro area.) If I were them, I would seriously consider breaking off and forming a separate state. (A few people are, but the movement isn't getting much, if any, support.) The United States is just too damn large; small is beautiful.
California has the same issues (climatologically and politically), and Washington state too -- too much separation, not enough coherence. Red and blue do not always make purple.
Anyway...last year's precipitation in Oregon was about average -- though again, averaged over two very different regions.
PS: Today I'm not worry about things like this, or this. Today the state is in mourning.
Banned at WUWT? Add Your Name to the List....
The complaints about free speech at WUWT are ironic or laughable, and maybe both. And the very definition of hypocritical. I've seen several people write they've been banned there.... A delusional WUWT commenter thinks it never happens:
Anyone else? Add a comment to join the list.
The last thing someone like Anthony Watts is interested in is an open exchange of scientific ideas -- he'd be out of bu$ine$$ in a matter of days.
Thursday, January 08, 2015
Jevons Paradox, U.S. Version
I spent enough time calculating these graphs for something I'm working on for YCC that you'd better believe I'm going to post them here as well.
U.S. electricity usage per capita; hasn't changed much in 15 years:
All the data are from US EIA and the FRED database from the Federal Reserve in St Louis. You can look them up yourself.
U.S. electricity usage per capita; hasn't changed much in 15 years:
Next: Primary energy usage is all energy used -- for electricity, manufacturing, transportation, agriculture, flushing the toilet, feeding your cats, and everything else For the US as a whole it's now about 3.32 terrawatts (trillion watts; annualized) -- that is, we use 3.32e12 Joules of energy every second, averaged over the course of a year (FYI: 1 kilowatt-hour = 3.6 megajoules), down from a peak of 3.39 TW in December 2007. (The graph below is per capita.) These numbers are decreasing only slowly -- about 0.2% per year, on average -- and then (for the most part) only in recessions, as in the first half of the 1980s, and the 2000s just as Bush Jr took office, and after the financial crisis in 2008:
The next graph is primary energy intensity -- how much energy it takes to produce a dollar of GDP (goods and services). I deflated the GDP numbers with the GDP Implicit Price Deflator; basically the same as the Consumer Price Index. "2014-dollars" are as of Nov 2014.
Energy efficiency has given us a factor of 2.8 in energy productivity since 1973 (42 years). And we're spending most of that by consuming more energy (see graph 2, above) -- the essence of Jevons Paradox.
All the data are from US EIA and the FRED database from the Federal Reserve in St Louis. You can look them up yourself.
Republicans Have a Flux Correction Problem
This is ironic -- Congressional Republicans do not, of course, believe in climate change. In part, that means they don't believe in climate models.
The GOP recently voted to also start analyzing legislation by their economic effects -- so-called "dynamic scoring." Basically, you need to project out the legislation's impact on the economy over so many years.
Here's the rub -- the models used for dynamic scoring use aren't stable. They need to equivalent of "flux corrections" -- factors inserted to keep the model stable, instead of its variables running away to infinity. From Bloomberg Businessweek:
Flux corrections were a significant issue in the early days of climate modelling -- ad hoc, hand-inserted functions to keep the model from heading off to infinity -- think Manabe and Wetherald plus 20-25 years.
But flux corrections are no longer needed in most models: climate models are stable as time goes on (which is a different issue than parametrizations, which are simplifed functional representations of difficult-to-calculate-from-first-principles factors like clouds)
.
Models for thee, but not for me. And the Republicans stumble blindly forward....
The GOP recently voted to also start analyzing legislation by their economic effects -- so-called "dynamic scoring." Basically, you need to project out the legislation's impact on the economy over so many years.
Here's the rub -- the models used for dynamic scoring use aren't stable. They need to equivalent of "flux corrections" -- factors inserted to keep the model stable, instead of its variables running away to infinity. From Bloomberg Businessweek:
Flux corrections were a significant issue in the early days of climate modelling -- ad hoc, hand-inserted functions to keep the model from heading off to infinity -- think Manabe and Wetherald plus 20-25 years.
But flux corrections are no longer needed in most models: climate models are stable as time goes on (which is a different issue than parametrizations, which are simplifed functional representations of difficult-to-calculate-from-first-principles factors like clouds)
.
Models for thee, but not for me. And the Republicans stumble blindly forward....
Tuesday, January 06, 2015
The Problems With Claiming 2014 Will be the "Warmest Year"
Problems with claims 2014 will be the "warmest year":
- It isn't true -- Cowtan & Way's dataset definitely won't find 2014 to be the warmest year since 1850; it will probably be second, after 2010. 2014 will be lower by probably 0.03 C, if December's anomaly is the same as November's. December would have to be the warmest month ever (i.e. most anomalous), and then some to beat 2010. That won't happen. And you can't say C&W is a superior dataset -- which it probably is, because it uses satellite data to infill large regions without temperature stations, instead of extrapolating over up to 1200 km -- and then ignore it when other datasets show this year as the warmer year.
- Any claim about the "warmest year" must be about probabilities, not absolutes
- Such claims look like what they are: spin. If you're going to make a big deal out of 2014 being the "warmest year" -- for NASA's GISTEMP, any record will probably only be by 0.01 C or so -- then you're going to have to accept that next year, when the year is 0.01 C or 0.03 C cooler, that the word is "cooling," which will be the spin used by the other side. Worse, you'll have to accept that label for all the years until another new record is set. If the record-setting year is 2006, then 2010, then 2014, .... , then 3 of out 4 years will be "cooling" and only one year "warming." So you lose 75% of the spin battle anyway.
- The claim that global warming "stopped 18 years ago" are stupid anyway, because that result only comes from one dataset (that may have problems) and ignores many others. It looks like what it is: desperate numerical wankery. Even Roy Spencer finds it "too funny." So don't be an equal but opposite wanker.
- Is this about science or not? It's the same with big snowstorms or a hurricane -- Joe Romm can't go on TV claiming every big snow event on the East Coast is a sure sign of global climate change, then make the same claim when it's very warm there, or very cold, or a hurricane happens, or they don't. Attribution is still very difficult, and usually long after the fact. The issue is trends, not snowforts. So stop talking about individual storm, or individual years, like they're trends, when they're not. (Note to self: that means you too, dingus.)
- Anyway, global warming is ocean warming, and the ocean -- at least the top half -- keeps warming year after year.
- If you think you have a better scientific case than the deniers, then act like it, and not like them. You can't be a "science communicator" without the science part.
Sunday, January 04, 2015
Antarctica Without Its Ice Caps
Interesting - I didn't realize Antarctica is an archipelago:
Via Amazing Maps on Twitter.
(And no, I'm not suggesting you should buy coastal property there -- though if you do, how about a finder's fee? -- or that the mapmakers of the world need to update their maps anytime soon. It's just interesting, OK?)
Saturday, January 03, 2015
What is the Probability HadCET Set a New Record?
Yesterday I posted about the Hadley Centre's Central England Temperature (CET), which set a record high of 10.93°C for 2014, breaking the old record of 10.87°C in 2006.
But did it really set a record? The difference between these two records, 0.06°C, is small -- especially since the monthly data are only given to the nearest 0.1°C. Could uncertainties in the measurements mean these numbers are essentially indistinguishable?
I think one way to get a handle on this (which certainly is not rigorous) is as follows:
In reality the two highest values in the dataset, T1=10.93°C and T2=10.87°C, each have an uncertainty they carry. If someone else at the same location was doing the same measurements, they probably would get slightly different numbers. If lots of people were doing the same measurements, there'd be a range of numbers for T1 and T2.
Assuming the measurements were all independent, the differences between measurements of T1 would be close to normally distributed, and the same for T2.
Of course, all these measurements weren't measured simultaneously. All we have are the monthly numbers. Since they're cited to the nearest 0.1°C, we might take that as the uncertainty ΔT of each measured temperature for each month (or at least an upper bound on the uncertainty). Then the uncertainty of the annual average ΔA, again assuming normal distributions, is (with 12 measurements in a year) σ = ΔT/sqrt(12) = 0.029.
(Yes, the uncertainty of the average is less than the uncertainty of any measurement, assuming the measurements are random and fall into a normal distribution.)
The situation is in the figure to the right. Two overlapping bell curves peak at the two different temperatures T1 and T2, with each having the same width σ.
The green line denotes where the two functions intersect.
Then the probability that T1 is greater than T2 is the area under the red curve (T1) from the green line to infinity.
Since they each have the same width, you can calculate where the red and blue curves intersect from their Gaussian functions -- it's just their average, (T1 + T2)/2.
Then you can evaluate the area to the right of the green line using the error function.
When I do this with T1= 10.93°C, T2 = 10.87°C, and σ = ΔT/sqrt(12) = 0.029, I find the probability that HadCET set a new record is
Which seems reasonably reasonable.
Again, this isn't rigorous, just a back of the envelope calculation someone might do before she went and studied how to do it exactly. Good enough for blog work.
Just as a check, if T1=T2, this method gives the probability that T1 > T2 is 50%, which is what you expect. If σ was twice as large, the probability reduces to 70% -- and you'd expect it to be smaller. If σ = 1, prob = 50.1%, again reasonable. As T1 increases with T2 and σ held the same, the probability becomes closer and closer to one, as expected.
If this is off-the-wall wrong, or worse, not even wrong, let me know in the comments.
Update: as a commenter noted, Gavin Schmidt discussed something much like this on Twitter.
But did it really set a record? The difference between these two records, 0.06°C, is small -- especially since the monthly data are only given to the nearest 0.1°C. Could uncertainties in the measurements mean these numbers are essentially indistinguishable?
I think one way to get a handle on this (which certainly is not rigorous) is as follows:
In reality the two highest values in the dataset, T1=10.93°C and T2=10.87°C, each have an uncertainty they carry. If someone else at the same location was doing the same measurements, they probably would get slightly different numbers. If lots of people were doing the same measurements, there'd be a range of numbers for T1 and T2.
Assuming the measurements were all independent, the differences between measurements of T1 would be close to normally distributed, and the same for T2.
Of course, all these measurements weren't measured simultaneously. All we have are the monthly numbers. Since they're cited to the nearest 0.1°C, we might take that as the uncertainty ΔT of each measured temperature for each month (or at least an upper bound on the uncertainty). Then the uncertainty of the annual average ΔA, again assuming normal distributions, is (with 12 measurements in a year) σ = ΔT/sqrt(12) = 0.029.
(Yes, the uncertainty of the average is less than the uncertainty of any measurement, assuming the measurements are random and fall into a normal distribution.)
The situation is in the figure to the right. Two overlapping bell curves peak at the two different temperatures T1 and T2, with each having the same width σ.
The green line denotes where the two functions intersect.
Then the probability that T1 is greater than T2 is the area under the red curve (T1) from the green line to infinity.
Since they each have the same width, you can calculate where the red and blue curves intersect from their Gaussian functions -- it's just their average, (T1 + T2)/2.
Then you can evaluate the area to the right of the green line using the error function.
When I do this with T1= 10.93°C, T2 = 10.87°C, and σ = ΔT/sqrt(12) = 0.029, I find the probability that HadCET set a new record is
85%.
Which seems reasonably reasonable.
Again, this isn't rigorous, just a back of the envelope calculation someone might do before she went and studied how to do it exactly. Good enough for blog work.
Just as a check, if T1=T2, this method gives the probability that T1 > T2 is 50%, which is what you expect. If σ was twice as large, the probability reduces to 70% -- and you'd expect it to be smaller. If σ = 1, prob = 50.1%, again reasonable. As T1 increases with T2 and σ held the same, the probability becomes closer and closer to one, as expected.
If this is off-the-wall wrong, or worse, not even wrong, let me know in the comments.
Update: as a commenter noted, Gavin Schmidt discussed something much like this on Twitter.
Friday, January 02, 2015
Is Time Running Out Fast Enough?
Nice headline & subhead in The Guardian, to Dana Nuccitelli's Dec 30th article, that gets to the crux of the matter:
(The rest of the article is definitely worth reading, too.)
(The rest of the article is definitely worth reading, too.)
Warmest Year Ever for World's Longest Temperature Record
That's the Hadley Centre's "Central England Temperature (CET)," of course.
It was a fairly clean record high, too, beating the previous #1 (2006) by0.09°C 0.06°C. (Third is now 2011.)(Reason for the correction: Round-off errors, between my calculation of the annual average (rounded to two decimal places) and HadCET's annual average (truncated to two decimal places.) I'll use HadCET's numbers.
The yearly anomaly was +1.69°C above the record's complete average. The record is 356 years long, though with some issues in the early years.
What's more, the annual average was a record without any individual month setting a record (for its month). The lowest ranked month was April, the 8th warmest April in the record.
Here's a plot of the annual anomalies:
Overall the anomalies kinda balance around 0°C.... Maybe the highs of recent decades are just a rebound from the cold of 1690-1700.... and there is no net warming at all. Maybe. Or maybe not.
It was a fairly clean record high, too, beating the previous #1 (2006) by
The yearly anomaly was +1.69°C above the record's complete average. The record is 356 years long, though with some issues in the early years.
What's more, the annual average was a record without any individual month setting a record (for its month). The lowest ranked month was April, the 8th warmest April in the record.
Here's a plot of the annual anomalies:
Overall the anomalies kinda balance around 0°C.... Maybe the highs of recent decades are just a rebound from the cold of 1690-1700.... and there is no net warming at all. Maybe. Or maybe not.
Subscribe to:
Posts (Atom)