Not surprisingly, it finds a warm Roman period, but then the reconstruction shows a dip in temperatures during the Medieval Climate Anomaly (MCA) of 1000-1200 AD. Then mild temperatures, then a sharp dip into the Little Ice Age, and strong warming since 1850 and the Industrial Revolution.
Not too surprising, except for the MCA. How did that come about?
--
Well, in fact it did not come about. Actually, this data consists of nothing but random numbers that I generated, between 0 and 1, one per year, with their 10-year moving average plotted above.
How do random numbers give a realistic looking temperature time series?
It's called the Slutsky Effect, which I just learned about from an article John Fleck sent me a few weeks ago:
"The myth of Europe’s Little Ice Age"
Morgan Kelly, Cormac Ó Gráda, 28 March 2015
Eugen Slutsky (1880-1948) was a Russian mathematician who did important work in economics and in the mathematics of time series, while trying to keep his head on his shoulders during the Russian revolution and the murdering afterward.
Slutsky showed that it's very easy to construct random times series that appear, when a moving average is calculated, to give results very reminiscent of economic business cycles. Here is a nice overview.
Or, here, a temperature time series.
The random data I generated, between 0 and 1, has an average of 0.490 and a standard deviation of 0.025.
Its trend from start to finish is -0.000003 -- essentially zero. Yet it shows what could easily be interpreted as meaningful intervals of warm and cold.
Kelly and Ó Gráda look at temperature reconstructions in Europe from 1300-2000, calculated 25-year moving averages, and find the following:
which looks meaningful, except it comes from this raw data, whose trend isn't statistically different from zero:
They've done more statistical analysis in a paper in the Annals of Applied Statistics, which I have not yet read but plan to.
So what to make of this?
I'm not sure. Patterns can be found in random data, that look meaningful. A series of random events can combine to look like a meaningful cycles in an economy, or a climate.
Is this all that modern global warming is, a time series analyzed so as to look meaningful? No, because its trend is statistically different from 0.
But other data? I think the lesson is you need to be careful. Kelly and Ó Gráda conclude there was no Little Ice Age, in a statistical meaningful sense. There were, to be sure, decades of worse weather than normal, that affected crop production in regions and the people who depended on them.
But a widespread LIA? They say no.
More later.
5 comments:
Ah, so it isn't a temperature reconstruction!
Personally I have never believed in an LIA because there does not appear to be much physical evidence for it. It always surprised me that causes of the LIA are accepted by some, such as the Maunder Minimum etc.
So what to make of this?
You really need statistical analysis.
This human tendency to see patterns much too easily might also be behind people who honestly feel that the hiatus is of importance and no not just abuse it for stealth political reasons.
Missing a pattern could mean you are dead, seeing one too many is only a bit more work, which is no biggie in good times.
Someone with some experience would have noticed that the variations (y-axis) are very small and that that minor temperature increase people keep on talking about is missing in the last century.
David,
How are we confident that the temperature rise since 1850 is not the Slutsky effect?
Unknown: Extremely confident. In this case, the trend of 20th century temperatures is statistically different than zero, which would not the case (I think) if it was the Slutsky effect.
Besides, there is an enormouse amount of evidence both indirect and (recently) direct, that CO2 causes warming of the magnitude we're seeing.
Post a Comment