They take the annual contiguous U.S. surface average from NCDC from here. Here it is (in Fahrenheit):
1991 | 53.90 |
1992 | 53.34 |
1993 | 52.00 |
1994 | 53.64 |
1995 | 53.45 |
1996 | 52.62 |
1997 | 53.02 |
1998 | 55.08 |
1999 | 54.67 |
2000 | 54.00 |
2001 | 54.41 |
2002 | 53.94 |
2003 | 54.02 |
2004 | 53.83 |
2005 | 54.36 |
2006 | 55.04 |
2007 | 54.38 |
2008 | 53.00 |
2009 | 53.07 |
2010 | 53.71 |
Now, the 10-yr change in the 10-yr moving average of this data is 0.40 °F (for 2010). That's a significant warming from one decade to the next.
But if you cherry-pick the time interval to something that isn't climatologically significant and just look at the annual data for the last 10 years, you get a linear trend of -0.92 ± 0.65 °F/decade. That's not even statistically significant to 1.5 sigma. And the interval ends in a strong La Nina. And USA48 is only 5.4% of the Earth's land area, and 1.6% of the Earth's total area. And using temperatures in F make them look 1.8 times bigger than those in C.
In other words, you have to look extremely hard and do some wild slicing and dicing to find something you can call "cooling." It's laughable to consider it a counterargument to AGW.
David, in his "graphs" the average temperature (black line) is many times, lesser than the actual temperature (red line). I find that hilarious and at the same time disappointing that those idiotic fans of his fail to see it.
ReplyDeleteWUWT is the only place where a so-called climate discussion turns into a tomato-gardening thread.