Wednesday, July 06, 2016

UAH Lower Troposphere vs. GISS, Detrended

Here are the detrended data for UAH LT v6beta6 and GISS, monthly:


The 1997-98 El Nino still stands out. So does a weird upward blip in UAH in January 2013. And, as Layzej conjectured, UAH LT does tend to have higher peaks than GISS during El Ninos (though not this year), and deeper drops during La Ninas.

Finally, since LT temperatures typically lag GISS by a few (?) months, I plotted the difference between UAH_LT_detrended_with_a_3-month_lag and GISS_detrended:




7 comments:

Layzej said...

If you compare a detrended UAH 5.6 vs a detrended UAH 6.x it looks like the period of 1993-2003 is about 0.1C warmer on UAH 6.x. This further exaggerates the 1998 El Nino and makes a new record more difficult.

Interesting point about the 3 month lag. I gave me some hope that we'd still see UAH rise relative to the surface station records (boosting my chances at winning my bet), but that doesn't seem to have happened. UAH is dropping in sync with the surface records.

Layzej said...

that should read ...it gave me some hope...

David in Cal said...

David, how much de-trending was done? That is, can you say what amount of trend you removed from the data? Thanks.

JoeT said...

I compared the detrended RATPAC-A radiosonde data with the GISS dataset here. You can find the data for the 850 - 300 mb region in the troposphere here

You'll notice that as with the UAH data, there is also a similarity between the surface and tropospheric data for the most recent El Nino, unlike in 1997-1998 where the troposphere appeared to warm more than the surface.

This wasn't what I was expecting. My own suspicion was the the hot load calibration on the satellite data was degrading.

By the way, the trend for GISS in the period 1995-2016 was 0.17C/decade while for RATPAC it was 0.24 C/decade.

JoeT said...

Not that I'm at least partially convinced that the degradation of the hot load calibration is not the cause of the similarity between the surface and troposheric warming during the recent El Nino, let's try something else.

In 1980 Manabe and Stouffer published their paper, Sensitivity of a global climate model to an increase of CO2 concentration in the atmosphere in which they calculated how the decrease in albedo for the northern latitudes leads to polar amplification. Here's a particular pertinent quote:

"It is shown that the sensitivity of a global climate model to an increase of the CO2 content in the atmosphere has significant seasonal and latitudinal variations. For example, the CO2-induced warming of the surface air is particularly large in high latitudes owing mainly to the poleward retreat of highly reflective snow cover and sea ice. However, the warming over the Antarctic continent is significantly less than the warming over the Arctic Ocean partly because of the smallness of a snow albedo feedback mechanism over Antarctica."

Over at the GISS web site you can make maps of the temperature anomaly as a function of latitude for different intervals of time. Here is a comparison of the zonal anomaly for the May 1997 - May 1998 period compared to the May 2015 - May 2016 period. Note that the scales are different and the reference periods are the same.

What you see in these plots is the huge increase in temperature anomaly in the northern latitudes during the latest El Nino. What this points to is that the polar amplification that Manabe and Stouffer predicted 35 years ago showed up in full force for the most recent El Nino.

David Appell said...

Joe, those are some good ideas.

Are you suggesting that polar amplification has increased enough between the 1997/98 El Nino and 2015/16 to be apparent in the in the lower tropospheric temperatures?

But wouldn't surface temperatures also have increased during that time, due to polar amplification, so they would have been higher than the LT temperature?

JoeT said...

David, think of my last 2 posts as me thinking out loud. I actually have some experience measuring cyclotron emission from a hot plasma, so the hot load calibration seemed a natural thing to screw up on a satellite. Carl Mears talks about it here.

"This 2-point calibration system continuously compensates for variations in the radiometer gain and noise temperatures. This seemingly simple calibration methodology is fraught with subtle difficulties. The cold mirror is relatively trouble-free as long we note when the moon intrudes on the cold space view and remove moon-affected values. The hot absorber has been more problematic. The thermistors often do not adequately measure thermal gradients across the hot absorber. For example, a hot load correction is required for AMSR-E because of a design flaw in the AMSR-E hot load. The hot load acts as a blackbody emitter and its temperature is measured by precision thermistors. Unfortunately, during the course of an orbit, large thermal gradients develop within the hot load due to solar heating making it difficult to determine the average effective temperature from the thermistor readings. The thermistors themselves measure these gradients and may vary by up to 15 K. Several other radiometers have had similar, but smaller, issues."

The weird thing is that the trend between the RATPAC-A data and UAH/RSS data are very different, but yet the detrended, natural variation looks very similar. It's a curious thing.

As to the second point I was making, I was thinking that the Walker circulation heats the lower troposphere (after detrending) to roughly the same level for the 1997/1998 and for 2015/2016. But this time the polar amplification really kicked in and boosted the surface temperatures increase to something comparable to the increase in the troposphere.

As I usually say, this isn't my area so I could be wrong, but it's a thought.

(BTW, it's too bad Greg Johnson only showed up briefly. It's still not clear to me how one gets the net TOA imbalance when the individual radiances are 2 orders of magnitude higher.)