Microwave Sounding Unit temperature measurements


Microwave Sounding Unit temperature measurements refers to temperature measurement using the Microwave Sounding Unit instrument and is one of several methods of measuring Earth atmospheric temperature from satellites. Microwave measurements have been obtained from the troposphere since 1979, when they were included within NOAA weather satellites, starting with TIROS-N. By comparison, the usable balloon record begins in 1958 but has less geographic coverage and is less uniform.
Microwave brightness measurements do not directly measure temperature. They measure radiances in various wavelength bands, which must then be mathematically inverted to obtain indirect inferences of temperature. The resulting temperature profiles depend on details of the methods that are used to obtain temperatures from radiances. As a result, different groups that have analyzed the satellite data have obtained different temperature trends. Among these groups are Remote Sensing Systems and the University of Alabama in Huntsville. The satellite series is not fully homogeneous – the record is constructed from a series of satellites with similar but not identical instrumentation. The sensors deteriorate over time, and corrections are necessary for satellite drift in orbit. Particularly large differences between reconstructed temperature series occur at the few times when there is little temporal overlap between successive satellites, making intercalibration difficult.

Creation of the satellite temperature record

From 1979 to 2005 the microwave sounding units and since 1998 the Advanced Microwave Sounding Units on NOAA polar orbiting satellites have measured the intensity of upwelling microwave radiation from atmospheric oxygen. The intensity is proportional to the temperature of broad vertical layers of the atmosphere, as demonstrated by theory and direct comparisons with atmospheric temperatures from radiosonde profiles.
Different frequencies sample a different weighted range of the atmosphere, depending on the absorption depth of the microwaves through the atmosphere. To derive data of the temperature profile at lower altitudes and remove the stratospheric influence, researchers have developed synthetic products by subtracting signals at different altitude and view angles; such as "2LT", which has a maximum at about 650 hPa. However this process amplifies noise, increases inter-satellite calibration biases and enhances surface contamination.
Records have been created by merging data from nine different MSUs and AMSU data, each with peculiarities that must be calculated and removed because they can have substantial impacts on the resulting trend.
The process of constructing a temperature record from a radiance record is difficult and some of the required corrections are as large as the trend itself:

Analysis technique

Upwelling radiance is measured at different frequencies; these different frequency bands sample a different weighted range of the atmosphere. Since the atmosphere is partially but not completely opaque, the brightness measured is an average across a band of atmosphere, depending on the penetration depth of the microwaves.
The brightness temperature measured by satellite is given by:
where is the surface weight, and are the temperatures at the surface and at the atmospheric level and is the atmospheric weighting function.
Both the surface and atmospheric weights are dependent on the surface emissivity, the absorption coefficient and the earth incidence angle ; the surface weight is the product of and an attenuation factor:
where the secant theta term accounts for the dependence of optical path length on the vertical angle, and is the optical depth:
The atmospheric weighting functions can be written as:
The first term in this equation is related to the radiation emitted upward from the level and attenuated along the path to the top of the atmosphere, the second include the radiation emitted downward from the level z to the surface and the radiation reflected back by the surface to the top of the atmosphere, the exact form of is dependent upon the temperature, water vapor and liquid water content of the atmosphere.

Channels

MSU Channel 1 is not used to monitor atmospheric temperature because it's too much sensitive to the emission from the surface, furthermore it is heavily contaminated by water vapor/liquid water in the lowermost troposphere.
Channel 2 or TMT is broadly representative of the troposphere, albeit with a significant overlap with the lower stratosphere; the weighting function has its maximum at 350 hPa and half-power at about 40 and 800 hPa.
Figure 3 shows the atmospheric levels sampled by different wavelength from the satellite measurements, where TLS, TTS, and TTT represent three different wavelengths. Note that the lowest measurement, TTT, includes brightness from both atmospheric and surface emission. TMT and TLT represent the altitude range computed lower troposphere temperature calculated using an atmospheric model as discussed below.
The T4 or TLS channel in representative of the temperature in the lower stratosphere with a peak weighting function at around 17 km above the earth surface.
; Calculation of lower troposphere temperature
In an attempt to derive data for lower altitudes and remove the stratospheric influence, several researchers have developed synthetic products that subtract the higher-altitude values from the lowest altitude measurement. Such a data-analysis technique is dependent on modeling the effect of altitude on temperature. However, this process amplifies noise, increases inter-satellite calibration biases and enhances surface contamination. Spencer and Christy developed the synthetic "2LT" product by subtracting signals at different view angles; this has a maximum at about 650 hPa. The 2LT product has gone through numerous versions as various corrections have been applied. Another such methodology has been developed by Fu and Johanson, the TTT channel is a linear combination of the TMT and TLS channel: TTT=1.156*TMT-0.153*TLS for the global average and TTT=1.12*TMT-0.11*TLS at tropical latitudes

Measurement corrections

; Diurnal sampling
All the MSU instruments and to a lesser extent AMSU drift slowly from the sun-synchronous equatorial crossing time changing the local time observed by the instrument, therefore the natural diurnal cycle may be aliased into the long term trend.
The diurnal sampling correction is in the order of a few hundredths °C/decade for TLT and TMT.
; Orbit decay
All Polar orbiting satellite lose height after launch, the orbital decay is stronger during period of elevated solar activity when the enhanced ultraviolet radiation warm the upper atmosphere and increase the frictional drag over the spacecraft.
The orbital decay change the instrument view angle relative to the surface and thus the observed microwave emissivity, furthermore the long term time-series is constructed by sequential merging of the inter-calibrated satellite data so that the error is summed up over time, the required correction is in the order of 0.1 °C/decade for TLT.
; Calibration changes
Once every earth scan MSU instrument use the deep space and on-board warm targets to make calibration measures, however as the spacecraft drifts through the diurnal cycle the calibration target temperature may change due to varying solar shadowing effect, the correction is in the order of 0.1 °C/decade for TLT and TMT.
One widely reported satellite temperature record is that developed by Roy Spencer and John Christy at the University of Alabama in Huntsville. The record comes from a succession of different satellites and problems with inter-calibration between the satellites are important, especially NOAA-9, which accounts for most of the difference between the RSS and UAH analyses. NOAA-11 played a significant role in a 2005 study by Mears et al. identifying an error in the diurnal correction that leads to the 40% jump in Spencer and Christy's trend from version 5.1 to 5.2.

Trends

Records have been created by merging data from nine different MSUs, each with peculiarities that must be calculated and removed because they can have substantial impacts on the resulting trend.
The process of constructing a temperature record from a radiance record is difficult. The satellite temperature record comes from a succession of different satellites and problems with inter-calibration between the satellites are important, especially NOAA-9, which accounts for most of the difference between various analyses. NOAA-11 played a significant role in a 2005 study by Mears et al. identifying an error in the diurnal correction that leads to the 40% jump in Spencer and Christy's trend from version 5.1 to 5.2. There are ongoing efforts to resolve differences in satellite temperature datasets.

Comparison with surface trends

To compare the MSU retrievals to the trend from the surface temperature record it is most appropriate to derive trends for the part of the atmosphere nearest the surface, i.e., the lower troposphere. As discussed earlier, the lowest of the temperature retrievals, TLT, is not a direct measurement, but a value calculated by subtracting higher altitude brightness temperature from the lower measurements. The trends found from the UAH and the RSS groups, shown in the table below, are calculated by slightly different methods, and result in different values for the trends.
Using the T2 or TMT channel, Mears et al. of Remote Sensing Systems find a trend of +0.140 °C/decade. Spencer and Christy of the University of Alabama in Huntsville, find a smaller trend of +0.08 °C/decade.
In comparing these measurements to surface temperature models, it is important to note that resulting values for the lower troposphere measurements taken by the MSU is a weighted average of temperatures over multiple altitudes, and not a surface temperature. The results are thus not precisely comparable to surface temperature models.

Trends from the record

ChannelStartEnd DateRSS v4.0
Global Trend
UAH v6.0
Global Trend
STAR v3.0
Global Trend
UW UAH
Global Trend
UW RSS
Global Trend
TLT19792017-050.1840.12
TTT19792017-010.1800.130.14
TMT19792017-010.1400.080.129
TLS19792017-01−0.260−0.31−0.263

Another satellite temperature analysis is provided by NOAA/NESDIS STAR Center for Satellite Application and Research and use simultaneous nadir overpasses
to remove satellite intercalibration biases yielding more accurate temperature trends. The STAR-NOAA analysis finds a 1979–2016 trend of +0.129 °C/decade for TMT channel.
Using an alternative adjustment to remove the stratospheric contamination, 1979–2011 trends of +0.14 °C/decade when applied to the RSS data set and +0.11 °C/decade when applied to the UAH data set were found.
A University of Washington analysis finds 1979–2012 trends of +0.13 °C/decade when applied to the RSS data set and +0.10 °C/decade when applied to the UAH data set.

Combined surface and satellite data

In 2013, Cowtan and Way suggested that global temperature averages based on surface temperature data had a possible source of bias due to incomplete global coverage if the unsampled regions are not uniformly distributed over the planet's surface. They addressed this problem by combining the surface temperature measurements with satellite data to fill in the coverage. Over the time period 1979-2016, combining the HadCRUT4 surface data with UAH satellite coverage, they show a global surface-warming trend of 0.188 °C/decade.

History of satellite temperature data interpretation

The early disagreement between the surface temperature record and the satellite records was a subject of research and debate. A lack of warming then seen in the UAH retrieval trends 1978-1998 was noted by Christy and Spencer and commented on in a 2000 report by the National Research Council and the 2001 IPCC Third Assessment Report
Christy et al. claimed that tropical temperature trends from radiosondes matches closest with his v5.2 UAH dataset. Furthermore, they asserted there was a discrepancy between RSS and sonde trends beginning in 1992, when the NOAA-12 satellite was launched.
In 1998 the UAH data had showed a cooling of 0.05K per decade. Wentz & Schabel at RSS in their 1998 paper showed this was due to the orbital decay of the NOAA satellites. Once the orbital changes had been allowed for the data showed a 0.07K per decade increase in temperature at this level of the atmosphere.
Another important critique of the early satellite record was its shortness—adding a few years on to the record or picking a particular time frame could change the trends considerably.
Through early 2005, even though they began with the same data, each of the major research groups had interpreted it with different results. Most notably, Mears et al. at RSS found 0.193 °C/decade for lower troposphere up to July 2005, compared to +0.123 °C/decade found by UAH for the same period.
There were ongoing efforts to resolve these differences. Much of the disparity in early results was resolved by the three papers in Science, 11 August 2005, which pointed out errors in the UAH 5.1 record and the radiosonde record in the tropics.
An alternative adjustment to remove the stratospheric contamination has been introduced by Fu et al.. After the correction the vertical weighting function is nearly the same of the T2 channel in the troposphere.
Another re-analysis, by Vinnikov et al. in 2006, found +0.20 °C per decade.
Analysis over a longer time period has resolved some, but not all, of the discrepancy in the data. The IPCC Fifth Assessment Report stated: "based on multiple independent analyses of measurements from radiosondes and satellite sensors it is virtually certain that globally the troposphere has warmed and the stratosphere has cooled since the mid-20th century. Despite unanimous agreement on the sign of the trends, substantial disagreement exists among available estimates as to the rate of temperature changes, particularly outside the NH extratropical troposphere, which has been well sampled by radiosondes, and concluded "Although there have been substantial methodological debates about the calculation of trends and their uncertainty, a 95% confidence interval of around ±0.1°C per decade has been obtained consistently for both LT and MT.

Corrections to UAH data trends

As well as the correction by Wentz and Schabel, doubts had been raised as early as 2000 about the UAH analysis by the work of Prabhakara et al., which minimised errors due to satellite drift. They found a trend of 0.13 °C/decade, in reasonable agreement with surface trends.
Since the earliest releast of results in the 1990s, a number of adjustments to the algorithm computing the UAH TLT dataset have been made. A table of the corrections can be found in the UAH satellite temperature dataset article.

Recent trend summary

To compare to the trend from the surface temperature record it is most appropriate to derive trends for the part of the atmosphere nearest the surface, i.e., the lower troposphere. Doing this, through December 2019:
For some time the only available satellite record was the UAH version, which showed a global cooling trend for its first decade. Since then, a longer record and a number of corrections to the processing have revised this picture, with both UAH and RSS measurements showing a warming trend.
A detailed analysis produced in 2005 by dozens of scientists as part of the US Climate Change Science Program identified and corrected errors in a variety of temperature observations, including the satellite data. Their report stated:
The 2007 IPCC Fourth Assessment Report states:

Tropical Troposphere

s predict that as the surface warms, so should the global troposphere. Globally, the troposphere is predicted to warm about 1.2 times more than the surface; in the tropics, the troposphere should warm about 1.5 times more than the surface. However, in the 2005 CCSP report it was noted that the use of fingerprinting techniques on data yielded that "Volcanic and human-caused fingerprints were not consistently identifiable in observed patterns of lapse rate change.". In particular, a possible inconsistency was noted in the tropics, the area in which tropospheric amplification should be most clearly seen. They stated:
The most recent climate model simulations give a range of results for changes in global average temperature. Some models show more warming in the troposphere than at the surface, while a slightly smaller number of simulations show the opposite behavior. There is no fundamental inconsistency among these model results and observations at the global scale, with the trends now being similar.
Globally, most climate models used by the IPCC in preparation of their third assessment in 2007 show a slightly greater warming at the TLT level than at the surface for 1979–1999 while the GISS trend is +0.161 °C/decade for 1979 to 2012, the lower troposphere trends calculated from satellite data by UAH and RSS are +0.130 °C/decade and +0.206 °C/decade.
The lower troposphere trend derived from UAH satellites is currently lower than both the GISS and Hadley Centre surface station network trends, while the RSS trend is similar. However, if the expected trend in the lower troposphere is indeed higher than the surface, then given the surface data, the troposphere trend would be around 0.194 °C/decade, making the UAH and RSS trends 66% and 81% of the expected value respectively.

Reconciliation with climate models

While the satellite data now show global warming, there is still some difference between what climate models predict and what the satellite data show for warming of the lower troposphere, with the climate models predicting slightly more warming than what the satellites measure.
Both the UAH dataset and the RSS dataset have shown an overall warming trend since 1998, although the UAH retrieval shows slightly less warming than the RSS. In June 2017, RSS released v4 which significantly increased the trend seen in their data, increasing the difference between RSS and UAH trends.
Atmospheric measurements taken by a different satellite measurement technique, the Atmospheric Infrared Sounder on the Aqua satellite, show close agreement with surface data.