Will increasing carbon dioxide cause warming that is so small that it can be safely ignored (low climate sensitivity)?
Or will it cause a global warming Armageddon (high climate sensitivity)?
The answer depends upon the net radiative feedback: the rate at which the Earth loses extra radiant energy with warming. Climate sensitivity is mostly determined by changes in clouds and water vapor in response to the small, direct warming influence from (for instance) increasing carbon dioxide concentrations.
The net radiative feedback can be estimated from global, satellite-based measurements of natural climate variations in (1) Earths radiation budget, and (2) tropospheric temperatures.
These feedback estimates have been mostly constrained by the availability of the first measurement: the best calibrated radiation budget data comes from the NASA CERES instruments, with data now available for 9.5 years from the Terra satellite, and 7 years from the Aqua satellite. Both datasets now extend through September of 2009. ….
But, as we show in our new paper (in press) in the Journal of Geophysical Research, these feedbacks can not be estimated through simple linear regression on satellite data, which will almost always result in an underestimate of the net feedback, and thus an overestimate of climate sensitivity.Link
Without going into the detailed justification, we have found that the most robust method for feedback estimation is to compute the month-to-month slopes (seen as the line segments in the above graph), and sort them from the largest 1-month temperature changes to the smallest (ignoring the distinction between warming and cooling)……
These results suggest that the sensitivity of the real climate system is less than that exhibited by ANY of the IPCC climate models. This will end up being a serious problem for global warming predictions.