?

Log in

Previous Entry | Next Entry

Comments

( 1 comment — Leave a comment )
marmoe
Jul. 11th, 2012 10:00 pm (UTC)
It's impressive, but not a good metric. The number of records set depends on the distribution of the thermometers. If you go by area you are still at something like three times more heat records, I think.

James Hansen, Makiko Sato and Reto Ruedy of NASA have put an interesting manuscript onto the arxiv server (keep in mind, this is not peer reviewed). They have had a closer look at temperature anomalies (deviations from the local average for a given day) over time. From these you do not only obtain the average temperature but also the probability distribution of temperature about their "expected" local value for a decade. This usually gives you a gaussian curve with a center value and width that is a property of the location. You can than normalize each local distribution such that it's center is at zero and its sigma (measure of the width) is one for a base decade. If local climate changes then the center and the width of the curve will change. What they have done is get a global average of the normalized local data. The results are interesting and worrying. The distributions have moved by one sigma to higher temperatures and have broadened. Both effects strongly increase the probability for high temperature records and decrease the probability for low temperature records. Of course this is not an explanation for the observed phenomenon, but a more detailed description of what is happening. Have a look at their figure 1 to see what I mean.

P.S.:
Welcome to the Rest of Our Lives
http://youtu.be/b0NrS2L6KcE

This should get you to the interview with Kevin Trenberth (NCAR) from the above video.
http://video.pbs.org/video/2252502406

Edited at 2012-07-11 11:38 pm (UTC)
( 1 comment — Leave a comment )

Latest Month

July 2017
S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
3031     

Tags

Page Summary

Powered by LiveJournal.com
Designed by Tiffany Chow