Posts tagged ‘GHCN’

So Where Is The Climate Science Money Actually Going If Not To Temperature Measurement?

You are likely aware that the US, and many other countries, are spending billions and billions of dollars on climate research.  After drug development, it probably has become the single most lucrative academic sector.

Let me ask a question.  If you were concerned (as you should be) about lead in soil and drinking water and how it might or might not be getting into the bloodstream of children, what would you spend money on?  Sure, better treatments and new technologies for filtering and cleaning up lead.  But wouldn't the number one investment be in more and better measurement of environmental and human lead concentrations, and how they might be changing over time?

So I suppose if one were worried about the global rise in temperatures, one would look at better and more complete measurement of these temperatures.  Hah!  You would be wrong.

There are three main global temperature histories: the combined CRU-Hadley record (HADCRU), the NASA-GISS (GISTEMP) record, and the NOAA record. All three global averages depend on the same underlying land data archive, the Global Historical Climatology Network (GHCN). Because of this reliance on GHCN, its quality deficiencies will constrain the quality of all derived products.

The number of weather stations providing data to GHCN plunged in 1990 and again in 2005. The sample size has fallen by over 75% from its peak in the early 1970s, and is now smaller than at any time since 1919.

Well, perhaps they have focused on culling a large poor quality network into fewer, higher quality locations?  If they have been doing this, there is little or no record of that being the case.  To outsiders, it looks like stations just keep turning off.   And in fact, by certain metrics, the quality of the network is falling:

The collapse in sample size has increased the relative fraction of data coming from airports to about 50 percent (up from about 30 percent in the 1970s). It has also reduced the average latitude of source data and removed relatively more high-altitude monitoring sites.

Airports, located in the middle of urban centers by and large, are terrible temperature measurement points, subject to a variety of biases such as the urban heat island effect.  My son and I measured over 10 degrees Fahrenheit different between the Phoenix airport and the outlying countryside in an old school project.  Folks who compile the measurements claim that they have corrected for these biases, but many of us have reasons to doubt that (consider this example, where an obviously biased station was still showing in the corrected data as the #1 warming site in the country).  I understand why we have spent 30 years correcting screwed up biased stations because we need some stations with long histories and these are what we have (though many long lived stations have been allowed to expire), but why haven't we been building a new, better-sited network?

Ironically, there has been one major investment effort to improve temperature measurement, and that is through satellite measurements.  We now use satellites for official measures of cloud cover, sea ice extent, and sea level, but the global warming establishment has largely ignored satellite measurement of temperatures.  For example, James Hansen (Al Gore's mentor and often called the father of global warming) strongly defended 100+ year old surface temperature measurement technology over satellites.  Ironically, Hansen was head, for years, of NASA's Goddard Institute of Space Studies (GISS), so one wonders why he resisted space technology in this one particular area.  Cynics among us would argue that it is because satellites give the "wrong" answer, showing a slower warming rate than the heavily manually adjusted surface records.