Steve McIntyre Comments on Historical Temperature Adjustments

Steve McIntyre, the statistician than called into question much of the methodology behind the Mann Hockey Stick chart, has some observations on adjustments to US temperature records I discussed here and here.

Eli Rabett and Tamino have both advocated faith-based climate
science in respect to USHCN and GISS adjustments. They say that the
climate "professionals" know what they're doing; yes, there are
problems with siting and many sites do not meet even minimal compliance
standards, but, just as Mann's "professional" software was able to
extract a climate signal from the North American tree ring data, so
Hansen's software is able to "fix" the defects in the surface sites.
"Faith-based" because they do not believe that Hansen has any
obligation to provide anything other than a cursory description of his
software or, for that matter, the software itself. But if they are
working with data that includes known bad data, then critical
examination of the adjustment software becomes integral to the
integrity of the record - as there is obviously little integrity in
much of the raw data.

While acolytes may call these guys "professionals", the process of
data adjustment is really a matter of statistics and even accounting.
In these fields, Hansen and Mann are not "professionals" - Mann
admitted this to the NAS panel explaining that he was "not a
statistician". As someone who has read their works closely, I do not
regard any of these people as "professional". Much of their reluctance
to provide source code for their methodology arises, in my opinion,
because the methods are essentially trivial and they derive a certain
satisfaction out of making things appear more complicated than they
are, a little like the Wizard of Oz. And like the Wizard of Oz, they
are not necessarily bad men, just not very good wizards.

He goes on to investigate a specific example the "professionals" use
as a positive example, demonstrating they appear to have a Y2K error in
their algorithm.   This is difficult to do, because like Mann, government scientists maintaining a government temperature data base taken from government sites paid for with taxpayer funds refuse to release their methodology or algorithms for inspection.

In the case cited, the "professionals" also make adjustments that imply the site has
decreasing urbanization over the last 100 years, something I am not
sure one can say about any site in the US except perhaps for a few
Colorado ghost towns.  The "experts" also fail to take the basic step of actually analyzing the site itself which, if visited, would reveal recently installed air conditioning unites venting hot air on the temperature instrument.   

A rebuttal, arguing that poor siting of temperature instruments is OK and does not affect the results is here.  I find rebuttals of this sort really distressing.  I studied physics for a while, before switching to engineering, and really small procedural mistakes in measurement could easily invalidate one's results.  I find it amazing that climate scientists seek to excuse massive mistakes in measurement.  I'm sorry, but in no other branch of science are results considered "settled" when the experimental noise is greater than the signal.  I would really, really, just for once, love to see a anthropogenic global warming promoter say "well, I don't think the siting will change the results, but you are right, we really need to go back and take another pass at correcting historical temperatures based on more detailed analysis of the individual sites."

3 Comments

  1. dearieme:

    I used to be a Global Warming sceptic. Now I'm a Global Warming cynic.

  2. Hank:

    Science
    RSS Feed - Science
    Published: Aug. 2, 2007 at 2:08 PM
    E-mail Story | Print Preview | License

    Math used in new climate change assessment
    SEATTLE, Aug. 2 (UPI) -- A team of U.S. scientists has used mathematics to assess the effect of natural solar variation on climate change.

    Charles Camp and Ka Kit Tung of the University of Washington's department of applied mathematics said that to accurately assess effects from human sources on the planet's climate, scientists must first be able to quantify the contribution of natural variation in solar irradiance to temperature changes.

    Camp and Tung said that while the existence of a long-term trend in solar output is controversial, its periodic change within an 11-year cycle has been measured by satellites.

    To assess how that oscillatory forcing affects climate on Earth, Camp and Tung compared the Earth's surface temperature measurements between years of solar maximum and years of solar minimum.

    They determined that times of high solar activity are on average 0.2 degrees Celsius warmer than times of low solar activity, and that there is a polar amplification of the warming.

    That finding is believed the first to document a statistically significant globally coherent temperature response to the solar cycle, the authors note.

    The research is reported in the current issue of the journal Geophysical Research Letters.

  3. dearieme:

    But if the surface temperature measurements are junk, Hank, then no theory/math model is worth anything, since there're no good data to compare it with.