This is Chapter 7 of an ongoing series. Other parts of the series are here:
- Introduction
- Greenhouse Gas Theory
- Feedbacks
- A) Actual Temperature Data; B) Problems with the Surface Temperature Record
- Attribution of Past Warming: A) Arguments for it being Man-Made; B) Natural Attribution
- Climate Models vs. Actual Temperatures
- Are We Already Seeing Climate Change (this article)
- The Lukewarmer Middle Ground
- A Low-Cost Insurance Policy
Note: This is by far the longest chapter, and could have been 10x longer without a lot of aggressive editing. I have chosen not to break it into two pieces. Sorry for the length. TL;DR: The vast majority of claims of current climate impacts from CO2 are grossly exaggerated or even wholly unsupported by the actual data. The average quality of published studies in this area is very low compared to other parts of climate science.
Having discussed the theory and reality of man-made warming, we move in this chapter to what is often called "climate change" -- is manmade warming already causing adverse changes in the climate?
This is a peculiarly frustrating topic for a number of reasons.
First, everyone who discusses climate change automatically assumes the changes will be for the worse. But are they? The Medieval Warm Period, likely warmer than today, was a period of agricultural plenty and demographic expansion (at least in Europe) -- it was only the end of the warm period that brought catastrophe, in the form of famine and disease. As the world warms, are longer growing seasons in the colder parts of the northern hemisphere really so bad, and why is it no one ever mentions such positive offsets?
The second frustrating issues is that folks increasingly talk about climate change as if it were a direct result of CO2, e.g. CO2 is somehow directly worsening hurricanes. This is in part just media sloppiness, but it has also been an explicit strategy, re-branding global warming as climate change during the last 20 years when global temperatures were mostly flat. So it is important to make this point: There is absolutely no mechanism that has been suggested by anyone wherein CO2 can cause climate change except through the intermediate step of warming. CO2 causes warming, which then potentially leads to changes in weather. If CO2 is only causing incremental warming, then it likely is only causing incremental changes to other aspects of the climate. (I will note as an aside that man certainly has changed the climate through mechanisms other than CO2, but we will not discuss these. A great example is land use. Al Gore claimed the snows of Kilimanjaro are melting because of global warming, but in fact it is far more likely they are receding due to precipitation changes as a result of deforestation of Kilimanjaro's slopes.)
Finally, and perhaps the most frustrating issue, is that handling claims of various purported man-made changes to the climate has become an endless game of "wack-a-mole". It is almost impossible to keep up with the myriad claims of things that are changing (always for the worse) due to CO2. One reason that has been suggested for this endless proliferation of dire predictions is that if one wants to study the mating habits of the ocelot, one may have trouble getting funding, but funding is available in large quantities if you re-brand your study as the effect of climate change on the mating habits of the ocelot. It is the unusual weather event or natural phenomenon (Zika virus!) that is not blamed by someone somewhere on man-made climate change.
As a result, this section could be near-infinitely long. To avoid that, and to avoid a quickly tedious series of charts labelled "hurricanes not up", "tornadoes not up", etc., I want to focus more on the systematic errors that lead to the false impression that we are seeing man-made climate changes all around us.
We will start with publication bias, which I would define as having a trend in the reporting of a type of an event mistaken for a trend in the underlying events itself. Let's start with a classic example from outside climate, the "summer of the shark".
The media hysteria began in early July, 2001, when a young boy was bitten by a shark on a beach in Florida. Subsequent attacks received breathless media coverage, up to and including near-nightly footage from TV helicopters of swimming sharks. Until the 9/11 attacks, sharks were the third biggest story of the year as measured by the time dedicated to it on the three major broadcast networks’ news shows.
Through this coverage, Americans were left with a strong impression that something unusual was happening — that an unprecedented number of shark attacks were occurring in that year, and the media dedicated endless coverage to speculation by various “experts” as to the cause of this sharp increase in attacks.
Except there was one problem — there was no sharp increase in attacks. In the year 2001, five people died in 76 shark attacks. However, just a year earlier, 12 people had died in 85 attacks. The data showed that 2001 actually was a down year for shark attacks. The increased media coverage of shark attacks was mistaken for an increase in shark attacks themselves.
Hopefully the parallel with climate reporting is obvious. Whereas a heat wave in Moscow was likely local news only 30 years ago, now it is an international story that is tied, in every broadcast, to climate change. Every single tail-of-the-distribution weather event from around the world is breathlessly reported, leaving the impression among viewers that more such events are occurring, even when there is in fact no such trend. Further, since weather events can drive media ratings, there is an incentive to make them seem scarier:
When I grew up, winter storms were never named. It was just more snow in Buffalo, or wherever. Now, though, we get "Winter Storm Saturn: East Coast Beast." Is the weather really getting scarier, or just the reporting?
The second systematic error is not limited to climate, and is so common I actually have a category on my blog called "trend that is not a trend". There is a certain chutzpah involved in claiming a trend when it actually does not exist in the data, but such claims occur all the time. In climate, a frequent variation on this failure is the claiming of a trend from a single data point -- specifically, a tail-of-the-distribution weather event will be put forward as "proof" that climate is changing, ie that there is a trend to the worse somehow in the Earth's climate.
The classic example was probably just after Hurricane Katrina. In a speech in September of 2005 in San Francisco, Al Gore told his Sierra Club audience that not only was Katrina undoubtedly caused by man-made global warming, but that it was the harbinger of a catastrophic onslaught of future such hurricanes. In fact, though, there is no upward trend in Hurricane activity. 2005 was a high but not unprecedented year for hurricanes. An Katrina was soon followed by a long and historic lull in North American hurricane activity.
Counting hurricane landfalls is a poor way to look at hurricanes. A better way is to look at the total energy of hurricanes and cyclones globally. And as you can see, the numbers are cyclical (as every long-time hurricane observer could have told Mr. Gore) but without any trend:
In fact, the death rates from severe weather have been dropping throughout the last century at the same time CO2 levels have been rising
Of course, it is likely that increasing wealth and better technology are responsible for much of this mitigation, rather than changes in underlying weather patterns, but this is still relevant to the debate -- many proposed CO2 abatement plans would have the effect of slowing growth in the developing world, leaving them more vulnerable to weather events. I have argued for years that the best way to fight weather deaths is to make the world rich, not to worry about 1 hurricane more or less.
Droughts are another event where the media quickly finds someone to blame the event on man-made climate change and declare that this one event is proof of a trend. Bill McKibben tweeted about drought and corn yields many times in 2012, for example:
It turns out that based on US government data, the 2012 drought was certainly severe but no worse than several other droughts of the last 50 years (negative numbers represent drought).
There is no upward trend at all (in fact a slightly downward trend that likely is not statistically significant) in dry weather in the US
McKibben blamed bad corn yields in 2012 on man-made global warming, and again implied that one year's data point was indicative of a trend
US corn yields indeed were down in 2012, but still higher than at any time they had been since 1995.
It is worth noting the strong upward trend in corn yields from 1940 to today, at the same time the world has supposedly experienced unprecedented man-made warming. I might also point out the years in yellow, which were grown prior to the strong automation of farming via the fossil fuel economy. Bill McKibben hates fossil fuels, and believes they should be entirely eliminated. If so, he also must "own" the corn yields in yellow. CO2-driven warming has not inhibited corn yields, but having McKibben return us to a pre-modern economy certainly would.
Anyway, as you might expect, corn yields after 2012 return right back to trend and continue to hit new records. 2012 did not represent a new trend, it was simply one bad year.
I think most folks would absolutely swear, from media coverage, that the US is seeing more new high temperatures set and an upward trend in heat waves. But it turns out neither is the case.
Obviously, one has to be careful with this analysis. Many temperature stations in the US Historical Climate Network have only been there for 20 or 30 years, so their all time high at that station for any given day is, by definition, going to be in the last 20 or 30 years. But if one looks at temperature stations with many years of data, as done above, we can see there has been no particular uptick in high temperature records and in fact a disproportionate number of our all-time local records were set in the 1930's.
While there has been a small uptick in heat waves over the last 10-20 years, it is trivial compared to the heat of the 1930's
Looking at it a different way, there is no upward trend in 100 degree (Fahrenheit) days...
Or even 110 degree days. Again, the 1930's were hot, long before man-made CO2 could possibly have made them so
Why, one might ask, don't higher average global temperatures translate into more day-time high temperature records? Well, we actually gave the answer back in Chapter 4A, but as a reminder, much of the warming we have seen has occurred at night, raising the nighttime lows without as much affect on daytime highs, so we are seeing more record nighttime high Tmin's than we have in much of the last century without seeing more record daytime Tmax temperatures:
We could go on all day with examples of claiming a trend from a single data point. Watch for it yourself. But for now let's turn to a third category
We can measure things much more carefully and accurately than we could in the past. This is a good thing, except when we are trying to compare the past to the present. In a previous chapter, we showed a count of sunspots, and databases of sunspot counts go all the way back into the early 18th century. Were telescopes in 1716 able to see all the sunspots we can see in 2016? Or might an upward trend in sunspot counts be biased by our better ability today to detect small ones?
A great example of this comes, again, from Al Gore's movie in which Gore claimed that tornadoes were increasing and man-made global warming was the cause. He was working with this data:
This certainly looks scary. Tornadoes have increased by a factor of 5 or 6! But if you look at the NOAA web site, right under this chart, there is a big warning that ways to beware of this data. With doppler radar and storm chasers and all kinds of other new measurement technologies, we can detect smaller tornadoes that were not counted in the 1950's. The NOAA is careful to explain that this chart is biased by changes in measurement technology. If one looks only at larger tornadoes we were unlikely to miss in the 1950's, there is no upward trend, and in fact there may be a slightly declining trend.
That, of course, does not stop nearly every person in the media from blaming global warming whenever there is an above-average tornado year
Behind nearly every media story about "abnormal" weather or that the climate is somehow "broken" is an explicit assumption that we know what "normal" is. Do we?
We have been keeping systematic weather records for perhaps 150 years, and have really been observing the climate in detail for perhaps 30 years. Many of our best tools are space-based and obviously only have 20-30 years of data at most. Almost no one thinks we have been able to observe climate in depth through many of its natural cycles, so how do we know exactly what is normal? Which year do we point to and say, "that was the normal year, that was the benchmark"?
One good example of this is glaciers. Over the last 30 years, most (but not all) major glaciers around the world have retreated, leading to numerous stories blaming this retreat on man-made warming. But one reason that glaciers have retreated over the last 50 years is that they were also retreating the 50 years before that and the 50 years before that:
In fact, glaciers have been retreating around the world since the end of the Little Ice Age (I like to date it to 1812, with visions of Napoleon's army freezing in Russia, but that is of course arbitrary).
A while ago President Obama stood in front of an Alaskan glacier and blamed its retreat on man. But at least one Alaskan glacier in the area has been mapped for centuries, and has been retreating for centuries:
As you can see, from a distance perspective, most of the retreat actually occurred before 1900. If one wants to blame the modern retreat of these glaciers on man, one is left with the uncomfortable argument that natural forces drove the retreat until about 1950, at which point the natural forces stopped just in time for man-made effects to take over.
Melting ice is often linked to sea level rise, though interestingly net ice melting contributes little to IPCC forecasts of sea level rises due to expected offsets with ice building in Antarctica -- most forecast sea level rise comes from the thermal expansion of water in the oceans. And of course, the melting arctic sea ice that makes the news so often contributes nothing to sea level rise (which is why your water does not overflow your glass when the ice melts).
But the story for rising sea levels is the same as with glacier retreats -- the seas have been rising for much longer than man has been burning fossil fuels in earnest, going back to about the same 1812 start point:
There is some debate about manual corrections added to more recent data (that should sound familiar to those reading this whole series) but recent sea level rise seems to be no more than 3 mm per year. At most, recent warming has added perhaps 1 mm a year to the natural trend, or about 4 inches a century.
Our last failure mode is again one I see much more widely than just in climate. Whether the realm is economics or climate or human behavior, the media loves to claim that incredibly complex, multi-variable systems are in fact driven by a single variable, and -- who'd have thunk it -- that single variable happens to fit with their personal pet theory.
With all the vast complexity of the climate, are we really to believe that every unusual weather event is caused by a 0.013 percentage point change (270 ppm to 400 ppm) in the concentration of one atmospheric gas?
Let me illustrate this in another way. The NOAA not only publishes a temperature anomaly (which we have mostly been using in all of our charts) but they take a shot at coming up with an average temperature for the US. The following chart uses their data for the monthly average of Tmax (the daily high at all locations), Tmin (the daily low for all locations) and Tavg (generally the average of Tmin and Tmax).
Note that even the average temperatures vary across a range of 40F through the seasons and years. If one includes the daily high and low, the temperatures vary over a range of nearly 70F. And note that this is the average for all the US over a month. If we were to look at the range of daily temperatures across the breath of locations, we would see numbers that varied from well into the negative numbers to over 110.
The point of all this is that temperatures vary naturally a lot. Now look at the dotted black line. That is the long-term trend in the average, trending slightly up (since we know that average temperatures have risen over the last century). The slope of that line, around 1F per century for the US, is virtually indistinguishable. It is tiny, tiny, tiny compared to the natural variation of the averages.
The point of this is not that small increases in the average don't matter, but that it is irrational to blame every tail-of-the-distribution temperature event on man-made warming, since no matter how large we decide that number has been, its trivial compared to the natural variation we see in temperatures.
OK, I know that was long, but this section was actually pretty aggressively edited even to get it this short. For God sakes, we didn't even mention polar bears (the animals that have already survived through several ice-free inter-glacial periods but will supposedly die if we melt too much ice today). But its time to start driving towards a conclusion, which we will do in our next chapter.
Chapter 8, summarizing the lukewarmer middle ground, is here.