Posts tagged ‘fuel’

Knowledge and Certainty "Laundering" Via Computer Models

Today I want to come back to a topic I have not covered for a while, which is what I call knowledge or certainty "laundering" via computer models.  I will explain this term more in a moment, but I use it to describe the use of computer models (by scientists and economists but with strong media/government/activist collusion) to magically convert an imperfect understanding of a complex process into apparently certain results and predictions to two-decimal place precision.

The initial impetus to revisit this topic was reading "Chameleons: The Misuse of Theoretical Models in Finance and Economics" by Paul Pfleiderer of Stanford University (which I found referenced in a paper by Anat R. Admati on dangers in the banking system).  I will except this paper in a moment, and though he is talking more generically about theoretical models (whether embodied in code or not), I think a lot of his paper is relevant to this topic.

Before we dig into it, let's look at the other impetus for this post, which was my seeing this chart in the "Southwest" section of the recent Fourth National Climate Assessment.

The labelling of the chart actually understates the heroic feat the authors achieved as their conclusion actually models wildfire with and without anthropogenic climate change.  This means that first they had to model the counterfactual of what the climate could have been like without the 30ppm (0.003% of the atmosphere) CO2 added in the period.  Then, they had to model the counterfactual of what the wildfire burn acreage would have been under the counter-factual climate vs. what actually occurred.   All while teasing out the effects of climate change from other variables like forest management and fuel reduction policy (which --oddly enough -- despite substantial changes in this period apparently goes entirely unmentioned in the underlying study and does not seem to be a variable in their model).  And they do all this for every year back to the mid-1980's.

Don't get me wrong -- this is a perfectly reasonable analysis to attempt, even if I believe they did it poorly and am skeptical you can get good results in any case (and even given the obvious fact that the conclusions are absolutely not testable in any way).  But any critique I might have is a normal part of the scientific process.  I critique, then if folks think it is valid they redo the analysis fixing the critique, and the findings might hold or be changed.  The problem comes further down the food chain:

  1. When the media, and in this case the US government, uses this analysis completely uncritically and without any error bars to pretend at certainty -- in this case that half of the recent wildfire damage is due to climate change -- that simply does not exist
  2. And when anything that supports the general theory that man-made climate change is catastrophic immediately becomes -- without challenge or further analysis -- part of the "consensus" and therefore immune from criticism.

I like to compare climate models to economic models, because economics is the one other major field of study where I think the underlying system is as nearly complex as the climate.  Readers know I accept that man is causing some warming via CO2 -- I am a lukewarmer who has proposed a carbon tax.  However, as an engineer whose undergraduate work focused on the dynamics of complex systems, I go nuts with anti-scientific statements like "Co2 is the control knob for the Earth's climate."  It is simply absurd to say that an entire complex system like climate is controlled by a single variable, particularly one that is 0.04% of the atmosphere.  If a sugar farmer looking for a higher tariff told you that sugar production was the single control knob for the US climate, you would call BS on them in a second (sugar being just 0.015% by dollars of a tremendously complex economy).

But in fact, economists play at these same sorts of counterfactuals.  I wrote about economic analysis of the effects of the stimulus way back in 2010.  It is very similar to the wildfire analysis above in that it posits a counter-factual and then asserts the difference between the modeled counterfactual and reality is due to one variable.

Last week the Council of Economic Advisors (CEA) released its congressionally commissioned study on the effects of the 2009 stimulus. The panel concluded that the stimulus had created as many as 3.6 million jobs, an odd result given the economy as a whole actually lost something like 1.5 million jobs in the same period. To reach its conclusions, the panel ran a series of complex macroeconomic models to estimate economic growth assuming the stimulus had not been passed. Their results showed employment falling by over 5 million jobs in this hypothetical scenario, an eyebrow-raising result that is impossible to verify with actual observations.

Most of us are familiar with using computer models to predict the future, but this use of complex models to write history is relatively new. Researchers have begun to use computer models for this sort of retrospective analysis because they struggle to isolate the effect of a single variable (like stimulus spending) in their observational data. Unless we are willing to, say, give stimulus to South Dakota but not North Dakota, controlled experiments are difficult in the macro-economic realm.

But the efficacy of conducting experiments within computer models, rather than with real-world observation, is open to debate. After all, anyone can mine data and tweak coefficients to create a model that accurately depicts history. One is reminded of algorithms based on skirt lengths that correlated with stock market performance, or on Washington Redskins victories that predicted past presidential election results.

But the real test of such models is to accurately predict future events, and the same complex economic models that are being used to demonstrate the supposed potency of the stimulus program perform miserably on this critical test. We only have to remember that the Obama administration originally used these same models barely a year ago to predict that unemployment would remain under 8% with the stimulus, when in reality it peaked over 10%. As it turns out, the experts' hugely imperfect understanding of our complex economy is not improved merely by coding it into a computer model. Garbage in, garbage out.

Thus we get to my concept I call knowledge laundering or certainty laundering.  I described what I mean by this back in the blogging dinosaur days (note this is from 2007 so my thoughts on climate have likely evolved since then).

Remember what I said earlier: The models produce the result that there will be a lot of anthropogenic global warming in the future because they are programmed to reach this result. In the media, the models are used as a sort of scientific money laundering scheme. In money laundering, cash from illegal origins (such as smuggling narcotics) is fed into a business that then repays the money back to the criminal as a salary or consulting fee or some other type of seemingly legitimate transaction. The money he gets
back is exactly the same money, but instead of just appearing out of nowhere, it now has a paper-trail and appears more legitimate. The money has been laundered.

In the same way, assumptions of dubious quality or certainty that presuppose AGW beyond the bounds of anything we have see historically are plugged into the models, and, shazam, the models say that there will be a lot of anthropogenic global warming. These dubious assumptions, which are pulled out of thin air, are laundered by being passed through these complex black boxes we call climate models and suddenly the results are somehow scientific proof of AGW. The quality hasn't changed, but the paper trail looks better, at least in the press. The assumptions begin as guesses of dubious quality and come out laundered at "settled science."

Back in 2011, I highlighted a climate study that virtually admitted to this laundering via model by saying:

These question cannot be answered using observations alone, as the available time series are too short and the data not accurate enough. We therefore used climate model output generated in the ESSENCE project, a collaboration of KNMI and Utrecht University that generated 17 simulations of the climate with the ECHAM5/MPI-OM model to sample the natural variability of the climate system. When compared to the available observations, the model describes the ocean temperature rise and variability well.”

I wrote in response:

[Note the first and last sentences of this paragraph]  First, that there is not sufficiently extensive and accurate observational data to test a hypothesis. BUT, then we will create a model, and this model is validated against this same observational data. Then the model is used to draw all kinds of conclusions about the problem being studied.

This is the clearest, simplest example of certainty laundering I have ever seen. If there is not sufficient data to draw conclusions about how a system operates, then how can there be enough data to validate a computer model which, in code, just embodies a series of hypotheses about how a system operates?

A model is no different than a hypothesis embodied in code. If I have a hypothesis that the average width of neckties in this year’s Armani collection drives stock market prices, creating a computer program that predicts stock market prices falling as ties get thinner does nothing to increase my certainty of this hypothesis (though it may be enough to get me media attention). The model is merely a software implementation of my original hypothesis. In fact, the model likely has to embody even more unproven assumptions than my hypothesis, because in addition to assuming a causal relationship, it also has to be programmed with specific values for this correlation.

This brings me to the paper by Paul Pfleiderer of Stanford University.  I don't want to overstate the congruence between his paper and my thoughts on this, but it is the first work I have seen to discuss this kind of certainty laundering (there may be a ton of literature on this but if so I am not familiar with it).  His abstract begins:

In this essay I discuss how theoretical models in finance and economics are used in ways that make them “chameleons” and how chameleons devalue the intellectual currency and muddy policy debates. A model becomes a chameleon when it is built on assumptions with dubious connections to the real world but nevertheless has conclusions that are uncritically (or not critically enough) applied to understanding our economy.

The paper is long and nuanced but let me try to summarize his thinking:

In this essay I discuss how theoretical models in finance and economics are used in ways that make them “chameleons” and how chameleons devalue the intellectual currency and muddy policy debates. A model becomes a chameleon when it is built on assumptions with dubious connections to the real world but nevertheless has conclusions that are uncritically (or not critically enough) applied to understanding our economy....

My reason for introducing the notion of theoretical cherry picking is to emphasize that since a given result can almost always be supported by a theoretical model, the existence of a theoretical model that leads to a given result in and of itself tells us nothing definitive about the real world. Though this is obvious when stated baldly like this, in practice various claims are often given credence — certainly more than they deserve — simply because there are theoretical models in the literature that “back up” these claims. In other words, the results of theoretical models are given an ontological status they do not deserve. In my view this occurs because models and specifically their assumptions are not always subjected to the critical evaluation necessary to see whether and how they apply to the real world...

As discussed above one can develop theoretical models supporting all kinds of results, but many of these models will be based on dubious assumptions. This means that when we take a bookshelf model off of the bookshelf and consider applying it to the real world, we need to pass it through a filter, asking straightforward questions about the reasonableness of the assumptions and whether the model ignores or fails to capture forces that we know or have good reason to believe are important.

I know we see a lot of this in climate:

A chameleon model asserts that it has implications for policy, but when challenged about the reasonableness of its assumptions and its connection with the real world, it changes its color and retreats to being a simply a theoretical (bookshelf) model that has diplomatic immunity when it comes to questioning its assumptions....

Chameleons arise and are often nurtured by the following dynamic. First a bookshelf model is constructed that involves terms and elements that seem to have some relation to the real world and assumptions that are not so unrealistic that they would be dismissed out of hand. The intention of the author, let’s call him or her “Q,” in developing the model may be to say something about the real world or the goal may simply be to explore the implications of making a certain set of assumptions. Once Q’s model and results become known, references are made to it, with statements such as “Q shows that X.” This should be taken as short-hand way of saying “Q shows that under a certain set of assumptions it follows (deductively) that X,” but some people start taking X as a plausible statement about the real world. If someone skeptical about X challenges the assumptions made by Q, some will say that a model shouldn’t be judged by the realism of its assumptions, since all models have assumptions that are unrealistic. Another rejoinder made by those supporting X as something plausibly applying to the real world might be that the truth or falsity of X is an empirical matter and until the appropriate empirical tests or analyses have been conducted and have rejected X, X must be taken seriously. In other words, X is innocent until proven guilty. Now these statements may not be made in quite the stark manner that I have made them here, but the underlying notion still prevails that because there is a model for X, because questioning the assumptions behind X is not appropriate, and because the testable implications of the model supporting X have not been empirically rejected, we must take X seriously. Q’s model (with X as a result) becomes a chameleon that avoids the real world filters.

Check it out if you are interested.  I seldom trust a computer model I did not build and I NEVER trust a model I did build (because I know the flaws and assumptions and plug variables all too well).

By the way, the mention of plug variables reminds me of one of the most interesting studies I have seen on climate modeling, by Kiel in 2007.  It was so damning that I haven't seen anyone do it since (at least get published doing it).  I wrote about it in 2011 at Forbes:

My skepticism was increased when several skeptics pointed out a problem that should have been obvious. The ten or twelve IPCC climate models all had very different climate sensitivities -- how, if they have different climate sensitivities, do they all nearly exactly model past temperatures? If each embodies a correct model of the climate, and each has a different climate sensitivity, only one (at most) should replicate observed data. But they all do. It is like someone saying she has ten clocks all showing a different time but asserting that all are correct (or worse, as the IPCC does, claiming that the average must be the right time).

The answer to this paradox came in a 2007 study by climate modeler Jeffrey Kiehl. To understand his findings, we need to understand a bit of background on aerosols. Aerosols are man-made pollutants, mainly combustion products, that are thought to have the effect of cooling the Earth's climate.

What Kiehl demonstrated was that these aerosols are likely the answer to my old question about how models with high sensitivities are able to accurately model historic temperatures. When simulating history, scientists add aerosols to their high-sensitivity models in sufficient quantities to cool them to match historic temperatures. Then, since such aerosols are much easier to eliminate as combustion products than is CO2, they assume these aerosols go away in the future, allowing their models to produce enormous amounts of future warming.

Specifically, when he looked at the climate models used by the IPCC, Kiehl found they all used very different assumptions for aerosol cooling and, most significantly, he found that each of these varying assumptions were exactly what was required to combine with that model's unique sensitivity assumptions to reproduce historical temperatures. In my terminology, aerosol cooling was the plug variable.

When I was active doing computer models for markets and economics, we used the term "plug variable."  Now, I think "goal-seeking" is the hip word, but it is all the same phenomenon.

Postscript, An example with the partisans reversed:  It strikes me that in our tribalized political culture my having criticised models by a) climate alarmists and b) the Obama Administration might cause the point to be lost on the more defensive members of the Left side of the political spectrum.  So let's discuss a hypothetical with the parties reversed.  Let's say that a group of economists working for the Trump Administration came out and said that half of the 4% economic growth we were experiencing (or whatever the exact number was) was due to actions taken by the Trump Administration and the Republican Congress.  I can assure you they would have a sophisticated computer model that would spit out this result -- there would be a counterfactual model of "with Hillary" that had 2% growth compared to the actual 4% actual under Trump.

Would you believe this?  After all, its science.  There is a model.  Made by experts ("top men" as they say in Raiders of the Lost Ark).  Do would you buy it?  NO!  I sure would not.  No way.  For the same reasons that we shouldn't uncritically buy into any of the other model results discussed -- they are building counterfactuals of a complex process we do not fully understand and which cannot be tested or verified in any way.  Just because someone has embodied their imperfect understanding, or worse their pre-existing pet answer, into code does not make it science.  But I guarantee you have nodded your head or even quoted the results from models that likely were not a bit better than the imaginary Trump model above.

Looking At Causes of Recent Wildfires and Resultant Property Damage, It's Hard To Point The Finger Solely or Even Mostly at CO2

Today I want to talk a bit about trends in wildfires in the US.  And as regular readers know, I have a real pet peeve about declaring a trend without actual, you know, trend data.  The media may be willing to jump from "most devastating fire in California history" to a trend just based on this one data point, but I am not going to play.

It turns out, though, that we don't have to play the one data point extrapolation game because there actually does seem to be a trend in wildfire.  Here is the national chart:

You might be saying:  Hey Coyote, you are cherry picking -- I have seen this same data but with a huge hump in the early part of the century.  Here is the chart you saw:

(source for both)  The problem with this chart is a huge definitional change in the data between the first and second half of the century.  In short, the early half of the century included controlled burns and other purposeful manmade actions (mostly in the southeast) and the latter half does not.  I described this here -- skeptics who use this chart are in serious danger of committing the same sloppy data errors we accuse warmists of (confession:  I made this mistake for a number of years).

To complete our proof there is indeed a trend in wildfire and not just in wildfire news stories, here is the chart for California, though I cannot vet the source.  I will say its not a slam dunk trend but I will take it on faith, at least for now, that the recent years would be high and make the trend more obvious

OK, so there seems to be a wildfire trend in the West.  I will focus on California because that has been the area in the news.  Let's consider 4 possible causes:

  1. Temperature.  The state of California has seen a 0.02C per decade, or 0.2C per century increase in temperatures.  This is a very tiny increase and well below the increase thought to have occurred in other parts of the world.  The rise has been faster over the last 10 years or so but it is unclear if this is a long-term trend or a near-term weather effect (e.g. tied to the PDO)
  2. Precipitation.   Total precipitation has decreased by ever so slightly over the last 100 years.  A half inch per century is about a 2% reduction
  3. Forest management.  The amount of wood harvested, and thus fuel removed, from forests has dropped by 80% since the 1950s
  4. Urbanization.  This does not necessarily increase fire acreage but it does substantially increase the probability a given fire will impinge on man-made structures.  Also, given the enormous almost exponential increase in total CA real estate value, the likely cost of fires of the same size and intensity has risen dramatically.  Much of the developed area affected by fires the last several years have been in the red and purple parts of the map that were developed most recently.  Fifty years ago they would have just burned trees (source).  More CA urbanization trends here.

So, what is causing the large fires?  Well, probably a lot of things.  I am a big believer that changes in outputs from complex systems can have complex causes (which is why I think the whole meme that "CO2 is the Earth's thermostat" is an embarrassing joke).  But given that over the last 50 years temperatures have risen by a fraction of a degree, precipitation has dropped by a fraction of an inch, but fuel removal has dropped by 80% and urbanization has skyrocketed, it is really hard for me to pin all or even most of the blame on manmade CO2.

Postscript:  One other point -- California is less than 0.1% of the total land area of the Earth.  I have a hard time extrapolating global climate trends from a few big events in 1/1000th of the world.

Postscript #2:  I missed this, that hotbed of climate denial called Mother Jones had an article a year ago blaming California fires on forest management policy, specifically preventing lots of little fires leading to one big fire.

A Trip Down Blog Memory Lane: 1800 Words on Why Steel Can Fail Without Melting

I was randomly browsing my blog history when I encountered a post from over 11 years ago when it was necessary to spend 1800+ words explaining why steel could still fail in the Twin Towers even when it did not actually melt.

Of late, Rosie [O'Donnell] has joined the "truthers," using her show to flog the notion that the WTC was brought down in a government-planned controlled demolition....

Rosie, as others have, made a point of observing that jet fuel does not burn hot enough to melt steel, and therefore the fire in the main towers could not have caused the structure to yield and collapse.  This is absurd.  It is a kindergartener's level of science.  It is ignorant of a reality that anyone who has had even one course in structural engineering or metallurgy will understand.  The argument made that "other buildings have burned and not collapsed" is only marginally more sophisticated, sort of equivalent to saying that seeing an iceberg melts proves global warming.  ...

Here is the reality that most 19-year-old engineering students understand:  Steel loses its strength rapidly with temperature, losing nearly all of its structural strength by 1000 degrees F, well below its melting point but also well below the temperature of burning jet fuel.

And on and on from there.  Seriously,  I know its hard to believe this was even necessary, but it was a serious charge by some of our intellectual betters in the entertainment industry.  Actually, it brings me a certain comfort in encountering this again -- maybe our public discourse is not really getting substantially stupider.  Maybe it has always been that way.

Look, I am not mocking you if you don't know the material properties of steel and how they change with temperature.   Odds are, in your jobs, you do not need to know anything about it.  What bothers me are the people who know nothing about these topics who speak with such certainty.  In some ways it seems to go past Dunning-Krueger,   People making these absolute pronouncements not only don't know anything about the topic, but many have actively avoided ever finding themselves in a classroom where the topic (or more accurately the mathematical and scientific foundations of the topic) might have been discussed.

It's not like I am totally immune to this.   Here are a few topics that I may have blogged about a few times years and years ago but now I won't touch because I know I don't understand them:

  • Central banking and monetary policy
  • Almost anything having to do with chemistry, including ocean acidification (or more accurately, reduced ocean alkalinity).  I even had an A in Organic Chemistry but it did not stick at all.
  • Literary criticism, except to say what I liked and I didn't like
  • Anything about certain performance-based crafts, like singing and acting, except to say which performances I did and did not enjoy
  • Ice hockey, horse racing, and soccer (which doesn't mean I don't enjoy watching them)
  • 80% of what Tyler Cowen writes about
  • Anything about music post-1985
  • Anything on cooking or food
  • Absolutely anything on wine

To the last point, I got invited to a wine tasting the other day.  Everyone was saying they tasted chicory or boysenberry or a hint of diatomaceous earth or whatever and I tasted .. wine.  Honestly I felt like a blind person sitting in on a discussion of the color wheel.  But I resist the temptation to scream that it is all just the emperor's new clothes -- I am sure the people around me can honestly taste differences that I can't.  I know I can taste differences in bourbon they cannot taste.  Good vodkas on the other hand, are a different matter.  Some day I am going to do a blind vodka tasting for my vodka-snob friends and see if they really can taste the difference.

Postscript I used to love the show Connections by James Burke.  He would start with something like the Defenestration of Prague and show a series of connections between it and, say, the invention of the telephone.  Perhaps you can see why I found it entertaining since I began a post about the structural strength of steel at different temperatures and ended it with whether good vodkas really taste different.

There are a lot of James Burke TV episodes on Youtube and I recommend them all.  Connections is recommended of course but I actually think his best series was season 1 of the Day the Universe Changed.  I believe this is episode 1.

A Reminder: Why the US Rail System Is At Least as Good As the European System if You Care About Energy Use

In an article about the French railroad SNCF, Randal O'Toole makes a point I have screamed to the world for years:

Meanwhile, French trains carry less than 11 percent of freight, as more than 86 percent of freight is transported on highways. Those numbers are in sharp contrast to the U.S., where at least a third of freight goes by rail and less than 40 percent goes by truck (and I suspect a bad model has erroneously exaggerated the role of trucks).

American railroads are a model of capitalism, one of the least-subsidized forms of transportation in the world. They are profitable and do far more for the national economy than Europe’s socialized railroads, which mainly serve narrow elites.

Most of the intellectual elites and nearly all the global warming alarmists deride the US for not having the supposedly superior rail system that France and Germany have.  They are blinded by the vision of admittedly beautiful high speed trains, and have frittered away billions of dollars trying to pursue various high speed rail visions in the US.

I know that the supposedly pro-science global warming alarmists sometimes are not actually very focused on science, but this is pretty simple to think about.

First, consider the last time you were on a passenger train.  Add up the weight of all the folks in your car.  Do you think they weighed more or less than the car itself?  Unless you were packed into a subway train with Japanese sumo wrestlers, the answer is that the weight of the car dwarfs that of the passengers it is carrying.    The average Amtrak passenger car apparently weighs about 65 tons (my guess is a high speed rail car weighs more).  The capacity of a coach is 70-80 passengers, which at an average adult weight of 140 pounds yields a maximum passenger weight per car of 5.6 tons.  This means that just 8% of the fuel in a passenger train is being used to move people -- the rest goes into moving the train itself.

Now consider a freight train.  The typical car weight 25-30 tons empty and can carry between 70 and 120 tons of cargo.  This means that 70-80% of the fuel in a freight train is being used to move the cargo.

Now you have to take me on faith on one statement -- it is really hard, in fact close to impossible, to optimize a rail system for both passengers and freight.  In the extreme of high speed rail, passenger trains required separate dedicated tracks.  Most rail systems, even when they serve both sorts of traffic, generally prioritize one or the other.  So, if you wanted to save energy and had to pick, which would you choose -- focusing on freight or focusing on passengers?  Oh and by the way, if you want to make it more personal, throw in a consideration of which you would rather have next to you on crowded roads, another car or another freight truck?

This is why the supposedly-green folks' denigrating of US rail is so crazy to me.  The US rails system makes at least as much sense as the European system, even before you consider that it was mostly privately funded and runs without the subsidies that are necessary to keep European rail running.  Yes, as an American tourist travelling in Europe, the European rails system is great.  Agreed.  I use it every time I go there.  I have to assume that this elite tourist experience must be part of why folks ignore the basic science here.

My original article on all this years ago was in Forbes here.

Postscript #1:  One could argue that what matters is not the weight ratios of freight vs. passenger rail but how those compare to the road alternatives.  I would have to think this through, but it gets way more complicated because you have to start worrying about average occupancy and such since that also differs.  At full capacity say of 4 people, the typical 4000 pound car (US, rest of the world is less) would passenger weight around 12% of the total, higher than for the passenger train.   But average occupies could change the comparison and I don't have the time to work it through.  But for a full analysis we would have to take a lot of other things into account.  For example, trains are a poor fit with customer travel time preferences for longer US distances, even for higher speed options.  In the same way freight pencils out worse for rail in Europe because the last mile transport problems become a bigger percentage in a shorter haul.  I am confident though that for the US, the freight-dominant system is the right solution and it amazes me how hard it is to get anyone to recognize this.

Postscript #2:  Thinking about the SNCF, I actually did a consulting project there 20+ years ago.  I remember two things.   First they had 25% more freight car repair people than they had freight cars.  Which led me to making the tongue-in-cheek suggestion that they could give every one of these folks their own tool bag, assign them their own car to ride around on, and still cut a fifth of their staff.  I have never, ever, ever seen bloated staffing like I did at SNCF.  My other memory was lunches with executives that took place in palatial dining rooms with waiters in white gloves.  We ate for like 3 hours and drank a case of wine and all I could think about doing after lunch was going to take a nap.

Postscript #3:  This is really going to be a random aside, but if you want to bring science to the table, monorails are the dumbest things ever.  The whole advantage of rail is the friction reduction of a metal flanged wheel rolling on a metal rail.   Most monorails (and people movers) are just tires on a concrete beam (e.g this is how the Disney monorails work).  This is no more efficient than a bus and actually less because the train jacks up the vehicle to passenger weight ratio over a bus.  Because of certain geometry issues, monorails also have limited capacity.  Disney has been struggling with this for years at the Magic Kingdom in Florida and their ferry boats seem to move a lot more passengers than the adjacent monorails.  Monorails do look awesome, though, and their tracks are airier and more attractive than traditional elevated rail tracks.

The Electric Vehicle Mileage Fraud, Updated: Tesla Model 3 Energy Costs Higher than A Prius, Despite Crazy-High eMPG Rating

Nearly 8 years ago (can it be so long?) I wrote a series of articles about what I called the electric vehicle mileage fraud at the EPA.  Rather than adopt sensible rules for giving electric vehicles an equivalent mpg rating, they used a horrible unscientific methodology that inflated the metric by a factor of three (in part by ignoring the second law of thermodynamics).  All the details are still online here.  I am not omniscient so I don't know people's true motivations but one is suspicious that the Obama administration wanted to promote electric vehicles and put their thumb on the scale of this metric (especially since the EPA in the Clinton Administration has already crafted a much better methodology).  To be fair, smart people screw this up all the time -- even Eric Schmidt screwed it up.

Take for example the Tesla model 3, which has been awarded an eye-popping eMPG of between 120 and 131.   Multiplying these figures by .365 (as described in my linked article) gets us the true comparative figure of 44 to 48.  This means that in terms of total energy consumption in the system, the Tesla is likely better than most gasoline-powered vehicles sold but less energy efficient than top hybrids (the Prius is listed as 53-58 mpg).  At the end of the day, electric cars feel cheaper to fuel in part because they are efficient, but perhaps more because there is no little dial with rotating dollar numbers on the electric cables one attaches to charge them  (also, there are still places where one can skim electricity for charging without paying).

Basically, I have been a voice in the wilderness on this, but I just saw this note on the Tesla Model 3 and its operating costs from Anton Wahlman writing at Seeking Alpha

there are attractive and spacious hatchbacks yielding at least 55 MPG for under $25,000, without taxpayer funding needed. Just to be conservative and give the opposite side of the argument the benefit of the doubt, I’ll refer to these as 50 MPG cars, even though they perform a little better. Rounding down is sufficient for this exercise, as you will see below....

To find out [the price to charge a Tesla], you can go to Tesla’s Supercharger price list, which is available online: Supercharging.

As you can see in the table above, the average is close to the $0.24 per kWh mark. So how far does that $0.24 take you?

The Tesla Model 3 is rated at 26 kWh per 100 miles according to the U.S. Department of Energy: 2018 Tesla Model 3 Long Range.

In other words, almost four miles per kWh. It’s close enough that we can round it up to four miles, just to give Tesla some margin in its favor. That squares with the general rule of thumb in the EV world: A smaller energy-efficient EV will yield around 4 miles per kWh, whereas a larger EV will yield around 3 miles per kWh.

That means that at $0.24 per kWh, the Tesla Model 3 costs $0.06 per mile to drive.

How does that compare to the gasoline cars? At 50 MPG and today’s nationwide average gasoline price of $2.65, that’s $0.05 per mile. In other words, it’s cheaper to drive the gasoline car than the Tesla Model 3.

This result that the Tesla is slightly more expensive to fuel than the top hybrids is exactly what we would expect IF the EPA used the correct methodology for its eMPG.  However, if you depended on the EPA's current eMPG ratings, this would come as an enormous shock to you.

Electric vehicles have other issues, the main one being limited range combined with long refueling times.  But there are some reasons to make the switch even if they are not more efficient.

  1. They are really fun to drive.  Quiet and incredibly zippy.
  2. From a macro perspective, they are the easiest approach to shifting fuel.  It may be easier to deploy natural gas to cars via electricity, and certainly EV's are the only way to deploy wind or solar to transportation.

 

Elon Musk Made the Kessel Run in Less Than Twelve Parsecs

I had to laugh at the stories the other day on the battery backup system Elon Musk and Tesla made for the Australian Power grid:

Tesla has completed its 100 megawatt Powerpack battery backup system in South Australia within 100 days (easily), as Elon Musk had promised. That means the company essentially won the "bet," and won't be on the hook for the entire cost of the project, estimated at $50 million. More importantly, it means that some 30,000 homes in South Australia will have a power backup in case there's no breeze at the Hornsdale Wind Farm located about two hours from Adelaide.

A megawatt is a measure of energy production or transmission rate.  As such, it is a perfectly appropriate way to size the capacity of a power plant that is assumed to have a continuous supply of fuel.  However, it is an extremely odd way to size a battery.  A battery has a fixed energy storage capacity, which is generally measured in watt-hours (or some conversion thereof). For example a 10 Wh battery would provide 10 watts for an hour before running out, or 5 watts for 2 hours, etc.  It is not clear if this is just a typo, that they really mean 100MWh, or if 100 megawatts is the peak discharge rate and they are being silent on exactly how long this lasts (ie how long can those 30,000 homes be powered?)  I checked the first 10 sources in a Google search and not a single media outlet that routinely chastises climate skeptics for being anti-science seems to have questioned the oddball and nearly meaningless 100MW figure.

I was going to compare the number on energy storage here and show that you could actually generate electricity from gas, not just store it, for well less than this.  But it is sort of hard to make the calculation when they don't get the units right.

By the way, if this is required to make wind power work, will we start seeing wind advocates building in $50 million batteries when they present their economics?  Any bets?

Here Are the Two Problem With EV's

There are two problems with electric vehicles.  Neither are unsolvable in the long-term, but neither are probably going to get solved in the next 5 years.

  1.  Energy Density.  15 gallons of gasoline weighs 90 pounds and takes up 2 cubic feet.  This will carry a 40 mpg car 600 miles.   The Tesla Model S  85kwh battery pack weighs 1200 pounds and will carry the car 265 miles (from this article the cells themselves occupy about 4 cubic feet if packed perfectly but in this video the whole pack looks much larger).  We can see that even with what Musk claims is twice the energy density of other batteries, the Tesla gets  0.22 miles per pound of fuel/battery while the regular car can get 6.7.  That is a difference in energy density of 30x.  Some of this is compensated for by heavy and bulky things the electric car does not need (e.g. coolant system) but it is still a major problem in car design.
  2. Charge Time.  In my mind this is perhaps the single barrier that could, if solved, make electric cars ubiquitous.  People complain about electric car range, but really EV range is not that much shorter than the range of traditional cars on a tank of gas.  The problem is that it is MUCH faster to refill a tank of gas than it is to refill a battery with a full charge.    Traditionally it takes all night to charge an electric car, but 2 minutes at the pump to "charge" a gasoline engine.   The fastest current charging claim is Tesla's, which claims that the supercharger sites they have built on many US interstate routes sites will charge 170 miles of range in 30 minutes, or 5.7 miles per minute.   A traditional car (the same one used in point 1) can add 600 miles of range in 2 minutes, or 300 miles per minute, or 52 times faster than the electric car.  This is the real reason EV range is an issue for folks.

Interestingly, Fisker (which failed in its first foray in to electric cars) claims to have a solid state battery technology that gets at both these issues, particularly #2

“Fisker’s solid-state batteries will feature three-dimensional electrodes with 2.5 times the energy density of lithium-ion batteries. Fisker claims that this technology will enable ranges of more than 500 miles on a single charge and charging times as low as one minute—faster than filling up a gas tank.”

Forget all the other issues.  If they can really deliver on the last part, we will all be driving electric vehicles in 20 years.  However, having seen versions of this same article for literally 30 years about someone or other's promised breakthrough in battery technology that never really lived up to the hype, I will wait and see.

Why Is It So Hard To Get Even Smart People To Think Clearly on Electric Vehicle Efficiency?

A lot of people on Twitter get freaked out when they see football players kneeling for the national anthem, or detect obscure micro-agressions in some online statement.  When I venture onto Twitter, which I am still not sure is good for my mental health, I get freaked out by this:

My initial response on Twitter was "Of course they are if you leave out the efficiency of converting fuel to electricity".  I will explain this response more in this post.

It would be impossible to say that Eric Schmidt is not a smart guy or lacks technical training.  I'd like to think that he would quickly understand his error and say that he would have said it better when he has 280 characters.  But soooo many people make this mistake, including the folks who write the electric vehicle MPGe standards for the government, that it is worth explaining why Mr. Schmidt's statement, as written, is silly.

Let's first look at what the terms here mean.

  • When we say that electric motors are 97% efficient, we mean that the actual physical work produced per unit of time is 97% of the electrical power used by the motor, which equals the current flowing to the motor times its voltage.
  • When we say that the internal combustion engine is 45% efficient, we mean that the physical work we get out of the engine is 45% of the heat liberated from burning its fuel.

By the way, both these efficiency numbers are the top end of current technology running at an ideal speed and percentage load.  In real life, efficiencies of both are going to be much lower.  Of the two numbers, the efficiency number for internal combustion is probably the most generous -- for non-diesel engines in most cars I would be surprised if the actual efficiency was much higher than half this figure.  Even average electric motors will still be in the 80's.

Here is the problem with what he tweeted

The problem with Schmidt's statement on its face is that he is comparing apples and oranges -- he has left out the efficiency in actually producing the electricity.  And for the vast, vast majority of the country, the marginal fuel -- the fuel providing the electricity for the next increment of load -- is going to be natural gas or coal.  His numbers leave out that conversion step, so let's add it in.

Actual power plants, depending on their age and use, have a wide range of efficiency numbers.  For example, a big combined cycle plan is more efficient that a gas turbine, but a gas turbine is useful because it can be started and stopped really quickly to react to changes in load.  Schmidt used leading-edge efficiency numbers so I will do the same.  For a coal plant the best numbers are in the high forties.  For a gas plant, this can reach into the 50's (this site says 60% but that is the highest I have ever seen).  We will take 50% as a reasonable number for a very very efficient power plant.  Power plants, by the way, since they tend to run constantly at ideal speeds and loads can get much closer to their ideal efficiency in real life than can, say, internal combustion engines.

After the electricity is produced, we have to take into account line and transformer losses (and in the case of electric cars the battery charging losses).  This obviously varies a lot but I have always used a figure of 10% losses so a 90% efficiency number.

Taking these numbers, let's convert the 97% efficiency number for electric motors to an efficiency number all the way back to the fuel so it is apples to apples with internal combustion.  We take 97% times 90% transmission efficiency times 50% electricity production efficiency equals 43.6%.  This is actually less than his 45% figure.  By his own numbers, the electric motor is worse, though I think in reality with realistic efficiency numbers rather than best-possible numbers the electric motor would look better.   The hard step where one is really fighting the laws of thermodynamics is the conversion of heat to work or electricity.  So it is amazing that a tiny power plant in your car can even be in the ballpark of giant optimized multi-stage power plants.

Here is why electric motor efficiency is almost irrelevant to getting rid of fossil fuels

Very efficient electric motors are necessary to moving to a non-fossil fuel economy, but not because of small increments in efficiency.  The reason is that large parts of our energy-using technology, mostly vehicles, run on a liquid fuel directly and this distribution for the fuel is already in place.  To replace this liquid fuel distribution system with something else is really expensive.  But there does exist one other energy distribution system that has already been built out -- for electricity.  So having efficient electric motors allows use of non-gasoline energy sources if those sources can be turned into electricity.  For example, there are real advantages to running vehicles on CNG, but there is no distribution system for that and so its use has been limited to large fleets (like city busses) where they can build their own fueling station.  But electric cars can use electricity from natural gas, as well as solar and wind of course that have no other distribution method other than by electricity.

The problem with all this is that most of the barriers to using electricity in more applications are not related to motor efficiency.  For vehicles, the problem is in energy storage density.  Many different approaches to powering automobiles were tried in the early days, including electric and steam powered cars.  The main reason, I think, that gasoline won out was due to energy storage density.  15 gallons of gasoline weighs 90 pounds and takes up 2 cubic feet.  This will carry a 40 mpg car 600 miles.   The Tesla Model S  85kwh battery pack weighs 1200 pounds and will carry the car 265 miles (from this article the cells themselves occupy about 4 cubic feet if packed perfectly but in this video the whole pack looks much larger).  We can see that even with what Musk claims is twice the energy density of other batteries, the Tesla gets  0.22 miles per pound of fuel/battery while the regular car can get 6.7.  More than an order of magnitude, that is simply an enormous difference, and explains the continued existence of internal combustion engines much better than electric motor inefficiencies.

And here is why electric vehicle equivalent MPG standards are still screwed up

I don't really have the energy to write about this again, but because these issues are so closely related I will quote myself from the past.  Suffice it to say that after years of development, the EPA made nearly the exact same mistake as did Mr. Schmidt's tweet.  This Despite the fact that the agency had already developed an accurate methodology and then abandoned it for a flawed methodology that produced inflated numbers for electric vehicles.  There is more than one way for the government to subsidize electric vehicles!

The Fisker Karma electric car, developed mainly with your tax money so that a bunch of rich VC's wouldn't have to risk any real money, has rolled out with an nominal EPA MPGe of 52 in all electric mode (we will ignore the gasoline engine for this analysis).

Not bad?  Unfortunately, it's a sham.  This figure is calculated using the grossly flawed EPA process that substantially underestimates the amount of fossil fuels required to power the electric car, as I showed in great depth in an earlier Forbes.com article.  In short, the EPA methodology leaves out, among other things, the conversion efficiency in generating the electricity from fossil fuels in the first place [by assuming perfect conversion of the potential energy in the fuel to electricity, the EPA is actually breaking the 2nd law of thermodynamics].

In the Clinton administration, the Department of Energy (DOE) created a far superior well to wheels MPGe metric that honestly compares the typical fossil fuel use of an electric vs. gasoline car, using real-world power plant efficiencies and fuel mixes to figure out how much fuel is used to produce the electricity that goes into the electric car.

As I calculated in my earlier Forbes article, one needs to multiply the EPA MPGe by .365 to get a number that truly compares fossil fuel use of an electric car with a traditional gasoline engine car on an apples to apples basis.  In the case of the Fisker Karma, we get a true MPGe of 19.  This makes it worse than even the city rating of a Ford Explorer SUV.

The Insanity of Base Load Wind Power

I have talked a lot about how wind power has almost no effect on fossil fuel use because the unpredictability of wind requires a lot of fossil-fueled plants to keep burning fuel on hot standby in case the wind dies.  Matt Ridley comes at wind from a different angle, discussing what it would take for wind to actually have any meaningful impact on world electricity production.

Even put together, wind and photovoltaic solar are supplying less than 1 per cent of global energy demand. From the International Energy Agency’s 2016 Key Renewables Trends, we can see that wind provided 0.46 per cent of global energy consumption in 2014, and solar and tide combined provided 0.35 per cent. Remember this is total energy, not just electricity, which is less than a fifth of all final energy, the rest being the solid, gaseous, and liquid fuels that do the heavy lifting for heat, transport and industry....

Meanwhile, world energy demand has been growing at about 2 per cent a year for nearly 40 years. Between 2013 and 2014, again using International Energy Agency data, it grew by just under 2,000 terawatt-hours.

If wind turbines were to supply all of that growth but no more, how many would need to be built each year? The answer is nearly 350,000, since a two-megawatt turbine can produce about 0.005 terawatt-hours per annum. That’s one-and-a-half times as many as have been built in the world since governments started pouring consumer funds into this so-called industry in the early 2000s.

At a density of, very roughly, 50 acres per megawatt, typical for wind farms, that many turbines would require a land area greater than the British Isles, including Ireland. Every year. If we kept this up for 50 years, we would have covered every square mile of a land area the size of Russia with wind farms. Remember, this would be just to fulfil the new demand for energy, not to displace the vast existing supply of energy from fossil fuels, which currently supply 80 per cent of global energy needs.

How do renewables advocates trumpet the high renewables numbers they often report?  By lumping in other things and hoping the reader is tricked into thinking the total is wind and solar.

Their trick is to hide behind the statement that close to 14 per cent of the world’s energy is renewable, with the implication that this is wind and solar. In fact the vast majority — three quarters — is biomass (mainly wood), and a very large part of that is ‘traditional biomass’; sticks and logs and dung burned by the poor in their homes to cook with. Those people need that energy, but they pay a big price in health problems caused by smoke inhalation.

People who talk about sustainability often miss the single best metric we have of the net scarcity of resources that goes into any product:  price.  I am always amazed when people point at a much much higher price version of some product and claim that it is more sustainable.  How can this possibly be?  Assuming the profit margins are relatively similar, the higher priced product has to be using more and scarcer resources.  How is that more sustainable  (I will perhaps grant the exception that certain emissions are not properly priced into some products).

To this end, wind power is much more expensive than, say, power from modern natural gas generation plants, even if one factors in a $30 a ton or so cost of CO2 emissions.  This has to make us suspicious that maybe it is not really more "sustainable".

Wind turbines, apart from the fibreglass blades, are made mostly of steel, with concrete bases. They need about 200 times as much material per unit of capacity as a modern combined cycle gas turbine. Steel is made with coal, not just to provide the heat for smelting ore, but to supply the carbon in the alloy. Cement is also often made using coal. The machinery of ‘clean’ renewables is the output of the fossil fuel economy, and largely the coal economy.

Commercial Airline Pilots, African Style

This was a letter from a pilot in answer to the question of why European airports ban flights from certain African airlines

One African airline I know of has the procedure that every landing must be a smooth one. That sounds okay to most people, but hopefully not to the pilots among us. They mandated that landing smoothly was more important than landing in the touchdown zone. They didn’t go around, they would simply touch down in the middle of the runway and slam on the brakes, close their eyes and pray that they didn’t go off the end of the runway. But the passengers never knew any of this, just feeling a smooth touchdown, so that was the most important factor for them.

The same airline would work out the maximum load they were able to safely lift in terms of passengers and freight, complete the load sheet and other paperwork to fit with this maximum, then do the real calculations in-flight, commonly landing more than five metric tonnes over the maximum landing weight.

The same airline made their pilots work 10 days on, one day off. They were not allowed to call in sick and any breach of this would necessitate armed men being sent to the pilots place of accommodation to physically force them onto the aircraft. I believe I know of two First Officers who got away with it — one because he had been arrested for murder, the other because he had been kidnapped. So I guess they weren’t totally unreasonable.

The SOP at my airline is to descend at 700 feet per minute for a three-degree approach and flare at about 20 feet. The SOP for this airline was to descend at 1500 feet per minute — thrust is idle so it saved them fuel — and flare at about 100 feet, floating down the runway to land dangerously, but smoothly, far far down the runway. Maintenance procedures were ignored and reports doctored. Licenses and check rides were given and passed based on bribes.

Eek.

By the way, if you are a frequent traveler interested in airline and credit card frequent flyer programs and benefits, the linked site is a good one.

Why Wind Power Does Not Greatly Reduce Fossil Fuel Use

The problem with wind power is that electric utilities have to be prepared at any time for their power production to just stop on short notice.  So they must keep fossil fuel plants on hot standby, meaning they are basically burning fuel but not producing any power.  Storage technologies and the use of relatively fast-start plants like gas turbines mitigates this problem a bit but does not come close to eliminating it.  This is why wind power simply as a source contributing to the grid makes very little sense.  Here is Kent Hawkins of Master Resource going into a lot more depth:

How do electricity systems accommodate the nature of wind and solar? They do this by having redundant capacity almost equalling the renewable capacities as shown in Figures 5 and 6 for two jurisdictions that have heavily invested in wind and solar – Germany and Ontario, Canada.

Pt I Fig 5

Figure 5 – Duplicate capacity requirements for Germany in 2015.

Source: See note 4, sub point a.

 

Part 1 Fig 6

Figure 6 – Duplicate capacity requirements for Ontario, Canada, in 2018

Source: Ontario Power Authority[5]

In both figures, the left-hand columns are peak demand requirements and include all the dispatchable capacity that is required to reliably meet demand and provide operating reserve. In the right-hand columns, if you look very carefully, you can see the capacity credit for wind by the slight reduction in “Peak Demand + Op Reserve.” In summary, when wind and solar are added, the other generation plants are not displaced, and, relative to requirements, wind and solar are virtually all duplicate capacity.

Wind might make more sense in niche applications where it is coupled into some kind of production process that can run intermittently and have its product stored.  I think T Boone Pickens suggested having wind produce hydrogen from water, for example, and then store the hydrogen as fuel.  This makes more sense because the total power output of a wind plant over a year can be predicted with far more certainty than the power output at any given minute of a day.  This is one reason why the #1 historic use of windpower outside of transportation has been to pump water -- because the point is to fill the tank once a week or drain the field over a month's time and not to make absolutely sure the field is draining at 10:52 am.  The intermittent power is stored in the form of water that has been moved from one place to another.

When Government Picks Winners, It Mostly Chooses Losers

In an article for Cato mocking the Obama Administration for creating energy technology forecasts that run to the year 2300, Pat Michaels wrote:

Consider the case of domestic natural gas. In 2001, everyone knew that we were running out. A person who opined that we actually would soon be able to exploit hundreds of years’ worth, simply by smashing rocks underlying vast areas of the country, would have been laughed out of polite company.

Energy statists on the Left today are trying to get rid of coal-fired electricity generation in this country (due to climate concerns).  But one thing that few people remember is that a significant reason we have so much coal-fired electricity generation in this country is that energy statists on the Left in the 1970's mandated it.  I kid you not:

The Powerplant and Industrial Fuel Use Act (FUA) was passed in 1978 in response to concerns over national energy security. The 1973 oil crisis and the natural gas curtailments of the mid 1970s contributed to concerns about U.S. supplies of oil and natural gas. The FUA restricted construction of power plants using oil or natural gas as a primary fuel and encouraged the use of coal, nuclear energy and other alternative fuels. It also restricted the industrial use of oil and natural gas in large boilers.

As a further irony, and absolutely typical of government regulation, this regulation banning oil and gas fired plants because oil and gas seemed to be running out was really trying to fix a problem caused by another regulation.   The government had caps on oil and gas prices through the 1970's that artificially reduced supplies.  Once these price regulations were removed, we suddenly had an oil and gas glut in the 1980's and the FUA was eliminated in 1987.  Watching regulators chase their tails in energy policy over the last 40 years would be comical if the effects of their repeated mistakes were not so dire.

The Left Justifies New Taxes Based on Reducing (Presumed) Negative Externalities, But Actually Just Wants The Money

Here is the Wikipedia definition of  a Pigovian tax:

A Pigovian tax (also spelled Pigouvian tax) is a tax levied on any market activity that generates negative externalities (costs not internalized in the market price). The tax is intended to correct an inefficient market outcome, and does so by being set equal to the social cost of the negative externalities. In the presence of negative externalities, the social cost of a market activity is not covered by the private cost of the activity. In such a case, the market outcome is not efficient and may lead to over-consumption of the product.[1] An often-cited example of such an externality is environmental pollution.

The Left often tries to justify new taxes based on their being Pigovian taxes.  The classic example is a carbon tax -- it is claimed there is a social cost to carbon-based fuel combustion (e.g. CO2 production and resulting global warming) that is not taken into account by market prices.  By adding the tax, these other costs can be taken into account, likely raising the price of these fuels and thus both reducing their use and providing a higher price umbrella for alternatives.

For years, I accepted these arguments at face value.  I might argue with them (for example, I think that the Left has tended to spot 10 of the last 2 true negative externalities), but I accepted that they really believed in the logic of the Pigovian tax.  I am now becoming convinced that I was wrong, that the Left's support of Pigovian taxes is frequently a front, a way of putting a more palatable face on what is really a naked grab for more taxpayer money by public officials.  To support this emerging hypothesis, I cite two examples.

 1.  Proposed Carbon Tax in Washington State

This last November, a carbon tax was placed on the ballot in Washington State.  In many ways, it partially mirrored my own proposal (here) by making the tax revenue neutral, ie the new carbon tax was offset by a reduction in other regressive taxes, particularly other consumption taxes.  If the Left and environmental groups truly embraced the Pigovian logic of a carbon tax, they should have jumped at supporting this initiative.  I discuss what happened in depth here but Vox has a good summary:

The measure, called Initiative 732, isn’t just any carbon tax, either. It’s a big one. It would be the first carbon tax in the US, the biggest in North America, and one of the most ambitious in the world.

And yet the left opposes it. The Democratic Party, community-of-color groups, organized labor, big liberal donors, and even most big environmental groups have come out against it.

Why on Earth would the left oppose the first and biggest carbon tax in the country? How has the climate community in Washington ended up in what one participant calls a "train wreck"? (Others have described it in more, er, colorful terms.)....

the alliance’s core objection to I-732 is that it is revenue-neutral — it surrenders all that precious revenue, which is so hard to come by in Washington. That, more than anything else, explains why alliance groups are not supporting it.

Opponents say they wanted to use the revenue for climate-related investments, but even if true there are two things wrong with this.  First, it shows ignorance of the economic theory of the Pigovian tax -- the whole point is that by raising the price of carbon-based fuels, markets will find the most efficient way to reduce this fuel use.  The whole point is that it is way more efficient to reduce CO2 production through this simple pricing mechanism than it is through government cronyist winner-picking "investments".  The second problem is that such promises of funds dedication never last.  Supposedly the tobacco settlement was all supposed to go to health care and tobacco-related education, but there is not a single state where even a double digit percentage went to these things (the American Lung Association estimates just 2% of the funds go to the original purpose).  In New York, the entire tobacco settlement stream was securitized and used to plug a single year's general budget hole.  You can be assured the same thing would happen with carbon tax revenue.

2.  Soda Tax in Philadelphia

Last year, Philadelphia passed a large soda tax.  The justification for such a tax is that such drinks cause obesity and other health issues.  Either for people's own good or to reduce the future burden on government health care programs, the whole point of such a tax is to reduce soda consumption.  Or so it was justified.

But now, once the tax took effect, the city government that passed the tax seems to be shocked and surprised that soda consumption is way down.  You would think that they would be declaring victory, ... that is, if the point was ever to reduce soda consumption and not just to raise some extra revenue.  Via Reason:

For now, Kenney and other city officials seem unfazed—dismissive, even—of the problems caused by the new tax. A city spokesman told Philly.com that no one knows whether low sales figures and predicted job losses are anything more than "fear-mongering to prevent this from happening in other cities."

Kenney put an even finer point on it.

"I didn't think it was possible for the soda industry to be any greedier," Kenney said in an emailed statement to Philly.com reporter Julia Terruso. "They are so committed to stopping this tax from spreading to other cities, that they are not only passing the tax they should be paying onto their customer, they are actually willing to threaten working men and women's jobs rather than marginally reduce their seven figure bonuses."

It's not the first time Kenney has tried to ignore basic economics when it comes to the soda tax. A few weeks ago, he blamed grocery stores and restaurants for "price gouging" when they increased prices for sugary drinks to make consumers pay for the cost of the tax (the tax is technically applied on the transaction between distributors and retailers, but, like all other taxes, it gets passed along).

Its clear that this tax justified as a pigovian tax is really no such thing.   City officials seem to be honestly surprised that consumption is down as the result of a Pigovian tax whose purpose is to... reduce consumption.  And if they really did not expect the tax to get passed on to consumers, then how does it work?   In fact, city officials are actually worried that reductions in soda consumption is going to cause the tax to yield less money than they expected, creating a hole in their budgets.

*    *    *

Going forward, I plan to apply an order of magnitude more skepticism to any future calls for Pigovian taxes.  I think the first thing I will ask of each new suggestion is "do you still support this tax if I were to make it revenue neutral, say by offsetting it with reductions in another regressive taxes?"

Why Scams Work

The WSJ has an interesting article about why get rich quick schemes that should be so easy to demolish, particularly with Google at our fingertips, seem to attract so many people.

The article reminded me of a piece I published years ago over at my climate site.  It was about a company called "Hydroinfra" in Sweden.  I want to reprint the article as I still find the subject to be immensely entertaining.  In particular, I really really encourage you to look at the comments section of this article linked towards the bottom and see the back and forth with reader "michael".  In the face of overwhelming skepticism from pretty much every other reader, Michael desperately wants to believe -- so much so that he and a few others start heaping derision and sinister motives (interspersed with spurious appeals to authority) on those who are trying to patiently explain the science.  One can see this same desperate behavior from those who have bought into every famous pyramid scheme ever.

I got an email today from some random Gmail account asking me to write about HyrdoInfra.  OK.  The email begins: “HydroInfra Technologies (HIT) is a Stockholm based clean tech company that has developed an innovative approach to neutralizing carbon fuel emissions from power plants and other polluting industries that burn fossil fuels.”

Does it eliminate CO2?  NOx?  Particulates?  SOx?  I actually was at the bottom of my inbox for once so I went to the site.  I went to this applications page.  Apparently, it eliminates the “toxic cocktail” of pollutants that include all the ones I mentioned plus mercury and heavy metals.  Wow!  That is some stuff.

Their key product is a process for making something they call “HyrdroAtomic Nano Gas” or HNG.  It sounds like their PR guys got Michael Crichton and JJ Abrams drunk in a brainstorming session for pseudo-scientific names.

But hold on, this is the best part.  :

Splitting water (H20) is a known science. But the energy costs to perform splitting outweigh the energy created from hydrogen when the Hydrogen is split from the water molecule H2O.

This is where mainstream science usually closes the book on the subject.

We took a different approach by postulating that we could split water in an energy efficient way to extract a high yield of Hydrogen at very low cost.

A specific low energy pulse is put into water. The water molecules line up in a certain structure and are split from the Hydrogen molecules.

The result is HNG.

HNG is packed with ‘Exotic Hydrogen’

Exotic Hydrogen is a recent scientific discovery.

HNG carries an abundance of Exotic Hydrogen and Oxygen.

On a Molecular level, HNG is a specific ratio mix of Hydrogen and Oxygen.

The unique qualities of HNG show that the placement of its’ charged electrons turns HNG into an abundant source of exotic Hydrogen.

HNG displays some very different properties from normal hydrogen.

Some basic facts:

  • HNG instantly neutralizes carbon fuel pollution emissions
  • HNG can be pressurized up to 2 bars.
  • HNG combusts at a rate of 9000 meters per second while normal Hydrogen combusts at a rate 600 meters per second.
  • Oxygen values actually increase when HNG is inserted into a diesel flame.
  • HNG acts like a vortex on fossil fuel emissions causing the flame to be pulled into the center thus concentrating the heat and combustion properties.
  • HNG is stored in canisters, arrayed around the emission outlet channels. HNG is injected into the outlets to safely & effectively clean up the burning of fossil fuels.
  • The pollution emissions are neutralized instantly & safely with no residual toxic cocktail or chemicals to manage after the HNG burning process is initiated.

Exotic Hyrdrogen!  I love it.  This is probably a component of the “red matter” in the Abrams Star Trek reboot.  Honestly, someone please tell me this a joke, a honeypot for mindless environmental activist drones.    What are the chemical reactions going on here?  If CO2 is captured, what form does it take?  How does a mixture of Hydrogen and Oxygen molecules in whatever state they are in do anything with heavy metals?  None of this is on the website.   On their “validation” page, they have big labels like “Horiba” that look like organizations thave somehow put their imprimatur on the study.  In fact, they are just names of analytical equipment makers.  It’s like putting “IBM” in big print on your climate study because you ran your model on an IBM computer.

SCAM!  Honestly, when you see an article written to attract investment that sounds sort of impressive to laymen but makes absolutely no sense to anyone who knows the smallest about of Chemistry or Physics, it is an investment scam.

But they seem to get a lot of positive press.  In my search of Google, everything in the first ten pages or so are just uncritical republication of their press releases in environmental and business blogs.   You actually have to go into the comments sections of these articles to find anyone willing to observe this is all total BS.   If you want to totally understand why the global warming debate gets nowhere, watch commenter Michael at this link desperately try to hold onto his faith in HydroInfra while people who actually know things try to explain why this makes no sense.

Years later, doing a Google search, I still seem to be the only person in the first 10 pages of Google results that wrote a skeptical article.  Seriously, I figured out this was all bullsh*t from about 60 seconds of studying their web site -- is this really what happens in tech journalism?  I got the same press release in my box that they did.  I (and many of the tech site commenters) figured this out quickly, why didn't any actual journalists?

 

Demand Curve? What Demand Curve?

Today's little slice of economic ignorance comes from tech site Engadget, a frequent contributor of such morsels.  Apparently California is considering new penalties on auto makers for not selling enough electric cars, penalties which by their structure will be fed right into the pocket of Tesla, already a gaping maw of government subsidy consumption:

Assemblywoman Autumn Burke tells the Associate Press that she's introducing a bill requiring that car manufacturers sell at least 15 percent zero-emissions free vehicles within a decade. Companies operating in the state already have to hit yearly emissions targets and get credits for sales, but this would require that they embrace electric or hydrogen fuel cell cars in a big way -- not just one or two novelty models. And if they don't sell enough eco-friendly cars, they'd have to either pay a fine to the state or pay rivals that meet the targets. Yes, they might inadvertently help the competition.

If the bill becomes law, it could light a fire under car makers that have so far been slow to adopt emissions-free tech. Only 3 percent of all California car sales are either electric or plug-in hybrids.

The underlying assumption, both by Ms. Burke as well as the article's author, seems to be that lack of electric car sales is entirely a supply-side problem -- low sales are because auto makers don't make enough of them.  While I have no doubt that there would be incrementally more sales if auto makers had a larger variety of models with different combinations of features, all of this seems to ignore the demand side.  Automakers, who are constantly locked in a death struggle over tiny increments of market share, and who already pay penalties for not selling as many electric cars as politicians would wish them to, have every incentive to sell as many as they can.  The issue strikes me as one of demand rather than supply - given current technology limits and costs, and despite large financial incentives from the government in the form of tax subsidies, most buyers have eschewed electric vehicles to date.  Neither Ms. Burke nor the author even pretend that this law will change this demand situation.

Which is why critics rightly argue that this is just another way to funnel other people's money into Elon Musk's pocket, without his actually having to sell any more cars.  Tesla already depends on payments from other auto makers for electric vehicle indulgences for much of its revenue, and this can only go up under this kind of law.

Why Wind and Solar Are Not Currently the Answer on Emissions Reductions

I have made this point forever, but it always bears repeating -- the variability of wind and solar require hot fossil fuel backups that leads to little reduction in total fossil fuel generation capacity (so that wind and solar investments are entirely duplicative) and less-than-expected reductions in actual emissions.

I don't think wind will ever be viable, except perhaps in a few unique offshore locations.  Solar is potentially viable with a 10x or so reduction in panel costs and a 10-100x reduction in battery/energy storage costs.  I honestly think that day will come, but we are not there.

From the Unbroken Window comes this slide from an interesting presentation at the Ontario Society of Professional Engineers, essentially making the same points I and others have been trying to make for years.

Ontario-Engineers

I made the point about nuclear in my climate legislative proposal here.

The Electric Vehicle Mileage Fraud Update: Singapore Figures It Out

Long-time readers know that while I have no particular problems with electric cars, I do think that the EPA uses fraudulent standards for evaluating the equivalent fuel economy or MPGe of electric vehicles.  In short, the current Obama standard ignores the previous Clinton-era methodology and creates a crazy new standard that assumes fossil fuels are burned with perfect efficiency when making electricity.  Most of my readers (but perhaps few Obama voters) will understand this assumption to be absurd.  The result is, as discussed here in Forbes, that the current MPGe numbers for electric vehicles are overstated by a factor of 3 (specifically you need to multiply them by 36.5% to get the correct equivalent amount of fossil fuels that must be burned in the power plant to power the electric car).  When this correction is made, cars like the Nissan Leaf are good (but not as good as a Prius) and cars like the old Fiskers Karma get worse mileage than a SUV.

As I wrote in the article on the Karma,

...electric vehicle makers want to pretend that the electricity to charge the car comes from magic sparkle ponies sprinkling pixie dust rather than burning fossil fuels. Take this quote, for example:

a Karma driver with a 40-mile commute who starts each day with a full battery charge will only need to visit the gas station about every 1,000 miles and would use just 9 gallons of gasoline per month.

This is true as far as it goes, but glosses over the fact that someone is still pouring fossil fuels into a tank somewhere to make that electricity.  This seems more a car to hide the fact that fossil fuels are being burned than one designed to actually reduce fossil fuel use.  Given the marketing pitch here that relies on the unseen vs. the seen, maybe we should rename it the Fisker Bastiat.

Well, congrats to Singapore.   They seem to have figured out what the US hasn't :

In the United States, motorists who buy a new Tesla Model S are eligible for an array of federal and local tax breaks because the all-electric sedan is considered a zero-emissions car. The story is different in Singapore, however, where the nation’s first Model S owner just found out his car is subject to heavy taxes because it’s lumped in the same category as some of the dirtiest new cars on the market.

Joe Nguyen explains he spent seven months trying to import a Model S that he bought in Hong Kong to his home in Singapore. The government’s Carbon Emissions-based Vehicle Scheme (CEVS) rewards motorists who import a used eco-friendly car with a roughly $11,000 tax break, but Nguyen was slapped with an $11,000 fine based on the conclusion that the S uses too much electricity.

“I don’t get it, there are no emissions. Then they send out the results from VICOM, stating that the car was consuming 444 watt hours per kilometer. These are not specs that I have seen on Tesla’s website, or anywhere else for that matter,” explained Nguyen in an interview with Channel NewsAsia.

A spokesperson for Singapore’s Land Transport Authority (LTA) said the fine is fair and completely justified.

“As for all electric vehicles, a grid emission factor of 0.5 g CO2/Wh was also applied to the electric energy consumption. This is to account for CO2 emissions during the electricity generation process, even if there are no tail-pipe emissions,” wrote the spokesperson in a statement. The LTA added that it had never tested a Model S before it received Nguyen’s car.

That means that, under Singaporean regulations, the Model S falls in the same emissions category as cars with an internal combustion engine that emits between 216 and 230 grams of CO2 per kilometer. In other words, it’s about as eco-friendly a high-performance, gasoline-burning models like the Audi RS 7, the Mercedes-AMG GT S, and the Porsche Cayenne S.

Actually, the US DOE does in fact publish electricity usage in watts per mileage driven.   They list numbers in the range of 38 KwH per 100 miles for the Model S, which would be about 238 watt hours per kilometer, so such numbers exist though Singapore thinks the car is less efficient than does Obama's DOE.  By my calculation the true MPGe (if the DOE's electric efficiency numbers are trustworty) of the car should be around 32, which is good for a large performance car (and well better than the competitive cars cited) but probably not lofty enough to deserve a subsidy.  Singapore's calculations that the Model S is as dirty as these cars on a CO2 emissions basis may still be correct even if it is more efficient if most of Singapore's electricity is produced by coal.

The Fight Against Global Warming, in One Picture

Using a helicopter and a large tank of heated water to deice a windmill so it can continue to reduce fossil fuel use and global warming.  (source)

de-icing-wind-turbine

 

Great Moments in US Energy Policy: In the 1970's, The US Government Mandated Coal Use For New Power Plants

What does government energy policy have in common with government food advice?  Every 30-40 years the Federal government reverses itself 180 degrees and declares all the stuff that they said was bad before is now good today.

Case in point:  Coal-fired electrical generation.  Coal is pretty much the bette noir of environmentalists today, so much so that Obama actually pledged to kill the coal industry when he was running for office.   The combination of new regulation combined with the rapid expansion of cheap natural gas supplies has done much to kill coal use (as illustrated by this bankruptcy today).

But many people may not realize that the rise of coal burning in power plants in the US was not just driven by economics -- it was mandated by government policy

Federal policies moved in coal's favor in the 1970s. With the Middle East oil crisis, policymakers began to adopt policies to try and shift the nation toward greater coal consumption, which was a domestic energy resource. The Energy Supply and Environmental Coordination Act of 1974 directed the Federal Energy Administration to prohibit the use of oil or natural gas by electric utilities that could use coal, and it authorized the FEA to require that new electric power plants be able to use coal. The Energy Policy and Conservation Act of 1975 extended those powers for two years and authorized $750 million in loan guarantees for new underground low-sulfur mines. Further pro-coal mandates were passed in the late-1970s.

I was aware of the regulations at the time as I was working in an oil refinery in the early 80's and it affected us a couple of ways.  First, it killed demand for low-sulphur heavy fuel oil.  And second, it sidelined several co-generation projects that made a ton of sense (generating electricity and steam from wasted or low-value portions of the oil barrel) but ran afoul of these coal mandates.

The Wrong Way to Sell Wind and Solar

A reader sent me this article on renewables by Tom Randall at Bloomberg.  I would like to spend more time thinking about it, but here are a few thoughts. [Ed:  sorry, totally forgot the link. duh.]

First, I would be thrilled if things like wind and solar can actually become cheaper, without government subsidies, than current fossil fuels.  I have high hopes for solar and am skeptical about wind, but leave that aside.

Second, I think he is selling renewables the wrong way, and is in fact trumpeting something as a good thing that really is not so good.  His argument is that the decline in capacity factors for natural gas and coal plants is a sign of the success of renwables.  The whole situation is complex, and a real analysis would require looking at the entire power system as a whole (which neither of us are doing).  But my worry is that all the author has done is to demonstrate a unaccounted-for cost of renewables, that is the reduction in efficiency of coal and natural gas plants without actually being able to replace them.

Here is his key chart.  It purports to show the total US capacity factor of each energy mode, with capacity factor defined as the total electricity output of the plant divided by what the electricity output could be if the plant ran full-out 24/7/365.

capacity factors

First, there is a problem with this chart in terms of its data selection -- one has to be careful looking at intra-year variations in capacity factor because they vary a lot seasonality, both due to weather and changes in relative fuel prices.  Also, one has to be hugely suspicious when someone is claiming a long term trend but only shows 18 months of data.   The EIA can provide some of the data for a few years ahead of his table.  You can see it is pretty volatile.

eia1

I won't dwell on the matter of data selection, because it is not the main point I want to make, but the author's chart looks suspiciously like cherry-picking endpoints.

The point I do want to make is that reducing the capacity utilization, and thus efficiency, is a COST not a benefit as he makes it out.  Things would be different if renewables replaced a lot of fossil fuel capacity at the peak utilization of the day (the total capacity of a power system has to be sized to the peak daily demand).  But the peak demand in most Western countries occurs late in the day, long after solar has stopped producing.  Germany, which relies the most on solar, has studied this and found their peak electricity demand is around 6PM, a time where solar provides essentially nothing.   Wind is a slightly different problem, because of its hour to hour unpredictability, but suffice it to say that it can't be counted on in advance on any particular day to provide power at the peak.

This means that one STILL has to have the exact same fossil fuel plant capacity as one did without renewables.  Yes, it runs less during the day and burns less fuel, but it still must be built and exist and be staffed and in many cases it still must be burning some fuel (even if producing zero electricity) to be hot and ready to go.

The author is arguing for a virtuous circle where reductions in capacity factors of fossil fuel plants from renewables increases the total cost per KwH of electricity from fossil fuels (because the capital cost is amortized over fewer kilowatts).  This is technically true, but it is not the way power companies have to look at it.  Power companies have got to build capacity to the peak.  With current technologies, that means fossil fuel capacity has to be built to the peak irregardless of their capacity factor.  If these plants have to be built anyway to cover for renewables when they disappear during the day, then the capital costs are irrelevant at the margin.   And the marginal cost of operations and producing power from these plants, since they have to continue to exist, is around $30-$40 a MwH, waaaay under renewables still.

In essence, the author is saying:  hurray for renwables!  We still have to have all the old fossil fuel plants but they run less efficiently now AND we have paid billions of dollars to duplicate their function with wind and solar plants.  We get to pay twice for every unit of electricity capacity.

Environmentalists are big on arguing that negative externalities need to be priced and added to the cost of things that generate them -- thus the logic for a carbon tax.  But doesn't that mean we should tax wind and solar, rather than subsidize them, to charge them for the inefficiently-run fossil fuel plants we have to keep around to fill in when renewables inevitably fail us at the peak time of the day?

By the way, speaking of subsidies, the author with a totally straight face argues that renewables are now cheaper than fossil fuels with this chart:

solar costs

 

He also says, "Wind power, including U.S. subsidies, became the cheapest electricity in the U.S. for the first time last year."

I hate to break it to the author, but a Ferrari would be cheaper than a Ford Taurus if the government subsidized it enough -- that means nothing economically other than the fact that the government is authoritarian enough to make it happen.  All his chart shows is that solar is more expensive than coal and gas in every state.

And what the hell are those units on the left?  Does Bloomberg not know how to annotate charts?  Since 6 cents per Kw/hr is a reasonable electricity cost, my guess is that this is dollars per Mw/hr, but it is irritating to have to guess.

OK, I am Calling the Market Top

As readers will know, I am frustrated that the Feds continue to fuel a huge financial asset bubble.  While I was wrong, so far, that the Feds would create consumer and industrial price inflation from their massive money printing operation, they have created an enormous price inflation in financial assets.   Every week they pour more newly printed dollars into the hands of financial asset holders, and corporations have joined in the fun by taking advantage of low borrowing rates to buy back record amounts of their own shares.   With both the Fed and publicly-traded corporations taking so many financial assets off the market at the same time investors have new cash to invest, someone has to create some new assets to buy.

Enter:  The $500 million spec home.  I kid you not.

LOL, I am betting the neighbors are not happy

click to enlarge

As an upside, I suppose they are creating a future tourist attraction.  Many of the great Gold Coast and Newport mansions of the late 19th century were too expensive for later generations to operate and ended up in the hands of non-profits and governments.

Nestle: Private Company Getting Blamed for Government Incompetence

The story begins with a discovery that the permit under which Nestle's Arrowhead Water has been collecting water in the San Bernardino National Forest expired in 1988.  LOL, oops.  Environmental and other Leftish sites are calling for Nestle's head and somehow blaming Nestle for this.

As a permittee with the US Forest Service (USFS) in California and across the country, I can guess with pretty high confidence exactly what happened here.  For years I was head of a trade group of recreation concessionaires (think lodges and guides and such) who do business in the USFS under permit.  Most of these were located in California.  For years, the biggest problem we have had with the USFS in California is that they are years and years behind in nearly all their permit renewals.  There are literally hundreds of expired permit in the USFS in California alone.

For reasons that probably go to bureaucratic incentives, despite the Forest Service's huge budget, they are loath to allocate resources to renewing these permits -- they want to fill their organization with biologists and archaeologists and arborists, not contracts people.  Making the situation worse, Forest Service and other Federal rules have burdened the permit renewal process with so many legal requirements that each one, even if trivial in size and impact, is absurdly time-consuming to complete.

This is not a new situation -- it has obtained for years.  Almost five years ago I met personally with the Chief of the Forest Service in DC and begged for more resources to be assigned to permit renewals, but to no avail.   I did the same in a meeting barely a month ago with the head of the USFS's Region 5 (basically California).   All of us permittees have been vociferously complaining about this for years.

When you look at these situations, then, what you will see is not some evil private business trying to get over on the public, but a business that is literally screaming in frustration, year in and year out, begging the US Forest Service to address its permit renewal.   Generally, local Forest Service staff will give the company verbal assurances that they should keep operating, so they do, continuing to pay their fees and operate within the guidelines of the old, expired contract.

I would be willing to bet a fair amount of money that this is exactly what happened to Nestle.

By the way, the usual groups seem to be piling on Nestle about bottled water from the Sacramento tap water system.  A couple of comments:

  • Environmentalists seem to obsessively hate bottled water, but ignore what a trivial, trivial percentage of total water use is bottled.
  • Critics are accusing Nestle of making obscene profits on Sacramento tap water.  But if they really think the spread between tap water and bottled water is too large, isn't the real issue that Sacramento is under-pricing its tap water?  After all, Nestle is paying what everyone else in the town is paying for water.
  • Environmentalists have a misguided fetish for local foods, often ignoring that transportation costs and energy are a tiny percentage of most food production costs  (a percentage small enough to be dwarfed by differential productivity of soils and climates).  But here, all they can possibly accomplish is to chase Nestle's bottling plant out of California and then have the water trucked back into the state.  This might be a net gain depending on the differential value of California water vs. fuel, but we can't know that because California water pricing is so screwed up.

OK, I Relent: I Will Support A Carbon Tax If Y'all Will Stop the Torrent of Stupid

President Obama is preparing to unleash a Colorado-River-sized torrent of stupid.  He wants to spend tens of billions of dollars on goofy green energy projects that will have an indiscernible affect on world temperatures but will have a very robust effect on some crony bottom lines.   Here is one example:

As part of President Obama’s plans to combat climate change, the White House announced a program on Friday for the U.S. Department of Energy to train 75,000 people to work in the solar power industry by 2020, many of whom will be part of a military veterans jobs initiative called Solar Ready Vets.

Seriously, is the training costs of workers really a substantial portion of a solar installation?

Andrea Luecke, president and executive director of the Solar Foundation, which publishes the annual National Solar Jobs Census, said that Obama’s announcement will not likely increase the size of the solar industry’s workforce but will instead ensure that the industry will be able to find highly skilled workers to fill jobs.

“We’re experiencing difficulty finding more skilled and qualified workers to install and do design work required,” she said, adding that the industry’s workforce has a “skills gap” as well-trained electricians and other workers go back to other construction jobs as the economy gains momentum.

I will translate that trade-group speak for you:  We like to pay our workers less than similarly-skilled construction workers so we lose a lot of skilled workers to higher paying construction companies. This program will not add any net employment to the economy but will help us keep wages lower by increasing the supply of qualified workers.

I can't help but think of Henry Ford, who famously raised the wages of his employees substantially.  The fake story is that he did this so all his workers could buy his product.   The real reason he did this was that he had horrendous labor turnover problems.  Like the solar industry, he was training folks who then left for higher paying jobs.  So he had to raise his wages to retain trained people.  How history would have changed if Ford had instead been able to call Obama and ask him to have the taxpayer pay to feed him with new, trained workers so he wouldn't have to raise his wage rates!

Seriously, did a bunch of technocrats get together and study the whole solar industry and come to the conclusion that solar installation skills were the keystone problem that was holding back the whole industry?  Of course not.  The solar industry will sink or swim based on panel costs and efficiencies.   What happened is someone said, "well the public always seems to like job training programs.  Those poll well."  And then they called the solar crony association or whatever it is called and they said, "sure, we would love to have taxpayers pay some of our training costs.  Thanks, we will be very supportive." And then someone said, "well, won't the Republicans pitch a fit over this?" And then someone had the brilliant idea of making it a veterans program -- "Republicans love soldiers, that will help defuze their opposition."  And an expensive crony giveaway was born.

About 5 years ago I said I would be willing to accept a carbon tax whose proceeds were used to reduce various labor tax rates (e.g. social security).  Substituting an energy consumption tax for a labor consumption tax was probably at least neutral and maybe even a net positive.

Now, I want to come back to that idea.  I don't believe any more than I did then that CO2-driven global warming will be catastrophic.  In fact, I am more confident than ever that while CO2-induced warming is a reality, the sensitivity of temperatures to CO2 levels is relatively low.  But please, I am willing to fully support a carbon tax that offsets some other existing tax if only we will stop all this stupid crony useless green energy stuff.  At least with a carbon tax, the markets will reduce fossil fuel use in the most efficient ways possible.  As opposed to programs like this one that will reduce fossil fuel use not at all but will cost a lot of money.

On Funding and Bias in Climate

I really, really did not want to have to write yet another post on this.  99+% of all climate funding goes to alarmists rather than skeptics.   Greenpeace laments donations of funds to skeptics by Exxon of a million dollars or so and wants to drive out all such funding when Greenpeace and Tides and the US Government are giving literally billions to alarmists.  Despite this staggering imbalance, the only stories you ever see are about the dangers and bias introduced by that measly 1% skeptics get.  I guess that 1% is spent pretty well because it sure seems to have people running in circles declaring the sky is falling.

One would think that at some point the world would wake up and realize that criticizing the funding sources behind an individual does not actually rebut that individual's arguments.

Potential bias introduced by funding sources (or some other influence) are a pointer -- they are an indication there might be a problem warranting deeper examination of the evidence introduced and the methodology of collecting that evidence.  Such potential biases are not themselves evidence, and do nothing to rebut an argument.  A reasonable way to use such biases in an argument would be something like:

I want to begin by noting that Joe may have had a predisposition to his stated conclusion even before he started because of [funding source, political view, whatever].  This means we need to very carefully look at how he got to his conclusion.  And I intend to show you that he made several important errors that should undermine our acceptance of his conclusions.  They are....

Unfortunately, nowadays people like the New York Times and our own Arizona Representative Raul Grijalva seem to feel like the job is done after the first sentence.  They have decided that the best way to refute recent scientific work by a number of climate scientists is to try to show that some of their funding comes from fossil fuel companies.

Beyond the strange implicit assumption that fossil fuel funding would automatically "disprove" a research paper, there is also an assumption that oil company funding is "unclean" while government or non-profit environmental group funding is "clean".  Remember the last time you saw a news story about a climate alarmist's funding?  Yeah, neither do I.

There is no justifiable reason for this asymmetry.  Funding does not potentially introduce bias because it is sourced from for-profit or non-profit entities.  In fact, the motivation of the funding source is virtually irrelevant.  The only relevant questions related to bias are:

  1. Did the funding source demand a certain conclusion at the outset of the study as the price of the funding -- or --
  2. Is there a reasonable expectation that the source would deny future funding if the conclusions of the study don't go their way

My sense is that #1 is rare and that the real issue is #2.

But #2 is ubiquitous.  Sure, if a group of skeptical scientists suddenly started writing papers about 8 degree warming predictions, Chevron is going to be less likely to fund their future research.  But on the flip side if Michael Mann suddenly started saying that future warming will only be a modest 1-2 degrees, do you think that he would continue to get funding from Greenpeace, the Tides Foundation, the WWF, or even from an Obama-run Federal agency?  No way.   There is absolutely no less bias introduced by Chevron funding than from Greenpeace funding, because in each case there can be a reasonable fear by the researcher that future funding would be denied by that source if the "right" answer was not reached.

Postscript & Disclosure of Biases:  I have never received any outside funding for this blog or my climate work.  However, if Chevron were to send me a check for a million dollars, I would probably cash it.  I do own individual shares of ExxonMobil stock as well as shares of the Vanguard S&P500 index fund, which includes equities of a number of energy companies.  I also am a frequent purchaser of gasoline and electricity, as well as a number of other products and services whose prices are tied to energy prices (e.g. air transportation).  As a consumer, I would rather not see the prices of these products rise.  I buy a lot of food, whose price might be improved by longer growing seasons.  My camping company tends to benefit from rising gasoline prices, because rising prices causes people to stay closer to home and camp at the type of places we operate.  It is hard to predict how regional climates will change if overall global temperatures rise, but since many of my campgrounds are summer escapes at high altitude, they would probably benefit somewhat from rising temperatures.  I own a home in Arizona whose value would probably be lessened if the world warmed 2-3 degrees, because it would make winters in the northeast and midwest more bearable and thus hurt Arizona as a location for a winter second home.  Global warming may reduce the life of my dog as we are less likely to walk her when it is over 100 degrees out which makes her less healthy.  I own land in Hawaii that might be more valuable if sea level rises puts it 6-8 inches close to the ocean.  I am planning a vacation to see the tulips bloom in Holland and changes in climate could shift the blooming date and thus cause me to miss the best colors.  Fifteen years from now my daughter would like a June wedding and changes to climate might cause it to rain that day.  My daughter also owns 5 shares of Walt Disney and their earnings might be helped by global warming as nostalgia for cooler weather could greatly increase DVD sales of "Frozen".

Competition via Influencing Government

I have mentioned a number of times my chicken or the egg arguments with Progressives on the solution to cronyism.  Is the problem that government power exists to influence markets, and as long as it exists people will bid to control it?  Or is it possible to wield massive make-or-break government power over industry rationally, and only the rank immorality and corrupt speech of corporations stands in the way.  The former argues for a reduction in government power, the latter for more regulation of corporations and their ability to participate in the political process.

I believe this is an example in favor of the "power is inherently corrupting" argument.  No corporation lobbied for NOx rules on diesel engines.  They all fought it tooth and nail.  But once these regulations existed, engine makers are all trying to use the laws to gut their competition:

In 1991, the EPA ignored complaints from several makers of non-road engines that rivals were cheating, in order to save fuel, on emissions rules for oxides of nitrous (NOx). Then environmental groups took up the same complaint, whereupon the agency demanded face-saving consent decrees with numerous engine makers, including two Volvo affiliates.

In essence, the engine makers apologized by agreeing in 1999 to accelerate by a single year compliance with a new emissions standard scheduled to take effect in 2006.

Meanwhile, with another NOx standard looming in 2010, Navistar sued the EPA claiming rival engine-makers were seeking to meet the rule with a defective technology. In turn, Navistar’s competitors sued claiming the EPA was unfairly favoring a defective technology pursued by Navistar (these are only the barest highlights of what became a truck-makers’ legal holy war).

While all this was going on, a Navistar joint-venture partner, Caterpillar, complained that 7,262 Volvo stationary engines made in Sweden before 2006 had violated the 1999 consent decree. Now let’s credit Caterpillar with a certain paperwork ingenuity: The Volvo engines were not imported to the U.S. and were made by a Volvo affiliate that wasn’t a party to the consent decree. EPA itself happily certified the engines under its then-current NOx standard, only changing its mind four years later, prodded by a competitor with a clear interest in damaging Volvo’s business.

To complete the parody, a federal district court would later agree that the 1999 consent terms “do not clearly apply” to the engines in question, but upheld an EPA penalty anyway because Volvo otherwise might enjoy a “competitive advantage” against engines to which the consent decree applied.

As a side note, this is from the "oops, nevermind" Emily Litella School of Regulation:

Let it be said that the EPA’s NOx regulation must have done some good for the American people, though how much good is hard to know. The EPA relies on dubious extrapolations to estimate the benefits to public health. What’s more, the agency appears to have stopped publishing estimates of NOx pollution after 2005. Maybe that’s because the EPA’s focus has shifted to climate change, and its NOx regulations actually increase greenhouse emissions by increasing fuel burn.