A new study in the journal Nature uses a computer model to tell us not only how much oil, natural gas, and coal be used over the next forty years, but also from which countries this energy can be harvested, in order to limit global warming by the year 2100 to 2 degrees Celsius. Beyond showcasing the utter hubris involved in today’s climate change policy debate, the study blows two holes in the standard case for a carbon tax: First, the 2°C target will probably cause more economic damages than environmental benefits, according to the latest IPCC report. Second, proponents of a carbon tax have been lecturing us for years that it is a “market solution” in contrast to top-down central planning. That’s why it’s so very revealing to see people grab hold of the new Nature study to justify bans on the Keystone Pipeline and development of unconventional oil and gas resources.

Summarizing the New Study

To give context to the present post, let me quote from the “Editor’s summary” at Nature of the new study:

If global warming is to be limited in this century to the much-publicized 2°C rise compared to pre-industrial levels, fossil fuel use and the associated release of greenhouse gases will need to be severely limited. This raises questions regarding the specific quantities and locations of oil, gas and coal that can be safely exploited. Christophe McGlade and Paul Ekins use an integrated assessment model to explore the implications of the 2°C warming limit for different regions’ fossil fuel production. They find that, globally, a third of oil reserves, half of gas reserves and over 80% of current coal reserves should remain unused during the next 40 years in order to meet the 2°C target and that the development of resources in the Arctic and any increase in unconventional oil production are incompatible with efforts to limit climate change.

To repeat myself, the most obvious reaction to such declarations is astonishment at the hubris involved. In particular, the “integrated assessment models” that these researchers and others use are in their infancy.

Back in 2013, MIT professor (and supporter of a carbon tax) Robert S. Pindyck published a paper in the prestigious Journal of Economic Literature that was absolutely scathing in its review of such computer models. The title of his paper was, “Climate Change Policy: What Do the Models Tell Us?” Here is the abstract of his paper:

Very little. A plethora of integrated assessment models (IAMs) have been constructed and used to estimate the social cost of carbon (SCC) and evaluate alternative abatement policies. These models have crucial flaws that make them close to useless as tools for policy analysis…[T]he models can tell us nothing about…the possibility of a catastrophic climate outcome.  IAM-based analyses of climate policy create a perception of knowledge and precision, but that perception is illusory and misleading. [Bold added.]

In light of such flaws in the computer models, it is astounding that we are seriously considering their projections of the percentage of acceptable fossil fuel development broken down by geographical location. For example, an E&E article by Christa Marshall on the new study reports that “about 95 percent of the coal reserves from Russia, Australia and the United States need to remain unburned,” and that “[a]bout 40 percent of Middle Eastern oil reserves should stay untapped, while the United States could use more than 90 percent of its oil, according to the paper.”

The study’s projections of which reserves of natural gas, coal, and oil should be used are based (in part) on their estimation of the financial footing of various projects. But this approach is just as flawed as using IAMs for climate policy, once again giving the false appearance of knowledge and precision. After all, not too long ago nearly everyone thought the cost of extracting natural gas and oil from shale rock was prohibitively expensive. Also, ten years ago everyone thought the U.S. needed to quickly build liquid natural gas import terminals. But now people are building liquid natural gas export terminals. Very few people correctly predicted even five years ago that a boom in U.S. oil production would help dramatically cut the price of oil. In fact, very few people predicted a year ago that the price of oil today would be $50 a barrel instead of $100. Projecting project economics 35 years into the future has little basis in reality, yet such projections provide a crucial element in for the new study.

But beyond the hubris, things are even worse for the authors: The entire premise of the study—namely, that humanity needs to limit global warming by the year 2100 to 2°C—is itself not supported by the latest IPCC report.

Don’t Kid Yourself: Interventionists Don’t Care About the “Consensus Science”

In a previous IER post, I methodically drew on each of the three main components of the latest IPCC report to show that the “much-publicized” (the term used by the editor at Nature) goal of 2°C in total warming was not an empirical conclusion flowing from the very climate and economic models that the IPCC surveyed. On the contrary, this goal is a focal point on which environmentalists and other supporters of government intervention in energy markets have converged, but it is largely arbitrary. Here is what I wrote in the introduction to my previous post:

“Before I dive into the details, let me outline my overall strategy: I am going to show that the IPCC’s own projections show that the likely economic costs of aggressive government action to slow emissions will be higher than the likely reduction in climate change damages that this policy action will yield. In other words, I will show that the IPCC’s middle-of-the-road estimates (for their best guess as to what will actually happen) show that the costs of government action to mitigate climate change are higher than the benefits.”

To be sure, there are computer models out there that show catastrophic results if the climate passes a “tipping point” and this is the rationale for adopting a hard ceiling on temperature increases. But when you use the most likely outcomes and look at the costs and benefits of various mitigation policies, then using the IPCC report itself you will conclude that the 2°C target is not worth the damage it will inflict on the global economy. This is the inconvenient truth of what “the consensus” says when we use the peer-reviewed literature to figure out what the climate goal ought to be.

So Much for a Decentralized “Market Solution”

Another lesson from the media flurry over the new Nature study is that proponents of a carbon tax should stop telling us that it is a “market solution” that will allow officials to avert the threat of climate change while respecting the ignorance of central planners. On the contrary, the most vocal environmentalists have no plans to make concessions on “green” subsidies, Keystone Pipeline, EPA power plant rules, fuel economy standards, Renewable Portfolio Standards, or other top-down prohibitions and mandates. The fact that so many “green” websites are trumpeting the new Nature study shows that they are quite confident in their ability to micromanage the global energy industry, and to declare entire fuel sources off-limits, telling us where we can—and can’t—have energy development.

Conclusion

The new Nature study estimating the percentage of various energy sources that can be used from various regions epitomizes all that is wrong with today’s climate change policy debate. The study assumes that current trends will continue, allowing it to make projections decades into the future that will certainly turn out false. Beyond the obvious hubris of this project, the article’s premise is misguided, according to the very IPCC “consensus” that proponents of government intervention have been saying should be the arbiter of disputes. Furthermore, the glee with which the results have been embraced shows that it is foolish for anyone to believe that a carbon tax will be a “market solution” that will humbly keep allocation decisions out of the hands of political officials.

Print Friendly, PDF & Email