Welcome to Energy Townhall, the place where IER's experts converge to provide timely analysis and commentary on all things energy. Join the conversation.

Oil and Gas Production on Federal Lands Still a Disappointment Oil

Posted April 24, 2014 | folder icon Print this page

In fiscal year 2010, 36 percent of our nation’s oil production took place on federal lands. Due to Obama Administration policies, by 2013, only 23 percent of our nation’s oil production took place on federal lands. Production on non-Federal lands, in contrast, is skyrocketing as hydraulic fracturing and horizontal drilling have increased production dramatically. Oil production on non-Federal lands increased by 21 percent in fiscal year 2013 from fiscal year 2012 levels—an increase of almost one million barrels per day.[i]

Likewise, natural gas production on Federal lands has been steadily declining, while natural gas production on non-Federal lands has been steadily increasing. In fiscal year 2013, natural gas production on non-Federal lands increased by 3 percent—654 billion cubic feet—and natural gas production on Federal lands declined by 9 percent—392 billion cubic feet.

Data from Fiscal Year 2009

Looking at data farther back provides an even more dramatic indication that Obama Administration policies are hurting oil and gas production on Federal lands, while oil and gas production on non Federal lands is skyrocketing. According to a report by the non-partisan Congressional Research Service, oil production on federal lands fell 6 percent between fiscal years 2009 and 2013 while over the same time period, oil production increased by 61 percent on state and private lands. As a result of these large increases in production, crude oil production on state and private lands increased by 2.1 million barrels per day over the 4 year period, an amount which is more than each of the following oil producing nations — Algeria, Libya, Qatar, or Norway– produced in 2012.

Screen Shot 2014-04-22 at 2.44.22 PM

Likewise, natural gas production on federal lands decreased by 28 percent between fiscal year 2009 and fiscal year 2013, while natural gas production on non-federal lands increased by 33 percent over the same time period.

Screen Shot 2014-04-22 at 2.46.58 PM

Source: CRS,

Obama Administration Policies

The Obama Administration has increased the time it takes to obtain a permit to drill on Federal lands, driving oil and gas companies to private and state lands on which to make their investments. After a company has obtained a lease, it must also obtain an application for permits to drill (APD) for each oil and gas well. Despite the Energy Policy Act of 2005 providing a new improved timeline for review, it took  an average of 307 days to approve or deny an APD in 2011, 89 days more than the 218 days it took in 2006, which is 41 percent more time. In its budget justifications, the Bureau of Land Management indicated that overall processing times per APD have increased because of the complexity of the process, a process they themselves control.

Since fiscal year 2011, the Bureau of Land Management has improved the time it takes to process a permit. But, the average time during the Obama Administration is still 18 percent longer than the time it took to process a federal permit to drill during the 5 preceding fiscal years. (See chart below.)


In contrast, the process on private and state lands is relatively quick. State agencies process drilling permits on private lands, with some states approving permits within 10 business days. On private lands, states allow some surface management issues to be negotiated between the oil producer and the individual land/mineral owner. To see the success of hydraulic fracturing and shale oil development on non-Federal lands, one only needs to look at North Dakota, the second-largest oil producing state in the nation behind Texas. North Dakota has an unemployment rate of just 2.6 percent, compared to 6.7 percent for the nation. Most of the land in North Dakota is not controlled by the Federal government.[ii]

Besides taking an enormously long time to process permits, the number of leases on federal lands is down dramatically under the Obama Administration as the chart below shows. The average number of onshore leases that the Bureau of Land Management issued during the Obama Administration is more than 50 percent less than the average number issued by the Clinton Administration and over a third less than those issued by the Bush Administration. In fiscal year 2013, 2,278 fewer leases were issued compared to fiscal year 2006 (1,468 leases in FY 2013 compared to 3,746 in FY 2006).


There are numerous benefits to opening Federal lands to oil and gas development. An IER study shows that opening  federal lands and waters to exploration and production would increase federal tax revenue by $24 billion over the next seven years and $86 billion annually thereafter while State and local governments would receive $10.3 billion in annual state and local tax revenue over the next seven years and $35.5 billion annually thereafter.  Over 37 years, federal and state and local tax revenues would increase to $2.7 trillion in federal revenues and $1.1 trillion in state and local revenues.

Further, the economy would benefit by $127 billion annually for the next seven years, and $450 billion annually in the long run. Most impressively, the opening of federal lands could have a cumulative increase in economic activity of up to $14.4 trillion over a period of 37 years. And the ripple effect of that boom would be 552,000 in job gains annually over the next 7 years with annual wage increases of up to $32 billion and an increase of 1.9 million jobs annually in the long run with annual wage increases of $115 billion.[iii]


Oil and natural gas production on private and state lands is skyrocketing, while production on Federal lands is in decline and has been throughout the Obama Administration years. The Obama Administration is offering less land for lease and taking longer to process permits to drill. As a result of the red tape, oil and natural gas producers are turning to private and state lands to make their investments. As Rep. Ed Whitfield (R-Ky.), chairman of the energy and power subcommittee, said: “While President Obama has been anxious to take credit for increased oil and gas production, the only areas he is responsible for is on federal lands — the only areas where oil and gas production is actually decreasing.”[iv]

[i] Congressional Research Service, U.S. Crude Oil and Natural Gas Production in Federal and Non-Federal Areas, April 10, 2014,

[ii] Forbes, Obama Stymies Oil and Natural Gas Production on federal Lands, April 17, 2014,

[iii] Institute for Energy Research, Economic Effects of Immediately Opening Federal Lands to Oil and Gas Leasing: A Response to the Congressional Budget Office, February 2013,

[iv] The Hill, Oil, gas production drops on federal property, April 16, 2014,


AWEA’s Bold Push for More Wind Welfare Wind

Posted April 23, 2014 | folder icon Print this page

The American Wind Energy Association (AWEA) is making an all-out effort to convince Congress to renew the wind production tax credit (PTC), the wind industry’s lucrative subsidy that expired at the end of 2013. AWEA is desperate to revive the PTC and, unfortunately, its most recent lobbying push relies heavily on misinformation and half-truths in order to divert attention away from the PTC’s many critics.

To set the record straight, this article addresses some of AWEA’s flawed arguments and glaring omissions. The PTC, while incredibly valuable to owners of wind power facilities, hurts U.S. taxpayers and undermines the economic efficiency and physical reliability of the U.S. power grid.


AWEA is a well-funded and well-organized industry association with 40 years of experience influencing public policy and an annual budget of more than $30 million. Perhaps due to AWEA’s skilled lobbying efforts, four different administrations and countless lawmakers have sided with AWEA and provided the wind industry a direct hand-out from American taxpayers.

Initially signed into law by George H. W. Bush as part of the Energy Policy Act of 1992, the PTC has expired and been renewed multiple times. Each renewal lasted only a short period, designed to extend the industry’s coveted subsidy for just one or two more years. Most recently, the PTC was extended through the 2013 calendar year as part of the “fiscal cliff” legislation passed in early 2013. A PTC extension for 2014 recently passed the Senate Finance Committee after being added to a tax extenders package by one of the wind industry’s most enthusiastic supporters, Senator Chuck Grassley. The Joint Committee on Taxation projects that a one-year extension of the PTC will cost American taxpayers over $6 billion.

The Institute for Energy Research (IER) has consistently argued against the PTC and highlighted its negative effects, which range from threatening grid reliability to redistributing federal tax dollars to a minority of U.S. states.

AWEA and Exelon Spar Over the PTC

As part of AWEA’s push to renew the PTC, it recently published a 28-page report that attempted to show that the PTC does not distort electricity markets and does not harm nuclear plant owner-operators. The policy report comes as a direct response to Exelon Corporation, the owner of the largest fleet of nuclear plants in the U.S. The issue at the center of the policy debate is “negative pricing.”

What is Negative Pricing?

Unlike the stable and predictable price of electricity at the retail level, market prices for wholesale electricity can fluctuate widely throughout the day—usually referred to as on-peak and off-peak prices—and across seasons. For example, wholesale prices tend to range between $30 and $50 per megawatt-hour but can drop into the negative or spike well above $500 per megawatt-hour. When the price becomes negative, electric generators are actually paying the grid to take their electricity. Several factors influence wholesale prices, namely supply, demand, and transmission constraints. Fundamentally, negative wholesale prices send a distress signal to markets that the supply and demand balance on the grid is economically unsustainable and suppliers need to reduce their output.

Why do sellers not drop out of the market when negative pricing occurs? As the Energy Information Administration (EIA) notes, “negative prices generally occur more often in markets with large amounts of nuclear, hydro, and/or wind generation.” That is because each of these technologies has an incentive to continue operating even when its facilities are temporarily paying the grid to take their power.

Nuclear plants are designed to run at full output and not “ramp” up and down, making them very reliable but inflexible. In times of very low demand, nuclear plants will sometimes take negative prices rather than go through the long and expensive process of lowering their output. Similarly, hydroelectric plants sometimes take negative prices in power markets because they are forced to run in order to comply with environmental requirements that force them to release water, regardless of whether the electricity is needed.

Unlike nuclear or hydro producers, the wind industry actually profits from negative prices because the PTC is such a large subsidy. Wind producers receive PTC payments per unit of power produced (even when the power has no value whatsoever to the grid), so they flood the grid with uneconomic power and ignore the distress signal sent by negative prices. Specifically, wind producers are paid the equivalent of $35 per megawatt-hour in PTC subsidies, so a wind producer taking the PTC can still profit while paying the grid to take its electricity. Wind’s inflexibility in the face of negative prices is therefore a policy problem with a policy solution (let the PTC expire), not a matter of physics or environmental restrictions.

The threat to baseload generation from negative prices is very real. Already, Dominion closed its Kewaunee Nuclear Plant in Wisconsin 20 years ahead of schedule and Entergy plans to retire its Vermont Yankee Nuclear Plant at the end of this year. Both companies cited economic considerations as the reason for closing the plants. While it is true that low-cost natural gas is partially responsible, it is also clear that artificially low prices caused by the PTC during off-peak hours played a role. In fact, the Department of Energy’s assistant secretary for nuclear energy referred to this emerging pattern of nuclear plants shutting down early as “a trend we are clearly very, very concerned about.”

Exelon’s Argument

Exelon argues that the PTC wreaks havoc on baseload or “around-the-clock” generation such as nuclear power by encouraging negative prices in wholesale electricity markets. In contrast to baseload units, electricity production from wind peaks at night and in the early morning when electricity demand is low, which contributes to a situation of over-supply. A 2012 study commissioned by Exelon maintains that PTC-related negative prices harm baseload power and grid reliability because they “directly conflict with the performance and operational needs of the electric system.” Essentially, if the PTC is extended, it will induce more negative pricing events during off-peak hours, and make more baseload units uneconomic. In other words, the PTC perpetuates a system of predatory negative prices that attack reliable (and far less subsidized) baseload producers.

The power grid reliability implications are straightforward. The PTC is making reliable generation uneconomic, while subsidizing unreliable wind power. Without reliable generation up and running, many regions will struggle to meet seasonal peak demand in winter or summer. For most of the country, the highest peaks occur in the summer months. The following chart from a study on the intermittency of wind power illustrates just how little wind contributes to those summer peaks.

ERCOT Wind PowerOn these arguments against the PTC, IER is not alone—energy experts across the board agree with Exelon. The Congressional Research Service (CRS) acknowledged the problem of negative pricing, noting in 2012 that “[n]egative power prices associated with wind power might generally occur at night when wind is producing at high levels. Large amounts of wind power generation can potentially contribute to transmission congestion and result in negatively priced wholesale power in certain locations.” The EIA also specifically lists the PTC as a cause of negative prices.

The same CRS report from 2012 outlined the reliability issues associated with wind, predicting that “should wind power continue to experience growth, it is uncertain whether current [regional transmission organization] market designs would function to ensure availability of the types of generation that would be necessary to both maintain resource adequacy and manage the variable and intermittent nature of wind power.”

Last December, the New York Times published an article about how wind and nuclear power “are trying to kill each other off” and noted the “cannibal behavior” of wind in power markets.

Focusing on Texas, which is the U.S. market hit hardest by wind power, Public Utility Commission Chairman Donna Nelson testified in 2012 that “[t]he market distortions caused by renewable energy incentives are one of the primary causes I believe of our current resource adequacy issue… [T]his distortion makes it difficult for other generation types to recover their cost and discourages investment in new generation.” And as the non-partisan Center for Strategic and International Studies wrote in May of 2013, “[a] growing number of analytical reports…point to the negative impact of renewable energy mandates and subsidies (direct and indirect) on the competitiveness of nuclear power.”

In fact, some environmentalists are troubled by wind power’s parasitic effect on nuclear power. James Hansen’s observation relating to a similar policy—renewable portfolio standards—actually underscores Exelon’s argument regarding the PTC:

The asymmetry finally hit me over the head when a renewable energy advocate told me that the main purpose of renewable portfolio standards (RPS) was to “kill nuclear”. I had naively thought that the purpose was simply to kick-start renewables. Instead, I was told, because utilities were required to accept intermittent renewable energies, nuclear power would become less economic, because it works best if it runs flat out.

In short, the predatory pricing enabled by the PTC is real, it is harmful to reliable generation, and it hits nuclear generation the hardest. AWEA cannot shrug off the harmful effects of the PTC or pretend they do not exist. As an Exelon executive said recently, “[w]e can work with AWEA on a clean energy future but we can’t deny the truth.”

AWEA’s Fuzzy Math

AWEA’s policy report, titled “The facts about wind energy’s impacts on electricity markets: Cutting through Exelon’s claims about ‘negative prices’ and ‘market distortion,’” attempts to turn the negative pricing arguments on their head by narrowly focusing on the wind industry’s side of the story. Specifically, AWEA flatly misrepresents the effect of the PTC on wholesale markets by omitting important information and making bogus comparisons.

AWEA claims the impact of wind on wholesale markets is “entirely market-driven” and “widely seen as beneficial.” The first claim is patently false and the second is very misleading.

No one at AWEA can claim with a straight face that the growth in the wind industry is “entirely market-driven.” AWEA spends millions of dollars a year lobbying for renewable energy mandates in the states and for the PTC and other support at the federal level. If wind were truly “market-driven,” there would be no need for AWEA’s massive lobbying effort for mandates and subsidies. The mandates and subsidies AWEA supports are the exact opposite of “market-driven.”

AWEA knows better than any other organization just how much government support the wind industry receives—support that simply does not exist for baseload generation and should not exist for any power generation source whatsoever. Because of AWEA’s lobbying efforts to mandate the use of their product, 29 states and the District of Columbia mandate certain levels of renewable energy generation (these laws are commonly called Renewable Portfolio Standards or RPSs). Because the vast majority of the power being used to satisfy these requirements comes from wind plants, the wind industry currently enjoys a government-mandated market share. This alone is enough to discredit AWEA’s comment about Exelon obscuring the “real story of wind energy successfully competing against more expensive forms of energy in the market.” AWEA knows the wind industry is winning on government support, not the free market.

State-level mandates aside, AWEA attempts to downplay the role of the PTC specifically in undermining baseload generation. It is vitally important to realize that negative prices are not the only indicator of market distortion. AWEA draws a false dichotomy in its report between the “real economic savings” from wind and the “exceedingly rare” negative prices that cause market distortions. Here, AWEA downplays the possibility that market distortion can exist without negative prices. But just as the PTC subsidy causes negative prices at the extreme, it regularly causes artificially low power prices in off-peak hours that can be just as damaging to baseload generation.

AWEA then makes the stretch that, because the negative pricing problem was less rampant in 2013 than it was in 2012, market distortions from the PTC no longer exist or are “extremely rare.” This argument is fatally flawed as demonstrated by the following analogy. Consider if a thief said, “I didn’t do anything wrong in 2013. I only stole half as often as I did in 2012.” Such a statement would be silly because theft is theft. The same is true of harmful market distortions.

Just because there were fewer hours in 2013 with negative prices, it does not follow that the PTC is any less of a problem. Even in a world where prices were never to fall below zero, market distortion caused by the PTC could still render baseload units uneconomic. For example, reliable power plants would still close if prices were consistently at or very near zero. As discussed above, this is what we are seeing in practice, AWEA’s distractions notwithstanding.

Also, the 2012 data are so bad that 2013 was bound to be a less damaging year—in fact, one of Exelon’s plants took negative prices for 8.3 percent of all hours in 2012. The fact that this statistic fell to 4.3 percent in 2013 is little consolation. Essentially, we can debate the extent to which the PTC continues to cause negative prices, but to recast the PTC as incapable of distorting power markets is disingenuous on AWEA’s part.

Finally and perhaps most disturbingly, AWEA’s report fails to capture any long-term effects of the PTC. For example, in several places the AWEA report talks about wind power “replacing the most expensive and polluting sources of energy.” In practice, wind cannot do this because wind is unreliable. Wind cannot replace the most expensive source of electricity generation because those generation sources only run at peak times. The wind does not blow when AWEA wants it to and millions of dollars spent on lobbying cannot change that simple fact of the physical world.

Furthermore, as James Hansen and others have observed, heavily subsidized wind power is actually displacing zero-emission nuclear power rather than the “most polluting” sources AWEA references. If the goal of the PTC was to wipe out America’s nuclear fleet, then it is succeeding. But if the goal was to support zero-emission generation, then it has backfired miserably. The PTC has wasted billions of taxpayer dollars to replace nuclear, a clean technology that works, with one that only sounds good and is fundamentally unreliable.


The wind production tax credit distorts power markets by allowing wind producers to profit from artificially low prices. Such market distortion undermines the reliability of America’s power grid in the long run by forcing reliable baseload power plants to close—including nuclear plants, which in turn defeats any environmental purpose for keeping the PTC. AWEA’s recent study is a desperate attempt to obscure the very real and worrisome long-term effects of the PTC by relying on misleading data. The PTC has rightly received scrutiny from energy experts across the political spectrum, and it deserves a more comprehensive analysis than AWEA provides in its report.

IER Economist Travis Fisher authored this post.


Reasons to Repeal the Renewable Fuel Standard Biomass

Posted April 21, 2014 | folder icon Print this page

The Renewable Fuel Standard (RFS) was first put in place by the Energy Policy Act of 2005 and then was more than quadrupled by the Energy Independence and Security Act two years later—both bills signed by President George W. Bush. The RFS requires increasing amounts of biofuels to be blended with transportation fuel such as gasoline. Congress and the Bush Administration created the RFS schedule based on what they thought future demand for transportation fuels would be when they wrote the legislation. But they were dreadfully wrong.

Supporters of the bills in Congress and the Bush Administration thought that transportation fuel consumption would increase year after year, but after 2007 oil consumption plateaued.  The requirement to blend ever-increasing amounts of ethanol has now left the nation with a “blend wall,” hitting up against the 10 percent mark of ethanol blended in gasoline. This means that the law is requiring more ethanol to be produced than can be consumed in a 10 percent blend by gasoline vehicles.

This problem has prompted the Environmental Protection Agency (EPA) to propose a cut back to the requirement for this year, but they have not finalized the amount. And, of course, the biofuels producers who have benefitted from a mandated market for their products are lobbying EPA to retain the original enacted number.

But the blend wall problem and Washington’s failure to accurately predict energy supply and demand are not the only reasons to wish the RFS to disappear. Ethanol was also billed as a way to reduce greenhouse gas emissions. However, today, studies show that greenhouse gas emissions from the entire fuel cycle for corn-based ethanol, the predominant biofuel produced in this country, is not that much different than petroleum based transportation fuels. And the other primary justification for the RFS—our use of foreign oil—is going away as the hydraulic fracturing revolution is leading to more and more oil production, at least on those lands not controlled by the federal government.  The United States is now awash in oil due to shale oil production, and the boom created by hydraulic fracturing and horizontal drilling means that some are now urging that we lift the oil export ban because oil supplies are climbing so rapidly.

The biofuel mandates mean that food prices are up due to corn demand and corn prices are increasing from competition between ethanol producers and food and livestock producers for corn.  Since ethanol is less energy efficient than petroleum fuels, automobile fuel efficiency declines. Gasoline prices are up. And, small motor owners –owners of boats, lawn mowers, etc.—have paid for repairs and/or additives to their gas lines to deal with the corrosive properties of ethanol.

On the other hand, the RFS has brought wealth to the American heartland where corn farms and ethanol producers abound.

EPA Proposed Ruling

In November, the EPA issued a proposal to lower the RFS requirement for 2014 from the initial congressional mandate of 18.15 billion gallons of ethanol and biodiesel for blending into gasoline to 15.21 billion gallons—a reduction of 16 percent. In 2013, the RFS requirement was 16.55 billion gallons.[i] Of that total, 13.8 billion gallons was corn-based ethanol and 2.75 billion gallons was advanced biofuel that is not made from corn.[ii] The proposed ruling would reduce the advanced biofuel requirement to 2.2 billion gallons.[iii] Even though it is now the second quarter of 2014, the mandate has not been finalized and is undergoing interagency review at the White House Office of Management and Budget.

The proposed lower RFS level was due to the slack in U.S. demand for transportation fuel and the closeness of the blend wall, the point at which the mandate will require the use of more ethanol than can be blended into the fuel supply at 10 percent per gallon. Oil refiners are reluctant to blend more than 10 percent ethanol into gasoline because of possible harm it could do to older vehicles and the fact that automakers will not warranty vehicles using a level higher than the 10 percent of ethanol blended with gasoline. This is despite the fact that EPA has authorized the use of gasoline blends with up to 15 percent ethanol content for cars built since the 2001 model year. Those cars represent about two-thirds of vehicles currently on the road.[iv]  Notably, even though EPA says it is safe to use ethanol at higher levels, they have not offered to stand by their assurance by offering to pay for any damages the higher mix of ethanol might do to engines.

Another issue stemming from the 2007 legislation is that it also required set levels of cellulosic biofuel—biofuel to be produced from cellulosic biomass plants. In 2007, cellulosic ethanol was not economic to produce and it is not today either. When the Renewable Fuel Standard passed in 2007, lawmakers mandated cellulosic biofuel production of 1.75 billion gallons in 2014. Thus, to comply with the RFS, on a quarterly basis, 437,500,000 gallons of cellulosic ethanol must be produced. But, in the first quarter of 2014, only 72,111 gallons of cellulosic ethanol were produced[1] — a mere 0.016 percent of the amount mandated.

Because only minimal amounts of cellulosic ethanol have been produced, EPA in its proposal, lowered that cellulosic mandate to 17 million gallons (the current rate of production would produce less than 300,000 gallons of cellulosic ethanol). Obviously, requiring refiners to buy 17 million gallons of cellulosic ethanol when only 300,000 gallons is likely to be available is unrealistic.

KiOR, one of the industry’s leading companies which makes cellulosic biofuels using woody biomass and non-food feedstocks, announced it was idling its flagship plant in Columbus, Mississippi, at the start of this year due to lack of funds. KiOR generated net losses of $347.5 million, $96.4 million and $64.1 million for the years 2013, 2012 and 2011, respectively, and total operating losses of $525.5 million since its inception through December 31, 2013.[v] It has recently obtained $25 million to operate through August.[vi] But, it is unlikely that it will meet EPA’s expected production level of 9 million gallons of the 17 million gallon requirement.[vii]

If refiners do not or cannot meet their renewable fuel requirements, they must purchase Renewable Identification Numbers (RIN), which are renewable fuel credits purchased from companies that have used more ethanol than required. The price of RINs increased to over  $1 a gallon last year, from about seven cents at the start of 2013, adding to the cost of gasoline.[viii] They have since declined, but can easily escalate again if the required levels of biofuels are not available to be blended into petroleum fuels.

Biofuels Can Hurt Rather Than Help the Environment

Growing biofuel crops on a large scale requires the conversion of agricultural land used for food crops or the destruction of forested land or the draining of wet lands to free up land for production. Destroying forests can offset reductions in carbon emissions from the use of biofuels. Besides destroying forestation, biofuel production requires water that stresses existing water supplies.[ix]

In a recent report, the United Nations warns that growing crops to make biofuel harms the environment and drives up food prices, and also indicated that biofuels, rather than combating the effects of global warming, could make them worse. In the summary for policymakers, the UN states: “Increasing bioenergy crop cultivation poses risks to ecosystems and biodiversity.” The report from the Intergovernmental Panel on Climate Change notes: “If production [of biofuels] is not carefully managed, biofuel feedstocks can displace land for food cropping or natural, unmanaged ecosystems.” Referring in part to deforestation, it says any benefit of biofuel production on carbon emissions “may be offset partly or entirely for decades or centuries by emissions from the resulting indirect land-use changes”. On biofuel production from corn, it adds: “Resulting increases in demand for corn contribute to higher corn prices and may indirectly increase incidence of malnutrition in vulnerable populations.”[x]

Corn Usage and Food Prices

About 40 percent of the corn crop in the United States, about 5 billion bushels, is used for ethanol production.[xi] The increased demand has raised the total corn acreage planted from 78 million acres in 2006 to 97 million acres in 2012.[xii] A study by the National Council of Chain Restaurants found that the RFS increased food costs by $3.2 billion a year for chain restaurants because it increased the costs of poultry, beef, pork, and other agricultural products. Per restaurant, the RFS increased food costs by $18,000 a year and as high as $35,000 annually if the restaurant sells more beef than average.[xiii]

GAO Report on Petroleum Refining

A recent report by the Government Accounting Office (GAO) found that the delay in issuing revised RFS levels puts financial strain on the refining industry.[xiv] The GAO recommends that the EPA adopt a plan for the timely release of the standards providing refiners with the amount of biofuels that they must blend into gasoline. According to the GAO, the EPA has missed its deadline in every year since 2009, i.e. in almost all years of the Obama Administration. According to the report, the rising biofuel content required each year under the RFS means higher costs for refiners, and delays in the release of the revised levels leads to uncertainty that could hinder refining-industry investment.[xv]

Long-Term Demand for Biofuels

Demand for transportation fuels is expected to continue to be slack due to new Corporate Average Fuel Economy Standards (CAFÉ) that the Obama Administration has put into effect and the driving patterns of the younger generation, among other policies and trends. A study by Bloomberg New Energy Finance, for example, expects gasoline demand in California to drop by either 9 percent or 13 percent by 2020, depending on the scenario, due to federal CAFE standards, the RFS, and California’s Low Carbon Fuel Standard.[xvi]Nationwide, gasoline demand has dropped by 10 percent between 2005 and 2013. Cars are also getting much more expensive, which may be reducing the availability of personal transportation for a growing number of Americans.

If the goal is to  increase the long-term demand for ethanol, two options exist: increasing the blending limits as EPA has proposed, or establishing a market for vehicles fueled by higher ethanol blends. Increasing the blending limits is a non starter for automobile warranties and for small engine owners. Similarly, increasing the market through other fuel types, such as E85 (a blend of 85 percent ethanol and 15 percent gasoline) requires consumer interest, expansion of the fueling infrastructure, and fuel favorability over its competition (i.e. electricity and hydrogen).[xvii]


Whether sourced from corn or other cellulosic materials, the demand for ethanol will decrease domestically without a mandate or an expanding market for higher blend fuels. But keeping the RFS mandate means lower automobile efficiency, higher fuel and food prices, more water and land usage, deforestation of forests, draining of wetlands, and no net positive effects for greenhouse gas emissions.  Once again, it seems that when Washington passes an energy law, the only thing certain is that it will become a law of unintended consequences.


[1] Technically, these are Renewable Identification Numbers, not gallons. But the number of RINs is higher than the number of gallons of ethanol.

[i] The Hill, Oil lobby optimistic on repealing renewable fuel standard, March 20, 2014,

[ii] Greenwire, EPA delays, ethanol credits take center stage in RFS court battle, April 7, 2014,

[iii] The Advocate, Louisiana’s piney hills may hold key to growth of green fuels industry in state, March 22, 2014,

[iv] Global Post, U.S. EPA head defends proposed cuts in biofuel target for 2014, March 27, 2014,

[vii] Platts, A cellulosic biofuels woes could spell trouble for the RFS, March 21, 2014,

[viii] Wall Street Journal , U.S. Ethanol Mandate Puts Squeeze on Oil Refiners, March 10, 2013,

[ix] National Geographic, Milestone IPCC Climate Report Shifts on Biofuels, April 1, 2014,

[x] Telegraph, Biofuels do more harm than good, UN warns, March 23, 2014,

[xi] AgriPulse, Vilsack urges EPA caution in revising RFS, February 28, 2014,

[xii] Platts, Upping the Ante: US ethanol’s leadership sets out its stall after 2013’s setbacks, February 27, 2014,

[xiii] The Hill, Rep. Welch: Ethanol mandate ‘killing’ farmers, April 10, 2014,

[xiv] Government Accounting Office, Petroleum Refining: Industry’s Outlook Depends on Market Changes and key Environmental Regulations, March 2014,

[xv] San Antonio Business Journal, Delays in the Renewable Fuel Standards hurts refiners like Valero, feds say, April 15, 2014,

[xvi] Ethanol Producer, Biofuels, other factors to lower fuel demand in Calif., March 21, 2014,

[xvii] The Motley Fool, The Future of Ethanol Demand, February 27, 2014,


Are Climate Change Mitigation Policies a Form of Insurance?

Posted April 17, 2014 | folder icon Print this page

In a previous post, I walked through one of the take-away messages from the latest IPCC report: Using middle-of-the-pack projections, the likely damages from climate change are actually less than what reputable studies estimate as the costs of government action to curb carbon dioxide emissions, such as a carbon tax or cap-and-trade scheme. In other words, even stipulating the entire IPCC framework and numbers, one can make a strong case that “on average” the various proposals to tax and regulate emissions would be a cure worse than the disease: They would cost more in terms of forfeited economic growth than they would save in terms of reduced climate change damages. Far from being a slam-dunk case as it is often portrayed in the media, aggressive government action to slow emissions does not pass a standard cost/benefit test.

Precisely because advocates cannot justify their desired proposals on conventional grounds, they have shifted their rhetoric. Climate change mitigation policies are now routinely described as a form of “insurance,” which admittedly may end up costing more than they’re worth, but will give peace of mind by preventing truly catastrophic outcomes. In the present post I’ll explain the shortcomings of this analogy, and show that it doesn’t justify the aggressive policies that many advocate in spite of their collapsing case for alarmism.

Insurance in the Market

Before examining climate change, let’s first get our bearings on regular insurance. For example, suppose there are many houses in a certain neighborhood that each have a total value of $500,000. Every year, there is a 1 in 1,000 probability that a given house will burn down. Thus the “expected” loss from fire is $500 per house per year. If a company offered fire insurance policies that provided full indemnification, the “actuarially fair premium” would be $500.

However, in practice the insurance company would charge more than that, let’s say $600 (for a 20 percent markup). This margin is to cover overheard expenses as well as provide a return on capital to the investors who started the company. Yet even though there is a sense in which buying such a fire insurance policy is a “bad deal” from the consumer’s point of view—because it costs $600 per year to eliminate an “expected loss” of only $500 per year—economists can easily justify this outcome. Consumers are “risk averse,” and it is completely rational to pay a little bit more than the actuarially fair premium in order to have the peace of mind of not worrying about a catastrophic loss from fire.

“Insurance” in the Context of Climate Change Policy

Such is the way economists handle everyday examples of insurance. Many economists want to apply this logic to the climate change policy debate. For example, Chris Field described the recently released IPCC Working Group II report (for which he was a co-chair) as one of managing risks:

The [IPCC Fifth Assessment Working Group II] report itself is scientifically bold.  It frames managing climate change as a challenge in managing risks…

[One theme of the report] is the importance of considering the full range of possible outcomes, including not only high-probability outcomes.  It also considers outcomes with much lower probabilities but much, much larger consequences.  [Bold added.]

Field’s emphasis on paying attention to very low-probability events that carry large consequences should underscore the analogy with conventional insurance. (After all, your house probably won’t burn down, but if it does, it will be catastrophic.) In a recent debate on climate change issues, when discussing the uncertainty surrounding the climate system’s sensitivity to increased carbon dioxide concentrations, MIT’s Kerry Emanuel argued:

To say it’s between 2.5 and 9 degrees for a doubling or more of CO2, Fahrenheit, it’s to confess that we don’t know…[If it’s the] near end [of that range], it doesn’t morph, it’s 2.5 degrees—we don’t have to worry very much, I would argue. And I don’t think many of my colleagues would suggest we do. If it’s in the middle range, there will be problems. Probably we’ll adapt to them. If it’s up at the higher end, that could be catastrophic. And the question for me is: Do we do nothing to avoid, even a small risk of catastrophe for our grandchildren? [Bold added.]

As Emanuel’s remarks indicate, the current argument for government action to curb greenhouse gas emissions isn’t to fret about the likely damages, but instead to focus on theoretically possible scenarios that would be catastrophic. This shows the connection to the layperson’s understanding of insurance. (Later in the debate, Emanuel went on to explicitly liken his support for climate change mitigation policies as a form of insurance.)

Notice that with this approach, the deck is now stacked heavily in favor of interventionist policies, because the critic no longer can merely point out the cost of such actions and the unlikelihood of the risks they are allegedly addressing. As Emanuel admits in the quotation above, for the low to medium ranges of impacts, climate change won’t be a big deal; our grandchildren will adapt with little difficulty. Nonetheless, Emanuel still thinks it is perfectly reasonable to take steps just in case the really bad outcomes occur. This rhetorical framing may resonate with many people because—to repeat—people in real life take out insurance policies against such things all the time (risk of a heart attack, house burning down, severe car accident, etc.).

Yet even though the rhetorical framing sounds plausible, it’s actually quite misleading. There are many crucial respects in which government efforts to artificially curb greenhouse gas emissions are nothing like private insurance.

Problems With the Climate Change Insurance Analogy

First of all, the type of insurance people like doesn’t have to be foisted upon them: they buy, say, homeowner’s fire insurance because the benefits of coverage are worth the price of the premium. In contrast, when it comes to forcing people to buy insurance—such as what is happening under the Affordable Care Act (aka “ObamaCare”)—then millions of people are extremely unhappy. Indeed, the Supreme Court had to declare that the “individual mandate” is really just a tax, because the federal government isn’t allowed to force people to buy insurance that they don’t want. On this ground, then, it’s hard to justify climate change policy as insurance, since it forces everybody to buy it even if they don’t agree the benefits outweigh the costs.

Another problem with the insurance analogy is that the catastrophic risks from climate change are all theoretical. In contrast, we have solid data on houses burning down, car crashes, and heart attacks. Insurance actuaries can pore over those numbers and come up with reasonable estimates for the likelihood of various events, for given pools of people. There is nothing at all like this in the climate policy debate. If the 21st century unfolds in the same way as the 20th century did, then we will experience modest warming and a mixture of mild benefits and mild harms from the change. There would be no case for aggressive government action at all. It is only by using computer models in which the temperature deviates from historical trends that the alarmist camp can build its case.

Part of the problem here is that the alleged catastrophic scenarios from climate change won’t occur until decades or even centuries into the future. So rather than likening government polices to taking out fire insurance on your house, a better analogy would be paying today for a fire insurance policy on your grandchild’s apartment on the moon colony of 2100. Nobody in his right mind would embark on such a path, even if a computer simulation showed that “locking in the rates” today would be much cheaper than deferring the decision so your grandkid had to deal with it.

The Unlikelihood of Climate Catastrophe

Above we’ve shown the weaknesses in framing government climate change mitigation policies as a form of insurance. But let’s stipulate the analogy, and look at just how overpriced this “insurance” would be.

For our purposes, we just need ballpark estimates of the type of damages that would occur in the “catastrophic” scenarios. Here’s a good example from the 2010 Economic Report of the President:

[W]hile the projected loss for the first 3˚C is 1.5 percent, the loss at 6˚C is five times higher. And the estimated loss associated with an increase of 9˚C is about 20 percent [of consumption’s share of GDP]…Overall, it is evident that policy based on the most likely outcomes may not adequately protect society because such estimates fail to reflect the harms at higher temperatures. [2010 Economic Report of the President, p. 242, bold added.]

As the bolded portion indicates, even back in 2010 the rhetoric was shifting to show that government action was needed, not so much to deal with what would probably happen, but rather to avoid the unlikely scenarios of what might happen.

So if the real danger zones kick in with warming of 6 degrees Celsius and higher, we have to ask how likely is such an outcome, and how soon might it occur? To answer that, let’s consult the following chart from the IPCC’s AR5 Working Group I report, which came out last fall:

2014.04.06 AR5 WG1

SOURCE: Figure 12-40, IPCC AR5, Working Group I

In the figure above, the IPCC has provided projections of the mean and “envelopes” of warming (in a 66% confidence interval depicted by the gray bands) for four Representative Concentration Pathways (RCPs). For three of them (which may include assumptions of strong government measures to reduce emissions), humanity never comes close to the danger levels of warming.

But for the sake of argument, let’s look at the most pessimistic scenario—RPC 8.5. Further, let’s look at the lower end of when we might reach 9 degrees Celsius warming, meaning where the left gray band first hits the 9˚C mark. Eyeballing the chart above, this happens around the year 2145. In other words, even if we just consider the worst-case emission and climate sensitivity scenario reported by the IPCC in its summary chart from its September 2013 update, the suite of IPCC computer models still only assigns a 17% probability that the earth will experience 9 degrees Celsius of warming before the year 2145.[1]

In light of the above, let’s pick round numbers and say that a very unlikely outcome—give it a probability of 1 in 500—involves 9˚C of warming by the year 2100.[2] This is the climate change analog of the house burning down. So, does it make sense to “buy fire insurance” against this extremely unlikely, but awful, outcome?

No, it doesn’t. Remember, the damage estimate for this amount of warming is around 20 percent of global consumption. (The White House report that gave this figure was relying on the IPCC’s AR4 literature, but we’re just giving a ballpark analysis here.) On the other hand, the latest IPCC Working Group III report estimates that aggressive government policies to limit climate change would cost 3.4 percent of consumption by the year 2050.

Now ask yourself: Suppose someone from an insurance company came to you in the year 2050 and said, “We’ve run computer models many thousands of times using all kinds of different assumptions. In the worst-case scenario, a very small fraction of the computer runs—about 1 in 500—has you losing 20% of your income in the year 2100. In order to insure you against this extremely unlikely outcome that will occur in half a century, we want to charge you 3.4% of your income this year.”[3]

Would you want to take that deal? Of course not. The premium is way too high in light of the very low probability and the relative modesty of the “catastrophe.” When someone’s house burns down, that’s a much bigger hit than 20% of annual income. And yet, the premiums for fire insurance are quite reasonable; they’re nowhere near 3.4% of income for most households. Moreover, the threat of your house burning down is immediate: It could happen tomorrow, not just fifty years from now. That’s why people have no problem buying fire insurance for their homes. Yet the situation and numbers aren’t anywhere close to analogous when it comes to climate change policies.


Recognizing that they can no longer make their case on the basis of down-the-middle projections, those favoring massive government intervention in the name of fighting climate change have resorted to focusing on very unlikely but devastating scenarios. In this context, they have likened their preferred government policies as a form of insurance.

However, this analogy fails for several reasons. First, insurance in the marketplace is voluntary; when the government forces people to buy it—as with ObamaCare—then there is indeed a public outcry. Second, actual insurance in the marketplace is based on extensive actuarial data; we have no such understanding with climate change, but instead the outcomes against which we are “insuring” live inside computer projections.

Finally, even taking the insurance analogy head-on, the numbers don’t work. Nobody would take out an insurance policy on the terms of likely payouts and expense of premium that climate change policy offers.

[1] If the gray band covers the temperature range with 66% confidence, that means the remaining probability of 34% is divided in half (i.e. 17%) and allocated to the left and right sides of the gray band.

[2] To be clear, there is no way to precisely derive such a figure from the IPCC’s graph. We are merely picking a ballpark figure (which is actually generous) based on the information that the graph does contain.

[3] Strictly speaking the climate change literature often deals with percentage reductions in consumption (not income), but it would sound odd for an insurance company to talk that way. The basic point is the same.

Robert Murphy

Another DOE Flop: Smith Electric Vehicles Closes Kansas Plant

Posted April 16, 2014 | folder icon Print this page

“You’re doing more than just building new vehicles. You are helping to fight our way through a vicious recession and you are building the economy of America’s future.” President Obama, 2010.

The U.S. Department of Energy (DOE) provided $30 million to Missouri-based Smith Electric Vehicles and President Obama used the spending as an opportunity to campaign in Missouri. Hindsight now shows that it was irresponsible for the Obama administration to give Smith Electric Vehicles $30 million dollars of American taxpayers’ money because the company is closing its U.S. operations due to a “tight cash flow situation.”[i] This is not the first electric vehicle company that DOE has funded from the 2009 Stimulus package that has gone down the tubes. Fisker Automotive, a Finnish electric car maker, was provided with a $529 million DOE loan, but was cut off at $193 million when it failed to reach milestones in delivering its Karma model, an electric vehicle with a showroom cost of over $100,000.

Smith Electric Vehicle Source:  

Smith Electric Vehicles

Smith Electric Vehicles is shuttering its Kansas City plant, but will continue to operate in Europe and Asia, where the company has additional production facilities. The DOE stimulus grant of $32 million was supposed to support the construction of 510 electric vehicles for municipal public transportation[ii]–plug-in delivery trucks, which currently are used by customers such as Frito-Lay, The Coca-Cola Company, Staples, Kansas City Power & Light Company, and the U.S. military. Using $29,150,672 in taxpayer funds, Smith Electric Vehicles had 439 vehicles in service at the end of 2013.

While the White House claimed that the project would create “more than 220 direct and indirect jobs,” the company had created the hourly equivalent of 70.35 jobs. Thus, DOE spent an average of over $414,000 for each job created. Last year, the company reported that it employed 131 people in the United States. Smith Electric Vehicles has been struggling for some time, reporting that it had annual losses in the millions after receiving DOE funds. In 2009, Smith Electric lost $17.5 million; in 2010, $30.3 million; in 2011, $52.5 million, and as of June 30, 2012, $27.3 million.[iii] The company intended to have an initial public offering in 2012 and to open an assembly plant in New York and in Chicago—but, those plans were scrapped.[iv]

Purchasers of these plug-in trucks can benefit from a number of different Federal subsidy programs to defer some of the cost, such as[v]:

  • The Alternative Fuel Infrastructure Tax Credit (up to 30 percent of the vehicle’s cost);
  • Qualified Plug-In Electric Drive Motor Vehicle Tax Credit (between $2,500 and $7,500 per truck);
  • EPA Diesel Emissions Reduction Act Grant (up to 25 percent of the total cost of a vehicle);
  • Clean Cities Grant (up to 50 percent total cost of the vehicle); and
  • Congestion, Mitigation and Air Quality Funds.

Further, the individual states also have their own subsidy and rebate programs that apply.

DOE’s Advanced Technology Vehicle Manufacturing Program

Recently, Energy Secretary Ernest Moniz announced his intention to restart the Department’s electric vehicle subsidy program, the Advanced Technology Vehicle Manufacturing (ATVM) program that has wasted millions of taxpayer dollars on a number of failed companies, including Smith Electric Vehicles. The program has about $16 billion in authorized funds that have not yet been disbursed. Funds from the program have previously been distributed to electric vehicle manufacturer Fisker Automotive and battery manufacturer A123 Systems, both of which went bankrupt after receiving the DOE funds.

However, Secretary Moniz believes that the electric vehicle market has improved since those failures. But, the financial troubles of Smith Electric Vehicles indicate just the opposite. The Government Accountability Office (GAO) recommended that the program be scrapped. GAO recommended that Congress rescind “all or part of the remaining credit subsidy appropriations to the [ATVM] loan program, unless the [DOE] can demonstrate sufficient demand for new ATVM loans and viable applications.”[vi]

Despite GAO’s report to the contrary, Congressional Democrats are offering legislation to continue subsidizing the electric vehicle market. For example, a bill introduced by Rep. Zoe Lofgren would authorize the Treasury Department to issue up to $50 billion in federal bonds to finance, among other projects, $2,500 government vouchers for Americans to purchase plug-in hybrid electric cars.[vii] This would be very beneficial to Tesla Motors, which is headquartered in her home town of Palo Alto, and whose purchasers already receive a $7,500 reduction in their tax obligation from the federal government.


It is now 2014 and President Obama’s 2009 Stimulus program is still plaguing the country. Although, it has clearly been demonstrated that numerous projects benefiting from it have failed or otherwise not met their goals[viii], Federal officials want to continue to support what the market is having trouble supporting with Federal and state government hand outs. Smith Electric Vehicles’ closing of its Kansas plant in hopes to retool production outside of the United States is evidence that the $414,000 per job Federal give away is not working.

[i] Free Beacon, Taxpayer-Backed Electric Car Company Closes U.S. Factory, April 9, 2014,

[iii] Kansas City Business Journal, Smith Electric Vehicles prices initial public offering at $76 million, September 7, 2012,

[iv] Kansas City Business, Smith Electric Vehicles has suspended production, April 4, 2014,

[v] Heartland, Bottomless Subsidies Needed to Keep DOE Electric Truck Project Alive, July 10, 2013,

[vi] General Accounting Office, Government Efficiency and Effectiveness, April 2014,

[vii] U.S. House of Representatives Press releases, Reps. Lofgren and Matsui Introduce Legislation to Increase Investment in Clean Energy Technologies, April 8, 2014,

The UK and U.S. Northeast Face a Pending Energy Shortage Nuclear

Posted April 15, 2014 | folder icon Print this page

The United Kingdom and the U.S. Northeast may have something in common going into next winter—a possible energy shortage. Both countries are closing existing power plants—coal-fired and nuclear—in favor of renewable and natural gas generating technologies. In the United States, the Independent System Operator in New England has warned that generating capacity is extremely tight with the future closure of the Vermont Yankee nuclear power plant and several coal-fired power plants in Massachusetts. Likewise, in the UK, 8,200 megawatts of coal-fired power plants have been shuttered, with an additional 13,000 megawatts at risk over the next 5 years. The UK’s energy regulator is worried that the amount of capacity over peak demand next year will be just 2 percent—a very scary low amount for those charged with keeping the lights on.

The U.S. Northeast Power Struggle

Both nuclear power and coal-fired power plants are retiring prematurely in New England due to onerous regulations and competition from low cost natural gas-fired generating plants. The Vermont Yankee nuclear power plant (604 megawatts), which supplies 4 percent of New England’s power and three-fourths of Vermont’s electricity, is expected to retire at the end of this year concurrent with the end of its fuel cycle. Entergy, the plant’s owner, cites a number of financial factors for the retirement including increased costs to comply with new federal and regional regulations and competition from natural gas power plants. However, Vermont Yankee has been opposed by state political figures for some time, and many have cheered its closure after years of criticizing its operation.  Also, U.S. nuclear power plants are plagued with competition from negative power prices from wind energy due to the federal Production Tax Credit (PTC) that provides a 10-year subsidy for qualified wind units. Because Vermont Yankee is operated as a merchant generator, its costs cannot be recovered through regulated cost-of-service rates.[i]

New England expects more than 1,369 megawatts of coal-fired generating capacity to be retired between 2013 and 2016. Dominion Energy Resources is planning to retire the nearly 750-megawatt Salem Harbor coal- and petroleum-fired power plant in Massachusetts this year due to the Northeast states antagonism toward coal (i.e. the Regional Greenhouse Gas Initiative), the costs of compliance of new environmental regulations, and declining profits for coal-fired units in New England.[ii]  To keep operating its coal-fired power plants, the company would need to spend millions of dollars on environmental equipment to comply with EPA regulations. In southeastern Massachusetts, the Brayton Point power plant, the largest coal-fired power plant in New England, is expected to be shut down in 2017 due to EPA’s onerous regulations.

Reliability experts are noting that the New England grid is entering risky territory. It currently gets 52 percent of its electricity from natural gas. There is currently enough natural gas pipeline capacity during non-winter months to supply New England utilities. But, this past winter, the lack of pipeline infrastructure resulted in the need to rely on nuclear, coal, and petroleum to meet demand from the extreme cold weather. The spot price of natural gas was so high that it was less expensive to generate electricity from petroleum. At a recent hearing, Senator Lisa Murkowski noted, “… 89 percent of the coal electricity capacity that is due to go offline was utilized as that back-up to meet demand this winter.”[iii]

With the early retirements of nuclear and coal-fired power plants cutting back on supply diversity, the New England grid is becoming dangerously reliant on natural gas for its generating capacity. The Independent System Operator New England recommended against the closure of the 1500 megawatt Brayton Point facility because the plant is needed to ensure reliability.[iv]

NE Energy Infrastructure

Source: Energy Information Administration,

After the colder-than-average winter, natural gas stockpiles are low. According to the Energy Information Administration, on average over the past five years, natural gas stockpiles totaled 3.832 trillion cubic feet by the end of October going into the winter heating season. This past winter, natural gas inventories dropped by 2.92 trillion cubic feet between the end of October and March 21, making it the fastest pace of withdrawals for any U.S. heating season since 1995. The extreme cold weather pushed stockpiles to their lowest level in 11 years. That large withdrawal means that about 3 trillion cubic feet of natural gas will need to go into storage during the warm-weather months to cover expected winter demand. By the end of October, it is expected that stockpiles may increase to close to 3.5 trillion cubic feet, about 300 billion cubic feet less than the high achieved over the past 5 years, putting even more stress on having adequate supplies for next winter.[v]

The northeast already has the highest electricity prices in the country (outside of Alaska and Hawaii). Residential electricity rates are currently 45 percent higher in New England than the U.S. average. Phasing out these power plants prematurely will only increase electricity rates in New England.

The UK Power Struggle 

The United Kingdom’s electricity consumption is roughly 1/12th of that in the United States, but policies there are leading to growing concerns about energy price and availability. The United Kingdom may encounter power shortages next winter because electric utilities are shuttering coal-fired power plants to comply with Europe’s carbon-emissions rules and have stopped their investment in new generating capacity. Over the past 15 months, 8.2 gigawatts of coal-fired power plants were shuttered and 13 gigawatts are at risk of closing by 2019. According to the UK’s energy regulator, the amount of electricity available over peak demand may drop below 2 percent next year, the lowest level in Western Europe.[vi]

Beginning in January 2016, the European Union will require electric utilities to add further emission reduction equipment to plants or close them by either 2023 or when they have run for 17,500 hours.  Only one UK electricity producer has chosen to install the required technology since the equipment is expensive, costing over 100 million pounds ($167 million) per gigawatt of capacity.   Because the UK has built only one coal-fired power plant since the early 1970s, most of the existing coal-fired plants are expected to be shuttered.

According to data from the Office of Gas and Electricity Markets in London, the capacity margin–the amount of excess supply above peak demand–may drop below 2 percent in 2015. Under normal weather conditions, the margin could drop below 4 percent during the winter months from over 6 percent now, but lower than average temperatures increase electricity demand and would thereby lower the capacity margin further.

The United Kingdom was the first nation to introduce a carbon tax on fossil fuel combustion, which is in addition to the regional carbon trading system of the European Union. As a result, UK utilities are already paying the most among European countries for the right to emit carbon dioxide from burning fossil fuels. To ensure reliability and to remove uncertainty for utilities, the UK government froze the tax from 2016 through 2020 so that electric generator operators could make investment decisions regarding their coal-fired power plants.

Another area of uncertainty for UK utilities is how a proposed market for providing backup electricity will work. According to the Department for Energy and Climate Change, electricity producers will be able to bid in an auction to take place this December to provide backup power for 2018. The program, called a capacity market, is expected to ensure sufficient capacity and security of supply. The Department estimates that the UK power industry needs around 110 billion pounds ($184 billion) of investment over the next 10 years.

According to Deutsche Bank AG, UK power prices, which are one of the highest in Western Europe, are expected to increase by 39 percent in the next five years. The UK generates 12 percent of its electricity from renewable energy today, and plans to get 15 percent from renewables by the end of the decade. UK electricity consumers will pay an additional 120 pounds a year (about $200) to fund the move toward greener power generation on top of their current average electricity bill of 1,420 pounds ($2,376).


The UK and the U.S. Northeast have something in common in their quest for lower greenhouse gas emissions—a possible energy shortage and unreliability of their electricity grid–expected as soon as this coming winter. Shuttering coal-fired power plants in favor of renewable energy and natural gas-fired technology due to government policies and regulations has been the major cause of the concern. Further, government regulations and policies are also closing nuclear units in the United States bringing diversity of supply issues to the forefront. Electric grid operators in both areas are worrying that generating capacity next winter may be too little to meet demand, particularly if frigid weather should hit.

Senator Joe Manchin stated at a recent Senate hearing, “Keep in mind that coal will provide about 30% of our power for at least the next three decades. As you are doing that, think about the fact that nearly 20 percent of the coal fleet is being retired. Add the fact that EPA’s proposed New Source Performance Standard rule will effectively ban the construction of any new coal plants, and you see that our reliability crisis is getting much worse.”

[ii] Dominion News, Dominion Sets Schedule to Close Salem Harbor Station, May 11, 2011,

[iv] Forbes, More Coal Plant Retirements in New England? Perhaps Not So Fast, January 6, 2014,

[v] Bloomberg, Record Natural Gas Need Keeps Bulls Betting on Advances: Energy, March 31, 2014,

[vi] Bloomberg, Green Rules Shutting Power Plants Threaten U.K. Shortage: Energy, March 19, 2014,