Enviroshop – About Magazine

Why Are the Networks Ignoring a Major Cause of Stronger Storms?

By Eric Pooley

The two-fisted gut punch of Harvey and Irma devastated Caribbean islands, swamped major American cities, blacked out power for millions, and exposed who-knows-how-many people to toxic soup of polluted floodwaters. But one thing these immensely powerful storms could not do was move the television networks to talk about how these storms got to be so strong.

The Sunday morning news shows, which still help determine the narrative for the Capital, failed to mention the clear connection between these more powerful storms and climate change. The hurricanes were covered, of course, but the scientifically established link between our warming climate and their increased destructive power was raised on only one of the four* major talk shows (CNN’s State of the Union with Jake Tapper), according to the non-profit group Media Matters.

More broadly, the study found that two broadcast networks, ABC and NBC, failed to air a “single segment on their morning, evening, or Sunday news shows” on the link between climate change and the storms.

The reality is that warmer waters fuel big hurricanes, warmer air holds more water, and rising sea levels surge higher and father. In short, climate change puts storms on steroids. A point NASA drove home as Irma approached Florida with this tweet:

.@NASAEarth data shows hot water ahead for #HurricaneIrma. Warm oceans are a key ingredient fueling hurricanes: https://t.co/PeoCi4vfZh pic.twitter.com/DGcLY2r0H4

— NASA (@NASA) September 7, 2017

Without serious coverage of this connection, we are left with only political propaganda from the White House and its allies. President Trump and EPA Administrator Scott Pruitt have repeatedly denied or downplayed the facts of climate science, even though every major American scientific organization has recognized this reality.

These attempts to deny the science are, not surprisingly, backed up by voices like Rush Limbaugh, who claimed last week that the discussion of stronger hurricanes was based on a “desire to advance this climate change agenda” – and then promptly evacuated his Florida studio.

Pruitt is trying to bury the views of the scientific community on climate change generally. The latest climate assessment by government scientists sheds light on the topic of climate change and hurricanes. But Pruitt is sitting on the report because there is apparently never a time he wants people thinking about climate change.

According to the “final draft” of the report, which was provided to the New York Times by authors worried about Pruitt’s political interference, it is “likely” that hurricanes’ maximum wind speeds and rainfall rates will increase. Pruitt has said that he is going to review the report, and it hasn’t been seen since.

The failure to inform the public about the link between more climate pollution and stronger storms – along with more wildfire, droughts, increasing flows of refugees, and other climate costs – means we are more likely to continue down the path toward a more dangerous future. Already, we are paying billions to clean up and rebuild after these storms; Citigroup has estimated that the total bill for unchecked climate change will be more than $40 trillion.

The networks have a lot on their plate covering Washington these days. There’s no shortage of misinformation to correct, and many serious stories to cover. But it’s hard to think of many that are a bigger threat to public health and well-being than the continued rampage of climate change. And just as with any other big story, the causes – not just immediately visible impacts – must be part of the reporting.

*Meet the Press was pre-empted.

Read more

Investor sees methane management as self-help for oil & gas companies

By Sean Wright

Environmental Defense Fund Q&A with Tim Goodman, Hermes Investment Management

Tim Goodman, Director of Engagement at Hermes Investment Management

When burned, natural gas produces half the carbon as coal, so it is often touted as a “bridge” fuel to a cleaner energy future. But the carbon advantage of natural gas may be lost if too much of it escapes across its value chain.

Natural gas is mostly methane, which, unburned, is a highly potent greenhouse gas accounting for roughly a quarter of today’s global warming. Worldwide, oil and gas companies leak and vent an estimated $30 billion of methane each year into the atmosphere.

EDF’s Sean Wright sat down with Tim Goodman, Director of Engagement at London-based Hermes Investment Management. Goodman, who views methane management as practical self-help for the industry to pursue, engages with oil and gas companies on strategies to manage their methane emissions. This is the first of a two-part conversation with Hermes, a global investment firm, whose stewardship service Hermes EOS, advises $330.4 billion in assets.

Wright: Do you think the oil and gas industry is changing its overall attitude towards climate after the historic Paris agreement and recent successful shareholder resolutions? If so, how do you see that change manifesting itself?

Goodman: I think climate change is obviously an existential question for the industry. The really big question is can it actually change in response to Paris? The industry is beginning to respond as a result of Paris and shareholder proposals and other stakeholder pressure. You’re seeing some of the majors increasing their gas exposure at the expense of oil. You’re seeing a number of international oil companies reducing or ending their exposure to particularly high carbon or high risk assets, such as the Canadian oil sands or the Arctic. The oil and gas industry is also starting to place a greater focus on methane management and its own emissions.

Wright: What about investors – what do you think is driving the continued momentum around methane and climate as we see larger and more mainstream funds tackling these issues?

Goodman: Let’s talk about climate for the moment – the roles of both investors and companies in the run up to the Paris agreement and during the negotiations were crucial. The investors made it absolutely clear that they wanted to see a successful Paris agreement. Addressing climate change is good for business and good for their portfolios. And we saw this with the Exxon vote – the two-degree scenario proposal where mainstream asset managers voted for this proposal. We believe that this happened because of the underlying pressure asset managers were getting from their own clients who have a long-term perspective and see climate change as a risk to their funds.

Specifically on methane, it’s practical self-help for the industry to embark on methane management. It’s an obvious practical measure for investors to engage upon. If you can reduce your contribution to greenhouse gases, save money, and gain revenue by being more efficient and safe, why wouldn’t you do that? It’s an easy entree into engaging with the oil and gas industry. Whereas the existential question, what’s your business going to look like 20 years from now, is a more difficult question perhaps both for the industry and the companies themselves.

Wright: You pretty much just explained why Hermes prioritized methane – is that correct?

Goodman: Yes. But the science is a big part of it. Methane is a far more potent greenhouse gas than carbon – the more that we can minimize its effects, the greater the window the world has to transition to a low carbon economy. Methane’s effects don’t last as long as carbon, but if we don’t tackle methane, we aren’t taking meaningful action to move to a low-carbon economy.

Wright: What do you see as the risks of unmanaged methane emissions?

Goodman: There is an economic risk and benefit for companies. Most of the measures to manage methane are relatively low-cost and can very easily be implemented for new projects. If you’re not doing them, for example, and you’re fracking shale, you’re at a competitive disadvantage to your peers. The cost-benefits perhaps are more difficult, but still there, in existing infrastructure. But particularly among the oil majors, their relationship with their host governments, local communities, and other stakeholders is vital. It’s important for companies to demonstrate good corporate citizenship. If you’re a laggard on methane, you’re more likely to be considered as an irresponsible partner both commercially and also in your local community. So I think oil and gas companies risk massive reputational and legal risks if they’re not managing methane effectively, notwithstanding the economic benefits.


Ignoring methane problem puts oil & gas companies “at a competitive disadvantage.” See more…
Click To Tweet


Wright: What do you typically hear from operators in your conversations about methane management? Do you hear different things from operators in different parts of the world?

Goodman: Methane management is part of a number of important issues that we’re engaging with the industry on, including other pollution, health and safety, human rights, corruption and climate change. What we’re hearing on methane does vary. It’s fair to say in some emerging markets methane management is not often discussed by investors with those companies. But when we do address this topic in these markets, the companies show interest and want to know why it’s important to us, what they should be doing, how they should be disclosing, etc. So we’re often having positive and interesting conversations in these markets.

In the developed markets, there’s a difference. And I think there’s a distinction between Europe and North America. The EU companies, particularly the majors, are realizing it’s an important issue and are talking about it and disclosing at least some data. In private dialogue with North American companies, it is clear methane is often an important issue for them, but their disclosure is less convincing. It does vary around the world, but you also have this interesting phenomenon, where some companies seem to be doing a good job in private dialogue, but the disclosure lags behind what they are actually doing. We also see companies attempting to present their efforts in a better light than perhaps they deserve. It’s a complex mixture, which is why engagement is so important because we are able to view the reality on the ground through private dialogue.

For more information on EDF’s investor resources on methane mitigation, please see our recent report, An Investor’s Guide to Methane, or subscribe to our newsletter.

Read more

Scott Pruitt’s relentless distortions of climate science and law

By Ben Levitan

This summer was anything but quiet for climate policy.

In June, President Trump announced that the U.S. would withdraw from the Paris climate agreement.

In July, the U.S. Court of Appeals for the District of Columbia Circuit blocked Environmental Protection Agency (EPA) Administrator Scott Pruitt’s attempt to suspend protections from climate-destabilizing oil and gas pollution, calling the move “unauthorized” and “unreasonable.”

In August, two judges of the same court reminded EPA of its “affirmative statutory obligation to regulate greenhouse gases,” citing longstanding Supreme Court precedent.

Now, the devastation caused by Hurricane Harvey and the record strength of Hurricane Irma are showing us what’s at stake, as sea level rises and extreme weather becomes more frequent.

Meanwhile, Administrator Pruitt has continued his pattern of deeply misleading statements about climate change and EPA’s responsibility to protect public health and the environment.

Pruitt uses these statements in an attempt to justify rolling back vital public health and environmental safeguards. In just his first four months in office, he took action against more than 30 health and environmental protections, including the Clean Power Plan — our first and only national limit on carbon pollution from existing power plants.

As America’s proven, life-saving environmental protections come under attack, here are four facts about climate law and science to help cut through Pruitt’s distortions.

  1. EPA has an affirmative statutory obligation to regulate climate pollution

Administrator Pruitt frequently questions EPA’s ability and authority to regulate climate pollutants under the Clean Air Act. But contrary to Pruitt’s claims, the Supreme Court has repeatedly ruled that the Clean Air Act covers climate pollution.

  • In Massachusetts v. EPA, the Court held that climate pollutants “without a doubt” and “unambiguous[ly]” meet the definition of “air pollutant” under the Clean Air Act.
  • In its subsequent American Electric Power v. Connecticut (AEP) opinion, the Supreme Court found that section 111 of the Clean Air Act — the section under which EPA issued the Clean Power Plan — “speaks directly” to the regulation of climate pollution from existing power plants. (Even opponents of climate protections conceded that point during oral argument.)
  • The Court again recognized EPA’s authority to regulate climate pollution in a third decision, Utility Air Regulatory Group v. EPA (UARG).

Former EPA administrators serving in both Republican and Democratic administrations have recognized that “Congress has already made the policy decision to regulate” air pollutants that EPA determines — based on scientific factors — endanger the public health or welfare.

That’s why we now enjoy protections from air pollutants like cancer-causing benzene, brain-damaging lead, and lung-impairing particulates. We may not have had those protections if former EPA Administrators had shared Pruitt’s myopic view of the agency’s responsibility under the Clean Air Act.

As the Supreme Court stated in Massachusetts v. EPA, Congress:

underst[oo]d that without regulatory flexibility, changing circumstances and scientific developments would soon render the Clean Air Act obsolete. The broad language … reflects an intentional effort to confer the flexibility necessary to forestall such obsolescence.

In issuing the Clean Power Plan and other climate protections, EPA scrupulously fulfilled the mandate with which Congress entrusted it. The Clean Power Plan also reflected the Supreme Court’s finding in AEP that climate pollution from existing power plants was covered by section 111.

Administrator Pruitt has seriously misconstrued judicial rulings that conflict with his policy goals.

For example, he claimed that the Supreme Court’s UARG decision “said the authority the previous administration was trying to say that they had in regulating carbon dioxide wasn’t there.”

Pruitt overlooks the fact that the UARG opinion upheld the vast majority of what EPA had done, including the requirement that sources subject to certain permitting obligations under the Clean Air Act utilize “best available control technology” for climate pollution. The Supreme Court only took issue with EPA’s potential regulation of a subset of sources constituting a small percentage of total emissions, which did not implicate EPA’s fundamental obligation to regulate climate pollution.

2. EPA’s obligation to regulate climate pollution is based on scientific factors, not the Administrator’s policy preferences

Administrator Pruitt’s most dangerous Supreme Court misinterpretation might be his twist on Massachusetts v. EPA, a landmark decision that set the foundation for many of the climate protections that followed.

In Pruitt’s reading, when it comes to climate pollution, the Supreme Court held only that EPA “must make a decision whether [to] regulate or not.”

But the Supreme Court actually held that EPA was required to determine — again, based on scientific factors — whether climate pollution endangers public health or welfare.

In 2009, EPA concluded that climate pollution indeed poses a clear danger to public health and welfare, based on an exhaustive review of an expansive array of published studies and surveys of peer-reviewed literature prepared by the U.S. government’s Global Change Research Program, the National Academy of Sciences, and the Intergovernmental Panel on Climate Change.

The D.C. Circuit upheld this Endangerment Finding against a barrage of legal attacks, finding that it was based on “substantial scientific evidence.”

After issuing the Endangerment Finding, EPA was statutorily obligated to follow the Clean Air Act’s process for regulating the dangerous pollution.

Administrator Pruitt’s position more closely resembles the losing argument in Massachusetts v. EPA. The George W. Bush Administration had justified its decision not to regulate climate pollution based on factors completely unrelated to public health or welfare. But the Supreme Court brushed aside EPA’s “laundry list of reasons not to regulate” and ruled that the agency was not free to — in Pruitt’s words — “make a decision” not to regulate. Rather, EPA must conduct a science-based evaluation of the risks that climate pollution poses to public health and welfare, and if the science supports an Endangerment Finding, regulation must follow.

3. The scientific evidence of climate change is overwhelming

Climate change is happening now. As climate pollution continues to accumulate in the atmosphere, it will bring melting sea ice and glaciers, rising sea levels, and more extreme weather including heat waves, floods, and droughts.

Administrator Pruitt attempts to minimize this threat by focusing on uncertainty. In Pruitt’s parlance, we still have more to learn about “the precision of measurement” when it comes to the effects of climate pollution. But the fact that there are still productive areas for research doesn’t mean we should disregard the vast amount that we already know.

As the American Meteorological Society recently told a different Trump Administration official:

[S]kepticism and debate are always welcome,” but “[s]kepticism that fails to account for evidence is no virtue.

In Massachusetts v. EPA, the Supreme Court held that EPA cannot decline to regulate climate pollution due to:

some residual uncertainty … The statutory question is whether sufficient information exists to make an endangerment finding.

EPA answered that question in its 2009 Endangerment Finding, and since then, the overwhelming scientific evidence for human-caused climate change has continued to grow.

In the final draft of the U.S. Global Change Research Program’s latest Climate Science Special Report — which is currently under review by political officials in the Trump Administration — climate scientists determined that, in the last few years:

stronger evidence has emerged for continuing, rapid, human-caused warming of the global atmosphere and ocean.

The year 2016 marked the third consecutive year of record-high global surface temperatures, and 2017 marked the third consecutive year of record-low winter Arctic sea ice. Meanwhile, the rate of sea level rise is increasing.

In contrast to the extensive scientific research demonstrating the role of climate pollution in destabilizing our climate, Administrator Pruitt has proposed a (possibly televised) “red team/blue team” exercise in which opposing teams of government-selected experts debate climate science.

Christine Todd Whitman, who served as EPA Administrator under President George W. Bush, characterized the red team/blue team exercise as “a shameful attempt to confuse the public into accepting the false premise that there is no need to regulate fossil fuels.”

Pruitt has acknowledged that he is “not a scientist” but nonetheless suggested that his red team/blue team exercise would represent “what science is all about.” Anticipating that some scientists might be reluctant to participate, he taunted:

If you’re going to win and if you’re so certain about it, come and do your deal.

But for most scientists, their “deal” is a careful process of observation, experimentation, and peer review — even when it doesn’t fit between commercial breaks.

However Pruitt manages his red team/blue team exercise, it can’t alter the conclusions of the massive body of climate research developed by thousands of scientists over decades of conscientious inquiry.

4. The American public supports policies to address climate change

One argument that Administrator Pruitt advanced for his red team/blue team exercise is that “the American people would be very interested in consuming that.”

Actually, Americans in every state have already shown an appetite for addressing climate change.

A recent survey found that large majorities of Americans support regulating greenhouse gases as a pollutant, setting strict carbon dioxide limits on existing coal-fired power plants, and requiring utilities to produce 20 percent of their electricity from renewable sources.

In fact, each of those policies garnered majority support in every Congressional district in America.

A majority of Americans opposed the decision to withdraw from the Paris climate agreement, as did the CEOs of many prominent businesses.

And the Clean Power Plan was supported in court by a broad and diverse coalition of 18 states, 60 cities, public health experts, leading business innovators (including Google, Apple, Amazon, and Microsoft), leading legal and technical experts, major consumer protection and low-income ratepayer organizations (including Consumers Union and Public Citizen), faith groups, more than 200 current and former members of Congress, and many others. (You can read their legal briefs on EDF’s website.)

Administrator Pruitt’s legal and scientific distortions show no sign of abating, and neither does his destructive rollback of public health and environmental protections. But his efforts have been rife with legal deficiencies. As EDF President Fred Krupp recently wrote, Pruitt “may have finally met his match: the law.”

Shortly after the D.C. Circuit blocked Pruitt from suspending protections from oil and gas pollution, and in the face of legal challenges from EDF and many others, Pruitt withdrew his unlawful delay of another Clean Air Act protection – the implementation of a national health-based smog standard.

EDF will continue to demand that Pruitt fulfill his solemn responsibility to protect the health of our communities and families under our nation’s bipartisan and time-tested environmental laws.

Read more

This speaks volumes: Industry rushes in to defend EPA’s new TSCA regulations

By Richard Denison

Richard Denison, Ph.D.is a Lead Senior Scientist.

Environmental Defense Fund has made no secret of our view that many elements of the final framework rules issued by the Trump EPA in July to implement recent reforms to the Toxic Substances Control Act (TSCA) are contrary to law and fail to reflect the best available science.  The rules EPA had proposed in January were heavily rewritten by a Trump political appointee, Dr. Nancy Beck, who until her arrival at the agency at the end of April was a senior official at the chemical industry’s main trade association, the American Chemistry Council (ACC).

In our view, the final rules largely destroyed the careful balance that characterized the efforts to reform TSCA and the final product of that effort, the Lautenberg Act.  In many respects, the final rules governing how EPA will identify and prioritize chemicals and evaluate their risks now mirror the demands of the chemical industry, reflected in comments they had submitted earlier – some of which Beck herself had co-authored.

These are among the reasons EDF as well as other NGOs and health and labor groups have had no choice but to file legal challenges to these rules.

Lest you have any doubt that the final rules are heavily skewed in industry’s direction, a development in these legal cases just yesterday should dispel it.  A broad coalition of industry groups – including Dr. Beck’s previous employer ACC – has filed motions to intervene in these cases in order to defend EPA’s rules (see here and here).  Parties to the motion constitute a remarkable list:

  • American Chemistry Council
  • American Coatings Association
  • American Coke and Coal Chemicals Institute
  • American Fuel and Petrochemical Manufacturers
  • American Forest and Paper Association
  • American Petroleum Institute
  • Battery Council International
  • Chamber of Commerce of the United States of America
  • EPS (Expanded Polystyrene) Industry Alliance
  • IPC – Association Connecting Electronics Industries
  • National Association of Chemical Distributors
  • National Mining Association
  • Polyurethane Manufacturers Association
  • Silver Nanotechnology Working Group
  • Society of Chemical Manufacturers and Affiliates (SOCMA)
  • Styrene Information and Resource Center
  • Utility Solid Waste Advocacy Group

Yesterday was the deadline for parties seeking to intervene in the cases to have done so.  Among those that had issued a “call to arms” to industry to intervene to defend EPA’s rules were leading Washington, DC industry law firms that represent these trade groups and their members.  For example, Wiley-Rein issued this client alert five days after our lawsuits were filed (emphasis added):

Also on August 11th the Natural Resources Defense Council, Safer Chemicals Healthy Families Coalition, the Environmental Defense Fund and other environmental advocacy organizations filed lawsuits challenging the EPA’s final Prioritization and Risk Evaluation Rules. While the petitions are light on details, they generally allege that EPA abused its discretion when issuing the final rules. The specific issues the petitioners have with the final rules are not yet clear, but these groups have publicly expressed concern with EPA’s interpretation of how it will review the conditions under which a chemical is known or reasonably foreseen to be used. Therefore, companies that make, import, process or use a chemical that is being evaluated by EPA now or in the future need to consider getting involved and supporting the rule [sic] as it now stands.

Step back for a minute and consider the unusual nature of this development:  When was the last time such a heavy-hitters list of industry groups rushed in to support EPA regulations?

More evidence of the topsy-turvy world we’re living in under the most anti-environmental and anti-regulatory administration in modern history.

Despite its professed support just over a year ago for balance and compromise in TSCA reform, the industry has shifted  in this new political climate to short-term, opportunistic thinking.  But that isn’t going to solve the problem that brought the industry to the TSCA negotiating table:  The lack of confidence in the safety of its enterprise, a problem that can only be expected to grow as regulations are rolled back and the public learns more about the millions of pounds of chemicals released into the environment from industrial facilities in the wake of hurricanes.

Read more

New report: Yes, we can have both clean air and reliable electricity

By Rama Zakaria

A new report by M.J. Bradley & Associates – based on an extensive review of data, literature, and case studies – shows that coal-fired power plants are retiring primarily due to low natural gas prices, and that the ongoing trend towards a cleaner energy resource mix is happening without compromising the reliability of our electric grid.

The report follows a highly-publicized order by Secretary of Energy Rick Perry for a review of the nation’s electricity markets and reliability. Perry wanted to determine whether clean air safeguards and policies encouraging clean energy are causing premature retirements of coal-fired power plants and threatening grid reliability.

The Department of Energy (DOE) just released that long-anticipated review — a baseload study that actually confirms that cheap natural gas has been the major driver behind coal retirements.

Now the M.J Bradley report affirms that finding, and offers even more evidence to support it and demonstrate that electric reliability remains strong.

The M.J Bradley report confirms conclusions by multiple studies which demonstrate that, of the three main factors responsible for the majority of the decline in coal generation, the increased competition from cheap natural gas has been by far the major contributor – accounting for 49 percent of the decline.

The two other factors are reduced demand for electricity – accounting for 26 percent – and increased growth in renewable energy – accounting for only 18 percent.

Several case studies featured in the M.J. Bradley report offer further proof that coal retirements are driven by economic factors – specifically low natural gas prices:

For example, PSEG President and COO Bill Levis – referring to the shutdown of Hudson Generating Station — said, “the sustained low prices of natural gas have put economic pressure on these plants for some time.PSEG Senior Director of Operations Bill Thompson also pointed to economic reasons, not environmental regulations, as basis for the decision to retire the plant.

Florida Power & Light (FPL) cited economics and customer savings as the primary reasons for its plans to shut down three coal units. According to FPL, the retirements of Cedar Bay and Indiantown are expected to save its customers an estimated $199 million. FPL President and CEO Eric Silagy said the decision to retire the plants is part of a “forward-looking strategy of smart investments that improve the efficiency of our system, reduce our fuel consumption, prevent emissions and cut costs for our customers.” Retirement of FPL’s St. John River Power Park would add another $183 million in customer savings.

According to the M.J. Bradley report, the overall decline in U.S. coal generation is primarily due to reduced utilization of coal-fired power plants, rather than retirements of those facilities.

Most recently retired facilities were older, smaller units that were inefficient and relatively expensive to operate. On average, coal units that announced plans to retire between 2010 and 2015 were 57 years old – well past their original expected life span of 40 years.

Meanwhile, existing coal plant utilization has declined from 73 percent capacity factor in 2008 to 53 percent in 2016. At the same time, the utilization of cheaper natural gas combined-cycle plants has increased from 40 percent capacity factor to 56 percent.

As a result, M.J. Bradley estimates that less than twenty percent of the overall decline in coal generation over the past six years can be attributed to coal plant retirements, with reduced utilization of the remaining fleet accounting for the rest of the decline.

Implications of coal retirements for electric grid reliability

As coal plants retire and are replaced by newer, cleaner resources, there have been concerns about potential impacts on the reliability of our electric grid. (Those concerns were also the topic of DOE’s baseload study.)

M.J. Bradley examined the implications of coal retirements and the evolving resource mix, looking at extensive existing research including their own reliability report released earlier this year.

These studies conclude that electric reliability remains strong.

These studies also found that flexible approaches to grid management, and new technologies such as electric storage, are providing additional tools to support and ensure grid reliability.

In order to understand that conclusion, consider two factors that are used to assess reliability:

  • Resource adequacy, which considers the availability of resources to meet future demand, and is assessed using metrics such as reserve margins
  • Operational reliability, which considers the ability of grid operators to run the system in real-time in a secure way to balance supply and demand – and is defined in terms of Essential Reliability Services, such as frequency and voltage support and ramping capability.

As many studies have already indicated, “baseload” is an outdated term used historically to describe the way resources were being used on the grid – not to describe the above factors that are needed to maintain grid reliability.

Here is what M.J. Bradley’s report and other assessments tell us about the implications of the evolving resource mix for grid reliability:

There are no signs of deteriorating reliability on the grid today, and studies indicate continued growth in clean resources is fully compatible with continued reliability

In its 2017 State of Reliability report, the North American Electric Reliability Corporation (NERC) found that over the past five years the trends in planning reserve margins were stable while other reliability metrics were either improving, stable, or inconclusive.

NERC’s report also found that bulk power system resiliency to severe weather continues to improve.

According to a report by grid operator PJM, which has recently experienced both significant coal retirements and new deployment of clean energy resources:

[T]he expected near-term resource portfolio is among the highest-performing portfolios and is well equipped to provide the generator reliability attributes.

DOE’s own baseload study acknowledges that electric reliability remains strong.  A wide range of literature further indicates that high renewable penetration futures are possible without compromising grid reliability.

Cleaner resources and new technologies being brought online help strengthen reliability

Studies show that technologies being added to the system have, in combination, most if not all the reliability attributes provided by retiring coal-fired generation and other resources exiting the system.

In fact, the evolving resource mix that includes retirement of aging capacity and addition of new gas-fired and renewable capacity can increase system reliability from a number of perspectives. For instance, available data indicates that forced and planned outage rates for renewable and natural gas technologies can be less than half of those for coal.

Studies also highlight the valuable reliability services that emerging new technologies, such as electric storage, can provide. Renewable resources and emerging technologies also help hedge against fuel supply and price volatility, contributing to resource diversity and increased resilience.

Clean energy resources have demonstrated their ability to support reliable electric service at times of severe stress on the grid.

In the 2014 polar vortex, for example, frozen coal stockpiles led to coal generation outages – so wind and demand response resources were increasingly relied upon to help maintain reliability.

And just last year, close to 100 megawatts of electric storage was successfully deployed in less than six months to address reliability concerns stemming from the Aliso Canyon natural gas storage leak in California.

Regulators and grid operators can leverage the reliability attributes of clean resources and new technologies through improved market design

A 2016 report by DOE found that cleaner resources and emerging new technologies are creating options and opportunities, providing a new toolbox for maintaining reliability in the modern power system.

The Federal Energy Regulatory Commission (FERC) has long recognized the valuable grid services that emerging new technologies could provide – from its order on demand response to its order on frequency regulation compensation, FERC recognized the value of fast and accurate response resources in cost-effectively meeting grid reliability needs. More recently, FERC’s ancillary service reforms recognize that, with advances in technologies, variable energy resources such as wind are increasingly capable of providing reliability services such as reactive power.

Grid operators are also recognizing the valuable contributions of cleaner resources and emerging new technologies, as well as the importance of flexibility to a modern, nimble, dynamic and robust grid. For instance, both the California Independent System Operator and the Midcontinent Independent System Operator (MISO) have created ramp products, and MISO also has a dispatchable intermittent resource program.

It will be increasingly important for regulators, system planners, and grid operators to continue assessing grid reliability needs, and leveraging the capabilities of new technologies and technological advancements, in the future. It is also important to continue market design and system operation and coordination efforts to support the emerging needs of a modern 21st century electric grid.

The facts show clearly that we shouldn’t accept fearmongering that threatens our clean air safeguards. Instead, working together, America can have clean, healthy air and affordable, reliable electricity.

Read more

EPA Safeguards and the Arkema Chemical Plant Disaster – Information You Should Know

By Elena Craft, PhD

Hurricane Harvey over the Gulf of Mexico. Photo: U.S. Department of the Interior

(This post was co-authored by EDF’s Peter Zalzal)

Like many Americans, we’ve been closely following the story about the Arkema chemical plant that was flooded when Hurricane Harvey hit Texas. The resulting explosions there have added a horrifying new dimension to the tragic events in the greater Houston area.

Here’s more information that you might want to know.

The Arkema chemical facility in Crosby, Texas has had previous health and safety violations and has been the subject of enforcement actions.

The Arkema Crosby chemical facility has been the subject of at least two enforcement actions by the Texas Commission on Environmental Quality.

  • In 2006, the facility was subject to penalties because of a fire due to inappropriately stored organic peroxides. The fire led to discharge of 3,200 pounds of volatile organic compounds along with other harmful pollutants.
  • In 2011, the facility was subject to penalties for failure to maintain proper temperatures of the thermal oxidizer.

Gina McCarthy, EPA Administrator under President Obama, strengthened the standards governing preparedness for chemical releases during emergency situations.

In January of 2017, EPA Administrator Gina McCarthy strengthened key provisions of the Accident Release Prevention / Risk Management Program. Those provisions are designed to help prevent and mitigate chemical accidents. The changes included more protective accident prevention program requirements, emergency response enhancements, and enhanced public transparency and availability of information.

Some of these key improvements, which are jointly known as the “Chemical Disaster Rule,” are summarized below (the final rule is at 82 Fed. Reg. 4594.) These protections were slated to take legal effect on March 14, 2017, and the rule required phased-in compliance with its provisions over the next several years. The rule requirements differ depending on whether the facility is classified as Program 1, 2, or 3, with more rigorous and focused requirements applying to Program 3 facilities due to the types of processes at the facility. The Arkema Crosby plant is a Program 3 facility.

  • Accident Prevention Program Improvement
    • Root Cause analysis: The final rule requires Program 2 or 3 facilities to conduct a “root cause analysis” as part of an incident investigation of a “catastrophic release.” The analysis is meant to look beyond immediate causes to help prevent future disasters by uncovering underlying causes in an incident investigation.
    • Third Party Audit: The rule requires Program 2 or 3 facilities to conduct independent third party audits, or to assemble an audit team led by an independent third party auditor, to perform a compliance audit after a reportable accident. Previously, facilities were allowed to perform self-audits. The revision “is intended to reduce the risk of future accidents by requiring an objective auditing process to determine whether the owner or operator of the facility is effectively complying with the accident prevention procedures and practices.” (82 Fed. Reg. at 4,595)
    • Safer Technology Alternatives Analysis: For Program 3 facilities, the rule requires a Safer Technology Alternatives Analysis to identify the practicability of any inherently safer technology identified.
  • Emergency Response Enhancements
    • The final rule requires all covered facilities to coordinate with local emergency response agencies at least once per year to determine how the facility is addressed in the community emergency response plan, and to ensure that local response organizations are aware of the regulated substances at the facility, their quantities, the risks presented by covered processes, and the resources and capabilities at the facility to respond to an accidental release of a regulated substance. (82 Fed. Reg. at 4,595)
    • The rule also requires Program 2 or 3 facilities to conduct notification exercises to ensure that emergency contact information is accurate and complete, and that certain facilities conduct field or tabletop exercises. From the final rule: “Improved coordination with emergency response personnel will better prepare responders to respond effectively to an incident and take steps to notify the community of appropriate actions, such as shelter in place.” (82 Fed. Reg. at 4,595)
  • Enhanced Availability of Information
    • “The rule requires all facilities to provide certain basic information to the public, upon request. The owner or operator of the facility shall provide ongoing notification of availability of information elements on a company website, social media platforms, or through some other publicly accessible means.” (82 Fed. Reg. at 4,596)

Arkema and its industry trade organization, the American Chemistry Council, filed comments objecting to several of these key improvements.

Arkema filed adverse comments on the proposed improvements to the Chemical Disaster Rule, and also endorsed comments filed by the American Chemistry Council (Arkema is a member company of ACC).

Arkema objected to the third-party audit procedure, objected to the safer technology alternatives analysis as burdensome, and expressed concerns about the requirements to share certain information with emergency responders and the public.

Scott Pruitt immediately obliged and suspended the Chemical Disaster Rule improvements.

One of the immediate actions taken by Trump Administration EPA head Scott Pruitt was to suspend these key improvements to Chemical Risk Program.

On February 28, 2017, an industry coalition including the American Chemistry Council, the American Petroleum Institute, the U.S. Chamber of Commerce, and the Utility Air Regulatory Group asked EPA to reconsider the Chemical Disaster Rule.

Administrator Pruitt quickly obliged by convening a reconsideration proceeding on March 13, 2017 and suspending the Rule for 90-days on March 16, 2017. Both of these initial actions to halt the rule took place without any public process, a pattern continued in many of Pruitt’s actions as EPA Administrator.

Subsequently, on June 14, 2017, Pruitt issued a rule suspending the requirements until February of 2019. Pruitt’s decision to suspend these protections disrupted the implementation of the rule.

Administrator Pruitt’s suspension is now being challenged in the U.S. Court of Appeals for the D.C. Circuit, with a preliminary decision yesterday denying the petitioners’ motion for a stay but granting expedited briefing on the merits. Air Alliance Houston is one of the organizations challenging Pruitt’s damaging actions. 

A closer look at the Arkema facility in Crosby, Texas.

The Arkema facility in Crosby, Texas is a Program 3 facility and is required to submit a Risk Management Plan under the Chemical Disaster Rule.

The envirofacts webpage for the facility notes that the last plan was submitted in June 2014, pursuant to the less stringent requirements that were then in place.

EPA does not publicly post online Risk Management Plans for facilities but they are available for review in the federal reading rooms. On August 31, 2017, EDF examined the 2014 Risk Management Plan for the Arkema facility. According to Arkema’s documents on file:

  • The Arkema facility manufactures liquid organic peroxides, which are primarily used in the production of plastic resins, polystyrene, polyethylene, polypropylene, PVC, and fiberglass.
  • There are two substances on site that are present at or above the minimum threshold quantities for a Risk Management Plan – 85,256 pounds of 2 methylpropene (a flammable substance), and 66,260 pounds of sulfur dioxide (a toxic substance). Both are present in levels that make the facility subject to Program 3 requirements.
  • The site conducted a process hazard analysis on October 31, 2013 and indicated that any errors identified would be corrected by October 30, 2015. The 2013 hazard analysis identified concerns, including: equipment failure; loss of cooling, heating, electricity;  floods (flood plain); hurricane; other major failure identified: power failure or power surge

There have now been explosions reported at the Arkema facility and 15 police officers were taken to the hospital after inhaling fumes from the chemical plant. Because of limited air monitors operating in the region, we do not know the pollutants or their concentrations in the surrounding air.

EPA Administrator Scott Pruitt has led an unprecedented rollback of public health and environmental safeguards for our communities and families.

This is one of many damaging actions by EPA Administrator Pruitt to roll back fundamental safeguards under our health and environmental laws. Pruitt’s actions imperil our communities and families, and increase risks across our nation.

The explosion at the Crosby chemical facility is a terrible tragedy. It is incumbent on those who manufacture and use these dangerous chemicals — and it is the solemn duty of policymakers entrusted with protecting the public – to carry out their responsibilities under our nation’s public health and environmental laws to protect all Americans.

EDF is urging EPA Administrator Pruitt to immediately reinstate the critical Chemical Disaster Rule safeguards that he has suspended, and we are asking all Americans to join us. Please contact EPA and tell them you support these protections.

Read more

Pruitt six months in: “taking a meat ax to the protections of public health and environment and then hiding it”

By Martha Roberts

In Scott Pruitt’s six-month tenure as President Trump’s EPA Administrator, his administration has firmly established a reputation for secrecy and for glossing over conflicts of interest.  

This pattern of making decisions behind closed doors and stocking EPA with industry representatives is problematic for many reasons, but most importantly because so many of those decisions are putting our health at risk.

Former EPA Administrator Bill Ruckelshaus — appointed by Presidents Nixon and Reagan —described Pruitt’s tenure thus far:

[I]t appears that what is happening now is taking a meat ax to the protections of public health and environment and then hiding it.

Pruitt’s troubling pattern of behavior has even caught the interest of the EPA’s Inspector General, who recently opened an investigation into Pruitt’s repeated travel to Oklahoma at taxpayers’ expense. And one of Pruitt’s handpicked appointees, Albert Kelly, was just penalized by a federal banking agency for “unsound practices” in his previous position as a bank CEO.

Weakening safeguards across the board

As we’ve documented, Pruitt has a troubling record of attacking public safeguards without providing any opportunity for public input – including protections against toxic wastewater, oil and gas pollution, climate pollution, and safety risks at major chemical facilities.

Pruitt took aim at limits on smog that would prevent 230,000 childhood asthma attacks every year. He tried to unilaterally delay these standards without any public input on his decision, until eventually he backed down in the face of legal and public backlash.

Pruitt also suspended enforcement of existing standards for pollution from oil and gas facilities without any public input. Pruitt’s announcement did not even mention the harmful health impacts from halting implementation of pollution controls for 18,000 wells across the country. Earlier this month a federal appeals court overwhelmingly rejected Pruitt’s move as illegal after a panel decision that deemed Pruitt’s actions “unlawful,” “arbitrary,” and “capricious.”

Undermining enforcement that holds polluters accountable 

A recent analysis of EPA’s enforcement program showed that penalties against polluters have dropped by a remarkable 60 percent since the Inauguration. Not holding companies responsible for their pollution has tangible impacts in the form of more pollution, more illness, and more avoidable, early deaths.

The Trump Administration’s proposed budget calls for a 40 percent cut to EPA’s enforcement office, which would further hamper EPA’s ability to hold polluters accountable. Meanwhile, EPA overall would face a 30 percent cut, which also puts public health at risk.

Pruitt sometimes tries to mask his focus on rolling back important EPA initiatives. For example, he claims to be concentrating on cleaning up contaminated land through EPA’s Superfund program, yet the Trump Administration’s budget proposal would cut Superfund by more than 30 percent.

Pervasive conflicts of interest

In Pruitt’s former role as Oklahoma Attorney General, he was exposed for cutting and pasting industry requests and sending them to EPA on his official stationary. He shamelessly responded by calling his conduct “representative government in my view.”

At EPA, Pruitt and his most senior advisors are now driving vital decisions about public health notwithstanding clear, severe conflicts of interest.

As just one example, Dr. Nancy Beck, the senior political appointee in EPA’s toxic chemicals office, recently left her prior position at the chemicals industry’s main trade association. In her current role at EPA, she has a key role in implementing the new reforms to the Toxic Substances Control Act passed last year. In this capacity, Dr. Beck is making decisions that directly affect the financial interests of companies she represented in her previous position on issues on which she advocated for the chemical industry as recently as earlier this year. The unsurprising result? Important protections are being weakened or reversed.  

Pruitt’s lax approach to ethics may also extend to his travel schedule. Pruitt’s travel records show that he traveled repeatedly to Oklahoma at taxpayer expense, straining EPA’s limited resources. (Some sources have speculated that Pruitt’s extensive travel may be a run up to a future Pruitt campaign for political office in Oklahoma.) As we mentioned at the beginning of this post, EPA’s Inspector General has now opened an investigation into the matter 

Pruitt’s appointment of Albert Kelly is another example of how he seems to tolerate behavior that other administrations would find unacceptable. Pruitt appointed the former banking CEO to lead a task force on Superfund cleanup sites. As we mentioned earlier, just this week Kelly was sanctioned by the FDIC, which issued a lifetime bar against his participation in any future banking-related activities and noted violations that involved Kelly’s “willful or continuing disregard for the safety or soundness of the bank” where he was CEO. Nonetheless, Pruitt continues to entrust Kelly with the responsibility for leading efforts to reform management of the billion-dollar hazardous waste clean-up program.

Pruitt’s pattern of secrecy

This summer Pruitt won the Golden Padlock Award, given by investigative reporters and editors to recognize the most secretive U.S. agency or individual.  

Robert Cribb, chair of the Golden Padlock committee, noted:

Judges were impressed with the breadth and scope of Pruitt’s information suppression techniques around vital matters of public interest.

Pruitt has overseen the elimination of important climate science resources that EPA previously made publicly available on its website. EDF recently received more than 1,900 items from EPA in response to a Freedom of Information Act request for climate-related information and data deleted from, or modified on, EPA websites.

Even the basics of how Pruitt spends his business hours, and with whom he spends them, are hidden from the public. Contravening a bi-partisan EPA transparency practice, Pruitt no longer makes senior management calendars — including his own — available to the public. The website comparison below highlights this sudden change:

EPA’s website on January 19, 2017

And the same page today

The start of Scott Pruitt’s term as EPA Administrator has been marked by continuous attacks on our public health safeguards and government transparency. Perhaps it’s not a surprise that Pruitt is keeping Americans in the dark about his actions, because the more we learn, the more we see reasons to be outraged. The American public deserves better from the senior leader in charge of protecting our health and welfare from dangerous pollution.

Read more

New Pew/RWJF report rigorously evaluates options and recommends 10 policies

By Tom Neltner

Tom Neltner, J.D.Chemicals Policy Director

For the past 2 years, the issue of lead – in paint, water, dust, soil, food, toys, and kids’ blood – has been extensively covered in the news. The crises in Flint and East Chicago have laid bare the vulnerability of communities across the U.S. The evidence is now clear that there is no safe level of lead in children’s blood. What used to be tolerable is no longer acceptable. Evidence from studies of children show clearly that levels of lead in blood affect brain development at levels below those once considered acceptable and should not be tolerated. We must be vigilant to prevent young children’s exposure to lead.

We have already made substantial progress as a nation. From 1999 to 2014, mean blood lead levels in young children dropped 56% and the levels over 5 micrograms of lead per deciliter of blood dropped 86%. This change was due to smart policies, effective regulations, funding, and vigilance from federal, state and local agencies as well as private and non-profit organizations. Despite this headway, lead exposure continues to be a significant problem, preventing our communities from thriving and holding back the future generations from achieving their full potential.

Last year, several organizations developed comprehensive plans1 to eliminate lead exposure. Each added value to the discussion. Today, a new report from the Health Impact Project, a collaboration of The Pew Charitable Trusts and Robert Wood Johnson Foundation (RWJF), provides a rigorous analysis of the costs of lead and the impact of various policy solutions to help protect children from the harms of lead exposure. My colleague, Ananya Roy, and I served as advisors on the project.

The Pew/RWJF report found that no source of lead exposure predominates and that a comprehensive response is needed to continue to make progress on protecting children from lead. The report estimates that for the babies born in 2018, if blood lead levels were kept to zero micrograms per deciliter, the benefits would amount to $84 billion, excluding the cost of intervening, and made five key findings:

  1. Removing leaded drinking water service lines from the homes of children born in 2018 would protect more than 350,000 children and yield $2.7 billion in future benefits, or about $1.33 per dollar invested.
  2. Eradicating lead paint hazards from older homes of children from low-income families would provide $3.5 billion in future benefits, or approximately $1.39 per dollar invested, and protect more than 311,000 children.
  3. Ensuring that contractors comply with the Environmental Protection Agency’s rule that requires lead-safe renovation, repair, and painting practices would protect about 211,000 children born in 2018 and provide future benefits of $4.5 billion, or about $3.10 per dollar spent.
  4. Eliminating lead from airplane fuel would protect more than 226,000 children born in 2018 who live near airports, generate $262 million in future benefits, and remove roughly 450 tons of lead from the environment every year.
  5. Providing targeted, evidence-based academic and behavioral interventions to the roughly 1.8 million children with a history of lead exposure could increase their lifetime family incomes and likelihood of graduating from high school and college and decrease their potential for teen parenthood and criminal conviction.

Collectively, the federal, state and local governments would receive an estimated $3.2 billion in benefits from the first three actions through education savings and increased revenues. The Pew/RWJF Report describes 10 policies to provide a comprehensive strategic response to reduce harm from lead. Each policy recommendation includes more details on the federal, state and local actions that need to be undertaken.

Priority Sources

  1. Reduce lead in drinking water in homes built before 1986 and other places children frequent.
  2. Remove lead paint hazards from low-income housing built before 1960 and other places children spend time.
  3. Increase enforcement of the federal renovation, repair, and painting rule.

Additional Sources

  1. Reduce lead in food and consumer products.
  2. Reduce air lead emissions.
  3. Clean up contaminated soil.

Poisoning Response

  1. Improve blood lead testing among children at high risk of exposure and find and remediate the sources of their exposure.
  2. Ensure access to developmental and neuropsychological assessments and appropriate high-quality programs for children with elevated blood lead levels.

Data and Research

  1. Improve public access to local data.
  2. Fill gaps in research to better target state and local prevention and response efforts.

This report comes just in time for the federal government to update its 2000 strategy to eliminate childhood lead poisoning. In May 2017, the U.S. Environmental Protection Agency (EPA) reported that the President’s Task Force on Environmental Health Risks and Safety Risks to Children, co-chaired by EPA and Department of Health and Human Service, is “developing an updated federal strategy to address lead risks to children from a variety of sources.” This effort was launched in November 2016 with an inventory of key federal programs to reduce childhood lead exposure. Past progress shows that sound policies can have an impact and the Pew/RWJF report shows there is much more that can be done. We hope the task force will take its recommendations to heart as it moves forward with the updated strategy.

 

1 See Coalition of 49 Health, Environmental & Children’s Organizations, Call for National Strategy to End Lead Poisoning and Lead Exposure (October 2016), Green and Health Homes Initiative’s Strategic Plan to End Childhood Lead Poisoning (October 2016), National Safe and Healthy Housing Coalition’s Find It, Fix It, Fund It Campaign (December 2016), and National Lead Summit’s Playbook to End Lead Poisoning in 5 Years (March 2017).

Read more

California’s new methane leakage requirements for gas utilities are already delivering benefits

By Renee McVay

EDF Schneider fellow Scott Roycroft co-authored this post

California’s gas utilities have had their share of problems in recent years – so improvements in environmental impacts, operations, and safety are important to track.

In 2014, the California legislature passed a law to require utility companies to publicly disclose data on gas leaks and emissions while working to actually cut those emissions.  Now, three years later, utility reporting has been standardized, an emissions trend has emerged, and the results are significant.

Graphic 1: A depiction of the volume of methane emissions from California utilities between 2015 and 2016. Emissions from the Aliso Canyon blowout are shown as a separate category.

According to the emissions data, from 2015 to 2016, both Pacific Gas and Electric Company (PG&E) and Southern California Gas Company (SoCalGas) have shown a reduction in their total annual emissions of leaked and vented gas.  PG&E was in front, with an 11% reduction of natural gas methane emissions, while SoCalGas — the nation’s largest utility — reduced its emissions by 3%. Together – the reduction in emissions from these two utilities is equal to nearly 700,000 metric tons of CO2e on a 20 year basis.

Even with the reductions though, there is still much room for improvement. Overall, if you exclude emissions from the Aliso Canyon blowout, natural gas utilities across California emitted about 6.2 billion cubic feet of methane — enough to provide natural gas to over 165,000 homes in California for the year. This is gas that customers paid for but is never delivered – also referred to as Lost and Unaccounted for Gas. The top three sources of emissions are customer meter leaks, distribution pipeline leaks, and distribution station leaks. Together, these three sources comprise 73% of total statewide utility emissions – and each have solutions to reduce their pollution.

Graphic 2: A depiction of the source categories of methane emissions from California utilities in 2016. Note: emissions from the Aliso Canyon blowout are not included in this analysis.

A red flag on Grade 3 pipeline leaks

About a quarter of the state’s natural gas emissions from utility systems come from distribution pipelines – and more than half of pipeline leaks are what companies classify as “Grade 3” leaks.  Reason being: since Grade 3 leaks are technically non-hazardous, until now companies have not been required to repair them, regardless of size.

Leak Age (in years) Less than 5 5-10 10-20 20+
Total Leaks 18,333 2,279 468 51

 

Looking at discovery dates for Grade 3 leaks shows just how long companies often take to repair these leaks; according to the data some of these leaks were discovered in the late 1980’s and still have not been repaired. Although some utilities have implied they are committed to repairing these older leaks, real and sustained action is needed to ensure continued abatement of all Grade 3 leaks – action which is required in the state’s new leakage abatement program.

Leak information can be correlated to pipeline materials

Public utility companies keep detailed records of leaks they discover, including date of discovery, geographic location, pipeline material, and pipeline pressure. With the new reporting by utilities, the public has access to information on how leakage correlates to pipe material (for example, older cast iron pipes are more leak prone than newer plastic pipes) and other qualities.

New technologies find greater number of gas leaks

The data also reveals that some technologies are more effective than others at finding a greater number of gas leaks. PG&E uses advanced leak detection technology to locate a large number of Grade 3 leaks, whereas other utilities do not. This partially explains why PG&E is registering more leaks on its system today than in prior years and also likely part of why they may be experiencing larger emission reductions than others. According to recent analyses, leak discovery is expected to increase in coming years as these technologies are more widely adopted.

What can be learned from California’s leak data

Requiring companies to report gas leaks has been instrumental in increasing transparency and sheds valuable insights on the tools and practices that can deliver the biggest emission reductions. And it helps utility customers and consumer advocates learn more about the gas that customers pay for but is emitted into the atmosphere.

As a result of this data, in June 2017 the California Public Utilities Commission (CPUC) started requiring companies to begin the implementation of 26 best practices for reducing emissions – including targeting and scheduling Grade 3 leaks for repair. Once these practices are fully implemented, utilities are expected to reduce methane emissions by 40% by the year 2030.

It’s clear that better leak reporting is a critical part of reducing natural gas emissions. By requiring companies to disclose leak data California is once again demonstrating what climate change leadership looks like and setting a powerful example that other states can follow.

 

Read more

Torrential rains and violent storm surge: Why hurricane impacts are getting worse

By Scott Weaver

Wikimedia

(This post originally appeared on EDF Voices)

As Hurricane Harvey barreled toward the coast of Texas last week with increasing intensity, forecasters were issuing dire warnings about life-threatening storm surge and torrential rain in addition to the dangerous winds that hurricanes bring.

It was no coincidence. As our climate warms, we’re experiencing ever-more devastating storm surges and record rainfalls during hurricane season – which is also why these storms are becoming more destructive and costly.

Evaporation means storms carry more water

Harvey, which formed quickly in an abnormally warm Gulf of Mexico, is dumping historic amounts of rain – 30-plus inches in the Houston area so far – with more expected, leading to catastrophic flooding in America’s fourth largest city.

So why do hurricanes bring more rain in a warmer climate? Evaporation intensifies as temperatures rise, increasing the amount of water vapor that storms pull into their systems as they travel across warm oceans. That makes for higher rainfall and more flooding when they hit land.

Unfortunately for Texas, Harvey has stalled out as a tropical storm, now drenching parts of Texas and Louisiana.

Sea level rise makes storm surges worse

Storm surge occurs when waters rise above their normal levels and are pushed inland by wind.

With Katrina, which hit land as a Category 3 hurricane, it was the storm surge that caused the levees to fail, leading to destruction to the New Orleans area. Storm surge was also responsible for an extra $2 billion in damage to New York City after Sandy hit that area in 2012, according to a Rand report.

This increasing phenomena is due, in large part, to sea level rise, which is triggered by human-caused global warming as warmer ocean water expands and land ice melts. The average global sea level has already increased by more than half a foot since the Industrial Revolution.

Storm-related flooding is on the rise

The devastating flooding we’re seeing in Houston is unusual because of its scale, but heavy rains and bad flooding are becoming the new normal in parts of our country as temperatures rise. Intense single-day rain events that cause flooding are on the rise.

Historic weather data measured since 1910 shows that in the contiguous 48 states, nine of the top 10 years for extreme one-day rain events have occurred since 1990.

We don’t yet know what kind of damage Harvey or future hurricanes will cause. But they should serve as a reminder that today, more than ever before, we need to be guided by science to help us prepare for, and act, on climate change.

Read more