Pruitt six months in: “taking a meat ax to the protections of public health and environment and then hiding it”

By Martha Roberts

In Scott Pruitt’s six-month tenure as President Trump’s EPA Administrator, his administration has firmly established a reputation for secrecy and for glossing over conflicts of interest.  

This pattern of making decisions behind closed doors and stocking EPA with industry representatives is problematic for many reasons, but most importantly because so many of those decisions are putting our health at risk.

Former EPA Administrator Bill Ruckelshaus — appointed by Presidents Nixon and Reagan —described Pruitt’s tenure thus far:

[I]t appears that what is happening now is taking a meat ax to the protections of public health and environment and then hiding it.

Pruitt’s troubling pattern of behavior has even caught the interest of the EPA’s Inspector General, who recently opened an investigation into Pruitt’s repeated travel to Oklahoma at taxpayers’ expense. And one of Pruitt’s handpicked appointees, Albert Kelly, was just penalized by a federal banking agency for “unsound practices” in his previous position as a bank CEO.

Weakening safeguards across the board

As we’ve documented, Pruitt has a troubling record of attacking public safeguards without providing any opportunity for public input – including protections against toxic wastewater, oil and gas pollution, climate pollution, and safety risks at major chemical facilities.

Pruitt took aim at limits on smog that would prevent 230,000 childhood asthma attacks every year. He tried to unilaterally delay these standards without any public input on his decision, until eventually he backed down in the face of legal and public backlash.

Pruitt also suspended enforcement of existing standards for pollution from oil and gas facilities without any public input. Pruitt’s announcement did not even mention the harmful health impacts from halting implementation of pollution controls for 18,000 wells across the country. Earlier this month a federal appeals court overwhelmingly rejected Pruitt’s move as illegal after a panel decision that deemed Pruitt’s actions “unlawful,” “arbitrary,” and “capricious.”

Undermining enforcement that holds polluters accountable 

A recent analysis of EPA’s enforcement program showed that penalties against polluters have dropped by a remarkable 60 percent since the Inauguration. Not holding companies responsible for their pollution has tangible impacts in the form of more pollution, more illness, and more avoidable, early deaths.

The Trump Administration’s proposed budget calls for a 40 percent cut to EPA’s enforcement office, which would further hamper EPA’s ability to hold polluters accountable. Meanwhile, EPA overall would face a 30 percent cut, which also puts public health at risk.

Pruitt sometimes tries to mask his focus on rolling back important EPA initiatives. For example, he claims to be concentrating on cleaning up contaminated land through EPA’s Superfund program, yet the Trump Administration’s budget proposal would cut Superfund by more than 30 percent.

Pervasive conflicts of interest

In Pruitt’s former role as Oklahoma Attorney General, he was exposed for cutting and pasting industry requests and sending them to EPA on his official stationary. He shamelessly responded by calling his conduct “representative government in my view.”

At EPA, Pruitt and his most senior advisors are now driving vital decisions about public health notwithstanding clear, severe conflicts of interest.

As just one example, Dr. Nancy Beck, the senior political appointee in EPA’s toxic chemicals office, recently left her prior position at the chemicals industry’s main trade association. In her current role at EPA, she has a key role in implementing the new reforms to the Toxic Substances Control Act passed last year. In this capacity, Dr. Beck is making decisions that directly affect the financial interests of companies she represented in her previous position on issues on which she advocated for the chemical industry as recently as earlier this year. The unsurprising result? Important protections are being weakened or reversed.  

Pruitt’s lax approach to ethics may also extend to his travel schedule. Pruitt’s travel records show that he traveled repeatedly to Oklahoma at taxpayer expense, straining EPA’s limited resources. (Some sources have speculated that Pruitt’s extensive travel may be a run up to a future Pruitt campaign for political office in Oklahoma.) As we mentioned at the beginning of this post, EPA’s Inspector General has now opened an investigation into the matter 

Pruitt’s appointment of Albert Kelly is another example of how he seems to tolerate behavior that other administrations would find unacceptable. Pruitt appointed the former banking CEO to lead a task force on Superfund cleanup sites. As we mentioned earlier, just this week Kelly was sanctioned by the FDIC, which issued a lifetime bar against his participation in any future banking-related activities and noted violations that involved Kelly’s “willful or continuing disregard for the safety or soundness of the bank” where he was CEO. Nonetheless, Pruitt continues to entrust Kelly with the responsibility for leading efforts to reform management of the billion-dollar hazardous waste clean-up program.

Pruitt’s pattern of secrecy

This summer Pruitt won the Golden Padlock Award, given by investigative reporters and editors to recognize the most secretive U.S. agency or individual.  

Robert Cribb, chair of the Golden Padlock committee, noted:

Judges were impressed with the breadth and scope of Pruitt’s information suppression techniques around vital matters of public interest.

Pruitt has overseen the elimination of important climate science resources that EPA previously made publicly available on its website. EDF recently received more than 1,900 items from EPA in response to a Freedom of Information Act request for climate-related information and data deleted from, or modified on, EPA websites.

Even the basics of how Pruitt spends his business hours, and with whom he spends them, are hidden from the public. Contravening a bi-partisan EPA transparency practice, Pruitt no longer makes senior management calendars — including his own — available to the public. The website comparison below highlights this sudden change:

EPA’s website on January 19, 2017

And the same page today

The start of Scott Pruitt’s term as EPA Administrator has been marked by continuous attacks on our public health safeguards and government transparency. Perhaps it’s not a surprise that Pruitt is keeping Americans in the dark about his actions, because the more we learn, the more we see reasons to be outraged. The American public deserves better from the senior leader in charge of protecting our health and welfare from dangerous pollution.

Read more

New Pew/RWJF report rigorously evaluates options and recommends 10 policies

By Tom Neltner

Tom Neltner, J.D.Chemicals Policy Director

For the past 2 years, the issue of lead – in paint, water, dust, soil, food, toys, and kids’ blood – has been extensively covered in the news. The crises in Flint and East Chicago have laid bare the vulnerability of communities across the U.S. The evidence is now clear that there is no safe level of lead in children’s blood. What used to be tolerable is no longer acceptable. Evidence from studies of children show clearly that levels of lead in blood affect brain development at levels below those once considered acceptable and should not be tolerated. We must be vigilant to prevent young children’s exposure to lead.

We have already made substantial progress as a nation. From 1999 to 2014, mean blood lead levels in young children dropped 56% and the levels over 5 micrograms of lead per deciliter of blood dropped 86%. This change was due to smart policies, effective regulations, funding, and vigilance from federal, state and local agencies as well as private and non-profit organizations. Despite this headway, lead exposure continues to be a significant problem, preventing our communities from thriving and holding back the future generations from achieving their full potential.

Last year, several organizations developed comprehensive plans1 to eliminate lead exposure. Each added value to the discussion. Today, a new report from the Health Impact Project, a collaboration of The Pew Charitable Trusts and Robert Wood Johnson Foundation (RWJF), provides a rigorous analysis of the costs of lead and the impact of various policy solutions to help protect children from the harms of lead exposure. My colleague, Ananya Roy, and I served as advisors on the project.

The Pew/RWJF report found that no source of lead exposure predominates and that a comprehensive response is needed to continue to make progress on protecting children from lead. The report estimates that for the babies born in 2018, if blood lead levels were kept to zero micrograms per deciliter, the benefits would amount to $84 billion, excluding the cost of intervening, and made five key findings:

  1. Removing leaded drinking water service lines from the homes of children born in 2018 would protect more than 350,000 children and yield $2.7 billion in future benefits, or about $1.33 per dollar invested.
  2. Eradicating lead paint hazards from older homes of children from low-income families would provide $3.5 billion in future benefits, or approximately $1.39 per dollar invested, and protect more than 311,000 children.
  3. Ensuring that contractors comply with the Environmental Protection Agency’s rule that requires lead-safe renovation, repair, and painting practices would protect about 211,000 children born in 2018 and provide future benefits of $4.5 billion, or about $3.10 per dollar spent.
  4. Eliminating lead from airplane fuel would protect more than 226,000 children born in 2018 who live near airports, generate $262 million in future benefits, and remove roughly 450 tons of lead from the environment every year.
  5. Providing targeted, evidence-based academic and behavioral interventions to the roughly 1.8 million children with a history of lead exposure could increase their lifetime family incomes and likelihood of graduating from high school and college and decrease their potential for teen parenthood and criminal conviction.

Collectively, the federal, state and local governments would receive an estimated $3.2 billion in benefits from the first three actions through education savings and increased revenues. The Pew/RWJF Report describes 10 policies to provide a comprehensive strategic response to reduce harm from lead. Each policy recommendation includes more details on the federal, state and local actions that need to be undertaken.

Priority Sources

  1. Reduce lead in drinking water in homes built before 1986 and other places children frequent.
  2. Remove lead paint hazards from low-income housing built before 1960 and other places children spend time.
  3. Increase enforcement of the federal renovation, repair, and painting rule.

Additional Sources

  1. Reduce lead in food and consumer products.
  2. Reduce air lead emissions.
  3. Clean up contaminated soil.

Poisoning Response

  1. Improve blood lead testing among children at high risk of exposure and find and remediate the sources of their exposure.
  2. Ensure access to developmental and neuropsychological assessments and appropriate high-quality programs for children with elevated blood lead levels.

Data and Research

  1. Improve public access to local data.
  2. Fill gaps in research to better target state and local prevention and response efforts.

This report comes just in time for the federal government to update its 2000 strategy to eliminate childhood lead poisoning. In May 2017, the U.S. Environmental Protection Agency (EPA) reported that the President’s Task Force on Environmental Health Risks and Safety Risks to Children, co-chaired by EPA and Department of Health and Human Service, is “developing an updated federal strategy to address lead risks to children from a variety of sources.” This effort was launched in November 2016 with an inventory of key federal programs to reduce childhood lead exposure. Past progress shows that sound policies can have an impact and the Pew/RWJF report shows there is much more that can be done. We hope the task force will take its recommendations to heart as it moves forward with the updated strategy.


1 See Coalition of 49 Health, Environmental & Children’s Organizations, Call for National Strategy to End Lead Poisoning and Lead Exposure (October 2016), Green and Health Homes Initiative’s Strategic Plan to End Childhood Lead Poisoning (October 2016), National Safe and Healthy Housing Coalition’s Find It, Fix It, Fund It Campaign (December 2016), and National Lead Summit’s Playbook to End Lead Poisoning in 5 Years (March 2017).

Read more

California’s new methane leakage requirements for gas utilities are already delivering benefits

By Renee McVay

EDF Schneider fellow Scott Roycroft co-authored this post

California’s gas utilities have had their share of problems in recent years – so improvements in environmental impacts, operations, and safety are important to track.

In 2014, the California legislature passed a law to require utility companies to publicly disclose data on gas leaks and emissions while working to actually cut those emissions.  Now, three years later, utility reporting has been standardized, an emissions trend has emerged, and the results are significant.

Graphic 1: A depiction of the volume of methane emissions from California utilities between 2015 and 2016. Emissions from the Aliso Canyon blowout are shown as a separate category.

According to the emissions data, from 2015 to 2016, both Pacific Gas and Electric Company (PG&E) and Southern California Gas Company (SoCalGas) have shown a reduction in their total annual emissions of leaked and vented gas.  PG&E was in front, with an 11% reduction of natural gas methane emissions, while SoCalGas — the nation’s largest utility — reduced its emissions by 3%. Together – the reduction in emissions from these two utilities is equal to nearly 700,000 metric tons of CO2e on a 20 year basis.

Even with the reductions though, there is still much room for improvement. Overall, if you exclude emissions from the Aliso Canyon blowout, natural gas utilities across California emitted about 6.2 billion cubic feet of methane — enough to provide natural gas to over 165,000 homes in California for the year. This is gas that customers paid for but is never delivered – also referred to as Lost and Unaccounted for Gas. The top three sources of emissions are customer meter leaks, distribution pipeline leaks, and distribution station leaks. Together, these three sources comprise 73% of total statewide utility emissions – and each have solutions to reduce their pollution.

Graphic 2: A depiction of the source categories of methane emissions from California utilities in 2016. Note: emissions from the Aliso Canyon blowout are not included in this analysis.

A red flag on Grade 3 pipeline leaks

About a quarter of the state’s natural gas emissions from utility systems come from distribution pipelines – and more than half of pipeline leaks are what companies classify as “Grade 3” leaks.  Reason being: since Grade 3 leaks are technically non-hazardous, until now companies have not been required to repair them, regardless of size.

Leak Age (in years) Less than 5 5-10 10-20 20+
Total Leaks 18,333 2,279 468 51


Looking at discovery dates for Grade 3 leaks shows just how long companies often take to repair these leaks; according to the data some of these leaks were discovered in the late 1980’s and still have not been repaired. Although some utilities have implied they are committed to repairing these older leaks, real and sustained action is needed to ensure continued abatement of all Grade 3 leaks – action which is required in the state’s new leakage abatement program.

Leak information can be correlated to pipeline materials

Public utility companies keep detailed records of leaks they discover, including date of discovery, geographic location, pipeline material, and pipeline pressure. With the new reporting by utilities, the public has access to information on how leakage correlates to pipe material (for example, older cast iron pipes are more leak prone than newer plastic pipes) and other qualities.

New technologies find greater number of gas leaks

The data also reveals that some technologies are more effective than others at finding a greater number of gas leaks. PG&E uses advanced leak detection technology to locate a large number of Grade 3 leaks, whereas other utilities do not. This partially explains why PG&E is registering more leaks on its system today than in prior years and also likely part of why they may be experiencing larger emission reductions than others. According to recent analyses, leak discovery is expected to increase in coming years as these technologies are more widely adopted.

What can be learned from California’s leak data

Requiring companies to report gas leaks has been instrumental in increasing transparency and sheds valuable insights on the tools and practices that can deliver the biggest emission reductions. And it helps utility customers and consumer advocates learn more about the gas that customers pay for but is emitted into the atmosphere.

As a result of this data, in June 2017 the California Public Utilities Commission (CPUC) started requiring companies to begin the implementation of 26 best practices for reducing emissions – including targeting and scheduling Grade 3 leaks for repair. Once these practices are fully implemented, utilities are expected to reduce methane emissions by 40% by the year 2030.

It’s clear that better leak reporting is a critical part of reducing natural gas emissions. By requiring companies to disclose leak data California is once again demonstrating what climate change leadership looks like and setting a powerful example that other states can follow.


Read more

Torrential rains and violent storm surge: Why hurricane impacts are getting worse

By Scott Weaver


(This post originally appeared on EDF Voices)

As Hurricane Harvey barreled toward the coast of Texas last week with increasing intensity, forecasters were issuing dire warnings about life-threatening storm surge and torrential rain in addition to the dangerous winds that hurricanes bring.

It was no coincidence. As our climate warms, we’re experiencing ever-more devastating storm surges and record rainfalls during hurricane season – which is also why these storms are becoming more destructive and costly.

Evaporation means storms carry more water

Harvey, which formed quickly in an abnormally warm Gulf of Mexico, is dumping historic amounts of rain – 30-plus inches in the Houston area so far – with more expected, leading to catastrophic flooding in America’s fourth largest city.

So why do hurricanes bring more rain in a warmer climate? Evaporation intensifies as temperatures rise, increasing the amount of water vapor that storms pull into their systems as they travel across warm oceans. That makes for higher rainfall and more flooding when they hit land.

Unfortunately for Texas, Harvey has stalled out as a tropical storm, now drenching parts of Texas and Louisiana.

Sea level rise makes storm surges worse

Storm surge occurs when waters rise above their normal levels and are pushed inland by wind.

With Katrina, which hit land as a Category 3 hurricane, it was the storm surge that caused the levees to fail, leading to destruction to the New Orleans area. Storm surge was also responsible for an extra $2 billion in damage to New York City after Sandy hit that area in 2012, according to a Rand report.

This increasing phenomena is due, in large part, to sea level rise, which is triggered by human-caused global warming as warmer ocean water expands and land ice melts. The average global sea level has already increased by more than half a foot since the Industrial Revolution.

Storm-related flooding is on the rise

The devastating flooding we’re seeing in Houston is unusual because of its scale, but heavy rains and bad flooding are becoming the new normal in parts of our country as temperatures rise. Intense single-day rain events that cause flooding are on the rise.

Historic weather data measured since 1910 shows that in the contiguous 48 states, nine of the top 10 years for extreme one-day rain events have occurred since 1990.

We don’t yet know what kind of damage Harvey or future hurricanes will cause. But they should serve as a reminder that today, more than ever before, we need to be guided by science to help us prepare for, and act, on climate change.

Read more

New study reveals gaps in the methods used to assess chemicals in oilfield wastewater

By Cloelle Danforth

A new study led by researchers with Colorado School of Mines exposes limitations with the current methods used to detect chemicals in oilfield wastewater and offers solutions to help regulators make better decisions for managing this waste stream.

Oilfield wastewater is extremely salty and can contain multiple combinations of many potentially harmful chemicals (approximately 1600 on a national basis). However, most standard or approved analytical methods available to regulators were designed to work with fresh water. Because oil and gas wastewater is so salty—sometimes 10 times saltier than seawater or more—chemists often have to dilute wastewater samples to manage the high salt content.

This means they may also be diluting chemicals of concern to concentrations too low to detect, even though they may be present at risky levels. For example, benzene is a chemical associated with petroleum hydrocarbons and a known carcinogen. It also has a drinking water standard of 5 parts per billion – that’s 5 cents in 10 million dollars. It really doesn’t take much dilution of a sample to lose that level of precision.

More concerning is that more than 75% of the chemicals associated with unconventional oil and gas production don’t have standard analytical methods. So, not only are the available analytical tools frequently inappropriate for saltwater samples, there are no tests at all for most of the chemicals that could be there.

Why it matters

In the United States oil and gas production generates nearly 900 billion gallons of wastewater per year. That’s the same amount as 30% of the water that went over the Hoover Dam last year.

Aerial view of wastewater pit at a drilling site

Historically, most of this wastewater has been pumped into underground wells for permanent disposal. But concerns about this practice leading to earthquakes, coupled with heightened demands for water and a desire to cut costs, has companies looking at new ways to manage this massive waste stream, from crop irrigation to discharge into surface waters.

(Learn more: Scientists Question Risks of Using Oilfield Wastewater on Crops)

Before this wastewater is reused in ways that could affect our water or food supplies, the right tools are needed to identify, measure, and treat the chemicals it may contain. This is a basic requirement for making sound decisions about protecting our health and our environment.

Advancing science to improve policy

Before new technologies and analytical methods developed in research laboratories are standardized and used in a regulatory context, they must undergo a rigorous and time-consuming validation process to assure they are robust, accurate, and precise.

Getting the job done right requires having the right tools. For this review, researchers combed through scores of research methods, evaluated which techniques are appropriate for oil and gas wastewater, discussed challenges associated with current methods, and offer potential solutions for detecting chemicals. In other words, the study offers us a starting point for making better decisions about cleaning or reusing wastewater.

Today, there’s a lot that isn’t – and can’t – be known about what’s in this wastewater, and, as a result, it’s nearly impossible to conclude that it won’t threaten human health and our environment if it is released into our ecosytem.

Fortunately, this review represents an important step toward identifying potential approaches to understanding the chemistry of this complex waste, and could lead to better treatment and disposal practices that will ultimately help keep our soil and water clean.

Read more

New utility settlement will unlock millions in clean energy funding for Ohio

By Dick Munson

Enhancing EV infrastructure is one of the many ways AEP’s new settlement advances clean energy.

AEP, one of Ohio’s largest utilities, just reached an exciting new milestone that takes the state further down the path to a clean energy economy.

The utility has reached a settlement that will unlock millions in funding, lower pollution, avoid unnecessary electricity bill increases, and provide customers with more clean energy options.

New benefits

In AEP’s recent electric security plan case (a process that sets generation rates charged to customers) through 2024, the utility, Environmental Defense Fund (EDF), the Ohio Environmental Council (OEC), and others have reached a settlement that includes the following:

  • Less pollution through more renewables: AEP will build or enter into a power purchase agreement for 900 MW of low-carbon solar and wind projects – enough to power almost 2,000 homes for a year. These new projects will allow the state to rely less on polluting coal.
  • No unnecessary fixed cost increases: The fixed monthly charge for residential customers will remain at $5.00/month, rather than a whopping $18.40/month that AEP sought. Higher fixed rates mean customers typically pay more regardless of how much they cut their energy use, effectively discouraging them from investing in energy-saving resources like efficiency and residential solar.
  • Strengthening grid reliability and resiliency: AEP will spend $10.5 million for one or more microgrid projects, localized power grids that have the ability to disconnect from the main, centralized grid.
  • Enhancing electric vehicle (EV) infrastructure: AEP will spend $10.5 million for an EV charging station rebate program.
  • Opening the door to future clean energy investments: AEP will implement a more streamlined way to recover costs from projects related to PowerForward, the state’s grid modernization effort, and the Smart City program, which involves $60 million in grants to the City of Columbus for smart transportation systems. This new cost recovery mechanism will lower AEP’s risk for investing in smart grid measures.

These ambitious developments will significantly enhance and diversify Ohio’s clean energy economy.

New utility settlement will unlock millions in clean energy funding for Ohio
Click To Tweet

A history of bailouts

Last year, AEP and FirstEnergy, another utility giant in Ohio, sought enormous bailouts for their unprofitable, old coal and nuclear plants. After the Federal Energy Regulatory Commission (FERC) blocked these pleas, each utility was forced back to the drawing board – and each came back with a very different approach.

While FirstEnergy continued its crusade for customer-funded bailouts, AEP began to re-think its strategy and reached a settlement with the Public Utilities Commission of Ohio (PUCO) for its customer rates through May of 2018. Although the settlement had some promising clean energy components like increased renewable energy, it was still an unnecessary handout.

Specifically, the deal would have forced Ohioans to pay for AEP’s share of two uneconomic coal plants, which are part of the Ohio Valley Electric Corporation (OVEC). These are two old coal plants in Ohio and Indiana, built in the 1950s to supply electricity for a uranium enrichment plant that has since closed. EDF, among others, did not agree to the settlement.

Groups are contesting that deal, and specifically the OVEC bailout piece, at the Ohio Supreme Court, arguing that the provision is unfair because AEP has already been compensated for its share of the OVEC plants (through “transition charges” that AEP recovered during the 10 years following deregulation). The outcome is expected within the next year.

Under the new rate case settlement, AEP will continue to get money for its share of the uneconomic OVEC power plants through 2024. If the Ohio Supreme Court rejects the OVEC bailout, the decision will render the related portion of the new settlement moot and various parties will get back together to determine the path forward. EDF and the OEC would not be at the table because we opposed the original OVEC coal bailout, and we do not support the extension in the new settlement.

Regardless of the impending court decision, AEP’s new settlement exemplifies how utilities can and should include clean energy provisions when setting their overall electricity prices for customers. As the U.S. continues its transition to a cleaner, smarter energy system, more utilities should take this approach.

Read more

Californians benefit from continuous pollution monitoring at oil and gas sites

By Tim O’Connor

Sophia Brewer, Oil and Gas Intern, contributed to this article.

Since the 1892 discovery of oil in California, the oil and gas industry has been a major economic engine and energy supplier for the state. Although this oil and gas production may be broken down into dollars and barrels, it doesn’t tell the story of the potential impact of drilling activity on the lives of the people in Los Angeles and the Central Valley who live right next to these operations.

While some production sites may be meeting stringent operational and environmental standards, others may not –there simply isn’t data to discern which is which – and that is where monitoring comes in.


The world is experiencing the largest technology and innovation boom in history. Computers, space-age science, the internet, and cloud-based information platforms are making the collection and analysis of data easier than ever before. There’s no reason the gathering and evaluating of pollution data should be any different.

Oil and gas companies like StatOilShellPG&E, and SoCalGas; academic institutions like Stanford and Colorado State University; government agencies like NASA and South Coast AQMD; and several tech companies like Picarro, Quanta 3 SensitUnited Electric ControlsEntanglement Technologies and many more; have demonstrated that it’s now possible to install precise continuous monitors at or around oil and gas facilities to capture real-time data on air pollution at all hours of the day. In some cases, such as natural gas storage sites, monitoring systems are being tested and installed right now under new regulatory requirements. In other cases, companies and communities are installing continuous monitors to provide valuable information about on-site operations issues or to inform neighbors of potential health risks.

A major series of studies is also underway by the state of California, stemming in part from requirements of AB 1496 to evaluate potential methane “hot-spots” in the state and observations of a 2016 Air Resources Board study that demonstrated that nearly half of the leaks at California oil and gas sites have detectable levels of cancer-causing compounds. From these efforts, it is clear that more pollution monitoring is needed to protect the public.

Pollution monitoring has economic, environmental and public health value

Perhaps the clearest example of the value of continuous monitoring can be seen at the Aliso Canyon gas storage field – operated by the SoCalGas Company. Stemming from a well blowout in October 2015, the event cost SoCalGas and its insurance providers over $832 million as of June 30, 2017, with additional costs expected well into the future. Pollution monitors may not prevent massive leaks, but they may act as an early warning system that alert facility operators to the presence of smaller leaks before they grow into bigger problems.

Continuous monitors can also act to empower the public and quell concerns over uncontrolled and unknown emissions. As a result of the Aliso Canyon incident for example, the California Air Resources Board (CARB) required the installation of continuous monitors at all natural gas storage sites in California. With these types of facilities holding vast amounts of natural gas mixed with crude oil residues and pollutants like benzene, the new CARB rule is a testament to the commercial availability of continuous monitors and the value these monitors provide to regulators and the public.

To find another example of the value continuous monitors can deliver, one must look no further than to the 1.3 million Californians who live within a half mile of the 93,000 active oil and gas facilities in the state. And since recent studies suggest a possible correlation between living in close proximity to oil and gas production sites and respiratory complications such as onset asthma and cancer, the public depends on strong regulations, transparent information, and equitable enforcement to minimize the risk of pollution from these sites at all hours of the day.

Well site near a residential neighborhood in Southern California

Continuous monitoring in California can reduce unequal community burdens

In California, nearly 69% of residents living near oil and gas sites are people of color – meaning emissions disproportionately burden these communities. Government agencies and oil and gas operators may not intend for these disparate impacts, but they are none-the-less part of the landscape and local residents often have neither the time, energy, and/or money to stand up to fix these inequalities.

Continuous monitoring for air pollution will foster better transparency between corporations, the government, residents, and customers. Through this improved transparency and resources to evaluate data, in particular in areas where people live and work immediately adjacent to oil and gas sites, monitoring can and will encourage the highest levels of corporate responsibility, more precise and informed engagement by neighbors, and higher levels of effectiveness in government oversite efforts. By investing in low-cost, high precision continuous monitors, oil and gas companies will reduce pollution in the neighborhoods where they are needed most.

Looking forward

While many have a vision of a fossil fuel free world, elimination of oil production isn’t likely to happen any time soon. What is on the horizon though, is technology and data analytics that can better document pollution from the industry and help aim toward consistent environmental responsibility. As a matter of capturing the low hanging fruit, at minimum, high producing sites located next to homes and businesses in disadvantaged communities (like many in the Los Angeles Basin) seem to be a perfect fit for early roll-out of monitors and data evaluation resources that document and result in reduced pollution.

Read more

Secretary Perry continues to ignore the evidence on grid reliability, even his own

By Michael Panfil

Late Wednesday night, the U.S. Department of Energy (DOE) released its so-called “study” on grid reliability.

Secretary Perry commissioned the report in this April memo, asking the DOE to investigate whether our electric grid’s reliability is threatened by the “erosion of critical baseload resources,” meaning coal and nuclear power plants. Perry took the unusual step of providing his own, pre-study conclusion, claiming that “baseload power is necessary to a well-functioning electric grid.”

His own report disagrees. It’s largely a backward-looking report that sometimes argues with itself, but comes, albeit grudgingly, to the same conclusion as every other recent study: the electric grid continues to operate reliably as uneconomic coal diminishes. Moreover, coal is declining because it can’t compete, and other resources are ensuring reliability at more affordable rates.

Perry seems undeterred by the evidence however, and the report’s accompanying cover letter and recommendations appear ready to double down on his pro-coal agenda. Here are three ways he tries to twist the facts in favor of dirty coal – a move that ignores more efficient, affordable, and innovative solutions and comes at a cost to Americans.

The misdirection spin

Perry’s letter accompanying the report included this nugget:

“It is apparent that in today’s competitive markets certain regulations and subsidies are having a large impact on the functioning of markets, and thereby challenging our power generation mix.”

Although Secretary Perry continues to blame “regulations and subsidies” for “challenging” the power generation mix (despite the mix becoming more, not less, diverse as more renewables come online), he would be well advised to read his own report if he’s looking for the real driver of coal retirements. The report he commissioned clearly states:

“The biggest contributor to coal and nuclear plant retirements has been the advantaged economics of natural gas-fired generation.”

Secretary Perry continues to ignore the evidence on grid reliability, even his own
Click To Tweet

This is hardly new information; extensive study has found that “decreases in natural gas prices have had a much larger impact on the profitability of conventional generators than the growth of renewable energy.” Coal is simply too costly to compete. And low natural gas prices, in addition to flat demand for electricity, are making energy more affordable.

The reliability spin

On reliability, again Perry’s letter takes one stance:

“The industry has experienced massive change in recent years, and government has failed to keep pace.”

And the report states the opposite:

Grid operators “are working hard to integrate growing levels of [renewable energy] through extensive study, deliberative planning, and careful operations and adjustments.”

Although Perry appears unaware of the thorough performance his own study references, grid operators are required by law to ensure reliable electricity at affordable rates. And indeed, government has kept pace. As the DOE report noted, the North American Electric Reliability Corporation’s most recent annual State of Reliability analysis concluded that the electric grid was reliable in 2016. And 2015. And 2014. And 2013. And although the DOE report neglected to mention it, this same State of Reliability analysis found that reliability has been increasing.

Certainly, more can and should be done. As the DOE report mentions, increasing the use of fast, flexible resources support a healthy grid. Unfortunately for Perry, coal can’t provide what’s needed, as the DOE report notes,

“For a power plant to make money today, it must be able to ramp up and down to coincide with the variable levels of renewable generation coming online. That makes combined cycle natural gas plants profitable…but coal plants have relatively high and fixed operating costs and are relatively inflexible.”

Coal, simply put, is too slow and old to respond nimbly.

The resiliency spin

Perry also attempts to pivot from focusing on reliability to resiliency, a lesser defined term:

“Customers should know that a resilient electric grid does come with a price.”

Like everything worth having, resiliency comes at a price, but that price should be cost-effective. But coal is part of the problem, not the solution to achieving affordable, resilient, and reliable electricity. Not only do coal-fired power plants unexpectedly break down more than any other resource, they have performed poorly during extreme weather events, as his report notes:

During extreme weather events in 2014 “many coal plants could not operate due to conveyor belts and coal piles freezing.”

During extreme weather events in 2014 “many coal plants could not operate due to conveyor belts and coal piles freezing.”

“Forced outages,” meaning the instance when a power station is unavailable to produce power due to an unexpected breakdown, are higher for coal than any other resource – almost twice as high, in fact, as the next highest resource. Coal also needs twice as much scheduled maintenance, referred to as “planned outages,” as any other resource.

A terrible solution in search of a problem

When Secretary Perry originally requested the DOE report, he already knew the grid reliability answer he was looking for. Unfortunately for him, the final report – despite best efforts – only further illustrates why his pre-study pro-coal conclusion is wrong. The report’s recommendations and his letter double down on coal despite the evidence, no matter the cost to the American public; no matter the cost to human health and the environment; and no matter the cost to the well-being of the electric grid itself.

Now that the report is finally here, coal companies likely will continue complaining and seeking help for their uneconomic power plants. Meanwhile, America’s grid will continue to embrace new, innovative technology that builds a cleaner, reliable, affordable, and resilient energy system.

Read more

Shell becomes latest oil and gas company to test smart methane sensors

By Aileen Nowlan

This week, the oil and gas giant Shell took a positive step toward addressing methane emissions. The company announced a new technology trial at a wellsite in Alberta, Canada, where it is piloting a specially designed laser to continuously monitor emissions of methane, a powerful pollutant known to leak from oil and gas equipment.

The move by Shell is a glimpse into the future and demonstrates growing market interest in smart, sensor-based methane detection technology. Shell’s project joins a similar field test already underway in Texas, operated by the Norwegian producer Statoil, and a California utility pilot run by Pacific Gas and Electric Company.

Each of these deployments is promising, but the ultimate test will be broad-scale adoption of innovations that generate actual methane reductions.

For industry, there is an incentive to move ahead. An estimated $30 billion of natural gas (which is largely methane) is wasted every year due to leaks and flaring from oil and gas operations worldwide. In addition, roughly 25 percent of global warming is driven by methane. Oil and gas methane emissions also contain chemicals that adversely affect public health.

For these reasons, methane is a problem that has caught the attention of regulators, investors and consumers alike. Advancing new technologies to enable the oil and gas industry to tackle this challenge more efficiently is key, even as companies use established tools to manage emissions now.

Collaborations Spark Methane Innovation

When you bring the right people to the table, innovative solutions will follow. Behind the Shell, Statoil and PG&E demonstration projects is a collaborative initiative, the Methane Detectors Challenge, begun by the Environmental Defense Fund four years ago. The project united eight oil and gas companies, R&D experts, and technology innovators in an effort to accelerate the development of next-generation methane detectors.

The formation of this project was motivated by a key insight: new technology to manage emissions needs to be created and deployed faster than ever. The Methane Detectors Challenge offers a unique resource to innovators – access to real facilities and collaboration with potential customers – which is essential to help entrepreneurs understand the market, demonstrate demand, and ultimately achieve economies of scale.

Both the Statoil and Shell pilots are using a solar-powered laser, created by Colorado-based Quanta3. The technology uses the Internet to provide real-time data analytics to wellsite managers via mobile devices or web portals.

Continuous Visibility, Faster Response

The oil and gas industry has a lot to gain from smart methane sensors that can prevent the loss of valuable product and reduce pollution.

Imagine a future where continuous leak detection systems allow operators to digitally monitor methane emissions occurring across thousands of sites. It’s a game-changer on the horizon. The burgeoning field of continuous methane monitoring offers a range of possibilities – including technologies capable of identifying emission spikes in real-time, allowing operators to cut mitigation time from months to days. Over time, smart sensors on wells may even help predict and prevent leaks and malfunctions before they occur.

Smart Methane Sensors Triggering New Market

The methane-sensing laser deployed by Shell and Statoil is one of many technologies in the emerging methane mitigation industry. In North America alone, more than 130 companies provide low-cost methane management technologies and services to oil and gas customers – a number likely to expand as innovators innovate, pollution requirements tighten, and producers increasingly appreciate the urgency of dealing with methane to maintain their social license to operate.

Smart automation technologies are already being used across the oil and gas industry to improve operating and field efficiencies. Continuous methane detection technology is the next logical step, which has the potential to provide significant economic, environmental and societal benefits.

The Shell pilot is a milestone to celebrate and we recognize the company for its early leadership. Now, we need governments and industry to show the determination needed to meet the methane challenge head-on. Sustained leadership is a prerequisite. But the keys to solving this problem are smart policies that incentivize ongoing innovation, and clear methane reduction goals—supported by technologies like continuous monitoring.

Image source: Shell/Ian Jackson

Read more