We already know which grid fixes can keep lights on during bad storms. Here are 3.

By Ronny Sandoval

After a record-breaking hurricane season and catastrophic wildfires in California, the vulnerabilities of our electric system – and the urgent need to upgrade it – have never been clearer.

It took more than 10 days of around-the-clock work to restore electricity to 350,000 customers after fires struck California wine country last month. Returning service to all 4.4 million power customers in Florida after Hurricane Irmatook almost as long – and 70 percent of Puerto Ricans still lack power six weeks after Hurricane Maria.

Such crippling outages contribute to $250 billion in economic losses globally every year.

But there are solutions available on the market today that can reduce the impact of these outages. By investing in technologies that modernize our electric grid, and with careful planning, we can also create a cleaner and more efficient electricity system overall.

Here are three cost-effective investments in this “modern grid” that could keep the lights on for more people during future storms.

1. Distributed energy systems improve resiliency

A modern grid makes how we make, move, and use electricity easier, more efficient, and cleaner.

Microgrids, for example, can go a long way to improve the resiliency of our electric system. These specialized energy systems can serve a defined area, regardless of whether the main power grid is active, through the use of batteries and localized generation such as solar, wind and other renewable sources.

A solar microgrid system under contruction. Photo source: U.S. Army Corps of Engineers

We saw the value of microgrids first-hand after Hurricane Sandy five years ago. During that disaster, microgrid systems kept the power flowing in spots stretching from Maryland to Manhattan – including at Princeton University’s sprawling campus in New Jersey when much of the Garden State was blacked out.

Today, there are about 160 microgrids operating in the United States, and the capacity of such systems is expected to double between 2016 and 2020.

These systems can be complex, however, and some of the rules governing the energy provided by such sites have yet to catch up with advances in technology or society’s need. It explains why they have not been adopted at the scale required to meet our demand for resiliency.

More can be done at the state and local level to revise these outdated rules and provide education and technical support to customers and facilities that could most benefit from microgrids.

2. New technologies detect outages faster

The electric system has been around for a very long time, but has yet to fully harness many of the same digital tools that have modernized internet-based communications.

Sensors, controls and other advanced technologies can be used to remotely detect and manage outages with better accuracy and results. What’s more, many of these same solutions can reduce energy waste by making the delivery of electricity to homes and businesses more efficient.

After a few initial pilots in 2013, Duke Energy expanded some of these energy saving applications to cover nearly all of its electrical circuits in Ohio. The utility has reliably reduced its voltage levels by 2 percent, cutting waste and saving customers money while reducing pollution.

We think more utilities could stand to gain from such tools.

3. A redesigned and measured grid is stronger

To build a more resilient grid, utilities must work together with their customers to identify conditions that make their systems vulnerable and to develop practicable solutions.

Together with local leaders, consumer advocates, environmental groups and other stakeholders, a system can be designed to meet the area’s unique needs and to quickly bounce back after disruptions.

That can mean burying power lines underground and reaching an agreement with power customers and regulators on how to pay for the associated capital costs – or replacing wooden poles with steel or other materials where burying lines isn’t the right solution.

But utilities must also measure their own success. Without knowing how grid upgrades have prevented or limited outages, they’ll have difficulty knowing which investments have the biggest impact.

Our society’s reliance on electricity has grown over the last several decades, but so have the sources of potential outages. A more robust planning process and upgrades that fully account for these changing system conditions will help us establish the sustainable and resilient electric system we need.

Photo source: Flickr/Oran Viriyincy

Read more

Superstorms: America’s new normal?

By Ellen Shenette

This year, the Atlantic basin had eight consecutive storms develop—the first time in 124 years. The storms—and by storms I mean big storms—have had catastrophic effects on families, communities and the economy at large. Millions of people were left powerless, access to clean drinking water was compromised and homes were destroyed. It will take decades for the country to recover from this devastation, and hurricane season is only halfway over.

And as the intensity of these storms increases, so do their price tags. Together, hurricanes Harvey, Irma and Maria, which hit the U.S. earlier this fall, are estimated to cost $150-$200 billion in combined destruction. This is an enormous blow to the economy and to tax payers’ wallets.

To those of us on the east coast, this sounds awfully similar to destruction caused by Hurricane Sandy, which hit New York City and New Jersey hard this time five years ago. That’s why it’s important to ask: could the devastation have been avoided, or at least reduced?

Cities are building back, stronger

This year’s storms showed the vulnerability of centralized electric grids, and the need for a modernized system. When Hurricane Maria hit Puerto Rico, 100 percent of the territory’s power supply was cut off to its 1.57 million customers and 80 percent of the transmission and distribution system destroyed. The rebuilding and repairs will go on for months.

Manager, EDF Climate Corps

Fortunately, changes to our current electrical system can prevent some of this. Microgrids–localized power grids that can operate independently from the main, centralized grid–are designed to provide power when the traditional grid is not functional. The ability to act autonomously strengthens the power system’s reliability and resilience, and protects critical infrastructure, such as hospitals, water treatment facilities and police stations in event of an extreme weather event like we saw this year.

They can also be “clean” by adding renewables like solar and wind that reduce reliance on fossil fuels and diesel generators—which may be compromised in a major storm event. And, since microgrids are not transmitting electricity over long distances, they don’t require an extensive network of transmission lines, allowing them to get up and running soon after a storm hits.

In the aftermath of Hurricane Sandy, the City of Hoboken announced it would use recovery funds to design a microgrid that would increase its sustainability and resiliency. With the help of EDF Climate Corps, the city of Hoboken created a Microgrids Toolkit. The toolkit includes a centralized dashboard for monitoring energy use, a timeline for implementation and scorecard for tallying up the potential benefits. It’s a customizable tool that can be used to scale projects across cities, creating a more resilient coastline.

The toolkit is currently being used to study the potential for additional microgrids in other municipalities across the state.

Spotlight: NYC taking action

Now New York is jumping on the microgrid train. During Sandy, nearly 2 million customers lost power and $19 billion in damages incurred. Some of the most important utility infrastructure on the city’s waterfront was destroyed, shedding light on the city’s large vulnerability issue. The New York Governor’s Office of Storm Recovery, in charge of coordinating statewide recovery efforts for Superstorm Sandy and other major weather events, has vowed to not only build back, but build back stronger.

The agency, in coordination with NYSERDA, is working on a community Microgrids Program that would mitigate future instances of power outages. EDF Climate Corps fellow Ben Bovarnick was hired to develop up to five microgrids in municipalities to demonstrate the feasibility and best practices for publicly-financed projects. The program had started to identify optimal projects, but sought new ideas for integrating advanced energy technologies and renewable energy into the projects.

Bovarnick identified opportunities for energy storage to reduce electricity demand and provide cost savings as well as improved system stability. Battery storage will also help with peak shaving and back up electricity, which has the potential to improve the project value and save the municipality on annual electricity expenditures.

Prepare, not react

The truth is, these superstorms are likely to continue, and their severity may increase. That means preparedness is key. States and municipalities, especially on the coast, must work together to create a comprehensive framework to tackle resiliency. Our efforts and finances should be invested in developing solutions that prevent extreme devastation, as opposed to cleaning up the aftermath.

Sandy was our wakeup call. But although post-Sandy protection-projects were introduced, there’s still a long way to go. Fortunately, the technology is out there to make resiliency a reality.


Follow Ellen on Twitter, @ellenshenette


Stay on top of the latest facts, information and resources aimed at the intersection of business and the environment. Sign up for the EDF+Business blog. [contact-form-7]


 

Read more

Little follow-up when FDA finds high levels of perchlorate in food

By Tom Neltner

Tom Neltner, J.D.is Chemicals Policy Director and Maricel Maffini, Ph.D., Consultant

FDA’s apparent lack of follow-up when faced with jaw-dropping levels of a toxic chemical in food is disturbing.

For more than 40 years, the Food and Drug Administration (FDA) has conducted the Total Diet Study (TDS) to monitor levels of approximately 800 pesticides, metals, and other contaminants, as well as nutrients in food. The TDS’s purposes are to “track trends in the average American diet and inform the development of interventions to reduce or minimize risks, when needed.” By combining levels of chemicals in food with food consumption surveys, the TDS data serve a critical role in estimating consumers’ exposure to chemicals.

From 2004 to 2012 (except for 2007), FDA collects and tests about 280 food types for perchlorate, a chemical known to disrupt thyroid hormone production. This information is very important, because for the many pregnant women and children with low iodine intake, even transient exposure to high levels of perchlorate can impair brain development.

The agency published updates on food contamination and consumers’ exposure to perchlorate in 2008 (covering years 2004-2006) and in 2016 (covering 2008-2012). On its Perchlorate Questions and Answers webpage, FDA says it found “no overall change in perchlorate levels across foods” in samples collected between 2008 and 2012 compared to those collected between 2005 and 2006. It also notes that there were higher average levels in some food and lower in others between the time periods and suggests that a larger sampling size or variances in the region or season when the samples were collected may account for the differences.

FDA’s Q&A webpage masks the most disturbing part of the story

FDA’s attempt at providing consumers with information about the presence of a toxic chemical in food and what it means for their health falls short. By focusing on the similar average level of perchlorate across foods, FDA masks the disturbing fact that children are consuming increasing amounts of perchlorate: 35% for infants, 23% for toddlers and 12% for children between 2 and 6 from 2004-2006 to 2008-2012. The agency’s webpage notes the exposures in 2008-2012 but fails to mention the increase reported by its own scientists.

FDA’s webpage also fails to mention that the perchlorate Reference Dose (RfD), the amount of exposure unlikely to result in an adverse health effect, was set more than a decade ago and does not reflect the latest research. In 2013, the Environmental Protection Agency (EPA)’s Science Advisory Board indicated that the RfD may not sufficiently protect pregnant women and young children. As a result, EPA, with FDA’s support, is in the final stages of updating that number. FDA’s estimated average exposures to perchlorate by infants and toddlers was very close to the RfD, and many children may well exceed the outdated RfD.

FDA failed to investigate extraordinarily high levels of perchlorate

In the 2016 update, we noticed that some foods, such as bologna, salami and rice cereal, had very high levels, far greater than in 2004-2006. In May 2017, FDA provided more data that enabled us to learn what region and year the samples were collected.

We knew from FDA’s Compliance Program Guidance Manual that staff are asked to follow-up on any unusual analytical findings in the TDS samples (Figure 1). So in June 2017, we filed a Freedom of Information Act (FOIA) request seeking details for 47 composite samples (18 baby foods and 29 other foods) that, in our view, had unusually high levels of perchlorate. We asked for any information regarding:

  • Test results for composite and individual samples (Figure 2);
  • Product details on the individual foods; and
  • Any follow-up testing or investigations.

FDA’s response to the FOIA raises concerns. First, the agency did not have criteria for following up on perchlorate. Specifically, staff requested guidance “on cut-off value of perchlorate in the composite that would trigger analysis of individual foods,” but we didn’t find any in the documents provided to us. Without this guidance, some staff listed composite samples with at least 20 ppb as “noteworthy;” others used 15 ppb as the trigger for re-testing and confirmation. All 47 samples we asked FDA about were above these levels. Yet, there was no retesting of the composite for 27 of them.

Second, staff confirmed the results for 20 composite samples which should have triggered tests on the three individual samples that made up the composite.

But FDA only provided information for four individual foods:

  • Baby food rice cereal (173 ppb in composite in summer of 2008): Newark sample had 252 ppb; New York City had 112 ppb; and Philadelphia had 3 ppb.
  • Baby food carrots (74 ppb in composite in winter of 2011): Denver sample had 163 ppb; Los Angeles sample had 111 ppb; and Seattle sample had 0 ppb.
  • Baby food barley cereal (67 ppb in composite in spring of 2008): Dallas had 182 ppb; and Tampa and Baltimore had 1 ppb.
  • Baby food oatmeal with fruit (42 ppb in composite in summer of 2008): New York City had 82 ppb; Newark had 3 ppb; and Philadelphia had 0 ppb.

Even more disturbing is that, despite confirming the results in the composite samples, FDA could find no records that the individual samples in the highest levels, bologna with 1557 ppb, 1,090 and 686 ppb for collard greens and 686 ppb for salami lunchmeat, were ever tested.

Similarly, the agency could not find any records of the TDS coordinator investigating the possible cause of the high levels of contamination or any communication between the coordinator and others in the Center for Food Science and Applied Nutrition that oversees the TDS. Also not available were the receipts essential to identifying the brand and lot of the samples with unusual high levels of perchlorate; therefore EDF was unable to follow-up.

Conclusion

The agency’s apparent lack of follow-up when faced with jaw-dropping levels of a toxic chemical in food is disturbing. While testing the products is a critical first step, FDA needs to investigate the reasons for the high levels so that it can protect the food supply. Identifying the cause and crafting interventions to reduce or minimize risks, as stated in the TDS purpose statement, is especially important for toxics like perchlorate, where even short exposures during critical life stages may cause lasting harm to a child’s developing brain. But for this to happen, FDA needs to acknowledge the evidence on sources of perchlorate in food, such as its use in packaging or from degraded bleach, and clearly explain what the implications are for the health of children and pregnant women, especially those with low iodine intake.

Update on related perchlorate issue: FDA has not yet responded to EDF’s and eight other public interest organizations’ objection to the agency’s May 4, 2017 decision to reject a petition to ban perchlorate from uses in contact with food. The objection and request for evidentiary hearing were filed on June 4, 2017.

Read more

Everything you need to know about climate tipping points

By Casey Ivanovich

(This post was co-authored by EDF Climate Scientist Ilissa Ocko)

Imagine cutting down a tree. Initially, you chop and chop … but not much seems to change. Then suddenly, one stroke of the hatchet frees the trunk from its base and the once distant leaves come crashing down.

It’s an apt metaphor for one of the most alarming aspects of climate change – the existence of “tipping elements.”

These elements are components of the climate that may pass a critical threshold, or “tipping point,” after which a tiny change can completely alter the state of the system. Moving past tipping points may incite catastrophes ranging from widespread drought to overwhelming sea level rise.

Which elements’ critical thresholds should we worry about passing thanks to human-induced climate change?

You can see the answer on this graphic – and find more information below.

The most immediate and most worrisome threats

  • Disappearance of Arctic Summer Sea Ice – As the Arctic warms, sea ice melts and exposes dark ocean waters that reflect sunlight much less efficiently. This decreased reflectivity causes a reinforcement of Arctic warming, meaning that the transition to a sea-ice free state can occur on the rapid scale of a few decades. Some scientists have suggested that we have already passed this tipping point, predicting that Arctic summers will be ice-free before mid-century.
  • Melting of the Greenland Ice SheetThe Arctic warming feedback described above may one day render Greenland ice-free. Research predicts that the tipping point for complete melt can occur at a global temperature rise of less than two degrees Celsius – a threshold that may be surpassed by the end of this century. While the full transition to an ice-free Greenland will take at least a few hundred years, its impacts include global sea level rise of up to 20 feet.
  • Disintegration of the West Antarctic Ice Sheet – The bottom of this ice sheet lies beneath sea level, allowing warming ocean waters to slowly eat away at the ice. There is evidence that this tipping point has already been surpassed – possibly as early as 2014. Like the Greenland Ice Sheet, full collapse would require multiple centuries, but it could result in sea level rise of up to 16 feet.
  • Collapse of Coral ReefsHealthy corals maintain a symbiotic relationship with the algae that provide their primary food source. As oceans warm and become more acidic, these algae are expelled from the corals in an often fatal process called coral bleaching. Research predicts that most of our remaining coral systems will collapse even before a global temperature rise of two degrees Celsius.

Tipping points in the distant future

  • Disruption of Ocean Circulation Patterns – The Thermohaline Circulation is driven by heavy saltwater sinking in the North Atlantic, but this water is becoming fresher and lighter as glaciers melt in a warming climate. The change in water density may prevent sinking and result in a permanent shutdown of the circulation. Research suggests that weakening of the Thermohaline Circulation is already in progress, but that an abrupt shutdown is unlikely to occur in this century. Some models suggest that these changes may prompt a secondary tipping element in which the subpolar gyre currently located in the Labrador Sea shuts off. Such a change would dramatically increase sea level, especially on the eastern coast of the United States.
  • Release of Marine Methane HydratesLarge reservoirs of methane located on the ocean floor are stable thanks to their current high pressure-low temperature environment. Warming ocean temperatures threaten the stability of these greenhouse gas reservoirs, but the necessary heat transfer would require at least a thousand years to reach sufficient depth, and may be further delayed by developing sea level rise.
  • Ocean AnoxiaIf enough phosphorous is released into the oceans – from sources including fertilizers and warming-induced weathering, or the breakdown of rocks –regions of the ocean could become depleted in oxygen. However, this process could require thousands of years to develop.

Potentially disastrous elements, but with considerable uncertainty

  • Dieback of the Amazon Rainforest Deforestation, lengthening of the dry season, and increased summer temperatures each place stress on rainfall in the Amazon. Should predictions that at least half of the Amazon Rainforest convert to savannah and grasslands materialize, a considerable loss in biodiversity could result. However, the dieback of the Amazon Rainforest ultimately depends on regional land-use management, and on how El Niño will influence future precipitation patterns.
  • Dieback of Boreal Forests – Increased water and heat stress could also lead to a decrease in boreal forest cover by up to half of its current size. Dieback of boreal forests would involve a gradual conversion to open woodlands or grasslands, but complex interactions between tree physiology, permafrost melt, and forest fires renders the likelihood of dieback uncertain.
  • Weakening of the Marine Carbon Pump – One mechanism through which oceanic carbon sequestration takes place is the marine carbon pump, which describes organisms’ consumption of carbon dioxide through biological processes such as photosynthesis or shell building. As ocean temperatures rise, acidification progresses, and oxygen continues to be depleted, these natural systems could be threatened and render the carbon sequestration process less efficient. More research is necessary in order to quantify the timescale and magnitude of these effects.

Tipping elements complicated by competing factors

  • Greening of the Sahara/Sahel As sea surface temperatures rise in the Northern Hemisphere, rainfall is projected to increase over the Sahara and Sahel. This increased rainfall would serve to expand grassland cover in the region, but is balanced by the cooling effect of human-emitted aerosols in the atmosphere.
  • Chaotic Indian Summer MonsoonThe fate of the Indian Summer Monsoon similarly depends upon a balance of greenhouse gas warming and aerosol cooling, which strengthen and weaken the monsoon, respectively. On the timescale of a year, there is potential for the monsoon to adopt dramatic active and weak phases, the latter resulting in extensive drought.

More research necessary to establish as tipping elements

  • Collapse of Deep Antarctic Ocean CirculationAs in the case of the Thermohaline Circulation, freshening of surface waters in the Southern Ocean due to ice melt may slowly alter deep water convection patterns. However, the gradual warming of the deep ocean encourages this convection to continue.
  • Appearance of Arctic Ozone HoleUnique clouds that form only in extremely cold conditions currently hover over Antarctica, serving as a surface for certain chemical reactions and facilitating the existence of the ozone hole. As climate change continues to cool the stratosphere, these “ice clouds” could begin formation in the Arctic and allow the development of an Arctic ozone hole within a year.
  • Aridification of Southwest North America As global temperatures rise, consequential changes in humidity prompt the expansion of subtropical dry zones and reductions in regional runoff. Models predict that Southwest North America will be particularly affected, as moisture shifts away from the southwest and into the upper Great Plains.
  • Slowdown of the Jet Stream A narrow and fast moving air current called a jet stream flows across the mid-latitudes of the northern hemisphere. This current separates cold Arctic air from the warmer air of the south and consequentially influences weather in its formation of high and low pressure systems. A slowing of the jet stream has been observed over recent years. Should slowing intensify, weather patterns could persist over several weeks with the potential to develop into extended extreme weather conditions.
  • Melting of the Himalayan Glaciers – Several warming feedbacks render the Himalayan glaciers vulnerable to dramatic melt within this century, though limitations on data availability complicate further study. Dust accumulation on the mountainous glaciers and the continual melt of snow and ice within the region both prompt a decrease in sunlight reflectivity and amplify regional warming.

Gradual, continuous changes

  • More Permanent El Nino State90 percent of the extra heat trapped on Earth’s surface by greenhouse gases is absorbed by the oceans. Though still under debate, the most likely consequence of this oceanic heat uptake is a gradual transition to more intense and permanent El Nino/Southern Oscillation (ENSO) conditions, with implications including extensive drought throughout Southeast Asia and beyond.
  • Permafrost MeltingAs global temperatures rise and the high latitudes experience amplified warming, melting permafrost gradually releases carbon dioxide and methane into the atmosphere and creates a feedback for even more warming.
  • Tundra Transition to Boreal Forest – Much like the conversion of the Amazon Rainforest and boreal forests to other biomes, tundra environments may transition into forests as temperatures increase. However, this process is more long-term and continuous.

With a range of critical thresholds on the horizon, each tipping element demonstrates the potential implications of allowing climate change to progress unchecked.

As tipping points loom ever closer, the urgency for emissions mitigation escalates in hopes of sustaining the Earth as we know it.

Read more

Large gas buyers set environmental performance indicators for how gas is produced

By Mark Brownstein

Co-authored by Beth Trask

Utilities who deliver gas to homes and businesses, and/or generate electricity from gas, are important stakeholders along the natural gas supply chain. They are the face of natural gas to their customers; and, thus, they need to know that the gas they sell is being produced in the most responsible and transparent way possible—one in which the impacts to the air, water, and communities are minimized.

This week, some of the nation’s largest gas buyers joined forces in a new voluntary coalition, the Natural Gas Supply Collaborative (NGSC). Together, they released a set of 14 performance indicators—spanning air, water, chemicals and community/worker safety—that they’d like to see natural gas companies report on publically on an annual basis.

Developed in consultation with environmental NGOs, including EDF, and with input from a handful of gas company representatives, these indicators are positive step toward a more transparent gas supply chain in which buyers and sellers can have informed dialogue about how gas is being produced.

We encourage more large gas buyers to join the coalition and get involved in this conversation.

Customers are watching

There are nine participants in the NGSC, including Austin Energy, NRG Energy, and Pacific Gas and Electric Company. Combined, these nine companies deliver enough natural gas to meet the needs of more than 36 million households and business customers. They supply enough electricity from natural gas to power over 17 million U.S. homes annually.

Thus these large gas buyers and others like them have influence in the marketplace. And when they enter into contracts with oil and gas companies, they have an obligation to discuss how air and water pollution is being minimized and how the well-being of communities and workers is being protected.

As with all supply chain management work, these conversations begin with transparency—the foundation for the NGSC’s 14 indicators.

Better reporting on methane

About one-quarter of the warming we experience today is caused by methane from human activities, and the oil and gas industry is one of the largest human-caused methane sources on the planet.

Recent data from the International Energy Agency shows that at least 75% of global oil and gas methane emissions can be cut cost-effectively with existing technology.

The NGSC performance indicators prioritize the importance of methane emission reduction to a producer’s social license to operate. They suggest that companies disclose their methane emissions, in total and by intensity. Suggested remediation techniques include developing leak detection and repair (LDAR) practices and schedules; developing methane reduction goals; reducing flaring and venting practices; and participating in the field testing of new technologies designed to detect leaks.

Leading practices on wastewater management

The report also shines a spotlight on aspects of production relating to water, chemical use, community, and safety that would benefit from enhanced transparency and action. Of particular note are a number of performance measures related to water and wastewater management. Wastewater, or “produced water,” in particular can pose serious risks to water and land resources surrounding operations if spills and leaks occur, and even if wastewater is treated onsite and intentionally released to nearby waterways or fields without proper controls. This is due to not only the high salinity of wastewater, but also hundreds of potentially toxic pollutants that may be present.

NGSC calls for gas companies to report on the number and volume of wastewater spills each year and to disclose their strategies for managing and ultimately disposing of wastewater. Significantly, the report notes that the emerging practice of reusing wastewater in novel ways—for example selling wastewater to farmers to irrigate crops—is a concern. Before embarking on wastewater reuse, producers should participate in research to better understand the environmental and public health risks.

Read more

Pruitt takes steps to remove science from decisions affecting the health of American families

By Sarah Vogel

Today EPA Administrator Scott Pruitt announced additions to the Agency’s Scientific Advisory Board (SAB) and the Clean Air Scientific Advisory Committee (CASAC). Taken in conjunction with the drastic policy shift also announced today, Pruitt is set to fundamentally undercut the role science in driving EPA decisions that directly affect the health and safety of American families and communities.

The new policy would exclude any scientist receiving an EPA grant from serving on any of the agency’s advisory panels. This creates a profound hypocrisy: under the policy scientists who take money from ExxonMobil or even Russia—since funding from other governments wouldn’t be disqualifying—Pruitt would regard as trusted to offer impartial advice. Meanwhile, those who have grants from the US environmental agency – whose research program was praised by the National Academy of Sciences in a report just this past summer – cannot.

In Pruitt’s Alice-in-Wonderland world, the EPA advisory panels intended to ensure the agency is making use of the best and latest science should be populated overwhelmingly by industry-affiliated scientists, at the expense of independent academic scientists.

Along with the policy, Pruitt’s new appointments to the SAB and CASAC (see below) include longtime fossil fuel and chemical industry advocates, who have consistently played down or outright dismissed concerns about the risks of pollution or toxic chemical exposures based on discredited and outrageous scientific claims. Although the SAB is supposed to “provide independent advice and peer review on the scientific and technical aspects of environmental issues to the EPA’s Administrator,” these additions cannot be relied upon to faithfully uphold the Board’s mission.

Meanwhile, Pruitt also took the unprecedented step of not renewing any appointments for members whose terms expire this year. This allows Pruitt to reshape the panel in his own image more quickly.

All told, the goal is as clear as it is concerning: to create a rubber-stamp set of scientific advisers that can distort the science while still lending an aura of credibility to Pruitt’s destructive actions at the Agency.

The real losers are not the researchers, but rather American families who depend on having an agency that actually works to protect their health.

Meet some of Mr. Pruitt’s new science advisers

Texas official with a long record of downplaying health concerns about pollutants and toxic chemicals ranging from ozone to benzene. Honeycutt argued against stronger ozone standards by noting most people spend their days indoors. He also claimed that “some studies even suggest that PM [particulate matter] makes you live longer.”

  • Dr. Tony Cox (Named to chair the CASAC)

Denver-based consultant with long track record of conducting research that disputes the public health benefits of reducing air pollution. Cox has stated that there is “no evidence that reductions in air pollution levels have caused any reductions in mortality rates.”

Record of disputing the benefits of clean air and air pollution limits; said that “Modern air … is a little too clean for optimum health.”

Professor at NC State affiliated with the climate-denying Heartland Institute, who claims that the “evidence is overwhelming” that if temperatures do increase, it will be “better for humans.”

Former Secretary of North Carolina Department of Environmental Quality (NCDEQ), who questions the well-established scientific consensus of climate change and, had a controversial tenure at the agency, notably over health advisories to well owners whose water might have been contaminated by coal ash.

Smith is a Managing Director of NERA Economic Consulting and co-head of its environmental practice. In work funded by the fossil fuel industry trade group the American Petroleum Institute, Smith argued that EPA data on lung response to ozone is imprecise, roundly debunked by policy experts and independent fact-checkers.

Read more