Protecting the Endangered Species Act Protects People and Nature
The Senate Committee on Environment and Public Works held a hearing at 10 a.m. on Wednesday, February 15, 2017 to[…]
Read moreDedicated To People, The Planet, and All Its Inhabitants – Since 1996
The Senate Committee on Environment and Public Works held a hearing at 10 a.m. on Wednesday, February 15, 2017 to[…]
Read moreBy a narrow vote of 52 to 46, the Senate confirmed Oklahoma Attorney General Scott Pruitt to lead the Environmental Protection[…]
Read moreTogether, we worked to put in place first-ever limits on the dangerous climate pollution that the oil & gas industry is leaking, venting, and flaring on our public and tribal lands. Now, all of that is at risk. C4.
Read moreBy Tom Neltner
Tom Neltner, J.D., is Chemicals Policy Director
On January 19, the Environmental Protection Agency (EPA) released a major new draft report proposing three different approaches to setting health-based benchmarks for lead in drinking water. We applauded EPA’s action and explored the implications for drinking water in a previous blog. One of the agency’s approaches provides useful, and surprising, insights into where the lead that undermines the health of our children comes from. Knowing the sources enables regulators and stakeholders to set science-based priorities to reduce exposures and the estimated $50 billion that lead costs society each year.
The EPA draft report is available for public comments until March 6, 2017, and it is undergoing external peer-review by experts in the field in support of the agency’s planned revisions to its Lead and Copper Rule (LCR) for drinking water. Following this public peer-review process, EPA expects to evaluate and determine what specific role or roles a health-based value may play in the revised LCR. With the understanding that some of the content may change, here are my takeaways from the draft:
For a visual look at the data, we extracted two charts from the draft EPA report (page 81) that show the relative contribution of the four sources of lead for infants (0-6 month-olds) and toddlers (1 to <2 year-olds) considered by the agency. The charts represent national exposure distributions and not specific geographical areas or age of housing.
There is no safe level of lead in children’s blood. Even low levels are likely to impair brain development, contributing to learning and behavioral problems and lower IQs. Based on data from a decade ago, lead costs society a collective 23 million IQ points and $50 billion every year. EPA’s draft analysis provides useful information to help set priorities to continue and accelerate progress in reducing childhood lead exposure.
Based on the analysis by EPA’s scientists, we conclude that reducing lead contamination in food needs to be a greater priority for FDA and with food manufacturers. Lead in paint remains a challenge, but studies show that every dollar invested yields $17 in return to society. And, as the Flint experience has shown, lead service lines can present a significant source of lead exposure.
Read moreThe U.S. Army Corps of Engineers has announced that it will grant the easement required to complete construction of the[…]
Read moreA new economic study released by Environmental Defense Fund and WestWater Research shows that Alternative Transfer Methods (ATMs) are cost[…]
Read moreBy Scott Weaver
As a climate scientist who is trained to base his conclusions strictly on scientific evidence and not politics, I find it particularly troubling that Scott Pruitt, President Trump’s pick to head the U.S. Environmental Protection Agency (EPA), is misrepresenting the scientific data that shows the earth’s atmosphere is warming.
Pruitt hopes to run the agency responsible for protecting the lives and health of Americans from environmental threats, and that includes reducing greenhouse gas emissions that are warming the planet. And as the Supreme Court has ruled, EPA has the authority to address greenhouse gases.
However, in his testimony before the Senate Environment and Public Works Committee on January 18, and then in follow-up written answers to Senators, Pruitt made several misleading, or flat-out inaccurate, statements.
In his attempt at subterfuge, Pruitt leaned on false and misleading climate-skeptic myths that have been debunked time and time again.
For instance, consider this one question and answer:
Written question from Sen. Jeff Merkley: Are you aware that each of the past three decades has been warmer than the one before, and warmer than all the previous decades since record keeping began in the 1880s? This trend is based on actual temperature measurements. Do you believe that there is uncertainty in this warming trend that has been directly measured? If so, please explain.
Written answer from Scott Pruitt: I am aware of a diverse range of conclusions regarding global temperatures, including that over the past two decades satellite data indicates there has been a leveling off of warming, which some scientists refer to as the “hiatus.” I am also aware that the discrepancy between land-based temperature stations and satellite temperature stations can be attributed to expansive urbanization within in our country where artificial substances such as asphalt can interfere with the accuracy of land-based temperature stations and that the agencies charged with keeping the data do not accurately account for this type of interference. I am also aware that ‘warmest year ever’ claims from NASA and NOAA are based on minimal temperature differences that fall within the margin of error. Finally, I am aware that temperatures have been changing for millions of years that predate the relatively short modern record keeping efforts that began in 1880. (Questions for the Record, page 145)
In response to the scientific evidence that the last three decades have each been warmer than the one before it, Mr. Pruitt offered negligent claims that both the satellite data and surface based observations have shown there to be no warming over the last two decades – the so-called global warming hiatus.
Science does not agree with this assessment.
The idea of a hiatus and a potential discrepancy between satellite and surface based data have been under intense objective scrutiny by the scientific community for some time – and the results are in:
Next, in the same answer, in what can only be described as countering his own misguided narrative, Pruitt attempted to blame the increasing temperature trend – which he just stated did not exist via the hiatus argument – on an unfounded discrepancy between satellite based and urban land based data.He claimed the increase in urbanization was causing a fictitious rise in global temperature – an impact long shown to be minimal at best, especially when applied to the massive geographic expanse of the world relative to the lesser change in the geographic extent of cities.
Pruitt went on to quibble with the fact that 2016 was the warmest year ever recorded, by overemphasizing the role of negligible differences in how various scientific agencies around the world calculate the globally averaged temperature.
Actually, the diversity of approaches is a scientific strength, because it provides a balanced view of the data – much like seeking a second opinion on a medical diagnosis. It’s vital to note that despite these trivial differences in methodology, the three long-running analyses by NASA, NOAA, and Great Britain’s UK Met Office all showed 2014 to 2016 to be the three consecutive warmest years on record. This fact is indisputable.
Pruitt concluded his misdirection by pointing out his awareness that temperatures have been changing for millions of years, and predating the relatively short modern record. Mr. Pruitt is indeed correct that the rapid warming in recent decades is quite alarming in the context of the much slower and longer term natural changes – although I don’t think that was what he was trying to say.
Pruitt seemed unaware of the latest scientific evidence on the various topics he chose to explore during his testimony. That indicates an ignorance of science coupled with a lack of preparation which adds up to being unfit to lead a scientifically-based government agency.
Read moreBy EDF Blogs
By Jonathan Camuzeaux, manager, Economics & Policy Analysis
Risky Business Report Finds Transitioning to a Clean Energy Economy is both Technologically and Economically Feasible
In the first Risky Business report, a bi-partisan group of experts focused on the economic impacts of climate change at the country, state and regional levels and made the case that in spite of all that we do understand about the science and dangers of climate change, the uncertainty of what we don’t know could present an even more devastating future for the planet and our economy.
The latest report from the Risky Business Project, co-chaired by Michael R. Bloomberg, Henry M. Paulson, Jr., and Thomas F. Steyer, examines how best to tackle the risks posed by climate change and transition to a clean energy economy by 2050, without relying on unprecedented spending or unimagined technology. The report focuses on one pathway that will allow us to reduce carbon emissions by 80 percent by 2050 through the following three shifts:
1. Electrify the economy, replacing the dependence on fossil fuels in the heating and cooling of buildings, vehicles and other sectors. Under the report’s scenario, this would require the share of electricity as a portion of total energy use to more than double, from 23 to 51 percent.
2. Use a mix of low- to zero-carbon fuels to generate electricity. Declining costs for renewable technologies contribute in making this both technologically and economically feasible.
3. Become more energy efficient by lowering the intensity of energy used per unit of GDP by about two thirds.
New Investments Will Yield Cost Savings
Of course, there would be costs associated with achieving the dramatic emissions reductions, but the authors argue that these costs are warranted. The report concludes that substantial upfront capital investments would be offset by lower long-term fuel spending. And even though costs would grow from $220 billion per year in 2020 to $360 billion per year in 2050, they are still likely far less than the costs of unmitigated climate change or the projected spending on fossil fuels. They’re also comparable in scale to recent investments that transformed the American economy. Take the computer and software industry, which saw investments more than double from $33 billion in 1980 to $73 billion in 1985. And those outlays continued to grow exponentially—annual investments topped $400 billion in 2015. All told, the United States has invested $6 trillion in computers and software over the last 20 years.
The US Could Lead the Next Tech Revolution by Investing in Clean Energy
Click To Tweet
This shift would also likely boost manufacturing and construction in the United States, and stimulate innovation and new markets. Finally, fewer dollars would go overseas to foreign oil producers, and instead stay in the U.S. economy.
The Impact on American Jobs
The authors also foresee an impact to the U.S. job market. On the plus side, they predict as many as 800,000 new construction, operation and maintenance jobs by 2050 would be required to help retrofit homes with more efficient heating and cooling systems as well as the construction, operation and maintenance of power plants. However, they would be offset by a nearly 34 percent loss in the coal mining and oil and gas sectors, by 2050, mainly concentrated in the Southern and Mountain states. As we continue to grow a cleaner-energy economy, it will be essential to help workers transition from high-carbon to clean jobs and provide them with the training and education to do so.
A Call for Political and Private Sector Leadership
Such a radical shift won’t be easy, and both business and policy makers will need to lead the transition to ensure its success. First and foremost, the report asserts that the U.S. government will need to create the right incentives. This will be especially important if fossil fuel prices drop, which could result in increased consumption. Lawmakers would also need to wean industry and individuals off of subsidies that make high-carbon and high-risk activities cheap and easy while removing regulatory and financial barriers to clean-energy projects. They will also need to help those Americans negatively impacted by the transition as well as those who are most vulnerable and less resilient to physical and economic climate impacts.
First and foremost, the report asserts that the U.S. government will need to create the right incentives.
Businesses also need to step up to the plate by auditing their supply chains for high-carbon activities, build internal capacity to address the impacts of climate change on their businesses and put internal prices on carbon to help reduce risks.
To be sure, this kind of transformation and innovation isn’t easy, but the United States has sparked technological revolutions before that have helped transform our economy—from automobiles to air travel to computer software, and doing so has required collaboration between industry and policymakers.
We are at a critical point in time—we can either accelerate our current path and invest in a clean energy future or succumb to rhetoric that forces us backwards. If we choose to electrify our economy, reduce our reliance on dirty fuels and become more energy efficient, we will not only be at the forefront of the next technological revolution, but we’ll also help lead the world in ensuring a better future for our planet.
This post originally appeared on our Market Forces blog.
Read moreA new report unveiled in January by the National Academy of Sciences, Engineering and Medicine outlines steps to build on[…]
Read moreWith legislation flying fast and furious through the Capitol – much of it using new or unusual legal mechanisms – lawmakers today must be doubly mindful of unintended consequences. Case in point: Actions rushed through the House and Senate under an obscure law called the Congressional Review Act (CRA), the details of which can cause deeper, more lasting impact than the simple name implies.
The CRA dates to the 1990s. It says that any rule finalized by a federal agency can be subject to an expedited congressional repeal for 60 legislative days after the agency sends up a copy of the final rule and a report detailing the reasons for its promulgation. Within that window, either chamber can introduce a joint resolution of disapproval – which, if passed by both houses of Congress and signed by the president, effectively voids the rule.
The law sounds simple enough. But it leaves a lot of room for error or mischief.
The CRA enjoys fast-track privileges, allowing a CRA bill to go straight to the floor without committee hearings. A so-called resolution of disapproval can be brought up at any time, with little or no notice. In the Senate, passage requires only a simple majority (51 votes). The measures are not subject to filibuster.
Until January, only one such resolution had ever been passed and signed into law, and the CRA has never been tested in court. But now there are at least 10 different CRA actions moving through the House and Senate. It’s worth a close look at what those measures would really do.
Hands Tied
If the President signs the joint resolution, the agency rule is voided. What’s more, that agency is forever barred from issuing any rule that is “substantially the same” as the as the one voted down. And therein lies the most serious problem.
Because this vague provision has never been clarified by the courts, agencies will almost certainly hesitate to undertake a new rule on the same topic, no matter how serious and well founded the action might be or regardless of new information (this is exactly what happened to the Occupational Safety and Health Administration, the only agency so far to have a rule disapproved through the CRA).
In short, CRA disapproval is a drastic and extreme legislative move that shouldn’t be undertaken lightly by either political party.
CRA Attack on BLM Waste Rule Defies Logic
Take for example the CRA resolution passed last week by the House of Representatives, which would roll back a Bureau of Land Management rule requiring oil and gas companies operating on millions of acres of federal and tribal land to take cost-effective, common-sense steps to reduce nearly 110 billion cubic feet of taxpayer-owned natural gas they currently waste each year through leaks, venting, or simply burning it off (called flaring).
That gas is worth an estimated $330 million dollars annually — more than $1.5 billion since 2013 — and is enough to supply every home in a city the size of Chicago for a year. Besides squandering a valuable energy resource, the waste generates air pollution affecting the health of millions of Americans.
BLM’s much-needed and long-overdue standards to address this problem took years to craft, and reflect input received in over 300,000 public comments. Industry lobbyists have glibly suggested that a resolution of disapproval means that rule could somehow be sent back to the agency for a redo. But this is not how the CRA works. Lawmakers in that chamber need to understand this critical difference before they vote.
Waking Up to the Problem
The good news is lawmakers in both houses appear to be increasingly aware of the issues involved with the CRA. A resolution roll back the BLM rule passed the House on a vote of 221-191, with a record 11 Republicans voting no (and three Democrats voting yes).
The bill is could hit the Senate floor at any time. Before they vote, Senators should step back and understand that the CRA resolution offers up an axe in place of the scalpel that many are seeking, and weigh their decision accordingly. We need our lawmakers to stand up for their constituents – American taxpayers – to promote their interests over the needs of the oil and gas lobby.
Image source: Flickr/k3nna
Read more