Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘pollution

Scientists urge a change to the regulation of per- and poly-fluoroalkyl substances

leave a comment »

By Paige Bommarito, PhD

PFAS Foam in Van Ettan Lake in Oscoda by Department of Environment, Great Lakes, and Energy on Flickr

Scientists are making continued calls to change the way the United States regulates per- and poly-fluoroalkyl substances (PFAS). This effort is highlighted in a recently published article and subsequent commentary in Environmental Science and Technology Letters. Researchers, including Dr. Linda Birnbaum, the former director of the National Institute of Environmental Health Sciences, have stated that there is sufficient evidence to begin managing PFAS as a chemical class, rather than as individual chemicals. This is a departure from conventional chemical regulation, which relies on rigorous examination of one substance at a time to determine its impact on both the environment and human health. This process is slow and scientists fear that further delay will have consequences for public health.

In 1946, the chemical company DuPont first introduced PFAS to consumer products in the form of non-stick cookware coated in Teflon®. Since then, over 8000 different PFAS have been created, making them the largest known class of environmental pollutants. Not only are PFAS used to create non-stick coatings, they’re also common in water-proofing materials and stain repellants. They can be found in a wide array of consumer products, from cosmetics to fire-fighting foams. Despite their diverse uses in consumer products, all are characterized by the presence of a perfluoroalkyl moiety, or a carbon chain with multiple fluorine atoms bound to it.

Scientists are concerned about PFAS because they are highly persistent. The carbon-fluorine bond that characterizes these compounds is one of the strongest chemical bonds known, making them very stable. This trait has earned PFAS the nickname “Forever Chemicals.” However, it also means that once PFAS are released into the environment, they are unlikely to ever leave. Additionally, PFAS are highly mobile and move through the environment with ease. As a result, PFAS have been detected in some of the most remote places on Earth, including Antarctica.

This extreme chemical stability means that PFAS accumulate in the environment and become part of the food chain. In a limited study of drinking water samples in the United States, researchers at the United States Environmental Protection Agency estimated that 6 million Americans consumed drinking water that exceeded health advisories for several PFAS, including perfluorooctanesulfonic acid (PFOS) and perfluorooctanoic acid (PFOA), which have historically been used to manufacture products such as ScotchguardTM and Teflon®. More recently, the Environmental Working Group, who have compiled tap water testing information in their Tap Water Database, estimate that at least 110 million Americans rely on tap water that have detectable amounts of PFAS

The broad use of PFAS in consumer products and their persistence in the environment also means that humans are ubiquitously exposed to PFAS. Studies in the National Health and Nutrition Examination Survey, a representative sample of the US population, reported that over 98% of the population had detectable levels of multiple PFAS in their blood. These numbers are concerning because even low amounts of PFAS exposure have been linked to adverse health outcomes, including several kinds of cancers, reproductive disorders, impaired fetal growth and development, and immunosuppression. Concerningly, PFAS exposure has been shown to impair human response to vaccines and may even increase susceptibility to infectious diseases, such as severe COVID-19 illness.

Yet, progress towards understanding these chemicals has been hindered, in part, by the chemical companies that manufacture them. Because many companies either patent or claim their PFAS as “trade secrets,” scientists and other members of the public are often shielded from information about these substances. Without access to the chemicals themselves, scientists cannot study them. This lag in the ability to examine both the prevalence and toxicity of many PFAS has created an insurmountable disconnect between the reality of human exposure and the exposures being studied by environmental and public health scientists. In fact, out of the many thousands of PFAS used in manufacturing, only about 10 are regularly studied in humans due, in large part, to the inability of scientists to access the remaining substances.

Nevertheless, the evidence against many PFAS is strong and public awareness is growing. Many countries, including the United States, no longer manufacture PFOS or PFOA. Companies, including 3M and Chemours, have been phasing these so-called “legacy” PFAS out of their products over the past few decades. Several clothing companies, including Levi’s and H&M, have also recently announced they would be removing PFAS from their supply chain. Even McDonald’s has announced an initiative to remove all PFAS from food packaging by the year 2025. While this is good news, when some PFAS are removed from products, others are often created to take their place. The hope is that newly synthesized PFAS will be both less persistent and less toxic, but this is not always the case. When these novel replacement chemicals have similar toxicities as the original compounds, this is known as “regrettable substitution.” For example, there has been recent controversy in several North Carolina communities where significant contamination with GenX, a fluoropolymer created to replace PFOA, has been identified. Studies in the community to investigate potential health effects are ongoing.  

Despite the fact that scientists have examined less than 1% of the PFAS in use, their toxicity and persistence combined with the sheer size of their chemical class has prompted calls for regulatory agencies to begin managing PFAS as a group, rather than individually. This request has been met with pushback. Several scientists affiliated with Honeywell International, a company that manufactures fluorinated chemicals, responded with criticism. They insist that PFAS includes many different chemicals with distinct physical and chemical properties that require individual examination rather than blanket regulation. However, the original authors followed up these critiques claiming that, despite any differences between the individual chemicals, PFAS are all characterized by their persistence in the environment and that alone is enough to warrant tighter regulation of the entire chemical class.

In February, as a part of their PFAS Action Plan, the Environmental Protection Agency has announced it will begin regulating PFOS and PFOA in drinking water under the Safe Drinking Water Act. Earlier this year, the Food and Drug Administration also released a statement with the intent to closely examine emerging research on the use of PFAS in cosmetics, which is expected to be followed by legislation that would strengthen the agency’s ability to regulate cosmetic product ingredients. The 2021 omnibus appropriations bill included $300 million dedicated to the regulation and clean-up of PFAS. While it is unclear whether regulatory agencies will change their approach towards regulating PFAS, both lawmakers and government agencies have taken initiative to regulate at least some of these substances more closely.

Science Policy Around the Web April 6, 2021

leave a comment »

By Dorothy Butler, PhD

Image by Hans Braxmeier from Pixabay

U.S. lawmakers target plastic pollution, producers in new legislation

Plastic is an everyday convenience, but plastic waste has a big impact on the environment. The US leads the world in plastic waste, and there are projections that the global plastic production with triple by 2050. To help combat the generation of plastic waste, Congress has introduced a bill that targets plastic pollution and plastic producers: Break Free From Plastic Pollution Act. This legislation is sponsored by US Senator Jeff Merkley (D-OR) and Representative Alan Lowenthal (D-CA). The legislation builds on a bill introduced last year, designated the Break Free From Plastic Pollution Act of 2020. The new bill includes elements of environmental justice and proper plastic disposal. Some specific actions include reducing single-use plastics by banning certain products that are not recyclable; requiring plastic producers to design, manage, and fund waste and recycling programs; and prohibiting plastic from being shipped to developing countries.

There are many who support the legislation, including over 400 environmental advocacy groups and dozens of Democrats. Additionally, the political commentator and comedian, John Oliver, also spoke about the act during a segment on the plastic pollution crisis. While many groups and individuals laud the bill and what it aims to accomplish, there are several industry groups that criticize portions of the bill. The National Waste & Recycling Association says the bill’s provisions are an economic and disruptive burden. The American Chemistry Council said the bill would restrict innovation and impact American jobs. Even though many groups oppose the specifics of the bill, almost all acknowledge that plastic waste is a problem but cannot agree how to deal with it.

(Valerie Volcovici, Reuters)

Science Policy Around the Web September 29th, 2020

leave a comment »

By Kellsye Fabian, PhD

Image by: Pexels from Pixabay

California to phase out sales of new gas-powered cars by 2035

On September 23, Governor Gavin Newsom issued an executive order requiring sales of all new passenger vehicles to be zero-emission by 2035. According to Newsom, this will aggressively lessen the state’s reliance on fossil fuels while promoting economic growth and jobs creation. 

Following the order, the California Air Resources Board, will develop regulations to mandate that every new passenger car and truck sold in-state is electric or otherwise zero-emissions by 2035. The board will also develop regulations to ensure that medium- and heavy-duty vehicles will be zero-emission by 2045. The order requires state agencies, in coordination with the private sector, to accelerate the deployment of affordable fueling and charging stations. The order does not prevent Californians from owning and selling gasoline-powered cars or buying them from outside the state.

The transportation sector is the largest contributor of greenhouse gas emissions in the state. It is responsible for more than 50% of the state’s carbon emission, 80% of smog-forming pollution, and 95% of toxic diesel emissions. Banning the sale of gasoline-powered vehicles by 2035 would achieve more than 35% reduction in greenhouse gas emissions and an 80% improvement in oxides of nitrogen emissions from cars statewide. 

Automakers and the fossil fuel industry were critical of Newsom’s plan. According to John Bozella, who heads a group that represents automakers, mandates and bans do not make a successful market and more has to be done to increase consumer demand for zero-emission vehicles. Chet Thompson, a lobbyist for fossil fuel refineries, asserted that the governor does not have the authority to limit car buyers’ choices.

Kassie Siegel, the director of the Center for Biological Diversity’s Climate Law Institute, praised the phaseout of gas-powered cars but said that Newsom had not gone far enough to curtail oil production. The executive order also calls to end the issuance of new hydraulic fracturing (also known as fracking) permits by 2024, but left it to the state legislature to enact a ban. Furthermore, under Newsom, the state approved drilling permits for more than 1,400 new oil and gas wells in the first half of 2020. 

(Dino Grandoni, Faiz Siddiqui, Brady Dennis, The Washington Post)

Written by sciencepolicyforall

September 29, 2020 at 9:27 am

Science Policy Around the Web April 14th, 2020

leave a comment »

By Andrew Wright, BSc

Image by Angelo Esslinger from Pixabay

United States wants to end most payouts for leading vaccination-related injury

The National Vaccine Injury Compensation Program (VICP), was established by Congress in 1986 to provide relief to vaccine companies from personal injury lawsuits in an effort to maintain a viable vaccine supply. Under this law, an individual who has suffered an injury from an approved vaccine can utilize a special court process that involves a Health and Human Services (HHS) medical review board and that covers the individual’s private attorney fees. During the Obama administration, the guidelines for this program as they pertain to shoulder injuries were changed so that individuals no longer need to prove their injury was caused by the vaccine, but rather there is a clear difference in their shoulder health within 48 hours of the vaccine being administered.  Since that rule change, shoulder injuries have accounted for 54% of VICP claims.

In response to this increase in claims, the Trump administration is recommending that shoulder injury claims should be made ineligible for VICP and that those claims should be made in civil court. The argument behind this rationale relies upon shoulder injuries from vaccine administration being from poor needle placement, rather than an inflammatory reaction from the vaccine itself. According to an HHS proposal, moving the burden to a standard civil court might promulgate better training on proper injection technique and reduce the number of vaccine-based shoulder injuries. It also argues that the surge of VICP claims are providing ammunition to anti-vaccination groups.

Critics have noted that the HHS published research in 2010 that demonstrates antigenic material is what causes severe shoulder inflammation and long-term injury when it is injected into the shoulder joint. This was corroborated by studies done by the National Academy of Medicine. In an open letter to HHS secretary Alex Azar, Uma Srikumaran, a shoulder surgeon from Johns Hopkins, makes the distinction, “while needle injection is necessary to deliver the antigen into these structures, it is not sufficient to cause [shoulder injury] alone.” He also argues that limiting VICP could increase malpractice insurance rates and ultimately make vaccine administration more costly.

The stakes are particularly high given the severity of the current coronavirus epidemic. If a vaccine is made available, rapid and efficient administration of it to the global population will be necessary to have an effective impact.

(Meredith Wadman, Science)

‘A huge step forward.’ Mutant enzyme could vastly improve recycling of plastic bottles

The United States falls behind many countries in how much it recycles, and those numbers are often bolstered by the enormous export of plastic to other nations. “The US is the only developed nation whose waste generation outstrips its ability to recycle, underscoring a shortage of political will and investment in infrastructure,” according to analysis by the global consulting firm Verisk Maplecroft. At the same time US houses the two largest plastic producers in the world: Coca-Cola and PepsiCo and has worked to block global action against plastic pollution at the United Nations.  Under this startling backdrop of inaction and counteraction, it seems like a new strategy to tackle overwhelming loads of plastic pollution will be necessary.

Under the current recycling process, only about 30% of soda bottle plastic is recycled, and the final product often loses much of its strength and color. Recently, the enzyme leaf-branch compost cutinase (LLC) has been engineered by a partnership between the company Carbios and the University of Toulouse. According to research published in Nature, this new enzyme can break down 90% of the most commonly used plastic within 10 hours of exposure at high heat. Moreover, the resulting recycled product is just as strong as the material that is put in and can be customized to match specific colors. While the economic viability of large-scale LLC reactors is still being explored, this new method could provide a new avenue to establish plastic recycling infrastructure where it is lacking.

(Robert F. Service, Science)

Written by sciencepolicyforall

April 14, 2020 at 9:41 am

Science Policy Around the Web March 10th, 2020

leave a comment »

By Andrew Beaven

Image by Robert Balog from Pixabay 

Newly Discovered Deep-Sea Creature Named After Plastic Found in Its Guts

Discovered 20,000 feet below the ocean’s surface in the Mariana Trench, a new shrimp-like animal was discovered. The animal, first detailed in the journal Zootaxa, was named Eurythenes plasticus because shards of polyethylene terephthalate (PET) were found in its digestive tract. Although it is not well-known by its chemical name, PET is extensively used in the U.S. and abroad. In fact, more than one-half of the world’s synthetic fiber is PET (known as “polyester” when used as a fabric). PET is also the clear, lightweight plastic that is used for nearly all single-serving and 2-liter bottles sold in the U.S. It is also used to package salad dressings, oils, shampoo, household cleaners, etc. Although PET is easily recyclable, given its prevalence, it should come as no surprise that the world is very literally littered with PET.

Of course, the global problem is not only PET, but a wide range of other polymers that are absolutely ubiquitous. Because of their polymeric structure, these plastics are highly resistant to physical and biological degradation. Therefore, much like physical / chemical erosion producing sand, physical / chemical degradation ultimately produces “microplastic” shards that have now been found pretty much everywhere.

What is being done to help? President Obama signed the Microbead-Free Waters Act of 2015, which banned plastic microbeads in cosmetic and personal care products. Eight states have fully banned single-use plastic bags, and some individual communities (e.g., Montgomery County, MD, and Washington, D.C.) have implemented bag fees. A sign of improvement is that the total amount of plastic disposed in the U.S. has remained fairly stable since about 2010, however the amount of plastic landfilled has also remained stable.

(David Bressan, Forbes)

Dry California winter prompts wildfire and drought concerns

The last great Californian drought officially ended in 2016, but scientists are concerned about 2020. The U.S. Geological Survey defines a drought as “a period of drier-than-normal conditions that results in water-related problems.” An arresting statistic of 2020 is that there was zero rain in downtown San Francisco throughout the month of February, an occurrence that had not happened since 1864. San Francisco was not alone in the state, many other Northern California sites reported zero rain for the first time on record. The northern Sierra, which provides large portions of California with runoff, also recorded its lowest precipitation on record for the month. In the southwest, January and February combined to be “one of the driest first two months of any calendar year on record.” All of this is particularly concerning because February is typically the wettest month of the year. Looking into the future, it is predicted that as the Earth warms, Californian rain patterns will become even more drastic – ranging from crushing drought to severe downpours. Compounding the problem, severe drought destabilizes earth, meaning that downpours can lead to catastrophic landslides.

What is being done to help? President Trump signed an order in February to re-engineer the state’s water plans – sending more water from Northern California to the farmers and higher populated regions in the south. The original water system was designed to protect fish in the north. Trump said that the original rules were due to “outdated scientific research and biological opinions,” and that the situation would be different if California was actually suffering from drought. Fortunately, current weather models predict some potential for wet March, April, and May. Additionally, most reservoirs are at or near average capacity because of particularly wet 2017 and 2019.

 (Diana Leonard, Washington Post)

Written by sciencepolicyforall

March 10, 2020 at 4:45 pm

Science Policy Around the Web January 16th, 2020

leave a comment »

By Andrew H. Beaven, PhD

Facts & Figures 2020 Reports Largest One-year Drop in Cancer Mortality

On January 11, 1964, the U.S. Surgeon General reported that cigarette smoking is a cause of lung cancer and laryngeal cancer in men, a probable cause of lung cancer in women, and the most important cause of chronic bronchitis. This led to the Federal Cigarette Labeling and Advertising Act of 1965 and the Public Health Act of 1969 that required warnings on cigarette packages, banned cigarette advertising in broadcasting media, and called for an annual report on the health consequences of smoking. 

Fifty-six years later, lung cancer is still the leading cause of cancer mortality in the U.S. – accounting for almost one-quarter of all cancer deaths. However, with an ever-increasing understanding of how to treat cancer and America’s general cessation, the American Cancer Society announced a 2.2% drop in the American cancer death rate between 2016 and 2017, the largest single-year drop in cancer mortality (statistics are reported in the American Cancer Society’s peer-reviewed journal, CA: A Cancer Journal for Clinicians). This substantial mortality rate decrease is primarily attributed to a decrease in lung cancer deaths. Coincidentally, the report aligns with recent legislation raising the age to buy tobacco products from 18 to 21 years old. This legislation was included in the federal year-end legislative package, passed by both houses of Congress, and signed into law on December 20, 2019 by President Donald Trump. The goal of the legislation is to keep tobacco out of teenager’s hands, with the hope that if teens do not start using tobacco early, they will never start using tobacco products.

(Stacy Simon, American Cancer Society)

NASA, NOAA Analyses Reveal 2019 Second Warmest Year on Record

New, independent analyses by U.S. federal agencies NASA and NOAA demonstrate Earth’s continuing warming. Global surface temperatures in 2019 were the second hottest since 1880 when modern recordkeeping began. These results, posted online January 15, continue the concerning trend – the past 5 years have been the warmest of the last 140 years (the hottest year was 2016). NASA and NOAA report temperature on a relative scale based on the mean temperature between 1951–1980. The 2019 anomaly was 1.8 ºF (0.98 ºC) warmer than the 1951–1980 mean. The report makes special note that average global warming does not imply that all areas experience the same warming. For example, NOAA reported that the contiguous 48 U.S. states experienced the 34th warmest year on record, simply giving it a “warmer than average” classification. Meanwhile, Alaska experienced its warmest year on record.

To account for biases, the scientists take into account the varied spacing of the temperature stations, urban heat island effects, data-poor regions, changing weather station locations, and changing measurement practices. Through continuing modeling and statistical analyses, scientists continue to conclude that this rapid uptick in temperature is because of increased greenhouse gas emissions caused by human activities.

(Steve Cole, Peter Jacobs, Katherine Brown, NASA)

Written by sciencepolicyforall

January 16, 2020 at 9:38 am

Science Policy Around the Web January 14th, 2020

leave a comment »

By Thomas Dannenhoffer-Lafage, PhD

Image by Pexels from Pixabay 

The FDA Announces Two More Antacid Recalls Due to Cancer Risk

The FDA has recently announced voluntary recalls of two prescription forms of ranitidine produced by the generic drug companies Appco Pharma and Northwind Pharmaceuticals. The recall was announced because the drug may contain unsafe levels of N-Nitrosodimethylamine (NDMA), a carcinogen. The FDA had announced in September that it discovered the drug contained NDMA but did not advise consumers to discontinue use of the drug. Ranitidine – commonly known as Zantac – is prescribed to 15 million Americans and is taken by millions more in over-the-counter versions. The drug was recently removed from the shelves of several retailers as a precaution. Zantac was once the best-selling drug in the world. 

The discovery of NDMA in ranitidine occurred when a mail order pharmacy company Valisure tested a ranitidine syrup. When the syrup tested positive for NDMA, Valisure tested other products containing ranitidine and found the same high amount of the carcinogen. Their findings were then reported to the FDA. According to the CEO of Valisure, the presence of NDMA in ranitidine could be due to chemical stability issues. 

The FDA did not recall the drug at that time because of the extreme conditions of the tests and claimed that less extreme conditions resulted in much smaller amounts of NDMA. Valisure also claimed that NDMA was found in high amounts in tests meant to simulate gastric fluid. However, when the FDA performed a similar test, they found no formation of NDMA. This may be due to the lack of sodium nitrate in the FDA’s tests. The FDA acknowledged this issue in testing by warning consumers to avoid food containing high amounts of sodium nitrate such as  processed meats if they wish to continue taking ranitidine. The FDA has also mentioned that the levels of NDMA found in ranitidine were comparable to what might be found in smoked or grilled meats.  

Several lawsuits have been filed asserting that Zantac has caused cases of cancer. However, experts point out that the likelihood of any individual getting cancer from taking the heartburn medicine is low. 

(Michele Cohen Marill, WIRED)

EPA Aims to Reduce Truck Pollution, and Avert Tougher State Controls

The Trump administration has announced a proposed rule change to tighten the pollution caused by trucks. Initiated by EPA head Andrew Wheeler, the new rule will limit emissions of nitrogen dioxide, which has been linked to asthma and lung disease. The change is predicted to curb nitrogen dioxide pollution more than current regulations, but will likely fall short of what is necessary to significantly prevent respiratory illness. 

The administration seems to be following the lead of the trucking industry, which lobbied for a new national regulation that will override state’s ability to implement their own rules, especially those of California. The EPA’s current rule, enacted in 2001, on nitrogen dioxide pollution from heavy-duty highway trucks required trucks to cut emissions by 95 percent over 10 years. This resulted in a 40-percent drop in nitrogen dioxide emissions across the nation. Although no law requires the EPA ruling to be updated, the Obama administration’s EPA had examined further cuts. The cuts were petitioned for by public health organizations and aimed to reduce emissions by another 90 percent by about 2025. California had begun the legal process to make such proposed cuts a reality, but Trump revoked California’s legal authority to set tighter standards on tailpipe emissions. 

This revocation has lead the EPA to move forward on the new rule that would only reduce emission by 25 percent to 50 percent. The trucking industry has pointed out that the current administration has gone to great lengths to understand how the EPA regulations affects them, something that was not standard practice under previous administrations. However, representatives from the American Lung Association have lamented that the current administration is not taking as much advice from major health and environmental groups as compared to previous administrations. 

(Carol Davenport, New York Times)

Written by sciencepolicyforall

January 14, 2020 at 10:30 am

Science Policy Around the Web October 8th, 2019

leave a comment »

By Mary Weston PhD

Image by Andreas Lischka from Pixabay 

A single tea bag can leak billions of pieces of microplastic into your brew

A recently published studyfrom McGill University shows that plastic teabags release billions of plastic micro- and nanoparticles into your tea. Researchers steeped plastic tea bags in 95°C (203°F) water for 5 minutes, finding that a single bag released approximately 11.6 billion microplastics and 3.1 billion nanoplastics. This concentration of plastic particles is thousands of times larger than any other reported food/drink item.

Although tea bags contain food-grade, FDA approved plastics, researchers know little about how plastics can degrade or leach toxic substances when heated above 40C (104F). Based on these new results, the study’s authors conclude that more research needs to be done to both determine how microparticles are released in our foods and the impact those substances have on human health.

To gain insight on the effect of plastic particle exposure, researchers grew water fleas, a common environmental toxicology model system, in the brewed solution, discovering they survived but had both behavioral and developmental abnormalities. While the plastic particle exposure levels these fleas experienced are far greater than what humans would be exposed to, it begs the question of what happens to humans with chronic low-dose exposure over time.

Microplastics are being detected everywhere, from the deepest parts of the ocean to regularly consumed bottled water, and their effect on human health have yet to be seen. One study suggests humans are consuming 5 grams of plastic a week, approximately the weight of a credit card.  However, In their first review of microplastics in tap and bottled water, the WHO asserts that microplastics “don’t appear to pose a health risk at current levels,” but also state that knowledge is limited and more research is needed to determine their impact on human health. 

(Rob Picheta, CNN)

Written by sciencepolicyforall

October 8, 2019 at 3:53 pm

Science Policy Around the Web August 30th, 2019

with 2 comments

By Andrew Wright, BSc

Image by Steve Buissinne from Pixabay

EPA’s controversial ‘secret science’ plan still lacks key details, advisers say

In early 2018 under its previous administrator Scott Pruitt, the U.S. Environmental Protection Agency (EPA) first proposed rules to restrict the use of scientific findings whose data and methodologies are not public or cannot be replicated. Following the removal of all sitting Science Advisory Board (SAB) members who receive EPA grants in late 2017 (roughly half of its members) there was concern that environmental experts were being sidelined from EPA decision-making, which the proposed rule seemed to support. While making data public and replicable has merits, the SAB has raised concerns that the proposed rule would make it impossible to use the most accurate information as many environmental studies are long-term ones that assess human exposure to toxins and cannot be ethically or efficiently replicated. Now, under administrator Andrew Wheeler, how this proposed rule will be implemented is still unclear. 

A central concern is how to maintain privacy over personally identifiable information (PII) to comply with existing privacy laws and concerns (such as the Health Insurance Portability and Accountability Act or HIPAA). One proffered strategy is to try a tiered approach based of the model currently used by the National Institutes of Health, whereby the more sensitive the PII is, the more restricted its access will be. 

As the SAB has decided to engage in a consultation of the proposed rule, individual members will have their comments written up in a report to be sent to Wheeler but will not have to come to a consensus for the proposed rule to move forward.  

(Sean Reilly, Science (Reprinted from E&E News

 Brazilian Amazon deforestation surges to break August records 

While the recent spate of fires in the Amazon rainforest have been capturing international attention, regular deforestation via cutting and clearing techniques have also been rapidly increasing. In August alone, 430 square miles, or a region the size of Hong Kong, has been cut down. This comes after the July’s loss of 870 square miles, a 275% jump from the previous year.  At the current rate of deforestation Brazil is on track to lose more than 3,800 square miles of rainforest, or an area roughly one and a half times the size of Delaware, this year.

“The August data from Deter is hardly surprising,” said Claudio Angelo of Climate Observatory, referencing the Deter-B satellite that was put into place in 2015 to monitor Brazil’s rainforests. According to him and other representatives from non-governmental organizations, the Bolsonaro government is delivering on its promises to support local industries such as mining, ranching, farming, and logging rather than enforcing environmental protections. 

While this deforestation data is separate from data on forest fires, felled trees are often left to sit and dry before they are lit aflame, leading forest engineers to portend that the fires are going to get worse in the coming months.

Since the Amazon rainforest generates its own weather patterns, studies have demonstrates the possibility that after 40% deforestation has occurred, the biome may irreversibly convert to savannah. This could impact global weather patterns, affected Brazilian weather most severely. However, recent estimates place that tipping point closer to 20-25% due to the synergistic effects of climate change. According to the World Wildlife Fund, approximately 17% of the rainforest has been lost in the past 50 years, putting uncontrollable forest conversion much closer than previously assumed.

(Jonathan Watts, The Guardian

Written by sciencepolicyforall

August 30, 2019 at 11:08 am

Science Policy Around the Web – June 14th, 2019

leave a comment »

By: Andrew Wright BSc

Image by David Mark from Pixabay 

The Pentagon emits more greenhouse gases than Portugal, study finds 

A recent study published by Brown University quantified the Pentagon’s total greenhouse gas emissions from 2001 to 2017 using estimates from the Department of Energy and fuel consumption data. The results demonstrated that during the time period studied, the Pentagon’s emissions were “in any one year…greater than many smaller countries‘ greenhouse gas emissions”. In 2017 alone, the Pentagon output 59 metric tons of CO2, ranking it higher than Sweden (42 metric tons), Portugal (55 metric tons) , or North Korea (58 metric tons). The Pentagon’s energy consumption is largely from air emissions (~55%) and diesel use (~14%), while the rest is dedicated to powering and heating military facilities.

Were it to be considered a standalone country, the Pentagon would be the 55th largest contributor of CO2 emissions, according to the study’s author Neta Crawford. In a separate article, she noted ”…the Department of Defense is the U.S. government’s largest fossil fuel consumer, accounting for between 77% and 80% of all federal government energy consumption since 2001″. While measures have been put into place by the Pentagon to reduce its emissions in recent years, given the threat assessment the Pentagon produced that warns fully two-thirds of military installations in the U.S. are or will be at risk due to climate change, further efforts may be needed.

 (Sebastien Malo, Reuters

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

June 14, 2019 at 3:58 pm