Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘environment

Science Policy Around the Web November 19th, 2019

leave a comment »

By: Andrew Wright Bsc

Source: Pixabay

EPA’s ‘secret science’ plan is back, and critics say it’s worse

​The Environmental Protection Agency (EPA) has been exploring new rules on the incorporation of scientific data in its rulemaking process. The so-called “secret science” rules were originally proposed in 2018 under the EPA’s previous administrator Scott Pruitt, and have since been revised by its new administrator Andrew Wheeler in response to harsh criticism from scientific, environmental, and patient groups. Rather than addressing these criticisms to mollify the proposals detractors, the draft of the newly proposed rule, which was leaked to the New York Times, seems to drastically broaden the scope of which data cannot be used. 

According to the 2018 proposed rule, all raw data would have to be made available for studies that assessed a “dose-response” relationship, a bedrock of toxicity research. This could be difficult, if not impossible, when considering patient privacy protection laws and proprietary information requirements that would prevent the dissemination of that data. In the new draft rule, this set of constraints is imposed on all scientific studies used to guide agency procedures, instead of just dose-response studies. The draft also seeks comment on whether these restrictions should be imposed retroactively. According to the draft rule, if the underlying data were not made available, the EPA would be able to “place less weight” or “entirely disregard” those studies.  

While the draft does provide room for a tiered data-sharing approach such as those implemented at the National Institutes of Health and the Food and Drug Administration and allows for political appointees to provide exemptions, critics worry that these new requirements will effectively remove science from the EPA’s decision-making process.  Thus far, the EPA’s scientific advisory board has not been afforded the opportunity to weigh-in.

(David Malakoff, Science)

‘Insect apocalypse’ poses risk to all life on Earth, conservationists warn

A recent study looking at insect populations in the UK suggests that up to half of all insects have been lost since 1970 and that 40% of all known insect species are facing extinction. Due to the complexity of ecological systems that rely on insect biodiversity to function properly, this level of insect loss could lead to “catastrophic collapse” on a global scale. 

This study demonstrates a similar severity of insect decline as has been seen in other regions around the world. In Puerto Rico, for example, insect biomass has declined between 10 and 60 times and has led to the destruction of its rainforest’s food web. In Germany, 75% of flying insects have vanished in the past 27 years.

Solutions to address what is now considered Earth’s sixth mass extinction event are becoming increasingly complex as failing components of anthropogenic damage to the global ecosystem are beginning to interact. However, conservationists suggest that insect numbers could be rapidly recovered through a combination of pesticide reduction and land management. 

(Damian Carrington, The Guardian)

Written by sciencepolicyforall

November 19, 2019 at 11:59 am

Science Policy Around the Web October 8th, 2019

leave a comment »

By Mary Weston PhD

Image by Andreas Lischka from Pixabay 

A single tea bag can leak billions of pieces of microplastic into your brew

A recently published studyfrom McGill University shows that plastic teabags release billions of plastic micro- and nanoparticles into your tea. Researchers steeped plastic tea bags in 95°C (203°F) water for 5 minutes, finding that a single bag released approximately 11.6 billion microplastics and 3.1 billion nanoplastics. This concentration of plastic particles is thousands of times larger than any other reported food/drink item.

Although tea bags contain food-grade, FDA approved plastics, researchers know little about how plastics can degrade or leach toxic substances when heated above 40C (104F). Based on these new results, the study’s authors conclude that more research needs to be done to both determine how microparticles are released in our foods and the impact those substances have on human health.

To gain insight on the effect of plastic particle exposure, researchers grew water fleas, a common environmental toxicology model system, in the brewed solution, discovering they survived but had both behavioral and developmental abnormalities. While the plastic particle exposure levels these fleas experienced are far greater than what humans would be exposed to, it begs the question of what happens to humans with chronic low-dose exposure over time.

Microplastics are being detected everywhere, from the deepest parts of the ocean to regularly consumed bottled water, and their effect on human health have yet to be seen. One study suggests humans are consuming 5 grams of plastic a week, approximately the weight of a credit card.  However, In their first review of microplastics in tap and bottled water, the WHO asserts that microplastics “don’t appear to pose a health risk at current levels,” but also state that knowledge is limited and more research is needed to determine their impact on human health. 

(Rob Picheta, CNN)

Written by sciencepolicyforall

October 8, 2019 at 3:53 pm

Science Policy Around the Web September 20th, 2019

leave a comment »

By Allison Cross, PhD

Image from Flickr

Hunt for Cause of Vaping Illness Suggests Multiple Mechanisms of Damage

A vaping-related respiratory illness has affected nearly 500 individuals across 3 dozen states and has been linked to 6 deaths since the first case was reported back in April. Experts, however, are still uncertain about what is causing the nationwide outbreak and even what the condition is exactly.  

report earlier this month from the FDA suggested they may have identified the source of the problem, vitamin E acetate,  a common contaminate in vaping products.  However, more recent information indicates that no single contaminate was identified in all product samples tested from sick individuals. To date, the only thing found in common among the nearly 500 individuals who have fallen ill is that they recently vaped in the US or its territories.  

On September 16th, the U.S. Centers for Disease Control and Prevention activated its Emergency Operations Center (EOC) to help to enhance operations and provide additional support to CDC staff working to identify the cause of the disease.  The CDC advices those concerned about the outbreak to refrain from using e-cigarettes or vaping products.

E-cigarettes and other vaping products have recently been under scrutiny by those concerned about the recent increase in popularity of vaping among adolescents.  Many have been pushing for a ban on flavored e-cigarettes as these products are believed to be deliberately targeting youth.  The recent outbreak has led to renewed calls for a total ban on these and other vaping products.  In response to the outbreak, regulators in New York approved a ban on the sale of flavored e-cigarettes on Tuesday the 17thand Michigan followed suit on Wednesday.   The health and human services secretary, Alex M. Azar II, also announced that the FDA is outlining a plan for removing flavored e-cigarettes and nicotine pods from the market, though finalizing this ban will take several weeks. 

(Emily Willingham, Scientific American)

Trump’s decision to block California vehicle emissions rules could have a wide impact

California has long struggled to reduce smog in its cities and for almost 4 decades, as a part of the federal Clean Air Act, they have been granted special permission by the EPA to set their own air pollution standards.  This may soon change however as President Trump announced that the administration plans to revoke California’s authority to set its own automotive emissions standards. The Trump administration, instead, aims to set a single national standard for automotive emissions. Many are concerned, however, about the more lenient national standard proposed by the Trump administration. 

Although California is only 1 of 49 states, the implications of revoking California’s authority to set its own emission standard are far reaching.  The Clean Air Act currently allows others states to adopt the standards set by California and, as of today, thirteen other states and Washington DC abide by California’s stricter standards. 

The plan currently proposed by the Trump administration aims to freeze fuel-efficiency standards for all vehicles after 2020.  Experts estimate that this new standard would increase average greenhouse gas emissions from new vehicles by 20% in 2025 compared to the level projected under the current rules.  

California leaders have pledged to challenge the decision by the Trump administration in court.  It is likely that other states and environmental groups will join in support of California and it is possible that the lawsuit makes its way all the way to the supreme court. 

 (Jeff Tollefson, Nature

Written by sciencepolicyforall

September 20, 2019 at 5:44 pm

Science Policy Around the Web August 30th, 2019

leave a comment »

By Andrew Wright, BSc

Image by Steve Buissinne from Pixabay

EPA’s controversial ‘secret science’ plan still lacks key details, advisers say

In early 2018 under its previous administrator Scott Pruitt, the U.S. Environmental Protection Agency (EPA) first proposed rules to restrict the use of scientific findings whose data and methodologies are not public or cannot be replicated. Following the removal of all sitting Science Advisory Board (SAB) members who receive EPA grants in late 2017 (roughly half of its members) there was concern that environmental experts were being sidelined from EPA decision-making, which the proposed rule seemed to support. While making data public and replicable has merits, the SAB has raised concerns that the proposed rule would make it impossible to use the most accurate information as many environmental studies are long-term ones that assess human exposure to toxins and cannot be ethically or efficiently replicated. Now, under administrator Andrew Wheeler, how this proposed rule will be implemented is still unclear. 

A central concern is how to maintain privacy over personally identifiable information (PII) to comply with existing privacy laws and concerns (such as the Health Insurance Portability and Accountability Act or HIPAA). One proffered strategy is to try a tiered approach based of the model currently used by the National Institutes of Health, whereby the more sensitive the PII is, the more restricted its access will be. 

As the SAB has decided to engage in a consultation of the proposed rule, individual members will have their comments written up in a report to be sent to Wheeler but will not have to come to a consensus for the proposed rule to move forward.  

(Sean Reilly, Science (Reprinted from E&E News

 Brazilian Amazon deforestation surges to break August records 

While the recent spate of fires in the Amazon rainforest have been capturing international attention, regular deforestation via cutting and clearing techniques have also been rapidly increasing. In August alone, 430 square miles, or a region the size of Hong Kong, has been cut down. This comes after the July’s loss of 870 square miles, a 275% jump from the previous year.  At the current rate of deforestation Brazil is on track to lose more than 3,800 square miles of rainforest, or an area roughly one and a half times the size of Delaware, this year.

“The August data from Deter is hardly surprising,” said Claudio Angelo of Climate Observatory, referencing the Deter-B satellite that was put into place in 2015 to monitor Brazil’s rainforests. According to him and other representatives from non-governmental organizations, the Bolsonaro government is delivering on its promises to support local industries such as mining, ranching, farming, and logging rather than enforcing environmental protections. 

While this deforestation data is separate from data on forest fires, felled trees are often left to sit and dry before they are lit aflame, leading forest engineers to portend that the fires are going to get worse in the coming months.

Since the Amazon rainforest generates its own weather patterns, studies have demonstrates the possibility that after 40% deforestation has occurred, the biome may irreversibly convert to savannah. This could impact global weather patterns, affected Brazilian weather most severely. However, recent estimates place that tipping point closer to 20-25% due to the synergistic effects of climate change. According to the World Wildlife Fund, approximately 17% of the rainforest has been lost in the past 50 years, putting uncontrollable forest conversion much closer than previously assumed.

(Jonathan Watts, The Guardian

Written by sciencepolicyforall

August 30, 2019 at 11:08 am

Science Policy Around the Web – July 12th, 2019

leave a comment »

By Mohor Sengupta, Ph.D.

Source: Maxpixel

CDC made a synthetic Ebola virus to test treatments. It worked

During the 2014-2016 Ebola outbreak in Guinea, West Africa, infectious samples containing the virus were shared by local government with international scientific communities. Using these materials, Dr. Gary Kobinger and his team developed and tested the efficacy of a monoclonal antibody vaccine at the Canadian National Laboratory. The same vaccine, ZMapp, and other therapies are currently being deployed in the most recent Ebola outbreak, which is the second largest outbreak so far. Beginning in ] 2018 in the Democratic Republic of Congo (DRC), this outbreak is still on the roll. Unfortunately, the Centers of Disease Control and Prevention (CDC) did not have any viral samples this time, meaning they were unable to test the efficacy of ZMapp and other drugs against the recent viral strain. 

Scientists at the CDC, led by Dr. Laura McMullan, constructed an artificial virus from the sequence of the current strain shared by DRC’s National Biomedical Research Institute (INRB). The group used the sequence data to perform reverse genetics and generate the authentic Ebola virus that’s currently infecting scores of people in Ituri and North Kivu provinces of DRC. 

“It takes a lot of resources and a lot of money and a lot of energy to make a cloned virus by reverse genetics. And it would be so much easier if somebody had just sent the isolate”, Dr. Thomas Geisbert, who is not involved in the work, said. 

The CDC group established the efficacy of current treatments (a drug called Remdesivir and the vaccine ZMapp) on the viral strain by using their artificial virus for all the tests. Their work was published Tuesday in the journal Lancet.

For all four Ebola outbreaks that the DRC has seen, healthcare authorities have not shared viral specimens with foreign Ebola researchers. Instead, the whole genome sequence was provided every time. With the whole genome sequence data, the Lancet paper noted that there are at least two Ebola strains in DRC that have independently crossed into the human population.  

Reasons for not sharing viral samples by DRC are not known but it is a roadblock to rapid and efficient treatments in affected geographical regions. McMullan said that shipping of samples across such large distances is often a logistical issue and requires permission from several authorities and coordination of many people. 

 (Helen Branswell, STAT)

Plastic Has A Big Carbon Footprint — But That Isn’t The Whole Story

We are all too familiar with ghastly images of dead whales with plastic-filled stomachs. These images are compounded by pictures of how much waste is generated, such as a picture of a twenty-story high mound of plastic trash in a developing country that appeared in a recent news article. While there is worldwide concern about how to eliminate use of plastics, there is very little discussion about the environmental impact of the materials that will replace plastic. 

Plastic has a high carbon footprint. In a recent report the Center for International Environmental Law (CIEL) has broken down the individual steps of greenhouse gas production, from the beginning of plastic production until it ends up incinerated as a waste. Manufactured from oil and natural gas, plastic production adds to carbon footprint right from its cradle when gases and oils leak into the environment. Subsequently, delivery of raw materials to the production sites further add to the burden. Being among the most energy intensive materials to produce, plastic production takes a heavy toll on energy, water and electricity. Finally, when plastics are incinerated, greenhouse gases end up in the environment. 

But what about the materials that commonly substitute for plastic, such as paper, compostable plastic, canvas or glass? What is their carbon footprint in production stages? Research by several independent groups has revealed that these materials leave an even larger carbon footprint during their production. Data have shown that polyethylene plastic bags not only used lesser fuel and energy throughout production, they also emitted fewer global-warming gases and left lesser mass of solid wastes, when compared with paper bags and with compostable plastic bags. Being more durable than other materials, use of polyethylene bags are more energy friendly than use of paper bags. 

Research done on behalf of the American Chemistry Council has shown that replacing plastic would eventually do more harm to the environment than their use. Finally, consumer habits count. If people don’t reuse plastics, then its advantages over paper cease to exist. Of course, the problem of permanent waste and global health consequences are issues that cannot be overlooked. The solution might lie in using plastics more wisely and re-using them as much as possible. 

(Christopher Joyce, NPR

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

July 12, 2019 at 3:18 pm

Posted in Linkposts

Tagged with , , , , ,

How human health depends on biodiversity

leave a comment »

By: Lynda Truong

Image by V Perez from Pixabay 

By many measures, the Earth is facing its sixth mass extinction. The fifth mass extinction, a result of a meteorite approximately 10 km in diameter, wiped out the dinosaurs and an estimated 40-75% of species on Earth. This time around, the natural disaster that is threatening life on Earth is us.

In May, the United Nations released a preliminary report on the drastic risk to biodiversity (not to be confused with the recent report on the drastic consequences of climate change).  The assessment, which was compiled by the Intergovernmental Science-policy Platform on Biodiversity and Ecosystem Services (IPBES), draws on information from 15,000 scientific and government sources with contributions from 145 global experts. It projects that one million species face risk of extinction. Scientists have estimated that the historical base level rate of extinction is one per million species per year, and more recent studies suggest rates as low as 0.1 per million species per year. At the established base level rates, it would take one to ten million years to see the same magnitude of extinction the planet currently faces. This accelerated rate of extinction can be linked to a variety of man-made causes, including changes in land and sea use, direct exploitation of organisms, climate change, pollution, and the introduction of invasive species. 

For some, that may not seem important. If humans are not on the endangered species list, why should it matter? As the IPBES Global Assessment indicates however, healthy ecosystems provide a variety of services, including improving air quality, purifying drinking water, and mitigating floods and erosions. The vast canopies of rainforests worldwide sequester 2.6 billion tons of carbon dioxide a year. Plants and soil microbes found in wetlands can remove toxins from water, including explosive chemicals such as nitroglycerin and trinitrotoluene (TNT). Mangrove forests serve as an important buffer against ocean storm surges for those on land. Nature is a powerful resource, and declines in biodiversity have broad implications for global development and health. 

The importance of biodiversity on global health is immediately apparent in middle- and low-income countries, which rely heavily on natural remedies and seasonal harvests for health and nutrition. The loss of entire species of plants can eliminate valuable sources of traditional medicine for indigenous communities. Genetically diverse crops are more resilient to pest and disease, ensuring a stable food supply and bolstering food security. Beyond this, ecosystem disturbances also have complex implications for infectious disease, which are often endemic to developing nations. 

However, these effects are also seen in first world countries. A well cited example for the impact of biodiversity loss on infectious disease involves Lyme disease, which is endemic to parts of the United States. The white footed mouse is a common carrier of Lyme disease, and in areas with high densities of these mice, ticks are likely to feed on the mice and subsequently transmit the disease to humans. However, the presence of other mammals that the tick can feed on dilutes the disease reservoir, lowering the likelihood of an outbreak (commonly referred to as the “dilution effect”). While biodiversity has complicated effects on the spread of infectious diseases, drastic changes to ecosystems often provide a breeding ground for disease vectors and lead to increases in transmission.

In addition to the direct effects of declines in biodiversity have on global health, an often-neglected aspect of its importance for health is as a resource for biomedical science. The IPBES assessment reports that 70% of cancer drugs are natural or inspired by natural sources such as traditional medicines. This merely scratches the surface of the influence of nature on modern biomedical research. 

Much like the communities that rely on natural products as medicine, many drug compounds produced by pharmaceutical companies are derived from nature. Morphine has been one of the most revolutionary drug compounds in history, effectively treating both acute and chronic pain. The compound was originally isolated from the opium poppy, and its chemical structure has since been modified to reduce negative effects and improve potency. While the current opioid crisis in the United States has highlighted the importance of moderate use, morphine and its analogues are some of the most useful and reliable pain relievers in modern medicine. Similarly, aspirin has been regarded as a wonder drug for its analgesic, anti-inflammatory, and cardioprotective effects. Aspirin is a chemical analogue of salicylic acid, a compound originally isolated from willow tree bark. 

Beyond general pain relief, many naturally derived drugs have also been useful for disease treatment. Quinine, the first effective antimalarial drug, was extracted from the bark of cinchona trees, and quinine and its analogues are still used to treat malaria today. Penicillin, serendipitously discovered in a fungus, has been useful for treating bacterial infections and informing modern antibiotic development. These medicines and many more have been crucial to the advancement of human health, yet could have just as easily been lost to extinction.

On a more fundamental level, scientific research has benefited from many proteins isolated from nature. Thermophilic polymerases, isolated from a bacterium residing in hot springs, are now an essential component of polymerase chain reactions (PCR) – a common laboratory technique that amplifies segments of DNA. This method is critical in molecular biology labs for basic research, and forensic labs for criminal investigations.Fluorescent proteins, which have been isolated from jelly fish and sea anemone, revolutionized the field of molecular biology by allowing scientists to visualize dynamic cellular components in real time. More recently, CRISPR/Cas systems were discovered in bacteria and have been developed as a gene editing tool capable of easily and precisely modifying genetic sequences. These basic tools have vastly improved the scope of biomedical research, and all of them would have been close to impossible to develop without their natural sources.

In addition to medicines and tools, nature has often informed biomedical research. Denning bears are commonly studied for potential solutions to osteoporosis and renal disease. Their ability to enter a reduced metabolic state where they do not eat, drink, or defecate for months at a time provides valuable insight into how these biological processes may be adapted to benefit human disease and physiology. Even more interestingly, there are a few species of frogs that become nearly frozen solid in winter, and thaw fully recovered in spring. In this frozen state, much of the water in their body turns to ice, their heart stops beating, and they stop breathing. When temperatures rise, they thaw from the inside out and continue life as per usual. Crazy cryonics and immortality aside, the freeze/thaw cycles could inform improved preservation for organ transplants.

Nature is a much better experimentalist than any human, having had billions of years to refine its experiments through the process of evolution and natural selection. Depleting these living resources, which provide invaluable benefits to human health and ecosystems, lacks foresight and is dangerously reckless. The techno-optimist approach of ceaseless development in the blind belief that whatever problem humanity encounters can be solved with research and innovation neglects to account for the dependency of research and innovation on nature. Most biomedical scientists, most physicians, and much of the general public have probably devoted a minimal amount of consideration to the importance of biodiversity. But for the one million species currently at risk, and for the hundreds of million more yet to be discovered, it’s worth a thought.

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

June 7, 2019 at 9:51 am

Science Policy Around the Web – April 2, 2019

leave a comment »

By: Patrice J. Persad Ph.D.

Image by Jason Gillman from Pixabay

Worrisome nonstick chemicals are common in U.S. drinking water, federal study suggests

What lurks in our drinking water—and all its effects on organismal health—may be more of a mystery than what resides in the deep recesses of our oceans. In a recent investigation conducted by the United States Geological Survey and the Environmental Protection Agency (EPA), manmade per- and polyfluroalkyl substances (PFAS) tainted drinking water samples were analyzed. PFAS, which put the “proof” in water-proof items, are substances of concern, or, more aptly, contaminants of emerging concern (CECs), given their potential carcinogenicity and permanence in ecosystems. Perfluorooctane acid (PFOA), a PFAS discontinued in production domestically, was at a concentration over 70 nanograms per liter (ng/l) in a sample. A trio of other PFAS surpassed this concentration level, as well. A standard level issued by federal agencies has yet to transpire. However, the Centers for Control of Disease(CDC) attests that the existing cut-off of 70 ng/l is unacceptable in that it is not sufficiently low, or conservative, with respect to human health. 

The Environmental Working Group(EWG) suspects that over 100 million individuals in the U.S. drink water with PFAS. Citizens currently advocate for authorities to test drinking water samples and disclose PFAS concentrations. Without setting standards, accountability for future detriments to health is up in the air. Only through discussion with the public, policy makers, the research community, and parties formerly or currently producing PFAS can we set safeguards to protect our water supply plus well-being. 

(Natasha Gilbert, Science)


To Protect Imperiled Salmon, Fish Advocates Want To Shoot Some Gulls

In recreating the fundamental question “Who stole the cookies from the cookie jar?”, nature’s version spins off as “Who stole the juvenile salmon from Miller Island?” In this spiraling whodunit mystery, an unexpected avian culprit surfaces: the gull. According to avian predation coordinator Blaine Parker, surveys revealed that a fifth of imperiled salmon were whisked away by gulls near channels flowing out of dams. Gulls also spirited away these juvenile fish from other avian predators, such as Caspian terns. Parker maintains that not every gull is a perpetrator of decreasing the species’ numbers; gulls can assist with the population control of other birds who feast on the juveniles. Therefore, he supports killing the individual gulls disturbing juvenile salmon booms—lethal management.

Although there has been precedent of sacrificing avian species for the security of juvenile salmon, several entities denounce lethal management of wayward gulls affecting the young fish’s survival rates. The Audubon Society of Portlandpoint out that the Army Corps. of Engineers’ modifications to dams for warding away gulls, or other airborne predators, are slipshod and ineffective, if not inexistent. The U.S. Army Corps., despite this criticism, avows that killing specific gulls is only a final resort. From Parker and these organizations’ opposing viewpoints, a new mystery migrates to the surface. Will killing avian predators populating dams and waterways have a significant impact on the endangered salmons’ survival? Research collaboration on ecological impacts may be a way to tell or reassess the futures of both juvenile salmon and gulls. 

(Courtney Flatt, Northwest Public Broadcasting/National Public Radio



Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

April 3, 2019 at 10:32 am