Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘environment

Science Policy Around the Web – July 12th, 2019

leave a comment »

By Mohor Sengupta, Ph.D.

Source: Maxpixel

CDC made a synthetic Ebola virus to test treatments. It worked

During the 2014-2016 Ebola outbreak in Guinea, West Africa, infectious samples containing the virus were shared by local government with international scientific communities. Using these materials, Dr. Gary Kobinger and his team developed and tested the efficacy of a monoclonal antibody vaccine at the Canadian National Laboratory. The same vaccine, ZMapp, and other therapies are currently being deployed in the most recent Ebola outbreak, which is the second largest outbreak so far. Beginning in ] 2018 in the Democratic Republic of Congo (DRC), this outbreak is still on the roll. Unfortunately, the Centers of Disease Control and Prevention (CDC) did not have any viral samples this time, meaning they were unable to test the efficacy of ZMapp and other drugs against the recent viral strain. 

Scientists at the CDC, led by Dr. Laura McMullan, constructed an artificial virus from the sequence of the current strain shared by DRC’s National Biomedical Research Institute (INRB). The group used the sequence data to perform reverse genetics and generate the authentic Ebola virus that’s currently infecting scores of people in Ituri and North Kivu provinces of DRC. 

“It takes a lot of resources and a lot of money and a lot of energy to make a cloned virus by reverse genetics. And it would be so much easier if somebody had just sent the isolate”, Dr. Thomas Geisbert, who is not involved in the work, said. 

The CDC group established the efficacy of current treatments (a drug called Remdesivir and the vaccine ZMapp) on the viral strain by using their artificial virus for all the tests. Their work was published Tuesday in the journal Lancet.

For all four Ebola outbreaks that the DRC has seen, healthcare authorities have not shared viral specimens with foreign Ebola researchers. Instead, the whole genome sequence was provided every time. With the whole genome sequence data, the Lancet paper noted that there are at least two Ebola strains in DRC that have independently crossed into the human population.  

Reasons for not sharing viral samples by DRC are not known but it is a roadblock to rapid and efficient treatments in affected geographical regions. McMullan said that shipping of samples across such large distances is often a logistical issue and requires permission from several authorities and coordination of many people. 

 (Helen Branswell, STAT)

Plastic Has A Big Carbon Footprint — But That Isn’t The Whole Story

We are all too familiar with ghastly images of dead whales with plastic-filled stomachs. These images are compounded by pictures of how much waste is generated, such as a picture of a twenty-story high mound of plastic trash in a developing country that appeared in a recent news article. While there is worldwide concern about how to eliminate use of plastics, there is very little discussion about the environmental impact of the materials that will replace plastic. 

Plastic has a high carbon footprint. In a recent report the Center for International Environmental Law (CIEL) has broken down the individual steps of greenhouse gas production, from the beginning of plastic production until it ends up incinerated as a waste. Manufactured from oil and natural gas, plastic production adds to carbon footprint right from its cradle when gases and oils leak into the environment. Subsequently, delivery of raw materials to the production sites further add to the burden. Being among the most energy intensive materials to produce, plastic production takes a heavy toll on energy, water and electricity. Finally, when plastics are incinerated, greenhouse gases end up in the environment. 

But what about the materials that commonly substitute for plastic, such as paper, compostable plastic, canvas or glass? What is their carbon footprint in production stages? Research by several independent groups has revealed that these materials leave an even larger carbon footprint during their production. Data have shown that polyethylene plastic bags not only used lesser fuel and energy throughout production, they also emitted fewer global-warming gases and left lesser mass of solid wastes, when compared with paper bags and with compostable plastic bags. Being more durable than other materials, use of polyethylene bags are more energy friendly than use of paper bags. 

Research done on behalf of the American Chemistry Council has shown that replacing plastic would eventually do more harm to the environment than their use. Finally, consumer habits count. If people don’t reuse plastics, then its advantages over paper cease to exist. Of course, the problem of permanent waste and global health consequences are issues that cannot be overlooked. The solution might lie in using plastics more wisely and re-using them as much as possible. 

(Christopher Joyce, NPR

Have an interesting science policy link? Share it in the comments!

Advertisements

Written by sciencepolicyforall

July 12, 2019 at 3:18 pm

Posted in Linkposts

Tagged with , , , , ,

How human health depends on biodiversity

leave a comment »

By: Lynda Truong

Image by V Perez from Pixabay 

By many measures, the Earth is facing its sixth mass extinction. The fifth mass extinction, a result of a meteorite approximately 10 km in diameter, wiped out the dinosaurs and an estimated 40-75% of species on Earth. This time around, the natural disaster that is threatening life on Earth is us.

In May, the United Nations released a preliminary report on the drastic risk to biodiversity (not to be confused with the recent report on the drastic consequences of climate change).  The assessment, which was compiled by the Intergovernmental Science-policy Platform on Biodiversity and Ecosystem Services (IPBES), draws on information from 15,000 scientific and government sources with contributions from 145 global experts. It projects that one million species face risk of extinction. Scientists have estimated that the historical base level rate of extinction is one per million species per year, and more recent studies suggest rates as low as 0.1 per million species per year. At the established base level rates, it would take one to ten million years to see the same magnitude of extinction the planet currently faces. This accelerated rate of extinction can be linked to a variety of man-made causes, including changes in land and sea use, direct exploitation of organisms, climate change, pollution, and the introduction of invasive species. 

For some, that may not seem important. If humans are not on the endangered species list, why should it matter? As the IPBES Global Assessment indicates however, healthy ecosystems provide a variety of services, including improving air quality, purifying drinking water, and mitigating floods and erosions. The vast canopies of rainforests worldwide sequester 2.6 billion tons of carbon dioxide a year. Plants and soil microbes found in wetlands can remove toxins from water, including explosive chemicals such as nitroglycerin and trinitrotoluene (TNT). Mangrove forests serve as an important buffer against ocean storm surges for those on land. Nature is a powerful resource, and declines in biodiversity have broad implications for global development and health. 

The importance of biodiversity on global health is immediately apparent in middle- and low-income countries, which rely heavily on natural remedies and seasonal harvests for health and nutrition. The loss of entire species of plants can eliminate valuable sources of traditional medicine for indigenous communities. Genetically diverse crops are more resilient to pest and disease, ensuring a stable food supply and bolstering food security. Beyond this, ecosystem disturbances also have complex implications for infectious disease, which are often endemic to developing nations. 

However, these effects are also seen in first world countries. A well cited example for the impact of biodiversity loss on infectious disease involves Lyme disease, which is endemic to parts of the United States. The white footed mouse is a common carrier of Lyme disease, and in areas with high densities of these mice, ticks are likely to feed on the mice and subsequently transmit the disease to humans. However, the presence of other mammals that the tick can feed on dilutes the disease reservoir, lowering the likelihood of an outbreak (commonly referred to as the “dilution effect”). While biodiversity has complicated effects on the spread of infectious diseases, drastic changes to ecosystems often provide a breeding ground for disease vectors and lead to increases in transmission.

In addition to the direct effects of declines in biodiversity have on global health, an often-neglected aspect of its importance for health is as a resource for biomedical science. The IPBES assessment reports that 70% of cancer drugs are natural or inspired by natural sources such as traditional medicines. This merely scratches the surface of the influence of nature on modern biomedical research. 

Much like the communities that rely on natural products as medicine, many drug compounds produced by pharmaceutical companies are derived from nature. Morphine has been one of the most revolutionary drug compounds in history, effectively treating both acute and chronic pain. The compound was originally isolated from the opium poppy, and its chemical structure has since been modified to reduce negative effects and improve potency. While the current opioid crisis in the United States has highlighted the importance of moderate use, morphine and its analogues are some of the most useful and reliable pain relievers in modern medicine. Similarly, aspirin has been regarded as a wonder drug for its analgesic, anti-inflammatory, and cardioprotective effects. Aspirin is a chemical analogue of salicylic acid, a compound originally isolated from willow tree bark. 

Beyond general pain relief, many naturally derived drugs have also been useful for disease treatment. Quinine, the first effective antimalarial drug, was extracted from the bark of cinchona trees, and quinine and its analogues are still used to treat malaria today. Penicillin, serendipitously discovered in a fungus, has been useful for treating bacterial infections and informing modern antibiotic development. These medicines and many more have been crucial to the advancement of human health, yet could have just as easily been lost to extinction.

On a more fundamental level, scientific research has benefited from many proteins isolated from nature. Thermophilic polymerases, isolated from a bacterium residing in hot springs, are now an essential component of polymerase chain reactions (PCR) – a common laboratory technique that amplifies segments of DNA. This method is critical in molecular biology labs for basic research, and forensic labs for criminal investigations.Fluorescent proteins, which have been isolated from jelly fish and sea anemone, revolutionized the field of molecular biology by allowing scientists to visualize dynamic cellular components in real time. More recently, CRISPR/Cas systems were discovered in bacteria and have been developed as a gene editing tool capable of easily and precisely modifying genetic sequences. These basic tools have vastly improved the scope of biomedical research, and all of them would have been close to impossible to develop without their natural sources.

In addition to medicines and tools, nature has often informed biomedical research. Denning bears are commonly studied for potential solutions to osteoporosis and renal disease. Their ability to enter a reduced metabolic state where they do not eat, drink, or defecate for months at a time provides valuable insight into how these biological processes may be adapted to benefit human disease and physiology. Even more interestingly, there are a few species of frogs that become nearly frozen solid in winter, and thaw fully recovered in spring. In this frozen state, much of the water in their body turns to ice, their heart stops beating, and they stop breathing. When temperatures rise, they thaw from the inside out and continue life as per usual. Crazy cryonics and immortality aside, the freeze/thaw cycles could inform improved preservation for organ transplants.

Nature is a much better experimentalist than any human, having had billions of years to refine its experiments through the process of evolution and natural selection. Depleting these living resources, which provide invaluable benefits to human health and ecosystems, lacks foresight and is dangerously reckless. The techno-optimist approach of ceaseless development in the blind belief that whatever problem humanity encounters can be solved with research and innovation neglects to account for the dependency of research and innovation on nature. Most biomedical scientists, most physicians, and much of the general public have probably devoted a minimal amount of consideration to the importance of biodiversity. But for the one million species currently at risk, and for the hundreds of million more yet to be discovered, it’s worth a thought.

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

June 7, 2019 at 9:51 am

Science Policy Around the Web – April 2, 2019

leave a comment »

By: Patrice J. Persad Ph.D.

Image by Jason Gillman from Pixabay

Worrisome nonstick chemicals are common in U.S. drinking water, federal study suggests

What lurks in our drinking water—and all its effects on organismal health—may be more of a mystery than what resides in the deep recesses of our oceans. In a recent investigation conducted by the United States Geological Survey and the Environmental Protection Agency (EPA), manmade per- and polyfluroalkyl substances (PFAS) tainted drinking water samples were analyzed. PFAS, which put the “proof” in water-proof items, are substances of concern, or, more aptly, contaminants of emerging concern (CECs), given their potential carcinogenicity and permanence in ecosystems. Perfluorooctane acid (PFOA), a PFAS discontinued in production domestically, was at a concentration over 70 nanograms per liter (ng/l) in a sample. A trio of other PFAS surpassed this concentration level, as well. A standard level issued by federal agencies has yet to transpire. However, the Centers for Control of Disease(CDC) attests that the existing cut-off of 70 ng/l is unacceptable in that it is not sufficiently low, or conservative, with respect to human health. 

The Environmental Working Group(EWG) suspects that over 100 million individuals in the U.S. drink water with PFAS. Citizens currently advocate for authorities to test drinking water samples and disclose PFAS concentrations. Without setting standards, accountability for future detriments to health is up in the air. Only through discussion with the public, policy makers, the research community, and parties formerly or currently producing PFAS can we set safeguards to protect our water supply plus well-being. 

(Natasha Gilbert, Science)


To Protect Imperiled Salmon, Fish Advocates Want To Shoot Some Gulls

In recreating the fundamental question “Who stole the cookies from the cookie jar?”, nature’s version spins off as “Who stole the juvenile salmon from Miller Island?” In this spiraling whodunit mystery, an unexpected avian culprit surfaces: the gull. According to avian predation coordinator Blaine Parker, surveys revealed that a fifth of imperiled salmon were whisked away by gulls near channels flowing out of dams. Gulls also spirited away these juvenile fish from other avian predators, such as Caspian terns. Parker maintains that not every gull is a perpetrator of decreasing the species’ numbers; gulls can assist with the population control of other birds who feast on the juveniles. Therefore, he supports killing the individual gulls disturbing juvenile salmon booms—lethal management.

Although there has been precedent of sacrificing avian species for the security of juvenile salmon, several entities denounce lethal management of wayward gulls affecting the young fish’s survival rates. The Audubon Society of Portlandpoint out that the Army Corps. of Engineers’ modifications to dams for warding away gulls, or other airborne predators, are slipshod and ineffective, if not inexistent. The U.S. Army Corps., despite this criticism, avows that killing specific gulls is only a final resort. From Parker and these organizations’ opposing viewpoints, a new mystery migrates to the surface. Will killing avian predators populating dams and waterways have a significant impact on the endangered salmons’ survival? Research collaboration on ecological impacts may be a way to tell or reassess the futures of both juvenile salmon and gulls. 

(Courtney Flatt, Northwest Public Broadcasting/National Public Radio



Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

April 3, 2019 at 10:32 am

Science Policy Around the Web – March 18, 2019

leave a comment »

By: Allison Cross, Ph.D.

Source: Pixabay

Scientists track damage from controversial deep-sea mining method

The extraction of rare and valuable metals and minerals from the deep sea is highly attractive to mining companies.  Scientists, however, have long raised concerns about potential harmful effects of these activities on marine ecosystems.  Next month, the mining company Global Sea Mineral Resources is scheduled to harvest precious metals and minerals on the seafloor in the remote Pacific Ocean for eight days with a team of scientists working alongside them.  The scientists will be using deep-sea cameras and sensors to monitor sediment plumes created by the mining activity.  

Scientists are concerned that sediment plumes created during deep sea mining could extend tens or hundreds of meters above the seafloor and “bury, smother and toxify” the marine communities in these regions.  The research exhibition scheduled for next month is intended to help scientists understand the potential impact of deep-sea mining and inform the development of an international code of conduct for deep-sea commercial mining.  

The code of conduct for deep sea commercial mining will be created by the International Seabed Authority (ISA), an organization founded in 1994 to organize, regulate and control all mining activity in international waters.  The ISA is planning to finalize the code by 2020, allowing companies that have been granted licenses to extract minerals from the deep sea to begin full scale mining in the Pacific Ocean.  

Though the experiment scheduled for next month will provide key insight into how long it takes for sediment plumes to resettle, and how far they can travel, the experiment is just too short to gauge potential long-term effects of mining activities.  Craig Smith, an oceanographer at the University of Hawaii at Manoa in Honolulu cautions “We will not really understand the actual scale of mining impacts until the effects of sediment plumes from full-scale mining are studied for years”.

(Olive Heffernan, Nature Briefing)

U.S. blocks U.N. Resolution on Geoengineering

Last week, during the fourth session of the UN Environment Assembly (UNEA) in Nairobi, the United States, Saudi Arabia, and Brazil joined together to block a resolution aimed at studying the potential risks of geoengineering.  “Geoengineering”, also referred to as climate engineering or climate intervention, aims to mitigate effects of global warming using techniques like solar radiation management and carbon dioxide removal.

Geoengineering technologies are not yet operational and while proponents believe these techniques could help curb the impact of climate change, opponents worry about the potential risks of these techniques on both people and nature. Notably, one proposed method of solar radiation managementinvolves using aerosols to reflect a portion of inbound sunlight back out to space. Research in this area is still in its infancy and some worry that infusing the atmosphere with aerosols could lead to undesired side effects, like severe weather.  

The proposal raised at the UNEA meeting last week, backed by Switzerland and nine other nations, aimed to direct the U.N. Environment Programme to study the implications of geoengineering and compile a report by next year on current scientific research in this area. 

While there is some consensus that issuesof geoengineering technologies need to be explored, countries disagree on who should be overseeing these efforts. It has been reported that the United States prefers questions about geoengineering to be dealt with by the Intergovernmental Panel on Climate Change (IPCC), rather than by UNEA. The IPCC is reported to be assessing geoengineering as a part of its next report set to be published in 2021 or 2022. 

(Jean Chemnick, Scientific America)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

March 19, 2019 at 7:45 pm

Intellectual property theft and its effects on US-China trade relations

leave a comment »

By: Neetu Gulati, Ph.D.

Source:Wikimedia

China and the US are currently in the midst of a trade war that, if not resolved my March 1, 2019, will lead to another increase in tariffs by the US. This trade war, which started over the US accusing China of stealing intellectual property from American companies, has already affected the economy of the two countries and could have global effects. The US has evidence that information including biomedical research breakthroughs, technological advances, and food product formulations have been stolen. In response to these illicit trade practices, the US imposed tariffs on Chinese imports, leading to the beginning of the trade war.

So how did we get here? 2019 marks forty years of diplomatic relations between the United States and China, which officially began on January 1, 1979. Since relations began, the two countries have benefited from ongoing trade, and China has become the largest goods trading partner with the US. Bilateral economic relations have increased from $33 billion in 1992 to over $772 billion in goods and services in 2017.  Despite strong economic ties, relations between the two countries have come under strain in recent years. The US State Department has identified concerns over military conflict in the South China Sea, counter-intelligence and security issues, and the trade deficit, among other issues. These issues came to a head in April 2018 when President Donald J. Trump issued a statement that China had stolen America’s intellectual property and engaged in illegal trade practices. In response, the US imposed additional tariffs on approximately $50 billion worth of Chinese imports. China then countered with tariffs on US imports, and thus a trade war between the two countries began.

To understand how intellectual property, or IP, fits into the trade war, it is important to first understand what it is. According to the World Intellectual Property Organization, IP “refers to creations of the mind, such as inventions; literary and artistic works; designs; and symbols, names and images used in commerce.” More simply, IP is something created or invented through human intellect, but not necessarily a tangible product. These products often have important scientific implications, as the umbrella of IP can cover genetically engineered crops, newly developed technologies and software, and new therapeutics, just to name a few. IP is legally protected through means such as patents, trademarks, and copyright, which allow people to gain recognition and financial benefits from their creations. These protections are country-specific, and the US Patent and Trademark Office gives guidance about protecting IP overseas, including in China. The process of transferring IP from the creator to another entity, often for distribution purposes, is known as technology transfer. This process is at the heart of the accusation of theft of American IP.

According to a seven-month long investigation done by the United States Trade Representative (USTR), China’s unreasonable technology transfer policies meant they did not live up to the commitments made when joining the World Trade Organization. The report found that Chinese laws require foreign companies to create joint ventures with domestic Chinese companies in order to sell goods within the country. The investigation by USTR found that “China’s regulatory authorities do not allow U.S. companies to make their own decisions about technology transfer and the assignment or licensing of intellectual property rights.  Instead, they continue to require or pressure foreign companies to transfer technology as a condition for securing investment or other approvals.” By pushing for technology transfer, these laws opened up American companies to theft of their IP. Stolen IP has included things like software code for a wind turbine, genetically modified corn seeds, the idea behind a robot named Tappy, and even the formulation for the chemical that makes Oreo filling white.

Beyond stealing information for goods entering China, it is also possible that Chinese workers in the United States may be stealing IP and sending it back to their home country. For example, a Chinese scientist known as ‘China’s Elon Musk’ was accused by his former research advisor of stealing research done at Duke University and replicating it in China for his own gain. A former assistant director of counterintelligence at the FBI suspects that the Chinese scientist was sent by the Chinese government intentionally to steal IP. This was not an isolated incident, either. According to a report from an advisory committee to the National Institutes of Health (NIH), research institutions in the US may have fallen victim to a small number of foreign researchers associated with China’s “Talents Recruitment Program,” which the National Intelligence Council identified as an effort to “to facilitate the legal and illicit transfer of US technology, intellectual property and know-how.” This comes mere months after the NIH announced that it had identified undisclosed financial conflicts between US researchers and foreign governments. Without giving details of specific countries, NIH Director Francis Collins reported to a Senate Committee hearing that “the robustness of the biomedical research enterprise is under constant threat.” Nevertheless, these threats should not hinder the research enterprise. During a hearing in April 2018, House Science Committee Chair Lamar Smith remarked, “on the one hand, we must maintain the open and collaborative nature of academic research and development. On the other, we must protect our research and development from actors who seek to do us harm.”

The balance between research collaboration and theft is delicate. Information sharing is increasingly necessary as scientific pursuits become more interdisciplinary in nature, and can lead to more productivity in research. However, voluntary collaboration is different from unwilling or coerced transfer of ideas. The ability of US scientists and entrepreneurs to innovate and create new IP is an important driver of the American economy, and further allows for the ability to research new scientific pursuits. Not only does IP theft undermine the incentive and ability for Americans to innovate, it has had drastic negative effects on the American economy, with annual losses estimated to be between $225 billion and $600 billion according to a report put out by the IP Commission. These losses directly affect those who own and/or license IP, as well as those who are associated with these companies or individuals. This can then lead to downsizing or cutting jobs, further harming American science and technology industries. It is for this reason that the US responded so strongly against the evidence of IP theft.

In response to the accusations from the US, Chinese President Xi Jinping promised to resolve the “reasonable concerns” of the US regarding IP practices. The Chinese government announced punishments that could restrict Chinese companies from state funding support due to IP theft and at the G20 Summit in December 2018, the Presidents of the two nations agreed to a 90-day financial truce, which will end March 1, 2019. 

The two countries are currently working on a trade deal to end the escalating tariffs, which would lessen tensions between the world’s two largest economies. The US wants China to commit to buying more American goods and services, and to agree to end the practice of requiring American companies to give technology transfers in order to do business in China. Without hashing out details, China has agreed to increase imports of U.S. agriculture, energy, industrial products and services. Delegations from the two countries will meet again in mid-February in China to continue negotiating. Trump was optimistic that the two nations would be able to make a deal before the deadline, saying, “I believe that a lot of the biggest points are going to be agreed to by me and him.”  

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

February 7, 2019 at 9:39 pm

Saving the Chesapeake Bay – Home to 18 million people

leave a comment »

By: Hsiao Yu Fang, Ph.D.

6655589977_7c79f475f0_b
Source: Flickr

The Chesapeake Bay is the largest U.S. estuary, where freshwater from rivers and streams flows into the ocean and mixes with seawater, making it a rich environment with abundant wildlife. Every year, the Bay produces 500 million pounds of seafood. The entire Chesapeake Bay watershed, which includes six states (New York, Pennsylvania, Maryland, Virginia, West Virginia, and Delaware) and the District of Columbia, is home to 3,600 species of plants and animals and more than 18 million people. Importantly, the actions of these 18 million people directly affect the health of the Bay. To quote the movie Finding Nemo, “All drains lead to the ocean.” Due to the combination of water-born nutrient pollution that comes from human-produced waste and runoff from cities and farms, the Bay has been listed on the country’s “impaired waters” list for decades. Thankfully, recent policy measures to regulate the environmental impact of human activity on the Bay have shown profoundly promising results that with further efforts could fully restore the health of the Bay.

At one point, the conditions of the Chesapeake Bay seemed almost irreversible. Years of population growth and pollution led to a significant decline in animal species, affecting commercial and recreational fishing as well as tourism. Scientists have shown that about one-third of the nitrogen in the Chesapeake comes from air pollution. Pollution in the air emitted from power plants and vehicles is carried over long distances via weather conditions and eventually deposits into the Bay’s waters. As air pollution can travel thousands of miles, the region over which air pollutants are capable of impacting the Bay is known as the airshed; this area is about nine times as large as the Bay watershed. Excess nitrogen and phosphorus pollution in the Chesapeake cause a biological chain reaction that results in “dead zones” – areas with minimal amounts of oxygen. This phenomenon worsens in the summer, when heat and pollution fuel algae blooms, blocking sunlight and depleting life-sustaining oxygen underwater. Aquatic life including fish, crabs, and oysters suffocate in these areas of the Bay affected by dead zones. The Bay used to yield tens of millions of bushels of oysters. Today the annual catch has fallen to less than one percent of historic records.

There have been several attempts through the years to restore the Bay. The Clean Water Act of 1972 reduced industrial pollution to the Bay, though it fell short of its promises of transforming the Bay into “fishable, swimmable” waters. In 1984, the six states within the Bay watershed embarked on another cleanup plan, which again failed to show lasting improvements. In 2010, the Chesapeake Clean Water Blueprint was established, which is the largest water cleanup plan ever managed by the US government. Using the powers granted by the Clean Water Act, the Environmental Protection Agency (EPA) issued new pollution limits for nitrogen, phosphorus, and sediment feeding into the Bay. Subsequently, the six Bay states and the District of Columbia announced formal plans to meet the EPA limits by 2025. What makes the Blueprint unique compared to previous failed attempts is that it will impose penalties on states that fail to act.  Each state is required to reach two-year incremental milestones of pollution reduction. Ideally, once the Blueprint fully achieves its goals, the Bay should no longer be on the impaired waters list.

Almost a decade has passed since the restoration efforts of the Chesapeake Clean Water Blueprint began, and already the Bay shows the potential for becoming a transformative environmental success story. Today, the Bay appears more resilient and capable of adapting to excess pollution loads. Recent studies have shown that the Bay is beginning to replenish oxygen in its waters; repairing what were once underwater dead zones. The Chesapeake Bay Foundation’s (CBF) 2018 State of the Bay Report’s Habitat Indicator Scores show that the resilience of the Bay, quantified as the growth of underwater grasses and resource lands, is slowly increasing from their 2016 levels, despite the record-breaking summer storms of 2018.

While progress has been made in restoring the Bay, more is needed. Bipartisan support from the federal government and from federal-state collaborations is essential to the Bay’s further recovery. The Bay’s overall health remains fragile and additional improvement is not assured. In fact, CBF’s2018 State of the Bay Report released this month showed a decline in the Bay’s health for the first time in a decade. This was due to extreme storm-related weather conditions in 2018 that carried high concentrations of nitrogen, phosphorus, and debris into the Bay.

The Chesapeake Bay’s health has vital impacts on people’s health, jobs, and access to clean drinking water. The forests in the Bay watershed produce safe, filtered drinking water for 75 percent of the watershed’s residents, which is nearly 13 million people. If more action is not taken now, the future cost of inaction will be more dire and expensive than current restoration efforts. The Chesapeake Clean Water Blueprint might be the best and last chance to restore the Bay. Simple, individual actions like conserving water and energy in our daily activities, volunteering in stream and river cleanups, and contacting local representatives and advocating for the importance of protecting the Bay can also go a long way towards contributing to the well-being of the Bay. “Treasure the Chesapeake” is not just a slogan on a license plate – these words underlie a great environmental recovery project, as well as a potential model for water pollution clean-up projects around the world.

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

January 23, 2019 at 3:54 pm

Science Policy Around the Web – October 26, 2018

leave a comment »

By: Mohor Sengupta, Ph.D.

rig-2251648_1280

Source: Pixabay

Environmental Problems

A 14-year-long oil spill in the Gulf of Mexico verges on becoming one of the worst in U.S. history

In the year 2004, hurricane Ivan leveled an oil production platform in the Gulf of Mexico, owned by Taylor Energy. Its striking magnitude destroyed the colossal platform which had drilled into several oil wells. The result was a huge mound of muck, filling the broken steel structure and leaking oil. To date, efforts to seal off the leakage have not been successful.

Taylor Energy at first denied that there was any leakage and then underreported the extent of the leakage. According to current estimates, about 700 barrels of oil are leaking per day, with each barrel holding 42 gallons of oil. The company has kept this information a secret for many years, and few people are aware of the actual level of spillage. The Taylor Energy spillage in fact pre-dates the Deepwater Horizon oil spill (also called the BP leak), so far the largest marine oil spill in history at 168 million gallons. While BP has coughed up $66 billion for fines, legal settlements and cleanup, Taylor Energy is a comparatively smaller operation and financially too cash-strapped to afford cleanup on such a large scale.

In these actions Taylor Energy flouted both the EPA’s Oil Pollution Act of 1990, which mandates that spillage must be reported to the U.S. Coast Guard National Response Center (NRC), and the Clean Water Act of 1972, which created a structure for regulating water pollutants. Taylor Energy was taken to court by environmentalists, and Taylor Energy and the NRC have been jointly found accountable in presenting false numbers and data. In an assessment submitted to Taylor Energy in 2009 by Waldemar S. Nelson and Company, a private firm, risks involved in ingesting fish from the affected area were discussed. A recent and independent analysis by the Justice Department showed that the original estimate of 1 to 55 barrels of leakage per day provided by NRC was inaccurate. After several spillage tests Oscar Garcia-Pineda, the author of the article, concluded that his results didn’t tally with those reported by NRC and the actual rate of spillage was 48 to ~1700 barrels per day.

These disturbing findings have arrived at a delicate time for environmental protection policy. Earlier this year, the Trump administration proposed a wide expansion of leases to the oil and gas industry. This would render all off-shore areas on the continental shelf, including those along the Atlantic coast, amenable to drilling. Oil and gas representatives are lobbying for this cause and have provided financial justifications including billions of dollars’ worth of annual economic growth, increased jobs and lower heating costs. However, multiple governors representing states across the four planning areas, from Maine to the Florida Keys, are opposed to this proposal.

Reports show that on average there are 20 uncontrolled releases of oil per 1000 wells under state or federal governments. In Louisiana alone, approximately 330,000 gallons of oil are spilt from off-shore and on-shore rigging platforms. With changing climate patterns, hurricanes on the Atlantic are predicted to be more intense in future, and given the government’s plans to extend rigging along the Atlantic coast, a bleak prospect looms ahead.

(Darryl Fears, Washington Post)

Health Supplements

The Problem with Probiotics

The healthy balance or maintenance of the natural flora of the gut, also called the gut microbiome,is essential for a healthy digestive system. Antibiotics have been shown to disrupt the gut-microbiome, resulting in diseases such as diarrhea and infections with Clostridium difficile. As an antidote, it has been common practice to pop in “good bacteria”, or probiotics, while on antibiotic treatment. These probiotics are essentially a mixture of supposedly healthy gut microbiota and are meant to replace those disrupted by the antibiotic.

Although people commonly take probiotics, this class of product is not regulated by the FDA and there are rising concerns about the standard of manufacture and quality of these commonly sold over-the-counter health supplements. Most recently, Dr. Pieter A. Cohen cautioned against overlooking the harmful effects of widely marketed probiotics in his recent article published in “JAMA Internal Medicine”.

There have been several studies discussing the benefits of probiotics, so much so that the journal “Nutrition” recently published a systematic review of systematic reviews. In a nutshell, all the studies ever done on efficacy of probiotics have produced very limited positive results and only pure microbial strains were used as the probiotic supplement in these studies. On the other hand, there has been no evidence to show that probiotics have been beneficial in treating conditions such as Crohn’s Disease, chronic diarrhea, ulcerative colitis or liver disease, all related in some way to the gut microbiome.

Safety assessment studies have found probiotics to be contaminated with unwanted microbial strains, and without FDA regulation of the manufacturing process production doesn’t often follow a well-defined pipeline. It is not known what kinds of health hazards might be caused by these contaminants, warns Dr. Cohen, and they can be lethal. In a notorious case, the death of an infant was attributed to a contaminated dietary supplement.

Unfortunately, none of these events have deterred Americans from using probiotics. Almost four million people, or 1.6 percent of adults in the United States used probiotics in 2012 and the global market for probiotics is steadily on the rise. In this situation, it is of great importance for dietary supplements be given the rigorous assessment and quality control checks that a prescription drug undergoes. There should be increased efforts to make consumers aware of adulterations in probiotics.

(Aaron E. Carrol, New York Times)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

October 26, 2018 at 12:36 pm