Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘EPA

Science Policy Around the Web August 1st, 2019

leave a comment »

By Andrew Wright BSc

Image by Steve Buissinne from Pixabay 

Major U.S. cities are leaking methane at twice the rate previously believed

While natural gas emits less carbon dioxide (CO2) when burned, if allowed to enter the atmosphere as methane (CH4) it can act as a greenhouse gas that is 20-80 times more potent than CO2. Some of this impact is supposed to be mitigated by the relatively low amount of leaked methane, roughly 370,000 tons in six major urban areas studied according to a 2016 report from the EPA. However, a new study in the journal Geophysical Research Letters analyzed those same metropolitan centers and found that the EPA has underestimated methane release by more than half. By taking simultaneous measurements of ethane, which appears only in natural gas supplied to homes and businesses, researchers were able to delineate the sources of leakage, as natural sources and landfills do not give off ethane. 

From their analysis, the total estimate from the six cites studies was 890,000 tons of CH4, 84% of which was from methane leaks. While the authors of the study are unsure as to why the EPA estimates are so low, they suggest it could be because the EPA only estimate leaks in the distribution system, rather than endpoint leaks in home and businesses. While these results cannot be reliably extrapolated to newer cities which may contain infrastructure more resilient to leakage, they could engender further study to gather a clearer picture of national methane release.

(Sid Perkins, Science)

 

Japan approves first human-animal embryo experiments

On March 1st the Japanese science ministry lifted a ban on growing human cells in animal embryos and transferring them to animal uteri. While human-animal hybrid embryos have been made before, functional offspring have not been allowed to develop.  The first researcher to take advantage of this new regulatory scheme is Hiromitsu Nakauch, the director of the Center for Stem Cell Biology and Regenerative Medicine at the Institute of Medical Science at the University of Tokyo and a faculty member at Stanford University. His long-term goal is to grow human organs in animals such as pigs, from which the functional organs could be extracted and transplanted into human patients. His intent is to start in an early embryonic mouse model, then a rat model, and finally a pig model with embryos that form for up to 70 days. 

This measured approach is in stark contrast to the recent controversy regarding CRISPR edited babies in China, but has still been met with a certain level of ethical skepticism. Bioethicists are particularly concerned that the human cells being injected into animal embryos, induced pluripotent stem (iPS) cells, may deviate from their intended target (in this case the pancreas) and affect the host animal’s cognition. According to Nakauchi, the experimental design, which involves eliminating the gene for the target organ and injecting human iPS cells to compensate, is such that the cells should only be involved in a specific part of the animal. 

While Nakauchi’s group used this method to successfully grow a pancreas in a rat from mouse cells, they have had limited luck putting human iPS cells into sheep embryos. Given the evolutionary distance between mice, rats, pigs, and humans it may be difficult for experimenters to produce more satisfactory results. To address this Nakauchi has suggested that he will be trying genetic editing techniques as well as using various developmental stages of iPS cells.

(David Cyranoski, Nature)

 

 

Advertisements

Written by sciencepolicyforall

August 1, 2019 at 12:23 pm

Science Policy Around the Web – April 2, 2019

leave a comment »

By: Patrice J. Persad Ph.D.

Image by Jason Gillman from Pixabay

Worrisome nonstick chemicals are common in U.S. drinking water, federal study suggests

What lurks in our drinking water—and all its effects on organismal health—may be more of a mystery than what resides in the deep recesses of our oceans. In a recent investigation conducted by the United States Geological Survey and the Environmental Protection Agency (EPA), manmade per- and polyfluroalkyl substances (PFAS) tainted drinking water samples were analyzed. PFAS, which put the “proof” in water-proof items, are substances of concern, or, more aptly, contaminants of emerging concern (CECs), given their potential carcinogenicity and permanence in ecosystems. Perfluorooctane acid (PFOA), a PFAS discontinued in production domestically, was at a concentration over 70 nanograms per liter (ng/l) in a sample. A trio of other PFAS surpassed this concentration level, as well. A standard level issued by federal agencies has yet to transpire. However, the Centers for Control of Disease(CDC) attests that the existing cut-off of 70 ng/l is unacceptable in that it is not sufficiently low, or conservative, with respect to human health. 

The Environmental Working Group(EWG) suspects that over 100 million individuals in the U.S. drink water with PFAS. Citizens currently advocate for authorities to test drinking water samples and disclose PFAS concentrations. Without setting standards, accountability for future detriments to health is up in the air. Only through discussion with the public, policy makers, the research community, and parties formerly or currently producing PFAS can we set safeguards to protect our water supply plus well-being. 

(Natasha Gilbert, Science)


To Protect Imperiled Salmon, Fish Advocates Want To Shoot Some Gulls

In recreating the fundamental question “Who stole the cookies from the cookie jar?”, nature’s version spins off as “Who stole the juvenile salmon from Miller Island?” In this spiraling whodunit mystery, an unexpected avian culprit surfaces: the gull. According to avian predation coordinator Blaine Parker, surveys revealed that a fifth of imperiled salmon were whisked away by gulls near channels flowing out of dams. Gulls also spirited away these juvenile fish from other avian predators, such as Caspian terns. Parker maintains that not every gull is a perpetrator of decreasing the species’ numbers; gulls can assist with the population control of other birds who feast on the juveniles. Therefore, he supports killing the individual gulls disturbing juvenile salmon booms—lethal management.

Although there has been precedent of sacrificing avian species for the security of juvenile salmon, several entities denounce lethal management of wayward gulls affecting the young fish’s survival rates. The Audubon Society of Portlandpoint out that the Army Corps. of Engineers’ modifications to dams for warding away gulls, or other airborne predators, are slipshod and ineffective, if not inexistent. The U.S. Army Corps., despite this criticism, avows that killing specific gulls is only a final resort. From Parker and these organizations’ opposing viewpoints, a new mystery migrates to the surface. Will killing avian predators populating dams and waterways have a significant impact on the endangered salmons’ survival? Research collaboration on ecological impacts may be a way to tell or reassess the futures of both juvenile salmon and gulls. 

(Courtney Flatt, Northwest Public Broadcasting/National Public Radio



Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

April 3, 2019 at 10:32 am

Science Policy Around the Web – March 26, 2019

leave a comment »

By: Neetu M. Gulati Ph.D.

Image by Dimitris Vetsikas from Pixabay

Sunscreen ban aimed at protecting coral reefs spark debate – among scientists

Corals around the world have begun “bleaching,” turning white and expelling the algae that live within them. After a 2015 study found that oxybenzone can harm corals, environmentalists have worked to bar the sale of sunscreens containing the chemical. Last year, Hawaii was the first US state to ban sale of sunscreens containing oxybenzone, as well as another harmful chemical octinoxate, which are found in up to 75% of sunscreens on the US market. The ban will go into effect in 2021. Florida and California are considering similar laws. However, while some are fighting to limit the use of these toxic chemicals, others say the major issue is not sunscreen – it’s climate change.

Evidence indicates that harmful chemicals and warming oceans due to climate change are both damaging corals and leading to bleaching. Scientists agree that the major contributing factor is climate change and the chemicals play a lesser role. Nevertheless, they disagree about what should be done. C. Mark Eakin, an oceanographer and the coordinator for NOAA’s Coral Reef Watch program, commented “if we don’t deal with climate change, it won’t matter what we do about sunscreens.” Furthermore, some people believe there is not enough clear evidence explaining how damaging these chemicals can be. While many scientists share this viewpoint, others think that every step towards saving the corals matters. Some lawmakers agree with this philosophy; Teri Johnston, the mayor of Key West, Florida, said of banning the harmful chemicals, “if it’s something we can do to minimize damage to reefs, it’s one small step we’re going to take.” The city of Key West banned the sale of sunscreens containing oxybenzone and octinoxate last month, an act that will go into effect in 2021.

Damage to coral reefs is a complicated issue, with multiple stressors likely to be involved: not only climate change and sunscreens, but also pollution and other harmful chemicals. While many are worried about protecting the reefs, there is also concern as to how these bans will affect human health. In response to the Hawaii ban, the Skin Cancer Foundation put out a statement which said, “by removing access to a significant number of products, this ban will give people another excuse to skip sun protection, putting them at greater risk for skin cancer.” 

One possible solution is to expand the number of ingredients permitted in sunscreen, to allow for other protective chemicals that are less harmful to the environment. The FDA has not expanded its list of approved ingredients in approximately 20 years. Comparatively, Europe allows for more chemicals, hopeful that any one single chemical will have a less harmful environmental impact when more diversity of ingredients is allowed. Towards this end, the FDA recently proposed new regulationsto improve American sunscreens.

(Rebecca Beitsch, Washington Post

In a first, U.S. private sector employs nearly as many Ph.D.s as schools do 

The career landscape for burgeoning PhDs has changed drastically in the last 20 years; while the number of PhDs awarded has increased, especially in the fields of life and health sciences, the proportion of PhDs employed in tenured and tenure-track positions has declined. This is in contrast to what some current faculty members, who may assume that tenure track positions are the standard path for PhDs, and other career paths are “alternative.” According to the Survey of Doctorate Recipients from the US National Science Foundation (NSF), in 2017, for the first time, private sector employment of PhDs (42%) is nearly equivalent to employment by educational institutions (43%). This is in stark contrast to 1997, when educational institutions employed 11% more PhDs than the private sector. While the survey takes into consideration all PhDs under the age of 76 who are employed full-time in the US, it is expected that newer PhDs are less likely to secure tenure-track positions. 

As career trajectories change, some universities are using new information about PhD outcomes to improve programming for current graduate and prospective students. According to the Coalition for Next Generation Life Science, ten academic institutions have released data onlineabout the career outcomes of their PhD graduates, with more institutions planning to release similar data by the end of next year. The data indicates the traditional model of training, which treats graduate school like an apprenticeship to becoming faculty, is outdated. Other skills that transfer beyond educational institutions, may be necessary to successfully train the next generation of PhDs. 

(Katie Langin, Science)



Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

March 26, 2019 at 5:00 pm

Science Policy Around the Web – May 11, 2018

leave a comment »

By: Mohor Sengupta, PhD

Tablets Nutrient Additives Dietary Supplements Pills

source: Max Pixel

Drug prices

Why Can’t Medicare Patients Use Drugmakers’ Discount Coupons?

With high drug prices, affordability of specialized medicines is a matter of concern for many individuals, especially those on life-saving brand-name drugs.

Manufacturers of brand-name medicines provide discount coupons to people with private health insurance. Such discounts are denied for people with federal healthcare plans such as Medicare or Medicaid. For example, for one patient on Repatha (a cholesterol reducing drug), the co-payment is $618 per month with the Medicare drug plan, but it is only $5 for patients with commercial insurance plans. This discrepancy has resulted in a “double standard” because arguably, the discount is denied to the people who need it most, that is the retired population subscribing (compulsorily) to federal healthcare programs.

Drug manufacturers have an incentive to offer discounts on branded medicines as they increase the likelihood of purchase and results in greater access to and demand for the products. While these discount coupons are immensely beneficial for life-threatening conditions for which generic drugs are not available, a 2013 analysis has shown that lower cost generic alternative and FDA approved therapeutic equivalent was available for 62% of 374 brand-name drugs.

The federal government has argued that with the discount coupons, patients might overlook or be discouraged from buying cheaper variants of the brand-name drug. Even if a patient chooses to use a brand-name drug with a discount coupon over cheaper alternative, their health insurance plan still has to pay for the drug. That amount maybe more than Medicare or Medicaid may be willing to pay. This has resulted in the federal anti-kickback statute which prohibits drug manufacturers to provide “payment of remuneration (discounts) for any product or service for which payment may be made by a federal health care program”.

One important question is why do drug makers sell the brand-name drugs at a much higher price bracket when generic, cheaper options are available? In the present scenario, insurance companies should make the judgement about whether they are willing to cover such brand-name drugs for which generic alternatives are available. Often doctors prescribe brand-name drugs without considering their long-term affordability by patients. It is the responsibility of doctors and insurance providers alike to determine the best possible drug option for a patient.

Taking in both sides of the picture, use of discounts must be exercised on a case basis. It must be enforced for specialized drugs against which generic alternatives are not available and which are usually used for severe or life-threatening conditions. Currently for people with such conditions and on federal healthcare plans, affordability is a major challenge.

(Michelle Andrews, NPR)

 

EPA standards

EPA’s ‘secret science’ rule could undermine agency’s ‘war on lead’

Last month the Environmental Protection Agency (EPA) administrator, Scott Pruitt issued a “science transparency rule” according to which studies that were not “publicly available in a manner sufficient for independent validation” could not be used while crafting a regulation. This rule is at loggerheads with Pruitt’s “war on lead” because a majority of studies on lead toxicity are observational, old and cannot be validated without consciously exposing study subjects to lead.

Lead is a potent neurotoxin with long term effects on central nervous system development. It is especially harmful to children. There are several studies showing lead toxicity, but many do not meet the inclusion standards set by the EPA’s the new science transparency rule. Computer models developed to assess lead toxicity, which played important role in EPA’s regulations on lead in the past, have amalgamated all these studies, including the ones that cannot be validated. If the science transparency rule is retroactive, it would mean trashing these models. An entire computer model can be rendered invalid if just one of its component studies doesn’t meet the transparency criteria.

Critics say that the transparency measure will be counter-effective as far as lead regulations are concerned. “They could end up saying, ‘We don’t have to eliminate exposure because we don’t have evidence that lead is bad’”, says former EPA staffer Ronnie Levin. Another hurdle is the proposed data sharing requirement. Lead based studies tend to be epidemiological and authors might be unwilling to share confidential participant data.

Bruce Lanphear of Simon Frazer University in Canada is skeptical of EPA’s intensions because the agency has not imposed similar transparency measures for chemical companies like pesticide producers.

Finally, this rule could set different standards for lead safely levels in different federal agencies. Currently Center for Disease Control and Prevention (CDC) and Department of Housing and Urban Development (HUD) consider 5 micrograms per milliliter of lead in blood as the reference level. The EPA rule could lead to a new reference level, leading to discrepancies when complying with agencies across the U.S. government.

(Ariel Wittenberg, E&E News)

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

May 11, 2018 at 10:24 pm

Science Policy Around the Web – May 8, 2018

leave a comment »

By: Saurav Seshadri, PhD

20180508_Linkpost

source: pixabay

Environment

EPA Cites ‘Replication Crisis’ in Justifying Open Science Proposal

The U.S. Environmental Protection Agency (EPA) may soon be using far less scientific evidence to inform its policy positions.  EPA administrator Scott Pruitt recently announced that, in an effort promote reproducibility and open access to information, the EPA will no longer consider studies whose underlying data or models are not publicly available.  However, such studies often represent the ‘best available’ data, which the EPA is legally obliged to consider, and form the basis of, among others, policies limiting particulate matter in the air.  Several studies that support the health and economic benefits of lower particulate limits do so by using detailed medical information whose disclosure would compromise patient confidentiality.  The so-called HONEST (Honest and Open New EPA Science Treatment) Act, put forth by House Republicans, aims to suppress such ‘secret science’; its detractors say that it’s a poorly disguised gift to industry interests, conveniently timed to take effect just before a scheduled review of pollution limits.

Opposition to the policy has been building steadily.  A letter signed by 63 House democrats, asking for an extension to the open comment period for the policy, has so far been unsuccessful. A separate letter, signed by almost a thousand scientists, and comments from several professional associations, have also been ignored – perhaps unsurprisingly, given Pruitt’s parallel effort to bar relevant scientists from EPA advisory boards.  The scientist behind the article calling attention to the ‘reproducibility crisis’ cited by Pruitt has also spoken out, writing that simply ‘ignoring science that has not yet attained’ rigorous reproducibility standards would be ‘a nightmare’.

Perhaps the most effective response has come from scientists who are outpacing the bureaucracy.  In a pair of papers published last year, a biostatistics and public health group at Harvard used air quality data, Medicare records, and other public sources to reiterate the health risks posed by air pollution.  Such studies could not be excluded by the new EPA policy and may influence regulators to keep particulate limits low.  Another potential roadblock to implementing changes could be the controversy surrounding Pruitt himself.  The administrator has been the target of several federal probes, following a series of scandals regarding his use of government funds for purposes such as a 24-hour security detail, soundproof office, and first class travel.  Bipartisan calls for his resignation have made his future at the EPA, and the quick implementation of a Republican agenda there, uncertain.

(Mitch Ambrose, American Institute of Physics)

Science funding

NIH’s neuroscience institute will limit grants to well-funded labs

With a budget of $2.1 billion, the National Institute of Neurological Disorders and Stroke (NINDS) is the fifth largest institute at NIH.  Yet each year many investigators are constrained by a lack of funds, while some large labs have accumulated so many grants that their principal investigator can only spend a few weeks per year on a given project.  To address this disparity, NINDS recently announced a plan to revamp implementation of an existing NIH policy, in which grant applications from well-funded labs must go through an additional review by a special council. While the current secondary review rarely rejects such applications, NINDS’ policy takes two steps to make the process more stringent: first, it increases the number of labs that would undergo review, to include labs that would cross the $1 million threshold with the current grant; second, it sets higher standards for review, requiring applications from such labs to score in the top 7% of all proposals to be successful.

Responses to the idea have been tentative, despite widespread support for its objective.  One potential cause for concern is its perceived similarity to the Grant Support Index (GSI), a previous NIH initiative with a similar goal (i.e., reallocating resources to sustain less-established but deserving researchers). The GSI sought to achieve this goal by placing a cap on the number of grants that a lab could receive, using a point system. However, this caused an uproar among scientists, who, among other issues, saw it as punishing or handicapping labs for being productive – it was quickly revised to create the Next Generation Researchers Initiative, a fund earmarked for early and mid-stage investigators, for which each institute is responsible for finding money.  The new policy appears to be a step towards meeting this obligation, and not, NINDS insists, a return to the GSI.

The impact of the new policy will probably be clearer after NINDS’ next round of grant reviews takes place, in January 2019.  So far, only the National Institute of General Medical Sciences (NIGMS) has a comparable policy, which has been in place since 2016.  The success of these approaches may well shape future cohorts of NIH-funded scientists – cutoffs and uncertainty are not unique to neuroscience, and other institutes are likely to be paying close attention.

(Jocelyn Kaiser, Science)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

May 8, 2018 at 6:11 pm

Science Policy Around the Web – January 30, 2018

leave a comment »

By: Kelly Tomins, BSc

20180130_Linkpost

By RedCoat (Own work) [CC-BY-SA-2.5], via Wikimedia Commons

Cloning

Yes, They’ve Cloned Monkeys in China. That Doesn’t Mean You’re Next.

Primates were cloned for the first time with the births of two monkeys, Zhong Zhong and Hua Hua, at the Chinese Academy of Sciences in Shanghai. Despite being born from two separate mothers weeks apart, the two monkeys share the exact same DNA. They were cloned from cells of a single fetus, using a method called Somatic Cell Nuclear Transfer (SCNT), the same method used to clone over 20 other animal species, beginning with the now infamous sheep, Dolly.

The recently published study has excited scientists around the world, demonstrating the potential expanded use of primates in biomedical research. The impact of cloned monkeys could be tremendous, providing scientists a model more like humans to understand genetic disorders. Gene editing of the monkey embryos was also possible, indicating scientists could alter genes suspected to cause certain genetic disorders. These monkeys could then be used a model to understand the disease pathology and test innovative treatments, eliminating the differences that can arise from even the smallest natural genetic variation that exists between the individuals of the same species.

Despite the excitement over the first cloning of a primate, there is much work to be done before this technique could broadly impact research. The efficiency of the procedure was limited, with only 2 live births resulting from 149 early embryos created by the lab. In addition, the lab could only produce clones from fetal cells. Now it is still not possible to clone a primate after birth. In addition, the future of primate research is uncertain in the United States. Research regarding the sociality, intelligence, and DNA similarity of primates to humans has raised ethical concerns regarding their use in research. The US has banned the use of chimpanzees in research, and the NIH is currently in the process of retiring all of its’ chimps to sanctuaries. In addition, there are concerns regarding the proper treatment of many primates in research studies. The FDA recently ended a nicotine study and had to create a new council to oversee animal research after four squirrel monkeys died under suspicious circumstances. With further optimization, it will be fascinating to see if this primate cloning method will expand the otherwise waning use of primate research in the United States.

The successful cloning of a primate has additionally increased ethical concerns over the possibility of cloning humans. In addition to the many safety concerns, several bioethicists agree that human cloning would demean a human’s identity and should not be attempted. Either way, Dr. Shoukrat Mitalipov, director of the Center for Embryonic Cell and Gene Therapy at the Oregon Health & Science University stated that the methods used in this paper would likely not work on humans anyways.

(Gina Kolata, New York Times)

Air Pollution

EPA ends clean air policy opposed by fossil fuel interests

The EPA is ending the “once-in always-in” policy, which regulated how emissions standards differ between various sources of hazardous pollutants. This policy regards section 112 of the Clean Air Act, which regards regulation of sources of air pollutants such as benzene, hexane, and DDE. “Major sources” of pollutants are defined as those that have the potential to emit 10 tons per year of one pollutant or 25 tons of a combination of air pollutants. “Area Sources” are stationary sources of air pollutants that are not major sources. Under the policy, once a source is classified as a major source, it is permanently subject to stricter pollutant control standards, even if emitted pollutants fall below the threshold. This policy was intended to ensure that reductions in emissions continue over time.

The change in policy means that major sources of pollution that dip below the emissions threshold will be reclassified as an area source, and thus be held to lower air safety standards. Fossil fuel companies have petitioned for this change for years, and the recent policy change is being lauded by Republicans and states with high gas and coal production. The EPA news release states that the outdated policy disincentives companies from voluntarily reducing emissions, since they will be held accountable to major source standards regardless of the amount of emissions. Bill Wehrum, a former lawyer representing fossil fuel companies and current Assistant Administrator of EPA’s Office of Air and Radiation, stated reversing this policy “will reduce regulatory burden for industries and the states”. In contrast, environmentalists believe this change will drastically increase the amount of pollution plants will expel due to the softening of standards once they reach a certain threshold. As long as sources remain just below the major source threshold, there will be no incentive or regulations for them to lower pollutant emissions.

(Michael Biesecker, Associated Press)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 30, 2018 at 3:30 pm

Science Policy Around the Web – December 8, 2017

leave a comment »

By: Roger Mullins, Ph.D.

20171208_Linkpost

source: pixabay

Chemical Safety

Chlorpyrifos Makes California List of Most Dangerous Chemicals

Last Wednesday, the California Office of Environmental Health Hazard Assessment (OEHHA) passed a vote to add the organophosphorus pesticide Chlorpyrifos to Proposition 65, an extensive list of over 900 chemicals known to cause cancer, birth defects, or reproductive harm. While Chlorpyrifos was previously considered for inclusion on this list in 2008, updated scientific information gave the OEHHA cause for reassessment.

This new data included further information on the neurodevelopmental toxicity of Chlorpyrifos in humans and wildlife. Of particular concern to this board was its harmful effect on fetal brain development. Central to this decision was the extensive review of scientific evidence provided in the 2014 and 2016 EPA Human Health Risk Assessments, as well as new and additional findings not previously reviewed in these assessments.

On a national level, the findings of earlier EPA risk assessments resulted in a national ban on homeowner use as far back as 2000. The recent 2014 and 2016 reports further cemented the evidence for pervasive neurodevelopmental toxicity and also highlighted the danger of dietary exposure from residues in drinking water and crops. An all-out ban on Chlorpyrifos was proposed in 2015, revoking all pesticide tolerances and cancelling its registrations, but this was ruled out by the current Environmental Protection Agency (EPA) in 2017. This pesticide is still under registration review by the EPA, which re-evaluates their decision on a 15-year cycle.

Inclusion on California’s Proposition 65 list does not amount to a ban within the state, though products containing Chlorpyrifos will have to be labeled as such starting in late 2018. This action on the state level stands in contrast to federal decisions, and is a revealing lesson in regard to the complexity of national response to scientific evidence.

(Sammy Caiola, Capital Public Radio)

Gene Drives

US Military Agency Invests $100m in Genetic Extinction Technologies

Gene-drives, an emerging powerful gene-editing technology, have been drawing considerable attention and controversy for their proposed use in disease vector control. This method involves the release of an animal that has been genetically modified into a wild population, with the aim of breeding in genes that have been designed to reduce the species’ ability to spread disease. These introduced genes are preferentially inherited, resulting in their eventual dominance in the population. For example, a gene could be designed and introduced to provide resistance to a particular parasite or reduce fertility. This technique is proposed for use in controlling mosquito-borne diseases such as malaria and the Zika virus, as well as to halt the spread of invasive species.

Controversy over this technique however also hinges on its strengths. The primary concerns are the likelihood of animals with favorable modifications crossing over international borders, downstream effects on dependent species, and the possibility of irreversible harm to the ecosystem if the technique is misapplied. Appropriately, much of this concern comes from fellow scientists. In light of this, scientists and policy-makers alike have been proactive about addressing the safety and ethical issues presented, coming up with a set of specific guidelines to advance quality science for the common good. These entail an effort to promote stewardship, safety, and good governance, demonstrate transparency and accountability, engage thoughtfully with affected communities, stakeholders, and publics, and foster opportunities to strengthen capacity and education. Consensus on these issues is intended to help move this promising field forward in the face of growing public scrutiny.

Recently, a trove of emails from US scientists working on gene drive technology was acquired under the Freedom of Information Act and disseminated to the media. Some of these emails revealed the Bill and Melinda Gates Foundation’s engagement with a public relations company to influence the UN moratorium on the use of this technology. The Foundation has long been a financial supporter of the Target Malaria research consortium that seeks to develop gene drives for the eradication of Malaria. The concern surrounding the release of these emails realizes the common fear of scientists involved in research with the potential to fall under the public eye, as ironically, even attempts to recruit expertise in portraying your research favorably may be seen as damning.

This will inevitably be true of any powerful emerging technique to come in the future as well. With the advance of science’s ability to address problems effectively, there will be obstacles towards implementing new technologies and addressing concerns from the communities they may affect. Some of these will be valid and cause for moratorium and introspection, and some will be more attributable to sensationalism. Understanding and navigating these differences will be an increasing and ever-present concern for policy-minded scientists.

(Arthur Neslen, The Guardian)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

December 8, 2017 at 1:35 pm