Science Policy For All

Because science policy affects everyone.

Archive for January 2018

Science Policy Around the Web – January 30, 2018

leave a comment »

By: Kelly Tomins, BSc


By RedCoat (Own work) [CC-BY-SA-2.5], via Wikimedia Commons


Yes, They’ve Cloned Monkeys in China. That Doesn’t Mean You’re Next.

Primates were cloned for the first time with the births of two monkeys, Zhong Zhong and Hua Hua, at the Chinese Academy of Sciences in Shanghai. Despite being born from two separate mothers weeks apart, the two monkeys share the exact same DNA. They were cloned from cells of a single fetus, using a method called Somatic Cell Nuclear Transfer (SCNT), the same method used to clone over 20 other animal species, beginning with the now infamous sheep, Dolly.

The recently published study has excited scientists around the world, demonstrating the potential expanded use of primates in biomedical research. The impact of cloned monkeys could be tremendous, providing scientists a model more like humans to understand genetic disorders. Gene editing of the monkey embryos was also possible, indicating scientists could alter genes suspected to cause certain genetic disorders. These monkeys could then be used a model to understand the disease pathology and test innovative treatments, eliminating the differences that can arise from even the smallest natural genetic variation that exists between the individuals of the same species.

Despite the excitement over the first cloning of a primate, there is much work to be done before this technique could broadly impact research. The efficiency of the procedure was limited, with only 2 live births resulting from 149 early embryos created by the lab. In addition, the lab could only produce clones from fetal cells. Now it is still not possible to clone a primate after birth. In addition, the future of primate research is uncertain in the United States. Research regarding the sociality, intelligence, and DNA similarity of primates to humans has raised ethical concerns regarding their use in research. The US has banned the use of chimpanzees in research, and the NIH is currently in the process of retiring all of its’ chimps to sanctuaries. In addition, there are concerns regarding the proper treatment of many primates in research studies. The FDA recently ended a nicotine study and had to create a new council to oversee animal research after four squirrel monkeys died under suspicious circumstances. With further optimization, it will be fascinating to see if this primate cloning method will expand the otherwise waning use of primate research in the United States.

The successful cloning of a primate has additionally increased ethical concerns over the possibility of cloning humans. In addition to the many safety concerns, several bioethicists agree that human cloning would demean a human’s identity and should not be attempted. Either way, Dr. Shoukrat Mitalipov, director of the Center for Embryonic Cell and Gene Therapy at the Oregon Health & Science University stated that the methods used in this paper would likely not work on humans anyways.

(Gina Kolata, New York Times)

Air Pollution

EPA ends clean air policy opposed by fossil fuel interests

The EPA is ending the “once-in always-in” policy, which regulated how emissions standards differ between various sources of hazardous pollutants. This policy regards section 112 of the Clean Air Act, which regards regulation of sources of air pollutants such as benzene, hexane, and DDE. “Major sources” of pollutants are defined as those that have the potential to emit 10 tons per year of one pollutant or 25 tons of a combination of air pollutants. “Area Sources” are stationary sources of air pollutants that are not major sources. Under the policy, once a source is classified as a major source, it is permanently subject to stricter pollutant control standards, even if emitted pollutants fall below the threshold. This policy was intended to ensure that reductions in emissions continue over time.

The change in policy means that major sources of pollution that dip below the emissions threshold will be reclassified as an area source, and thus be held to lower air safety standards. Fossil fuel companies have petitioned for this change for years, and the recent policy change is being lauded by Republicans and states with high gas and coal production. The EPA news release states that the outdated policy disincentives companies from voluntarily reducing emissions, since they will be held accountable to major source standards regardless of the amount of emissions. Bill Wehrum, a former lawyer representing fossil fuel companies and current Assistant Administrator of EPA’s Office of Air and Radiation, stated reversing this policy “will reduce regulatory burden for industries and the states”. In contrast, environmentalists believe this change will drastically increase the amount of pollution plants will expel due to the softening of standards once they reach a certain threshold. As long as sources remain just below the major source threshold, there will be no incentive or regulations for them to lower pollutant emissions.

(Michael Biesecker, Associated Press)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 30, 2018 at 3:30 pm

Science Policy Around the Web – January 26, 2018

leave a comment »

By: Jennifer Patterson-West, Ph.D.


source: pixabay

Drug Pricing

At $850,000, price for new childhood blindness gene therapy four times too high, analysis says

Advances in medicine have greatly shifted the prognosis of many diseases. Heart disease is still the leading cause of death in the United States, nevertheless, deaths due to cardiovascular disease have declined by over 50% since the 1950s. This sharp decline correlates with the introduction of the first beta blocker in 1964, which could effectively lower high blood pressure. Diseases, such as childhood leukemia, AIDS, and hepatitis C, that were once considered a death sentence are now being treated.

However, these advances come at a cost.  In 2014, the Tufts Center for the Study of Drug Development (CSDD) released a report estimating the cost to develop a new drug to be $2.5 billion. With the high cost of drug development, it is not surprising that drug prices are soaring in all sectors. Pharmaceutical price hikes have sparked significant controversy due to the financial stress they put on patients. Last year, Mylan Pharmaceuticals increased the cost of an EpiPen by 500% fueling the debate on instituting government control on drug pricing in the U.S. The pricing of a new gene therapy for blindness at $850,000  has once again sparked this debate.

Advocates of price control point to Europe, where drug pricing is already bureaucratically controlled. In Europe and elsewhere, many drugs are available at a fraction of the cost. Prices in these countries have been significantly reduced by the implementation of external price referencing (EPR). In this system, Drugs are categorized into classes based on therapeutic effect and a reference point is set for each drug class.  This system follows the logic that manufacturers have considerable pressure to minimize cost when there are good alternatives.

Opponents of price control underscore a widening gap in the innovation of new drugs between the United states and Japan and Germany, where drug price controls have been enacted over the past few decades. It has been proposed that price control inadvertently limits the funds pharmaceutical companies direct to research and development. Pharmaceutical development comes with a high risk for investors, therefore, without the incentive of high profits, it is speculated that investors would move to other sectors of industry, such as high technology. Despite these odds, investment into research and development by the global pharmaceutical industry is only second to the tech industry.

(Andrew Joseph, STAT News)


When A Tattoo Means Life Or Death. Literally

In the United States, an advanced directive or living will is the standard way that individuals inform medical professionals of the treatments they would like to receive if they are dying or permanently unconscious. These forms can provide instructions regarding the use of machines to maintain life, feeding tubes, and intravenous fluids, “do not resuscitate” (DNR) orders, as well as organ and tissue donation preferences. Do not resuscitate (D.N.R.) means that doctors will not intervene with CPR or advanced cardiac life support (ACLS) if a patient stops breathing or their heart stops. Nevertheless, under this advisement, patients will continue to receive palliative care to mitigate pain or other physical symptoms.

These legal documents give patients control over their health care when serious illness may impair their judgement or overwhelm their ability to make a decision.  Although, if a patient arrives at a hospital unconscious or alone these directives may not be communicated to the medical staff. For this reason, individuals have taken to wearing their medical preferences on their body predominately in the form of a bracelet or necklace.

A recent case study reported an instance where a 70-year-old man arrived at the University of Miami hospital unconscious without an advance directive on file. The patient had a tattoo stating “do not resuscitate” with his signature across his chest. The tattoo gave doctors pause regarding how to proceed when the patient stopped breathing. Their initial reaction was to disregard the tattoo and provide necessary care since the alternative was an irreversible path. The legal recognition of the tattoo’s directive was uncertain. After reviewing the patient’s case, Dr. Kenneth Goodman, an expert medical ethicist advised the medical staff to honor the tattoo. Fortuitously, this decision was subsequently supported by a D.N.R. on file with the Florida Department of Health.

In contrast, a separate case study reported a patient with a D.N.R. tattoo that specified that it did not reflect his current preference for end-of-life care. These two cases left doctors uncertain about how to respond in future cases and emphasized the need for a central database containing advanced directives that is accessible to medical professionals.

Some states have established electronic databases for advanced directives. These states include New York, Oregon, Utah, West Virginia, and California. Residents of other states are recommended to file their living will at a private registry, such as the U.S. living will registry. In all cases, the usefulness of these registries is limited by a hospital or doctor’s use of the service.  With decentralized registries, it is inefficient for medical staff to search for a patient’s living will, especially when rapid response is necessary.  For this reason, many hospitals rely on an internal system containing patient records. A careful discussion of standard practices for obtaining documentation or recognizing physical identification of D.N.R. needs to be established among the American Medical Association so that medical staff can rapidly respond to a patient’s medical needs and wishes.

(Rebecca Hersher, NPR)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 26, 2018 at 5:23 pm

Science Policy Around the Web – January 19, 2018

leave a comment »

By: Allison Dennis B.S.


source: pixabay

Emergency Unpreparedness

IV bag shortage has hospitals scrambling to treat flu

While other hospital activities may put a predictable strain on medical supplies, the sudden onset of a particularly bad flu season has left hospitals strapped for the basic medical staple, IV bags. Intravenous (IV) therapy delivers liquids directly to the vein and is made possible by prepackaged sterile bags loaded with saline, a mix of water, salt, sugar, electrolytes, and vitamins that match what naturally exists in the body. By matching the natural composition of blood, these fluids are able to help the body rapidly return to normal after dehydration and can efficiently deliver drugs. Severe dehydration is a common side effect of flu, as one of the body’s first line of defense is to develop a fever, a process that expends a lot of water and oxygen. Additional symptoms may leave flu-sufferers uninterested or unable to drink the water they need. For patients ill enough to seek treatment at the hospital, IV therapy is often required to rapidly rehydrate their bodies and can be used simultaneously to deliver antivirals.

IV bags have been continuously in short supply in the US since 2014. Reasons for this shortage seem to stem from the complexities of safely manufacturing saline, a 10-day process that reportedly requires 29 steps, and the insatiable demand, 740 IV bags are estimated to be used each minute in the US. Production of IV drugs and saline is more tightly regulated by the FDA than other drugs because they are injected directly into the blood stream. Even the smallest contamination can result in a widespread blood infection.

In recent months, the shortage has been heightened by the coalescing of two closely monitored seasons, flu and hurricane. Half of the IV bags used in the US are manufactured in Puerto Rico, which was devastated by hurricane damage early this fall. IV bag producers are slowly returning to their pre-storm levels of production, but ongoing power outages are continuing to cause disruption. To try to alleviate this burden, the FDA has granted additional companies permission to begin manufacturing and selling the bags that are in short supply. To help hospitals struggling to meet the constant demand for IV bags, the FDA is temporarily permitting hospitals to import sterile saline from overseas.

In some cases, care providers are able to substitute pills for drugs usually administered intravenously. In others, providers may choose to administer drugs through an I.V. push, directly injecting them into the vein, a method that can be both painful and time consuming. But when it comes to treating the severe dehydration that can result when the body battle the flu, intravenous rehydration is often the only appropriate treatment.

(Linda Johnson, Associated Press)


After years of avoidance, Department of Energy joins quest to develop quantum computers

Quantum computing promises to revolutionize the way we solve complex problems through computation. While the hardware needed to make this a reality exists, software developers and thinkers are struggling to catch up. Conventional computers use bits, either 0 or 1, to create logic in a language the computer can understand. Quantum computers would expand this language to capture the ability of subatomic particles to exist in more than one state at a time. Instead of bits, these computers would use qubits, or quantum bits, allowing more information to be stored without using more energy.

But to think of quantum computing as just a more powerful conventional computer is off base. The types of problems these computers will solve will be fundamentally different. By using the properties of quantum interference, computer scientists are hoping to develop algorithms that would allow incorrect-solutions or redundant information to cancel each other out. These properties would allow quantum computers to perform incredibly complicated calculations while still delivering an interpretable result. These computers may prove an asset to modeling quantum processes themselves, a task conventional computers struggle with. On the to-do list are calculating molecular energies, modeling catalysis by enzymes, designing novel materials at the atomic level.

Overtime, programming languages evolved to allow developers to write code without constantly needing to know how computers would physically implement that code. However, learning how to use quantum hardware to perform what will be new types of computation is requiring physicists, computer scientists, and researchers to start from the beginning again. To foster collaboration, the Department of Energy has set up quantum computing testbeds, places where hardware designers and scientists can work together to simultaneously shape the computational revolution to come.

(Adrian Cho, Science)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 19, 2018 at 7:11 pm

FDA stem cell therapy crackdown: a stem-free clinic

leave a comment »

By: Belinda Hauser, Ph.D.


source: pixabay

The building blocks of life are stem cells, they don’t kill or cure anything, but they promote regeneration. Stem cells are classically defined as an undifferentiated cell capable of giving rise to more stem cells or differentiating into any cell type. Stem cells have given scientists insight into understanding how cells function and dysfunction in development. Moreover, research in stem cell development has lead to promising treatment possibilities; it is believed that stem cells have the potential to repair or replace damage caused by age, injury or disease. However, stem cell therapies have been controversial, arising from the practice of isolating and culturing stem cell derived from human embryos, and later, introducing pluripotent stem cells from previously differentiated cell types. This controversy is entrenched in both political and ethical debates, broadly affecting the regulation of cord blood harvesting, human cloning and clinical trials.

Today, common stem cell therapy uses include blood transplants or bone marrow transplants. The Food and Drug Administration (FDA) has only approved hematopoietic progenitor cells, derived from umbilical cord blood, for use in the United States. Harvesting of cord blood is considered safe for the mother and baby since the blood is collected after birth. Stem cells collected from the blood of the cut cord are used to treat a variety of diseases including blood cancers such as leukemia, and lymphomas, and blood diseases of the immune system. Given the scarcity of approved options, patients desperately seeking therapy may turn to treatments that are illegal and potentially harmful. The FDA has gone to great lengths to evaluate the potential risk associated with new and current products through both animal and human studies in order to ensure safety in the use of biological products. Thereby, to determine the effectiveness and safety of new investigative products, well-controlled human studies must be designed and executed. This attention is applied to all clinical trials and is well documented. For example, the federal government requires all clinical trials to be cited and it is standard protocol for the National Institutes of Health (NIH) to list all clinical trials being conducted via This promotes awareness and gives consumers an opportunity to be well informed of all trials being conducted.

Preceding the FDA’s goal to develop and license stem cell therapies for patients and prevent consumer exploitation is their concern for consumer safety and education. In March 2017, the FDA provided materials to clarify the benefits and risks of stem cell therapies. They warned that when injected, unproven stem cell treatments present the risk of mobility of implanted cells, i.e. metastasis, risk of excessive proliferation, i.e. tumor growth, contamination, stem cell failure, or reaction of the injection site. Therefore, new investigative products must go through a rigorous protocol to determine their effectiveness and safety in well-controlled human studies.

In August 2017, the FDA cracked down on unscrupulous stem cell clinics, announcing increased enforcement of regulations and oversight of stem cells clinics across the country. For example, the FDA seized five vials of (live) smallpox virus vaccine from the California stem cell treatment centers in Rancho Mirage and Beverly Hills, California.  A Florida clinic, now called U.S. Stem Cell Clinic of Sunrise, Florida, caught the attention of the FDA after stem cell treatments it delivered to women with macular degeneration, an eye disease, caused permanent damage. Staff member used stem cells from fat isolated from each patient’s stomach and then injected cells into their eyes. A common practice of clinical trials is to pay human subject-volunteers to participate in studies. However, to receive this unproven treatment patients were required to pay $5,000 to receive the stem cell injections. Permitting patients to pay for participation is a topic of ethical debate for even the most scrupulously designed trials. The FDA issued a notice warning U.S. Stem Cell Clinic for marketing products without FDA approval and condemning their exploitation of consumers. An inspection performed  by FDA investigators found evidence of significant deviations from good manufacturing practices in manufacturing of at least 256 lots of stem cell products produced by the clinic. In an attempt to impede the investigation, the U.S. Stem Cell Clinic attempted to refused access of the FDA investigators to the employees of the clinic.  Ultimately, the clinic was cited for failure to establish appropriate written procedures to prevent contamination, risking infection of human subjects. It is required that U.S. Stem Cell Clinic comply and correct the failures stated in the warning letter. If the clinic fails to address the outlined issues, actions will be taken by the FDA, these include seizure, injunction and or prosecution.  Moreover, U.S. Stem Cell Clinic  administered the product both intravenously and directly into the spinal cord of patients hoping to treat a number of serious diseases (Parkinson’s disease, amyotrophic lateral sclerosis (ALS) heart disease, pulmonary fibrosis, and chronic obstructive pulmonary disease (COPD), all without FDA review or approval. In fact the FDA has not approved any biological products manufactured by U.S. Stem Cell Clinic for any use.

Overall, the challenge of regulation and compliance continues to loom over all stem cell clinics in the U.S.; however, the FDA is dedicated to enforcing continuous regulation, while educating and protecting U.S. consumers. The building blocks of life are stem cells, manipulated properly, they have the ability to treat disease without posing unacceptable risk. Safely figuring out how will take time.

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 17, 2018 at 11:43 am

Science Policy Around the Web – January 12, 2018

leave a comment »

By: Maryam Zaringhalam, Ph.D.


source: pixabay

Emergency Preparedness/ Public Health

CDC plans session on ‘preparing for the unthinkable’: a nuclear detonation

On January 4, the Centers for Disease Control and Prevention (CDC) announced a grand rounds — or training session — entitled “Public Health Response to a Nuclear Detonation.” The event’s web page notes: “While a nuclear detonation is unlikely, it would have devastating results and there would be limited time to take critical protection steps. Despite the fear surrounding such an event, planning and preparation can lessen deaths and illness.” While the announcement followed a Twitter exchange between North Korean leader Kim Jong Un and President Donald Trump, CDC officials maintain that planning for the grand rounds has been underway for months. The Tweet also triggered an uptick in sales for potassium iodide, a drug that can be taken to prevent the absorption of radioactive iodine in the thyroid. The last CDC training focused on radiological and nuclear disaster preparedness was offered by the CDC in 2010, so this session will be an opportunity to share what public health programs at the federal, state, and local level have done to prepare in the intervening years. The session will be webcast live on January 16 and is geared towards research and public health professionals, but can also be viewed by interested members of the public.

(Helen Branswell, STAT News)

Public Health

Hospitals In States That Expanded Medicaid Less Likely To Close

The Affordable Care Act included a provision for expanding Medicaid programs to cover all people with household incomes up to 138 percent of the federal poverty level. In 2012, the Supreme Court left individual states to decide whether or not they would opt into Medicaid expansion. For the 31 states and the District of Columbia that chose to expand Medicaid, the federal government pledged to cover 100 percent of the cost for newly eligible enrollees in the first few years, with a provision that the share would eventually decrease to 90 percent. According to a new study, hospitals in Medicaid-expansion states were six times less likely to close than hospitals in the 19 states that did not expand.

Researchers at the University of Colorado tracked hospital closures and financial performance in the period between 2008 (four years before the ACA went into effect) and 2016. They attributed the favorable performance of hospitals in expansion states to the increased population of lower income people with insurance. As a result, Medicaid payments to hospitals increased and less hospital care went uncompensated because an uninsured patient could not afford to pay their bills.

The effect was most pronounced in rural hospitals, which often struggle to stay open. Hospital closure in rural communities can have large economic consequences. Mark Holmes, director of the Rural Health Research Program at UNC, told STAT: “Hospitals are usually the largest, or the second-largest, employer in a community… Losing an employer of 150 people with good jobs is like losing a manufacturing plant.”

While ACA repeal failed last year, Republican members of Congress are still pressing to roll back Medicaid expansion. Some lawmakers have suggested a block grant system, which would cap state spending on Medicaid at a set cost and allow states to spend the money however they would like. On Thursday, January 11, 2018, the Center for Medicare & Medicaid Services (CMS) issued a letter to state Medicaid directors that opens the door for states to require adults to be actively employed as a condition for coverage. The findings of this report, published Monday in the journal Health Affairs, therefore remain salient in the ongoing debate around health care.

(John Daley, NPR)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 12, 2018 at 5:39 pm

Science Policy Around the Web – January 5, 2018

leave a comment »

By: Emily Petrus, PhD


source: pixabay


Nature’s 10: Ten people who mattered this year

As we kick off a new year of discovery amidst an unpredictable social and political landscape, it helps to reflect on those who made big changes in science in 2017. Nature Magazine put together a list of the top 10 “people who mattered” this year, which runs the gamut from lawyers, patients and of course, scientists.

Academic scientists hailed from most corners of the globe and a variety of fields:

  • US geneticist David Liu (Broad Institute and Harvard University, Cambridge MA, USA) was recognized for pioneering work in gene editing. CRISPR-Cas9 is one of the hottest ways to accurately and easily edit genes. Liu’s team created new enzymes from scratch that can rewrite genetic code, swapping AT pairs to CG pairs – a team in China used this technique to cure human embryos of a blood disorder.
  • Marica Branchesi (Gran Sasso Science Institute, L’Aquila, Italy) plays a pivotal role in coordinating astronomers and physicists who study gravitational-wave research. 70 teams of scientists from all over the world shared their equipment and data on August 17, 2017 to watch two neutron stars collide in a galaxy far away. Without Branchesi’s efforts this important event would have been inadequately monitored, leaving some questions unanswered.
  • Pan Jianwei (University of Science and Technology of China, Hefei, China) is a physicist developing quantum teleportation, which can be used to create encryption keys. Jianwei’s group beamed these keys from a satellite to Beijing and Vienna, enabling groups to videochat with compete security – the photons become distorted if hackers try to intercept the signal. This technology lays the groundwork for quantum internet to be available worldwide.
  • Victor Cruz-Atienza (National Autonomous University of Mexico, Mexico City, Mexico) studies earth’s seismic activity. In 2016 he published a paper simulating how different soil structures are affected by earthquakes, using Mexico City’s ancient lake basin as an example. His calculations were validated after the 7.1 magnitude earthquake in September 2017. Cruz-Atienza’s goal is to raise awareness about upcoming earthquake threats and help countries prepare for them before they hit.

Other members on the list brought unique skills to the table to help scientists continue their work.

  • Khaled Toukan (Chairman of the Jordan Atomic Energy Commission, Acting Director of the Synchrotron-light for Experimental Science and Applications in the Middle East) paved the way for physicists to obtain and share valuable equipment in a turbulent region of the world. His skilled diplomatic interactions steered the project to completion through 20 years of funding upsets and political upheaval.
  • Lassina Zerbo (Comprehensive Nuclear-Test-Ban Treaty Organization, Vienna, Austria) is dedicated to reducing nuclear conflict. 2017 has been rife with nuclear threats, as hostile barbs are routinely traded between the US president and the North Korean leader. Zerbo coordinates a worldwide system to share information which detect data about the earth’s hydroacoustic, infrasound, seismic and radionuclide activity. This is helpful for monitoring who is doing nuclear testing, but also for tsunami detection and studying whale migration.
  • Jennifer Byrne (Children’s Hospital at Westmead, Sydney, Australia) is a cancer geneticist and flawed paper detective. Scientists must publish frequently, and the number of dubious scam journals has increased in recent years. Both factors contribute to flawed and fraudulent literature which muddy the waters in a field based on a trust that what is published is true. Byrne and a computer scientist have developed a software (Seek & Blastn) which could be used by journals to detect misconduct prior to publication.

Rounding out the list is the inspirational posterchild for novel cancer therapies, Emily Whitehead, who was recognized for her role in getting CAR-T cell therapy approved by the FDA. The dubious distinction of using creative ways to dismantle the EPA from the inside was awarded to Scott Pruitt.

(Heidi Ledford, Davide Castelvecchi, Elie Dolgin, Sara Reardon, Elizabeth Gibney, Nicky Phillips, Alexandra Witze, Nature)


U.S. lifts research moratorium on enhancing germs’ danger / NIH lifts 3-year ban on funding risky virus studies

The US is a great place to do research for many scientists, and the outlook is even brighter in 2018 for a select group of viral researchers. Studying how viruses work is an important undertaking – we can prepare for pandemic outbreaks, develop vaccines, and sometimes use viruses to delivery DNA for gene therapy. However, in 2011 researchers in the Netherlands and The University of Wisconsin in Madison published a study in which they made the H5N1 bird flu easier to transmit between ferrets. This type of study is called “gain of function” and is usually a way for scientists to make viruses even more deadly or transmittable. If this sounds like the zombie apocalypse to you, it did to HHS lawmakers too; they paused funding in 2014 for this type of research. On December 19, 2017 the pause was lifted after a lengthy process of putting new policies in place.

The pause was to allow the National Science Advisory Board for Biosecurity and the HHS to craft clear new rules and regulations that all grants will have to pass before being permitted to work with research involving enhanced potential pandemic pathogens (ie deadly viruses). Grants which make it through the peer review process then experience a secondary review by a panel who will determine if the project’s benefits outweigh the risks, and make recommendations for funding and/or request modifications. In addition, such dangerous research will only be permitted in facilities which are properly equipped to handle such biosafety concerns.

Biomedical research moves at a fast pace, so most proposals that were submitted before the freeze are now obsolete, requiring researchers to submit fresh proposals, following the new guidelines. This may sound tedious and most researchers may not relish new hoops to jump through, however nobody wants a deadly pathogen released due to limited oversight. Even the CDC managed to send live anthrax, bird flu, and botulinum toxin (which causes botulism) to other labs five times over the course of a decade. The pause in funding came at the request of epidemiologists and other scientists who felt there weren’t enough regulations from a safety or ethical standpoint to support funding for gain of function types of viral research. If researchers can prove that their projects pose a limited risk and will produce valuable benefits to our knowledge of public health, their research can resume once more.

(Jocelyn Kaiser, Science) (Lenny Bernstein, The Washington Post)


Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 12, 2018 at 1:27 pm