Science Policy For All

Because science policy affects everyone.

Archive for November 2018

Science Policy Around the Web – November 30, 2018

leave a comment »

By: Allison Cross, PhD

Global-Warming-Change-Climate-Change-Climate-Desert-2063240.jpg

Source: Maxpixel

 

Fire, drought, flood: Climate challenges laid bare in US government report

On Black Friday, and amid the news of deadly wildfires in California, the federal government released the Fourth National Climate Assessment (NCA4).  This report, detailing global warming and climate change, is mandated every four years as part of The Global Change Research Act of 1990.  The report was released by the U.S. Global Change Research Program which includes scientist and policy experts from 13 federal agencies.  It affirms that the effects of climate change are already being felt across the country, and that current global efforts to combat it, as well as regional efforts to adapt to it, do not reach the levels needed to avert substantial damage to the US economy, environment, and public health.

The NCA4 describes the latest in climate-change science and examines how global warming is likely to differentially effect regions across the country and the economy.  The report explains how higher temperatures and drier conditions will result in more large fires across the west coast.  The southwest and midwest can expect persistent droughts to continue, while the east coast will suffer from increased flood risks. These conditions will disrupt agricultural productivity.  The US National Oceanic and Atmospheric Administration estimated that the 2017 economic loss in the US from major storms, floods and droughts was $290 billion, and the NCA4 declares that storms are expected to become even more powerful as global warming continues.

The report states that if the current trends in global greenhouse-house emissions continue, some US economic sectors can expect to experience annual losses of hundreds of billions of dollars by the end of the century.

Though the report was intended to inform policy-makers, it makes no specific policy recommendations to address the issues outlined in the report.  With President Trump having removed the US from the Paris climate change agreement after he took office, and repeatedly blaming the deadly California wildfires on poor forest management, many scientists are concerned that the government will not take action to address the grave findings outlined in the report.

This report released by the  U.S. Global Change Research Program  comes after, and is in agreement with, a report released in October by the Intergovernmental Panel on Climate Change (IPCC) stating that major and costly changes would be needed to avert the disastrous effects to come from climate change.

 

(Jeff Tollefson,  Nature Briefing)

 

FDA plans overhaul of decades-old medical device system

Just 24 hours after a global investigation into medical device safety was published, the Food and Drug Administration announced they will be overhauling the medical device approval process.  The FDA says the changes were planned before the new stories broke, referring to the Medical Device Safety Action Plan: Protecting Patients, Promoting Public Health issued by the FDA back in April. The investigation into medical device safety that made the headlines was led by the International Consortium of Investigative Journalists and found that, over a 10-year period, the FDA received reports of more than 1.7 million injuries and close to 83,000 deaths suspected of being linked to medical devices.

Under current regulations, most medical devices can undergo an expedited approval process if the manufacturers can prove the devices are similar to those already on the market.  This approval process was implemented by the FDA in 1976 and is known as the 510(k)clearance process.  This process means that extensive clinical safety and efficacy testing is only required for a handful of new devices. It has been reported that under this clearance process almost 20% of products are approved based on similarity to devices that are more than 10 years old.  Critics of the expedited clearance process point out that this system has allowed defective devices to be cleared included hip replacements that failed prematurely, and surgical mesh linked to pain and bleeding.

The FDA has said that under the modernized 510(k) clearance process, medical devices that come to the market “should either account for advances in technology or demonstrate that they meet more modern safety and performance criteria.”  The proposed changes to the approval process include pushing companies to compare their devices to more up-to-date technology.  The FDA also plans to pursue actions to allow them to retire outdated base-products when safer, more effective technology emerges.

The FDA has set a deadline of early 2019 to finalize its guidance on establishing an alternative accelerated pathway for medical device approval but the reforms being proposed may take years to implement.

 

(Matthew Perrone, Associated Press, Stat News)

 

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 30, 2018 at 3:46 pm

Publications and Patents: Laying the foundation for future innovation and economic growth

leave a comment »

By: Xavier Bofill de Ros, Ph.D.

thought-2123970_1920.jpg

Source: Pixabay

 

Many hours on the laboratory bench made me wonder: What is the real impact of our science? How do the thousands of publications appearing in scientific magazines every month and the funds poured into research benefit our society? We all know history; Fleming’s research on mold resulted in the discovery of penicillin and saved millions of lives ever since, and GPS systems rely heavily on basic trigonometry. These examples embody the power of science as a driver of technological progress and motivates – public policies to support scientific research. For example, NIH receives $37 billion  annually to fund intramural and extramural biomedical research[1]. Some of this investment in research generates intellectual property, bringing back to the system private money derived from license agreements. For instance, the NIH Technology Transfer Office had an income of $138 million from royalties in 2015[2]. However, many critics are quick to point out that basic research rarely pays off in practical R&D.

To understand where we are we need to know where we are coming from. A big part of the current legislation that governs the intellectual property derived from publicly-funded research is inspired from the Patent and Trademark Law Amendments Act, also known as the Bayh–Dole Act passed in 1980. This act established that the ownership of inventions made with federally-funded research projects by universities, small business and non-profit institutions is entitled to them in preference to the government. Prior to that act, the government accumulated ownership to large numbers of patents derived from the $75 billion per year of funding dispersed through different agencies, however fewer than 5% of those patents were licensed[3]. In exchange for this new source of revenue, public money receiving institutions  are required to educate the research community about the patenting procedures and to protect the government’s interests on funded inventions among other requirements. Despite the criticisms for forcing consumers to “pay twice” for patented products, the economic impact of the Bayh-Dole Act has been important. Recent reports suggest that academic licenses to industry contributed between $148 to $591 billion per year to US gross domestic product (GDP)[4].

Besides economic performance, other approaches to assess the impact of scientific publications on intellectual property come from the bibliometric analysis of the prior art on issued patents. A recent study from Kellogg School of Management analyzed the content of 4.8 million patents and 32 million research articles to find out how research is connected to inventions[5]. By analyzing the prior art references of patents, and the references of these references, the authors revealed that 80% of research articles linked to a future patent. This connection is often indirect, since direct citations of research articles in patents only account for about 10%, but it quickly accumulates to 42% and 74% when second degree and third degree citations are included. This indicates that the vast majority of the publication corpus ends up in the pool of knowledge where inventions arise. The analysis of the distance between research articles and patents also revealed differences between fields of research. Areas such as “Computer science”, “Nanotechnology” and “Biochemistry and Molecular Biology” depict a more immediate impact on patents compared to others less easily applicable. The authors of the study also went on to address which institutions yield research articles with a more significant impact on patents. To this aim, they compared the publications from universities, government laboratories and publicly traded firms. Consistent with previous studies, firms’ scientific production is the most directly linked to patent production. However, universities and government publications follow at a very close distance, despite generally engaging with more long-term research goals.

Other less tangible contributions from academic research and industry take place through the open access of data, reagents or knowledge[6]. Examples of these are The Cancer Genome Atlas (TCGA) with genomic data from more than 11,000 patients, the Jackson Laboratory (JAX) collection and distribution of mouse strains of human diseases, or the Addgene repository, with a collection of more than 67.000 plasmids. Similarly, collaboration agreements like CRADAs (Cooperative Research and Development Agreements) allow industry to partner with academic labs[7]. Under such agreements, which can last years, researchers from academic labs and companies can engage with joint ventures by providing each other with resources, skills and funds. In these partnerships the ownership of any coming intellectual property is discussed upfront as well as first option rights for licensing. Such collaboration formulas have a positive impact on the market readiness of the technologies developed, when not directly shortening the pathway to market through the same industrial partner. Similarly, there’s also specific agreements allowing for to joint clinical trials, specifically for rare diseases, or to transfer research materials.

Overall, this illustrates that public investment can be used to generate innovation and economic growth through the right policy measures. Contrary to the belief that technological and scientific advances move independently, there’s a well-connected flow of ideas that permeate between patented inventions and scientific articles. There are already good incentives to the research communities to facilitate the collaboration between academia and industry. However, there’s still room for novel policies to further leverage what can be achieved through the public investment on research.

[1]https://www.nih.gov/about-nih/what-we-do/budget

[2]https://www.ott.nih.gov/sites/default/files/documents/pdfs/AR2016.pdf

[3]GAO/RCED-98-126 Transferring Federal Technology. Page 3.

[4]The Economic Contribution of  University/Nonprofit  Inventions in the United  States: 1996-2015. Biotechnology Innovation Organization and the Association of University Technology Managers

[5]Ahmadpoor M, Jones BF. “The dual frontier: Patented inventions and prior scientific advance”. Science. 2017 Aug 11;357(6351):583-587.

[6]Bubela T, FitzGerald GA, Gold ER. Recalibrating intellectual property rights to enhance translational research collaborations. Sci Transl Med. 2012 Feb 22;4(122).

[7]Ben-Menachem G, Ferguson SM, Balakrishnan K. Beyond Patents and Royalties: Perception and Reality of Doing Business with the NIH. J Biolaw Bus. 2006 Jan 1;24(1):17-20.

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 28, 2018 at 10:41 am

Science Policy Around the Web – November 27, 2018

leave a comment »

By: Allison Dennis, B.S.

forest-1161868_1280

Source: Pixabay

 

California’s Wildfires Could Mean A Generation Of Lung Problems

The acute dangers of uncontrolled wildfires are undeniable, yet chronic dangers remain poorly understood. Changes in air quality due to wildfire smoke may have long-term and widespread health effects that researchers are only beginning to decipher. In 2017, nature provided the near perfect conditions for a much needed experiment. As the Mendocino Complex Fire raged, its smoke drifted over 200 miles to blanket the living space of an outdoor colony of primates bred for research for 10 days. 500 infant rhesus macaques, a commonly used model of human disease, were exposed, allowing respiratory immunologist Lisa Miller to begin an experiment looking for long-term respiratory damage in a pediatric population. Her previous studies of a smaller group of monkeys exposed in 2008 revealed monkeys born in wildfire conditions grew up to have a reduced lung capacity and compromised immune system . Ten years after the fire, monkeys who were infants at the time of the fire have a high incidence of idiopathic pulmonary fibrosis, a fatal human disease associated with environmental pollutants and cigarette smoking. By carefully recreating her impromptu 2008 experiment, Miller is hoping to gain deeper insights into what damage can occur in the developing lung tissue that will lead to possible interventions.

Miller’s research already suggests that a brief exposure to smoke early in life can have a lifetime of consequences. Smoke inhalation is much more widespread than the immediate dangers of fire, and will need to be incorporated into disaster preparedness plans. Financial assistance may be needed to help families temporarily relocate following fires not only due to burned homes, but also to homes blanketed in smoke. Currently, websites like airnow.gov provide up to date measures of air quality and can be used to decide when to limit children’s time outdoors. Parents with other options may need to weigh the potential risks of raising a family in places where wildfires are an annual occurrence.

(Maggie Koerth-Baker, FiveThirtyEight)

 

Genome-edited baby claim provokes international outcry

 

The announcement of the first genome-edited babies is shocking ethicists, scientists, and spectators around the world. He Jiankui of the Southern University of Science and Technology of China in Shenzhen claims to have altered the embryonic DNA of two twin girls born in November 2018 by using CRISPR-cas9 to disable the protein CCR5, known to provide HIV access to human cells. The study has not yet been submitted for publication and will likely undergo extensive peer review for verification. He recruited couples looking to conceive, where the male partner had HIV. The risk of transmission of HIV between father and offspring is very low, and removing semen from sperm before fertilization is commonly used to further mitigate the small risk. However, He thought these would-be parents would especially value the benefits of conferring their child with a lifetime protection against HIV through altered genetics.

Since its first demonstration as a gene-editing technique nearly ten years ago, ethicists have debated the potential application of CRISPR-cas9 to alter human DNA. Because reproductive tissues develop from the edited zygotic cells, it is likely that the twins will pass these changes on to their offspring along with any other possible off-target changes to the genome. While lacking the CCR5 gene has been shown to confer protection against HIV infection, it may subsequently increase the risk of other viral infections. Many still feel that not enough is known about the long-term and generational effects of altering a person’s germ-line using these proteins to justify the risks they could pose over an unborn person’s lifespan. To what extent the institutions that facilitated the experiment were consulted or informed of the true nature of the search remains unclear. The trial was only registered on November 8th, well after the work had begun. However in the race to be first, He appears to understand that any harm to the children arising from the experiment stating these risks are “going to be my own responsibility.”

 

(David Cyranoski, Nature News)

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 27, 2018 at 10:24 am

Mapping the Human Exposome: Understanding the “E” after the “G”

leave a comment »

By: Bryan Bassig, Ph.D.

computer-1149148_1280

Source: Pixabay

 

Current efforts to maximize our understanding of the known interplay between genetic and environmental factors in disease etiology have the potential to inform future research priorities and disease management and prevention

Defining the concept of the ‘exposome’

It is now commonly accepted that the etiology of most chronic diseases is a combination of genetics and environmental exposures, and most likely the interaction of these factors (“G” x “E”). The breadth of environmental exposures that have been implicated in chronic disease risk is substantial and includes personally modifiable factors including smoking and dietary choices as well as exposures that likely require policy interventions on a more universal scale, such as reducing air pollution. Substantial investments to map and characterize the human genome have led to an explosion of population-based studies that seek to understand the specific genetic variants that are associated with a wide variety of disease phenotypes. This in turn has generated great enthusiasm in applying these identified variants to personalized disease risk management and treatment. Whereas current discussion of the role of genetics in population-based health has already begun to move from discovery to translation with ongoing personalized medicine initiatives, our understanding of how to comprehensively measure the totality of environmental factors (broadly defined as non-genetic factors) that shape disease risk at a population-based level has generally lagged behind that of genetics.

Given the interplay and contributions of both “G” and “E” in disease processes, research and financial investments in one component but not the other likely lead to less efficiency in capturing the interindividual variation that exists in disease etiology and treatment and survival. An increasing recognition of this point over the last decade has propagated several research initiatives aimed at greater understanding of environmental factors in disease etiology, including efforts to understand the human “exposome.” Investment in these initiatives from a scientific funding standpoint has the potential to significantly improve exposure science and may in theory inform population-based health research strategies.

The concept of the human exposome was first conceived by epidemiologist Dr. Christopher Wild, a former director of the International Agency for Research on Cancer, in 2005. The concept has since gained traction within the research community. The idea behind the exposome is to complement the advances that have been made in understanding the human genome by characterizing the full spectrum of environmental exposures that occur from conception to death with an understanding that these exposures are both dynamic in nature and broad in scope. Indeed, a full “mapping” of the exposome as originally conceived by Dr. Wild and subsequently by others would include an ability to measure all internal (e.g. endogenous hormones and metabolites) factors as well as exogenous exposures that are either specific to the individual (e.g. smoking/alcohol, diet) or more universal in nature (e.g. built environment, climate). These exposures would be captured or measured at various “snap shots” throughout life, ideally corresponding to key time points of biological development such as in utero, childhood, and early adulthood. In contrast to traditional exposure assessment in population-based studies, which rely on questionnaires or targeted biological measurements of a limited number of chemicals that have been selected a priori, innovative technologies that take an agnostic and more comprehensive approach to measuring internal biomarkers (e.g. “omics”) or lifestyle-related factors (e.g. using smart phones to log physical activity patterns) would be needed for full characterization. Ideally, this would represent the “cumulative level of all cumulative exposures” in the human body.

Implementation: Progress, Potential, and Challenges

Implementation of the exposome paradigm is still in its relative infancy and current discussions are primarily focused on the scope of the initiative that is achievable within the parameters of scientific funding and infrastructure. For instance, in the absence of large prospective cohort studies that include collection of repeated samples or exposure information from people over multiple timepoints, application of the exposome paradigm is still possible but may be limited to fully characterizing the internal and external environment using samples or measurements taken at a single timepoint. While the current focus is on scientific implementation of this paradigm, the potential long-term translatable implications of exposome research can be imagined. From the standpoint of environmental regulation, agencies that conduct risk assessments of environmental exposures evaluate a series of questions including the dose-response relationship of these exposures with biologic effects or disease risk, and whether certain factors like age at exposure influence susceptibility. Application of the exposome framework provides a mechanism to potentially better characterize these questions as well as to evaluate multiple exposures or “mixtures” when making regulatory decisions. This potential however would need to be balanced in view of the existing regulatory framework and the need to develop guidelines for interpreting the expansive and complex datasets.   

While any application of the exposome paradigm to public health or clinical utilization would be an incredibly daunting challenge, a 2012 study published in Cell described this theoretical potential. The case study presented findings from a multiomic analysis of a single individual over 14-months in which distinct biologic changes and omic profiles were observed during periods when the individual was healthy relative to periods when he developed viral illness and type 2 diabetes. The authors concluded that the observed profiles were a proof of principle that an integrative personal omics profile could potentially be used in the future for early diagnostics and monitoring of disease states. While the study did not integrate data on external environmental exposures, further incorporation of these factors into the omic framework may provide evidence of distinct signatures that differ according to exposure status.

Current efforts to advance the exposome field have been bolstered by several initiatives including a 2012 report by the National Academies that described the future vision and strategy of exposure science in the 21st Century. Exposome-related research is also a major goal of the 2018-2023 strategic plan offered by the National Institute of Environmental Health Science (NIEHS), and the agency has supported two exciting exposome research initiatives. These include the HERCULES (Health and Exposome Research Center: Understanding Lifetime Exposures Center) research center at Emory University that is on the front lines of developing new technologies for evaluating the exposome, and the Children’s Health Exposure Analysis Resource (CHEAR) to encourage the use of biological assays in NIH-funded studies of children’s health.

As the field of exposomics matures, there will undoubtedly be several issues that arise that intersect both scientific and policy-related considerations as described by Dr. Wild and others involved in this field. These include but are not limited to:

  1. a) Cross-discipline education and training opportunities: The exposome paradigm encompasses multiple scientific disciplines, including laboratory sciences, bioinformatics, toxicology, and public health. Given the traditional model of graduate programs in science, which generally focus on distinct subfields, new educational and/or training programs that provide cross-disciplinary foundations will be critical in training the next-generation of scientists in this field.
  2. b) Data accessibility and reproducibility: Given its expansive nature and the inherent interindividual variation of non-genetic factors, full characterization of the exposome and associations between exposures and disease may require large international teams of researchers that have central access to the expansive, complex datasets that are generated. Unlike the human genome, the dynamic nature of the internal and external environment will require extensive reproduction and validation both within and across different populations.
  3. c) Funding and defining value: Fully implementing the exposome paradigm from an epidemiological research perspective would likely require profound investments in study infrastructure and laboratory technology. The discontinuation of the National Children’s Study, which originally intended to enroll and follow 100,000 children from birth to 21 years of age in the United States, illustrates the challenges associated with conducting large longitudinal research projects. These demands would need to be balanced with justifying the added value and potential for future utility along the same lines as genomics. The comprehensive understanding of non-genetic determinants of disease risk from a research standpoint, however, is the natural precursor to any discussion of utility.
  4. d) Communication of research findings: The field of genomics has matured to the point that consumers can now obtain personalized reports and risk profiles of their genome from companies like 23andMe and Ancestry.com. It is theoretically possible that this commercial model could be extended in the future to other types of biomarkers such as the metabolome, yet the dynamic nature and current lack of clarity regarding the disease relevance of most non-genetic biomarkers would create considerable challenges in interpreting and conveying the data.

Conclusions

The underlying principles of the exposome were originally conceived by Dr. Wild as a mechanism to identify non-genetic risk factors for chronic diseases in epidemiologic studies. While the increasing number of exposome research initiatives are primarily focused on this scientific goal, challenges remain in the implementation. It is likely too early to project what the future public health and/or clinical utility of this paradigm, if any, may be. Nevertheless, continued investments in this area of research are critical to understand the “missing pieces” of disease etiology and to ideally inform preventive measures and/or disease management in the future.  

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 21, 2018 at 9:55 pm

Science Policy Around the Web – November 20, 2018

leave a comment »

By: Andrew Wright, B.S.

NASAburningbrazil

Source: Wikimedia

 

Habitat Loss Threatens All Our Futures, World Leaders Warned

Recent reports have suggested that humanity has only 12 years to avoid catastrophic environmental collapse due to 1.5C of industrial warming. While solutions to the threat of runaway climate change have been given a new sense of urgency by these findings, there exists a commensurate and oft less visited issue, that of rapid declines in global biological diversity. Driven primarily by agricultural land conversion of terrestrial and marine ecosystems (via forest clearing and river damming, respectively), vertebrate species have declined by 60% on average since 1970 according to the World Wildlife Fund’s most recent Living Planet Report . While this decline appears strongest in South and Central America and in freshwater habitats, the report joins a compendium of literature suggesting holistic declines in biodiversity among birds, insects, fish, and terrestrial vertebrates as part of an ongoing anthropogenic mass extinction event.

To address some of the issue, the UN Convention on Biological Diversity (CBD) is currently meeting in Sharm El Sheikh, Egypt to discuss progress on the Aichi biodiversity targets for 2020.  These targets came out of The Convention on Biological Diversity, a multilateral treaty signed in 1992 focused on preserving biodiversity, sustainable use of biological resources, and equitable sharing of resources. The Aichi biodiversity targets specified that people would be aware of risks to biodiversity, and that biodiversity values would be adopted by public, private, and governmental entities by 2020. Given the rapidity, intensity, and ubiquity of the decline in species, most, if not all, of these targets will likely be missed. As such, the delegates from the 196 signatory nations will also work on creating new biodiversity targets to be finalized at the next CBD meeting in China.

Since a comprehensive solution seems necessary given the increasingly global nature of trade, authors of the new targets hope to garner a greater deal of international attention, and intend to make the case that government commitment to reversing or pausing biodiversity loss should receive equivalent weight as action on climate change.

(Jonathan Watts, The Guardian)

The Ethical Quandary of Human Infection Studies

The United States has greatly improved its ability to soundly regulate the ethics of clinical studies since the infamous malfeasance of the Tuskegee syphilis study. Most significantly, the National Research Act of 1974 established the Institutional Review Board to address how to adequately regulate the use of human subjects by adhering to the principles of respect for persons, beneficence, and justice.

The National Research Act provided a substantial step forward and provided a clear requirement for universal informed consent. However, the expansion of clinical studies to new international regions of extreme poverty, due in part to the influx of private money from large charitable organizations, has come with novel ethical considerations. In these newly explored populations where income, education, and literacy levels may be lower, emphasis is now being place on how to recruit volunteers without implicitly taking advantage of their circumstances.

One area of concern is compensation levels. While compensation in a malaria infection study in Kenya was tied to the local minimum wage, the number of volunteers recruited far surpassed expectations. This may have been due to the fact that payment during this study was guaranteed and consistent, in contrast to local work.

Aware of the concern, two of the largest private medical research funding organizations, the Bill and Melinda Gates foundation and the Wellcome Trust have recently instituted ethical guidelines putatively reinforcing the principle of beneficence, placing special emphasis on maximizing benefits over risk. It is an open question whether these protections will be sufficient, but at the very least it is important that rules to be put in place proactively rather than as a reaction.

 

(Linda Nordling, Undark/Scientific American)

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 20, 2018 at 11:58 am

Science Policy Around the Web – November 16, 2018

leave a comment »

By: Ben Wolfson, Ph.D.

e-cigarette-1301664_1920.jpg

Source: Pixabay

 

F.D.A. Seeks Restriction on Teens’ Access to Flavored E-Cigarettes and a Ban on Menthol Cigarettes

Stricter regulation on E-cigarettes by the Food and Drug Administration (FDA) has been rumored since last week, when market leading E-cigarette company Juul Lab stopped accepting orders for some if it’s most popular flavored products. In an announcement on Thursday of this week the FDA said that while it would allow stores to continue to sell flavored E-cigarettes, they would have to be restricted to areas that are inaccessible to minors.

In the same announcement, the FDA stated that it will move to ban additional flavored tobacco products, menthol cigarettes and flavored cigars. Flavored tobacco products are disproportionately used by young people, with a recent study finding that over 80% of youth and young adult tobacco users reporting using flavored tobacco. The same study also reported that 75% of youth tobacco users said that they would stop using tobacco if it was not flavored. These trends are exactly why the FDA has moved for new regulation. While youth use of combustible tobacco products has dropped, people who try E-cigarettes are more likely to use combustible in the future.

The exact way new regulations will be applied remain to be determined, and public health advocates have indicated disappointment that the FDA did not announce and outright ban. Age restrictions are already in place or tobacco products, and many underage individuals get tobacco from older people as opposed to stores illegally selling them.

For these same reasons, the ban on menthol cigarettes and flavored cigars is being lauded by advocates, especially in the African-American community where use of these products is especially high, and restrictions have been fought by the tobacco industry for years.

(Sheila Kaplan and Jan Hoffman, New York Times)

Offering free DNA sequencing, Nebula Genomics opens for Business. But there’s an itsy-bity catch

As personal genome sequencing becomes accessible and popular, so do the privacy concerns related to it. While individuals may choose to get their genome sequenced for recreational purposes, the data generated is highly valuable and of great interested to researchers, companies and law enforcement individuals, an evolving paradigm which was recently delved into in more detail on this blog. Individuals who sequence their genomes have the opportunity to share their (anonymized) data with researchers, however these systems remain one-sided and simplistic.

While AncestryDNA and 23andMe are currently the most popular companies for personal sequencing, a new genetics company run by famed geneticist and genome engineering/synthetic biology pioneer George Church, recently announced plans to enter the market. Church’s company, Nebula Genomics, plans to offer genome sequencing for a range of costs. Those who wish to opt out of sharing any information will pay $99 for genome sequencing, however the information provided will be low resolution. If the customer opts-in to sharing data the test will be free, and the accuracy of the data will be increased.

Regardless of whether they choose to answer questions about themselves, both free and paying costumers will still be able to refuse to share data with researchers. While other companies have an “all-or-nothing” approach to data sharing, Nebula will allow customers to audit data requests on a case-by-case basis. Any data shared will remain anonymized. Church stated that individuals with especially unique genetic traits that a company wants to study would even receive payment for their data. This approach would give people back control of their data, and is a push-back against the current system where companies control all data and the profits gathered from it.

(Sharon Begley, Stat News)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 16, 2018 at 4:33 pm

Insect Allies and the role of DARPA in scientific research

leave a comment »

By: Ben Wolfson, Ph.D.

early-heath-dragonfly-2186186_1920

Source: Pixabay

 

Last month, a Pentagon research program called Insect Allies burst into the public conversation after a team of research scientists and legal scholars published an article detailing their concerns and critiques of the project in Science magazine. Insect Allies is run by the Defense Advanced Research Projects Agency (DARPA), and was announced in 2016 with the stated goal of “pursuing scalable, readily deployable, and generalizable countermeasures against potential natural and engineered threats to the food supply with the goals of preserving the U.S. crop system”. As indicated by its eponymous project name, the Insect Allies research program seeks to develop insects that carry gene editing viruses, allowing for rapid genetic modification of plant food sources. The Insect Allies program exemplifies both the pros and cons of DARPA work. The described project leapfrogs current technological paradigms, promoting a next stage of synthetic biology work. However at the same time, it seeks to create a technology with problematic potential military applications. The battle between basic research and the development of military technologies is one that has dogged DARPA since its inception. As the theoretical and empirical knowledge in the fields of genetic modification and synthetic biology improve, it is imperative that novel technologies are developed with the appropriate ethical and moral oversight and that scientists consider the ramifications of their work.

Origins and Changes of DARPA

Science and the military have long been interwoven, a process that was formalized in the U.S. in the past century. In 1947, President Truman created the Department of Defense, in part to fund scientific research. A decade later President Eisenhower highlighted the importance of science in national defense with the creation of the Advanced Research Projects Agency (renamed DARPA in 1972). DARPA’s creation was in direct response to the launch of Sputnik by the Soviet Union, and given the mission statement of “preventing technological surprises like Sputnik, and developing innovative, high-risk research ideas that hold the potential for significant technological payoffs”.

In its early years, DARPA funded significant amounts of basic and foundational research that did not have immediate applications. However, in 1973 Congress passed the Mansfield Amendment, preventing the Defense Department from funding any research without “a direct and apparent relationship to a specific military function or operation”. The amendment was contentious at the time of its passing, with Presidential Science Advisor Lee DuBridge telling a congressional subcommittee that the amendment had negatively affected the quality of research projects because it is not possible to prove the relevance of a project, and therefore it is wrong to prevent an agency from funding basic research it sees as valuable. Passage of the amendment fundamentally reshaped the U.S. research funding landscape, and projects consisting of upwards of 60% of DOD research funds were cancelled or moved to other agencies. In place of basic research DARPA has shifted to funding research with direct military applications. These projects have often fallen into the realm of “dual-use” technologies, having both civilian and military uses. Successful examples of this strategy include funding projects that evolved into the internet and Global Positioning Systems (GPS). Current research span from projects with clear civilian applications, such as a multitude of projects researching the next generation of medical technologies, to those that are weapons research with purely military potential.

The Insect Allies program

Agriculture is one of the predominant industries in the U.S., making the U.S. a net exporter and world’s largest supplier of a variety of agricultural products including beef, corn, wheat, poultry and pork. The importance of American agriculture to both national security and the security of its global allies and trade partners is well recognized by national security officials, especially in the context of climate change and the potential for growing scarcity. The primary threats to agriculture are disease and weather related events. While these can be mitigated through pesticides, clearing of crops, quarantine, and selective breeding, current strategies are both destructive and time consuming.

The Insect Allies program has three focus areas; viral manipulation, insect vector optimization, and selective gene therapy in mature plants. Through application and combination of these technologies Insect Allies would function by genetically modifying already growing plants through utilization of “horizontal environmental genetic alteration agents (HEGAAs). Traditionally, genetic modification involves changing the genes of a parent organism and propagating its offspring. This process is essentially the same as the selective breeding practiced in agriculture for generations. While this is effective, it is a time-consuming practice as you must breed successive generations of your population of interest.

Through HEGAAs, Insect Allies completely revamp the process. Instead of creating a population of interest from scratch, HEGAAs allow scientists to modify an existing population. If you wanted to create a pesticide-resistant crop, the traditional strategy would be to insert the gene for pesticide resistance into one plant and then collect its seeds and use them to grow an entire field of pesticide resistant plants. With HEGAA technology, farmers could make an already grown field resistant by modifying each individual plant on a broad scale.

Criticism of the Insect Allies program

The authors of the Science article critique the Insect Allies program over a variety of issues, ranging from biological to ethical or moral dilemmas. The article raises issue with both the use of wide-scale genetic modification technologies as well as with the application of insects as vectors as opposed to already existing technologies such as overhead spraying. The use of wide-scale genetic modification is a line which has yet to be crossed, and currently lacks a regulatory path. While research into gene modifying technology is ongoing and real-world tests inevitable, these tests are a contentious issue that is currently being debated. Moreover, agricultural products modified by HEGAAs have no current path to the market. The combination of seemingly little thought in the program towards the regulation that would be necessary for the described application of their technology as well as the existence of lead the authors to suspect that Insect Allies is being developed for other means. While a population of gene-modifying insects could be used to help U.S. crops survive weather-changes or pesticides, they could also potentially be applied to crops of other nations in war. Biological Weapons were banned in 1972, and currently no nations have (publicly) developed them.While the technologies being developed by Insect Allies are described as “for peaceful means”, the stated goals are achievable through already existing technologies. Furthermore, international competition with Insect Allies may accelerate crossing the line between wartime and peacetime technology.

Soon after publication of the Science article, Dr. Blake Bextine, program manager for Insect Allies, released a statement refuting many of these points. He stated that DARPA moved into agricultural work as it is an important aspect of both national and international security, and that the work falls under DARPA’s charter to develop fundamentally new technologies that leapfrog existing capabilities. Moreover, he affirmed that Insect Allies has no plan for open release, and that regulatory systems would be developed and had been planned since the start of the program.

What does the future hold

The Science article’s authors note that they would be worried about Insect Allies whether it was under a civilian or military purview, but it is impossible to ignore the implications of synthetic biology and genetic modification research to the military. DARPA’s strategy of generating high-risk, high-reward research is both effective and engrained into the DNA of the organization, however so is the fact that DARPA is a defense organization.

When DARPA was founded (as ARPA), it was to promote high-risk scientific research that would increase U.S. soft power internationally. After the Mansfield amendment, these goals were shifted towards applied research instead of basic, and with them a focus on defense-oriented research. An advantage of basic research is that it takes time to develop, allowing the findings, and their ramifications, to percolate throughout the global scientific community. The quintessential example of this is regulation of recombinant DNA technologies. Soon after recombinant DNA technology was developed, the 1975 Asilomar Conference was held to establish voluntary guidelines that would ensure the safety of a game-changing scientific technology. As synthetic biology technological development has accelerated, the discussion around the regulation of synthetic biology and genetic modification technology has also begun, and is currently ongoing.

While it is impossible to argue with the massive benefits that civilian applications of DARPA developed technologies have provided, synthetic biology and genetic modification technologies have the potential to enact immense changes globally. The environment and application of a technology has a huge potential to influence its use and the way it is viewed by the public for generations. Insect Allies program states that it is focusing on developing insect-based HEGAAs technologies as a means of pushing development of gene-editing technologies to increase food security in a transparent manner that promotes open published research. It is critical that the Insect Allies program is held to this standard, and that regulation by the global scientific community is allowed to impact the direction and application of these potentially game-changing technologies.

 

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 15, 2018 at 11:22 am