Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘genomics

Science Policy Around the Web – February 22, 2019

leave a comment »

By: Janani Prabhakar, Ph.D.

Source: Pixabay

China Uses DNA to Track Its People, With the Help of American Expertise

Recently, global attention fell on China and its treatment of its Uighur people, a predominately Muslim ethnic group who live in the far west region of China and have enjoyed a fair degree of self-autonomy from the Chinese government. In an attempt take control of this ethnic group and region, the Chinese government has used intense surveillance, oppression, and detainment through “re-education” camps. In addition, the Government has been collecting DNA samples from this ethnic group to generate a comprehensive database of Chinese Uighurs. What is unknown is how the Chinese government intends to use this database in its oppression of Chinese Uighurs. To complicate matters, collection of DNA samples has been bolstered by equipment and data from US-based companies and researchers. For example, Thermo Fisher supplies technology to support DNA data collection and analysis. Yale geneticist, Dr. Kenneth Kidd, provided genetic material from global populations to Chinese researchers. In both cases, the US parties were unaware of how their contributions were used. This brings to bear larger questions about culpability, adherence to scientific norms, and the role of scientific collaborations at a global scale. China has included these DNA samples in global databases, but it does not seem that proper consent was received for these samples. In some reports, individuals were called by police for mandatory health screening and medical checkup, suggesting that the samples were obtained through coercion. It is unclear whether inclusion of these samples in databases or during collaborations reflects tacit acceptance of China’s surveillance program of Uighurs.

Researchers including Dr. Kidd have hosted Chinese researchers in their labs to gain techniques in analyzing DNA material. The output of these collaborations has resulted in publications that provide methods to distinguish between ethnic groups, wherein Chinese researchers utilize DNA samples obtained from collaborators as a comparison group to Uighurs. Chinese officials state that this would be useful to identify individuals at a crime scene, which on the surface, is not an incorrect application or use of the data. However, in light of the allegations on the Chinese government in their use of this technology to oppress certain minority groups, the role of scientific collaboration becomes murky. In addition to collaboration is the question about US corporate involvement. Thermo Fisher recently announced that they would stop selling their products that have been integral to forensic DNA analyses in Xinjiang, where the campaign to suppress Uighurs has been most intense. This is striking given that 10 percent of Thermo Fisher’s $21.9 billion-dollar revenue comes from China. While this is a big step, monitoring how technology and science is being used in this global environment must be a central focus given the large human rights implications.

(Sui-Lee Wee, New York Times

The Energy 202: One of world’s biggest coal miners caps production amid climate 

In response to increasing public pressure, Glencore, one of the world’s largest mining companies, has announced that it will cap the amount of coal it mines. Evidence of private companies responding to this global pressure to reduce greenhouse gases has emerged around the world. Most recently, the Tennessee Valley Authority voted to shut down two aging coal-fired power plants. These actions reflect big shifts in an industry that has largely debunked climate change theories, particularly when President Trump has promised to bring back coal.  The pressure on private companies came from more than 200 institutions worldwide that targeted the world’s largest emitters. This has led to a wave of corporations like Glencore to make similar reduction pledges. These commitments are multi-pronged. Glencore has agreed to limit coal production (and not increase it) annually, in a shift away from its original production trends. In addition, the company has considered tying executive pay with meeting these goals as well as ending partnerships with coal lobbies.

This move towards reduction has also come from a waning coal market and its economic promise for companies like Glencore. While many countries like the United States and other advanced economies have reduced their coal use, Asian countries like India and China have increased it. Glencore exports most of its coal to coal-using countries in Asia. As a result, it is actually in the economic interest of private coal companies to reduce coal production:  restricting supply as demand wanes will increase coal prices and increase Glencore’s revenue. Important to climate change activists, increase in coal prices would lead over time to a reduction in coal use. In this way, private companies can work in partnership with efforts to reduce greenhouse gases without compromising their own interests.

(Dino Grandino, New York Times

Have an interesting science policy link? Share it in the comments!

Advertisements

Written by sciencepolicyforall

February 22, 2019 at 2:08 pm

Science Policy Around the Web – February 19, 2019

leave a comment »

By: Mohor Sengupta Ph.D.

Source: Maxpixel

New AI fake text generator may be too dangerous to release, say creators

Artificial intelligence. In a Beijing competition last year, it identified patterns in brain MRI scans from hundreds of individuals and diagnosed their conditions far more accurately than human doctors. In modern journalism, AI is being trained by humans to scan financial reports and dish out an immediate news story with pertinent facts and figures. Bloomberg uses the bot “Cyborg”, which helps it compete against its rival, Reuters, in financial journalism. The Washington Post uses “Heliograf”, which covered the 2016 Olympics and the 2016 US elections. But, AI can create a completely fake human image by teaching itself from several real images. It can produce a completely fake video by literally putting words in a person’s mouth. And, owners of the nonprofit research company, “OpenAI”, realized, to their concern, that a new AI text generator called GPT2, created by its researchers, is so smart that it can make up connected and logical sentences following a random one-liner input. 

Take this example. When fed with the opening line of George Orwell’s Nineteen Eighty-Four – “It was a bright cold day in April, and the clocks were striking thirteen” – GPT2 comes up with this: “I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run. I just imagined what the day would be like. A hundred years from now. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science.” It recognizes the futuristic reference in the input. It had an interesting and obviously fake statement to say about UK Prime Minister Theresa May when fed with an unfinished line from a Guardian article about Brexit. In short, GPT2 is quick and creative. 

It is not difficult to imagine the mischief this kind of groundbreaking technology can create. Generating fake news is one that comes to the mind easily. 

Take for example, this false and incomplete phrase presented to GPT2 by a group of humans at MIT: “Russia has declared war on the United States after Donald Trump accidentally”.

OpenAI chillingly demonstrated how it can be a lethal hoodwinker by completing that phrase into this: “Russia has declared war on the United States after Donald Trump accidentally fired a missile in the air. Russia said it had “identified the missile’s trajectory and will take necessary measures to ensure the security of the Russian population and the country’s strategic nuclear forces.” The White House said it was “extremely concerned by the Russian violation” of a treaty banning intermediate-range ballistic missiles. The US and Russia have had an uneasy relationship since 2014, when Moscow annexed Ukraine’s Crimea region and backed separatists in eastern Ukraine.”

Don’t panic, none of that is true!

Machine learning was used to train GPT2 on a dataset containing about 10 million articles, selected by scouring the social news site Reddit for links with more than three votes. “We need to perform experimentation to find out what they can and can’t do,” said Jack Clark, policy director of OpenAI. “If you can’t anticipate all the abilities of a model, you have to prod it to see what it can do. There are many more people than us who are better at thinking what it can do maliciously.” For now, the company has decided to keep GPT2 behind closed doors until such a time when they know the bot’s full potential. Only a simpler version will be made publicly available. 

While there are skeptics of AI, there are also critics of the skeptics. They argue that humans can produce fake news as well as an AI. But take a moment to think of the number, variations, creativity and logical or factual supports that can be weaved in seconds by a bot like GPT2. How is it possible that any human can outsmart a program that has learnt 10 million articles about current and past politics, about wars, about famous fictional characters, about sports, about celebrities, about nature and about anything under the sun, and beyond? 

Downsides of a technology cannot confine it behind closed doors. We have learnt that from the past. AI is becoming increasingly necessary in many areas, for example, identifying medical conditions from MRI as mentioned above. Nature Medicine recently published an article where an AI accurately diagnosed common childhood diseases by analyzing digital and electronic health records. But like any revolutionizing technology, AI can be misused for malicious purposes. And it will be used for that, sooner or later. The idea here is to prepare the world for what is about to become mainstream in a few years. Clark calls this an “escalator from hell”. 

(Alex Hern, The Guardian)

Searching Tardigrades for Lifesaving Secrets

In the series “Cosmos: A Spacetime Odyssey”, Neil deGrasse Tyson talked about five cataclysmic events, separated by eons, that wiped out almost all life on Earth. Every time this happened, a miniscule proportion of existing life escaped and adapted to the new environment. Among this population there is an organism that survived ALL five purges. This animal is the size of a sand particle and it dwells in moisture. It is called a Water bear, or a Moss piglet, or Tardigrade.  

It is not entirely clear how these organisms survive extreme environmental conditions, but research on their physiology suggest involvement of certain proteins that protect its cells from dehydration. In an event of environmental emergency, the tardigrade can desiccate itself within minutes into a firm, curled-up ball called a “tun”, only to resume normal functions within minutes of sensing moisture. This remarkable feat is called anhydrobiosis in scientific jargon. A group of scientists from Japan have found certain heat-soluble proteins that help tide over the anhydrobiosis period. Separately, another group of scientists from USA and Italy have found intrinsically disordered proteins (IDPs) specific to tardigrades (TDPs) to confer to them the protection against desiccation.  

A team of three scientists at Harvard Medical School and are now using computational biology and machine learning to design TDPs tailored to slow down metabolism in human cells. “It really started out as a wacky, high-risk idea,” said Pamela Silver who spearheads the project. In 2008 Dr. Silver came across a grant challenge posted by the US military seeking novel solutions to stabilize hemorrhage of personnel in war zones. Together with her bioinformatician colleague and machine-learning expert Roger Chang and computational biologist Debora Marks, she set out to engineer novel TDPs that would lower the metabolism in eukaryotic cells during a state of shock. This is akin to “slowing biological time”, only to return back to normal pace when the critical period has passed. Although it is still unclear how these proteins might work, Dr. Chang has suggested that they might form a biological glass that physically immobilizes everything inside the cell during the stress period. 

The group was recently awarded a five-year cooperative agreement by the Defense Advanced Research Projects Agency (DARPA) to pursue the idea. If it works, the engineered TDPs could be a benchmark in modern medicine, revolutionizing current understanding of not only trauma management, but also address several pressing issues, like, long-term transportation of protein-based drugs or to-be-transplanted organs and keeping the egg cell viable without freezing it. 

(Steph Yin, New York Times)

Embryo ‘Adoption’ Is Growing, but It’s Getting Tangled in the Abortion Debate

In Vitro Fertilization (IVF) has changed the way people understand reproduction. It has made conceiving a reality for many women who are not able to conceive naturally for medical or social reasons. During the IVF process, sperm from a male donor are used to fertilize an egg cell from a female donor (generally the would-be biological parents). Usually, more than one embryo results from such a process, but only one is transplanted in the recipient’s uterus. Commonly, when a biological child is desired, donor and recipient are the same woman. In a relatively new trend, couples other than the donor are “adopting” these extra embryos. While this may seem as a very uncomplicated way of giving new life and a great option for many women, it is fraught with problems arising from religious and social issues. 

Monica Broecker is a woman in her mid-forties who found out that she couldn’t conceive, after repeated miscarriages. She decided to adopt an embryo and approached National Embryo Donation Center, based in Knoxville, Tennessee, which is also the largest embryo donation clinic in USA. Despite being financially stable, Ms. Broecker was turned down by the agency because she is a single woman. 

Some of the better-known embryo adoption agencies are funded by a grant by the Department of Health and Human Services (HHS), called the “Embryo Awareness Adoption Program” which has had $1 million in annual funding since 2003. To date, all the grant recipients, barring two, are affiliated with anti-abortion or Christian organizations. 

Ms. Tyson, who works for Snowflake Embryo Adoption, one of the recipient agencies of the HHS grant, says that her clients are mostly Christian. In almost all embryo donations, the donor family selects the recipient. Ms. Tyson has had a difficult time finding donors for single women, LGBTQ people or people from diverse religious practices and atheists. She usually refers such recipients to Embryo Donation International, an agency that is not affiliated to any religious organization. Unsurprisingly, Embryo Donation International doesn’t receive the HHS grant. 

On an average, embryo donation costs much less than a single round of IVF. Increasingly, many women wanting to become mothers are seeing this approach as a perfect synergy of their requirements and giving life to an embryo rather than it being frozen indefinitely. Matching with a donor is the biggest problem that many of them are experiencing. Although all grant recipients deny that they consider religion, sexual orientation or marital status while considering clients, statistics show a different picture. The government should look more carefully into its grant applicants and take measures to diminish bias in granting the awards. 

(Caroline Lester, New York Times)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

February 19, 2019 at 4:19 pm

Mapping the Human Exposome: Understanding the “E” after the “G”

leave a comment »

By: Bryan Bassig, Ph.D.

computer-1149148_1280

Source: Pixabay

 

Current efforts to maximize our understanding of the known interplay between genetic and environmental factors in disease etiology have the potential to inform future research priorities and disease management and prevention

Defining the concept of the ‘exposome’

It is now commonly accepted that the etiology of most chronic diseases is a combination of genetics and environmental exposures, and most likely the interaction of these factors (“G” x “E”). The breadth of environmental exposures that have been implicated in chronic disease risk is substantial and includes personally modifiable factors including smoking and dietary choices as well as exposures that likely require policy interventions on a more universal scale, such as reducing air pollution. Substantial investments to map and characterize the human genome have led to an explosion of population-based studies that seek to understand the specific genetic variants that are associated with a wide variety of disease phenotypes. This in turn has generated great enthusiasm in applying these identified variants to personalized disease risk management and treatment. Whereas current discussion of the role of genetics in population-based health has already begun to move from discovery to translation with ongoing personalized medicine initiatives, our understanding of how to comprehensively measure the totality of environmental factors (broadly defined as non-genetic factors) that shape disease risk at a population-based level has generally lagged behind that of genetics.

Given the interplay and contributions of both “G” and “E” in disease processes, research and financial investments in one component but not the other likely lead to less efficiency in capturing the interindividual variation that exists in disease etiology and treatment and survival. An increasing recognition of this point over the last decade has propagated several research initiatives aimed at greater understanding of environmental factors in disease etiology, including efforts to understand the human “exposome.” Investment in these initiatives from a scientific funding standpoint has the potential to significantly improve exposure science and may in theory inform population-based health research strategies.

The concept of the human exposome was first conceived by epidemiologist Dr. Christopher Wild, a former director of the International Agency for Research on Cancer, in 2005. The concept has since gained traction within the research community. The idea behind the exposome is to complement the advances that have been made in understanding the human genome by characterizing the full spectrum of environmental exposures that occur from conception to death with an understanding that these exposures are both dynamic in nature and broad in scope. Indeed, a full “mapping” of the exposome as originally conceived by Dr. Wild and subsequently by others would include an ability to measure all internal (e.g. endogenous hormones and metabolites) factors as well as exogenous exposures that are either specific to the individual (e.g. smoking/alcohol, diet) or more universal in nature (e.g. built environment, climate). These exposures would be captured or measured at various “snap shots” throughout life, ideally corresponding to key time points of biological development such as in utero, childhood, and early adulthood. In contrast to traditional exposure assessment in population-based studies, which rely on questionnaires or targeted biological measurements of a limited number of chemicals that have been selected a priori, innovative technologies that take an agnostic and more comprehensive approach to measuring internal biomarkers (e.g. “omics”) or lifestyle-related factors (e.g. using smart phones to log physical activity patterns) would be needed for full characterization. Ideally, this would represent the “cumulative level of all cumulative exposures” in the human body.

Implementation: Progress, Potential, and Challenges

Implementation of the exposome paradigm is still in its relative infancy and current discussions are primarily focused on the scope of the initiative that is achievable within the parameters of scientific funding and infrastructure. For instance, in the absence of large prospective cohort studies that include collection of repeated samples or exposure information from people over multiple timepoints, application of the exposome paradigm is still possible but may be limited to fully characterizing the internal and external environment using samples or measurements taken at a single timepoint. While the current focus is on scientific implementation of this paradigm, the potential long-term translatable implications of exposome research can be imagined. From the standpoint of environmental regulation, agencies that conduct risk assessments of environmental exposures evaluate a series of questions including the dose-response relationship of these exposures with biologic effects or disease risk, and whether certain factors like age at exposure influence susceptibility. Application of the exposome framework provides a mechanism to potentially better characterize these questions as well as to evaluate multiple exposures or “mixtures” when making regulatory decisions. This potential however would need to be balanced in view of the existing regulatory framework and the need to develop guidelines for interpreting the expansive and complex datasets.   

While any application of the exposome paradigm to public health or clinical utilization would be an incredibly daunting challenge, a 2012 study published in Cell described this theoretical potential. The case study presented findings from a multiomic analysis of a single individual over 14-months in which distinct biologic changes and omic profiles were observed during periods when the individual was healthy relative to periods when he developed viral illness and type 2 diabetes. The authors concluded that the observed profiles were a proof of principle that an integrative personal omics profile could potentially be used in the future for early diagnostics and monitoring of disease states. While the study did not integrate data on external environmental exposures, further incorporation of these factors into the omic framework may provide evidence of distinct signatures that differ according to exposure status.

Current efforts to advance the exposome field have been bolstered by several initiatives including a 2012 report by the National Academies that described the future vision and strategy of exposure science in the 21st Century. Exposome-related research is also a major goal of the 2018-2023 strategic plan offered by the National Institute of Environmental Health Science (NIEHS), and the agency has supported two exciting exposome research initiatives. These include the HERCULES (Health and Exposome Research Center: Understanding Lifetime Exposures Center) research center at Emory University that is on the front lines of developing new technologies for evaluating the exposome, and the Children’s Health Exposure Analysis Resource (CHEAR) to encourage the use of biological assays in NIH-funded studies of children’s health.

As the field of exposomics matures, there will undoubtedly be several issues that arise that intersect both scientific and policy-related considerations as described by Dr. Wild and others involved in this field. These include but are not limited to:

  1. a) Cross-discipline education and training opportunities: The exposome paradigm encompasses multiple scientific disciplines, including laboratory sciences, bioinformatics, toxicology, and public health. Given the traditional model of graduate programs in science, which generally focus on distinct subfields, new educational and/or training programs that provide cross-disciplinary foundations will be critical in training the next-generation of scientists in this field.
  2. b) Data accessibility and reproducibility: Given its expansive nature and the inherent interindividual variation of non-genetic factors, full characterization of the exposome and associations between exposures and disease may require large international teams of researchers that have central access to the expansive, complex datasets that are generated. Unlike the human genome, the dynamic nature of the internal and external environment will require extensive reproduction and validation both within and across different populations.
  3. c) Funding and defining value: Fully implementing the exposome paradigm from an epidemiological research perspective would likely require profound investments in study infrastructure and laboratory technology. The discontinuation of the National Children’s Study, which originally intended to enroll and follow 100,000 children from birth to 21 years of age in the United States, illustrates the challenges associated with conducting large longitudinal research projects. These demands would need to be balanced with justifying the added value and potential for future utility along the same lines as genomics. The comprehensive understanding of non-genetic determinants of disease risk from a research standpoint, however, is the natural precursor to any discussion of utility.
  4. d) Communication of research findings: The field of genomics has matured to the point that consumers can now obtain personalized reports and risk profiles of their genome from companies like 23andMe and Ancestry.com. It is theoretically possible that this commercial model could be extended in the future to other types of biomarkers such as the metabolome, yet the dynamic nature and current lack of clarity regarding the disease relevance of most non-genetic biomarkers would create considerable challenges in interpreting and conveying the data.

Conclusions

The underlying principles of the exposome were originally conceived by Dr. Wild as a mechanism to identify non-genetic risk factors for chronic diseases in epidemiologic studies. While the increasing number of exposome research initiatives are primarily focused on this scientific goal, challenges remain in the implementation. It is likely too early to project what the future public health and/or clinical utility of this paradigm, if any, may be. Nevertheless, continued investments in this area of research are critical to understand the “missing pieces” of disease etiology and to ideally inform preventive measures and/or disease management in the future.  

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 21, 2018 at 9:55 pm

Science Policy Around the Web – November 16, 2018

leave a comment »

By: Ben Wolfson, Ph.D.

e-cigarette-1301664_1920.jpg

Source: Pixabay

 

F.D.A. Seeks Restriction on Teens’ Access to Flavored E-Cigarettes and a Ban on Menthol Cigarettes

Stricter regulation on E-cigarettes by the Food and Drug Administration (FDA) has been rumored since last week, when market leading E-cigarette company Juul Lab stopped accepting orders for some if it’s most popular flavored products. In an announcement on Thursday of this week the FDA said that while it would allow stores to continue to sell flavored E-cigarettes, they would have to be restricted to areas that are inaccessible to minors.

In the same announcement, the FDA stated that it will move to ban additional flavored tobacco products, menthol cigarettes and flavored cigars. Flavored tobacco products are disproportionately used by young people, with a recent study finding that over 80% of youth and young adult tobacco users reporting using flavored tobacco. The same study also reported that 75% of youth tobacco users said that they would stop using tobacco if it was not flavored. These trends are exactly why the FDA has moved for new regulation. While youth use of combustible tobacco products has dropped, people who try E-cigarettes are more likely to use combustible in the future.

The exact way new regulations will be applied remain to be determined, and public health advocates have indicated disappointment that the FDA did not announce and outright ban. Age restrictions are already in place or tobacco products, and many underage individuals get tobacco from older people as opposed to stores illegally selling them.

For these same reasons, the ban on menthol cigarettes and flavored cigars is being lauded by advocates, especially in the African-American community where use of these products is especially high, and restrictions have been fought by the tobacco industry for years.

(Sheila Kaplan and Jan Hoffman, New York Times)

Offering free DNA sequencing, Nebula Genomics opens for Business. But there’s an itsy-bity catch

As personal genome sequencing becomes accessible and popular, so do the privacy concerns related to it. While individuals may choose to get their genome sequenced for recreational purposes, the data generated is highly valuable and of great interested to researchers, companies and law enforcement individuals, an evolving paradigm which was recently delved into in more detail on this blog. Individuals who sequence their genomes have the opportunity to share their (anonymized) data with researchers, however these systems remain one-sided and simplistic.

While AncestryDNA and 23andMe are currently the most popular companies for personal sequencing, a new genetics company run by famed geneticist and genome engineering/synthetic biology pioneer George Church, recently announced plans to enter the market. Church’s company, Nebula Genomics, plans to offer genome sequencing for a range of costs. Those who wish to opt out of sharing any information will pay $99 for genome sequencing, however the information provided will be low resolution. If the customer opts-in to sharing data the test will be free, and the accuracy of the data will be increased.

Regardless of whether they choose to answer questions about themselves, both free and paying costumers will still be able to refuse to share data with researchers. While other companies have an “all-or-nothing” approach to data sharing, Nebula will allow customers to audit data requests on a case-by-case basis. Any data shared will remain anonymized. Church stated that individuals with especially unique genetic traits that a company wants to study would even receive payment for their data. This approach would give people back control of their data, and is a push-back against the current system where companies control all data and the profits gathered from it.

(Sharon Begley, Stat News)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 16, 2018 at 4:33 pm

Insect Allies and the role of DARPA in scientific research

leave a comment »

By: Ben Wolfson, Ph.D.

early-heath-dragonfly-2186186_1920

Source: Pixabay

 

Last month, a Pentagon research program called Insect Allies burst into the public conversation after a team of research scientists and legal scholars published an article detailing their concerns and critiques of the project in Science magazine. Insect Allies is run by the Defense Advanced Research Projects Agency (DARPA), and was announced in 2016 with the stated goal of “pursuing scalable, readily deployable, and generalizable countermeasures against potential natural and engineered threats to the food supply with the goals of preserving the U.S. crop system”. As indicated by its eponymous project name, the Insect Allies research program seeks to develop insects that carry gene editing viruses, allowing for rapid genetic modification of plant food sources. The Insect Allies program exemplifies both the pros and cons of DARPA work. The described project leapfrogs current technological paradigms, promoting a next stage of synthetic biology work. However at the same time, it seeks to create a technology with problematic potential military applications. The battle between basic research and the development of military technologies is one that has dogged DARPA since its inception. As the theoretical and empirical knowledge in the fields of genetic modification and synthetic biology improve, it is imperative that novel technologies are developed with the appropriate ethical and moral oversight and that scientists consider the ramifications of their work.

Origins and Changes of DARPA

Science and the military have long been interwoven, a process that was formalized in the U.S. in the past century. In 1947, President Truman created the Department of Defense, in part to fund scientific research. A decade later President Eisenhower highlighted the importance of science in national defense with the creation of the Advanced Research Projects Agency (renamed DARPA in 1972). DARPA’s creation was in direct response to the launch of Sputnik by the Soviet Union, and given the mission statement of “preventing technological surprises like Sputnik, and developing innovative, high-risk research ideas that hold the potential for significant technological payoffs”.

In its early years, DARPA funded significant amounts of basic and foundational research that did not have immediate applications. However, in 1973 Congress passed the Mansfield Amendment, preventing the Defense Department from funding any research without “a direct and apparent relationship to a specific military function or operation”. The amendment was contentious at the time of its passing, with Presidential Science Advisor Lee DuBridge telling a congressional subcommittee that the amendment had negatively affected the quality of research projects because it is not possible to prove the relevance of a project, and therefore it is wrong to prevent an agency from funding basic research it sees as valuable. Passage of the amendment fundamentally reshaped the U.S. research funding landscape, and projects consisting of upwards of 60% of DOD research funds were cancelled or moved to other agencies. In place of basic research DARPA has shifted to funding research with direct military applications. These projects have often fallen into the realm of “dual-use” technologies, having both civilian and military uses. Successful examples of this strategy include funding projects that evolved into the internet and Global Positioning Systems (GPS). Current research span from projects with clear civilian applications, such as a multitude of projects researching the next generation of medical technologies, to those that are weapons research with purely military potential.

The Insect Allies program

Agriculture is one of the predominant industries in the U.S., making the U.S. a net exporter and world’s largest supplier of a variety of agricultural products including beef, corn, wheat, poultry and pork. The importance of American agriculture to both national security and the security of its global allies and trade partners is well recognized by national security officials, especially in the context of climate change and the potential for growing scarcity. The primary threats to agriculture are disease and weather related events. While these can be mitigated through pesticides, clearing of crops, quarantine, and selective breeding, current strategies are both destructive and time consuming.

The Insect Allies program has three focus areas; viral manipulation, insect vector optimization, and selective gene therapy in mature plants. Through application and combination of these technologies Insect Allies would function by genetically modifying already growing plants through utilization of “horizontal environmental genetic alteration agents (HEGAAs). Traditionally, genetic modification involves changing the genes of a parent organism and propagating its offspring. This process is essentially the same as the selective breeding practiced in agriculture for generations. While this is effective, it is a time-consuming practice as you must breed successive generations of your population of interest.

Through HEGAAs, Insect Allies completely revamp the process. Instead of creating a population of interest from scratch, HEGAAs allow scientists to modify an existing population. If you wanted to create a pesticide-resistant crop, the traditional strategy would be to insert the gene for pesticide resistance into one plant and then collect its seeds and use them to grow an entire field of pesticide resistant plants. With HEGAA technology, farmers could make an already grown field resistant by modifying each individual plant on a broad scale.

Criticism of the Insect Allies program

The authors of the Science article critique the Insect Allies program over a variety of issues, ranging from biological to ethical or moral dilemmas. The article raises issue with both the use of wide-scale genetic modification technologies as well as with the application of insects as vectors as opposed to already existing technologies such as overhead spraying. The use of wide-scale genetic modification is a line which has yet to be crossed, and currently lacks a regulatory path. While research into gene modifying technology is ongoing and real-world tests inevitable, these tests are a contentious issue that is currently being debated. Moreover, agricultural products modified by HEGAAs have no current path to the market. The combination of seemingly little thought in the program towards the regulation that would be necessary for the described application of their technology as well as the existence of lead the authors to suspect that Insect Allies is being developed for other means. While a population of gene-modifying insects could be used to help U.S. crops survive weather-changes or pesticides, they could also potentially be applied to crops of other nations in war. Biological Weapons were banned in 1972, and currently no nations have (publicly) developed them.While the technologies being developed by Insect Allies are described as “for peaceful means”, the stated goals are achievable through already existing technologies. Furthermore, international competition with Insect Allies may accelerate crossing the line between wartime and peacetime technology.

Soon after publication of the Science article, Dr. Blake Bextine, program manager for Insect Allies, released a statement refuting many of these points. He stated that DARPA moved into agricultural work as it is an important aspect of both national and international security, and that the work falls under DARPA’s charter to develop fundamentally new technologies that leapfrog existing capabilities. Moreover, he affirmed that Insect Allies has no plan for open release, and that regulatory systems would be developed and had been planned since the start of the program.

What does the future hold

The Science article’s authors note that they would be worried about Insect Allies whether it was under a civilian or military purview, but it is impossible to ignore the implications of synthetic biology and genetic modification research to the military. DARPA’s strategy of generating high-risk, high-reward research is both effective and engrained into the DNA of the organization, however so is the fact that DARPA is a defense organization.

When DARPA was founded (as ARPA), it was to promote high-risk scientific research that would increase U.S. soft power internationally. After the Mansfield amendment, these goals were shifted towards applied research instead of basic, and with them a focus on defense-oriented research. An advantage of basic research is that it takes time to develop, allowing the findings, and their ramifications, to percolate throughout the global scientific community. The quintessential example of this is regulation of recombinant DNA technologies. Soon after recombinant DNA technology was developed, the 1975 Asilomar Conference was held to establish voluntary guidelines that would ensure the safety of a game-changing scientific technology. As synthetic biology technological development has accelerated, the discussion around the regulation of synthetic biology and genetic modification technology has also begun, and is currently ongoing.

While it is impossible to argue with the massive benefits that civilian applications of DARPA developed technologies have provided, synthetic biology and genetic modification technologies have the potential to enact immense changes globally. The environment and application of a technology has a huge potential to influence its use and the way it is viewed by the public for generations. Insect Allies program states that it is focusing on developing insect-based HEGAAs technologies as a means of pushing development of gene-editing technologies to increase food security in a transparent manner that promotes open published research. It is critical that the Insect Allies program is held to this standard, and that regulation by the global scientific community is allowed to impact the direction and application of these potentially game-changing technologies.

 

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 15, 2018 at 11:22 am

Unlinking databases is not enough to unlink identity from genetic profiles

leave a comment »

By: Allison Dennis B.S.

genomic-privacy-3302478_1280

Source: Pixabay

 

The efficacy of law enforcement is an issue of public safety. Advances in medicine are a matter of personal wellbeing. Knowing more about one’s unique genetic heritage is a point of curiosity. As all of these spheres delve further and further into DNA sequencing, the ubiquity of personal genetic information is increasingly becoming an issue of privacy. The emerging nature of DNA technology has left us with three major types of DNA databases separated by their use: medical, forensic, and recreational. Each is governed by its own sets of rules, set by federal law, state law, and user agreements. Under specific circumstances data can be intentionally shared for other uses. However, the technological limitations that kept these databases separated in the past may be nearing erosion.

Medical

By congregating and comparing the genomes of people with and without a specific disease through DNA databanks, researchers can discover small glitches in the DNA of affected patients. Identifying the genetic changes that disrupt the normal functions of the body allows researchers to begin designing therapeutics to correct deficiencies or developing genetic tests to diagnose specific diseases, possibly before symptoms have appeared. The potential for medical databases have prompted government led initiatives such as All of Us to amass genetic information from a diverse group of 1 million Americans, which will be queried for medical insights. Already, the Cancer Genome Atlas, maintained by the US National Institutes of Health, contains deidentified genetic data from tumor and normal tissues from 11,000 patients and is openly searchable for research purposes. Foundation Medicine, a private company that provides doctors and patients with genomic profiles of tumor samples to inform treatment options, has stockpiled data from over 200,000 samples. Foundation Medicine shares these data through collaborative agreements and business partnerships with members of the oncology research community and pharmaceutical companies.

Medical DNA databanks, while masking a patient’s name, may link to an individual’s medical history. Because researchers often do not know what parts of the genome will reveal key clues, the genetic data contained in these databases is rich. Often researchers look at how the frequency of single nucleotide changes at hundreds of thousands of places in the genome differ between people affected and unaffected by a particular disease.

The medical benefit of compiling and sharing genomic information is carefully balanced against privacy concerns by Federal regulation. The Genetic Information and Nondiscrimination Act of 2008 (GINA) prohibits employers and health insurers from requesting access to an individual’s or family’s genetic information. The Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule mandates that health-care providers not disclose an individual’s genetic information. The NIH Genomic Data Sharing Policy limits access to individual-level genetic information held in their databases, including the Cancer Genome Atlas, to approved scientific researchers. Despite these safeguards, genetic information contained within medical databases can be identified and provided to law enforcement following a court order in extreme cases.

Forensic

Forensic DNA databases contain searchable genomic profiles for the critical task of identification by law enforcement and military experts. U.S. Federal law allows law enforcement officers to collect and store DNA profiles on anyone they arrest, including those detained by immigration enforcement. Since 1998, the Federal Bureau of Investigation has hosted the national Combined DNA Index System (CODIS), which currently contains 16.8 million offender and arrestee profiles. Unlike medical databases, which can contain a wealth of information, CODIS profiles are limited to a set of 20 places in the genome where the number of times a small sequence of DNA is repeated varies between individuals. The unique combination of these 20 lengths place the probability of two unrelated people sharing a profile at roughly 1 in 1 septillion, and were intentionally selected to not reveal any medically relevant parts of the genome.

The creation of CODIS was authorized by Congress through the DNA Identification Act of 1994, which mandated privacy protection standards. As a safeguard, the database profiles are associated with specimen identification numbers rather than any personal information. The system can only be accessed in physically secure spaces and is restricted to use by criminal justice agencies specifically for the purpose of law enforcement. Only after a match has been found from a query and the candidate match has been confirmed by an independent laboratory will the identity of the suspected perpetrator be revealed, and even then only to the agencies involved in the cases. The Scientific Working Group on DNA Analysis Methods (SWGDAM) continues to recommend revisions to these standards for security and confidentiality issues. Despite housing a relatively unrevealing type of genetic information, CODIS goes above and beyond the privacy protections provided by many recreational and medical databases.

Recreational

Individuals are increasingly turning to direct-to-consumer genetics testing, driven by their curiosity to discover their genetic heritage and to gain some insight into their genetic traits. These tests contain a wealth of information drawn from single nucleotide changes across more than 500,000 parts of the genome. The most popular tests are offered by AncestryDNA and 23andMe, who manage data according to Privacy Best Practices established by the industry. These practices include removing names and demographic identifiers from genomic records, storing identifying information separately if retained, using encryption, limiting access, and requiring consent for third party sharing. As the records are presumed to contain medically relevant information, all identified samples are governed by the same HIPAA and GINA regulations that govern medical tests. 23andMe has amassed a database of over 5 million genetics profiles. AncestryDNA has over 10 million, greatly rivaling the size of forensic and medical databases.

Direct-to-consumer genetics testing companies often sell de-identified genetic data to pharmaceutical and diagnostic development companies for research purposes. Those that follow the Privacy Best Practices established by the industry only do so for users who have consented to participate in research, and GINA expressly prohibits these companies from sharing an individual’s genetic information with potential employers or health insurers.

There are also limits to prevent law enforcement from abusing recreational genetics testing companies. While there is the potential for someone to submit a sample that is not their own, the AncestryDNA service agreement stipulates that users only provide their own sample, and 23andMe expressly disallows “law enforcement officials to submit samples on behalf of a prisoner or someone in state custody.” Moreover, their tests have been specifically designed to make collection of a third parties’ sample difficult. For instance, the 23andMe test requires an amount of saliva needing 30 minutes to generate, preventing illicit collection.

While companies go to great lengths to protect the information contained in their databases, most companies will provide individuals with their own complete profiles when requested. The allure of mapping family connections has lead millions of genealogical hobbyists to openly contribute their re-identified genomic DNA to searchable online databases. The most famous searchable database is GEDmatch, which currently contains about one million profiles. The platform allows users to upload their own genome to retrieve high probability matches of other user’s profiles. A level of privacy is maintained by only sharing small pieces of the genome, allowing complete profiles to remain obscured. However, GEDmatch’s user agreement emphasizes that rather than use encryption, they store data in a format that “would be very difficult for a human to read” and allow volunteers access to the data. Additionally, they specifically welcome “DNA obtained and authorized by law enforcement” for inclusion in their database. The wealth of information publicly hosted on sites like GEDmatch have provided a unique opportunity for other types of DNA databanks to share information and blur the lines of privacy.

Database Cross-Linking

The use of GEDmatch by law enforcement marks an important seachange in genetic privacy. In the past, medical and recreational databases were only occasionally queried by law enforcement, who were seeking specific profiles. However in April 2018, in a desperate search for leads to solve a cold case, law enforcement officers utilized a nearly 40-year old rape-kit to develope a genetic profile. While previous searches over the decades had been limited to the FBI database and the perpetrator’s 20 CODIS loci law enforcement officials were able to undertake a blind and expansive search by uploading the complete profile to the GEDmatch database, which ultimately lead to a third cousin of the man who would be charged with 12 murders.

These types of searches have the power to exonerate or implicate criminals, as a 100% match is undeniable. While only just starting to be used, for someone of European ancestry living in the United States the odds are as high as 60% that a genetic relative can be identified from a database similar to GEDmatch. A public opinion poll conducted shortly after April 2018, revealed that the majority of respondents approved of searches of recreational databases by law enforcement, especially to identify perpetrators of violent crimes.

Scientists have already laid the theoretical groundwork that could allow law enforcement to link a suspect’s profile in a medical or recreational database using the limited 20 CODIS markers from a crime-scene sample. Portions of the genome that share close physical proximity along a chromosome are more likely to be inherited together, allowing statistical predictions to be made about which pieces are most likely to occur together. Although the two types of profiles do not contain the same markers, scientists can predict which marker profiles most likely came from the same individual.

While the use of these tactics might be supported for the purpose of identifying violent criminals, it also puts medical privacy at risk. Despite the de-identification of genomic profiles, scientists have demonstrated reasonable success in tracking down a person’s identity given a genetic profile, a genealogical database such as GEDmatch, and information on the internet.

As DNA databases develop in their depth of information and coverage of individuals, the ability to link records to individuals grows. A lack of compatibility will not be enough to keep medical genomic information sequestered from criminal profiles. Industry standards and user agreements must be discussed and strengthened to safeguard the genetically curious.

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 8, 2018 at 10:14 am

Science Policy Around the Web – March 06, 2017

leave a comment »

By: Liu-Ya Tang, PhD

Source: pixabay

Technology and Health

Is That Smartphone Making Your Teenager’s Shyness Worse?

The development of new technologies, especially computers and smartphones, has greatly changed people’s lifestyles. People can telework without going to offices, and shop online without wandering in stores. While this has brought about convenience, it has also generated many adverse effects. People tend to spend more time with their devices than with their peers. Parents of shy teenagers ask, “Is that smartphone making my teenager’s shyness worse?”

Professor Joe Moran, in his article in the Washington Post, says that the parents’ concern is reasonable. The Stanford Shyness Survey, which was started by Professor Philip Zimbardo in the 1970s, found that “the number of people who said they were shy had risen from 40 percent to 60 percent” in about 20 years. He attributed this to new technology like email, cell phones and even ATMs. He even described such phenomena of non-communication as the arrival of “a new ice age”.

Contrary to Professor Zimbardo’s claims, other findings showed that the new technology provided a different social method. As an example, teenagers often use texting to express their love without running into awkward situations. Texting actually gives them time and space to digest and ponder a response. Further, Professor Moran said that the claim of Professor Zimardo was made before the rise of social networks;  shy teenagers can share their personal life online even if they don’t talk in public. He also talks about the paradox of shyness, where shyness is caused by “our strange capacity for self-attention”, while “we are also social animals that crave the support and approval of the tribe.” Therefore, new technologies are not making the shyness worse, in contrast social networks and smartphones can help shy teenagers find new ways to express that contradiction. (Joe Moran, Washington Post)

Genomics

Biologists Propose to Sequence the DNA of All Life on Earth

You may think that it is impossible to sequence the DNA of all life on Earth, but at a meeting organized by the Smithsonian Initiative on Biodiversity Genomics and the Shenzhen, China-based sequencing powerhouse BGI, researchers announced their intent to start the Earth BioGenome Project (EBP). The news was reported in Science. There are other ongoing big sequencing projects such as the UK Biobank, which aims to sequence the genomes of 500,000 individuals.

The significance of the EBP will greatly help “understand how life evolves”, says Oliver Ryder, a conservation biologist at the San Diego Zoo Institute for Conservation Research in California. Though the EBP researchers are still working on many details, they propose to carry out this project in three steps. Firstly, they plan to sequence the genome of a member of each eukaryotic family (about 9000 in all) in great detail as reference genomes. Secondly, they would sequence species from each of the 150,000 to 200,000 genera to a lesser degree. Finally, the sequencing task will be expanded to the 1.5 million remaining known eukaryotic species with a lower resolution, which can be improved if needed. As suggested by EBP researchers, the eukaryotic work might be completed in a decade.

There are many challenges to starting this project. One significant challenge is sampling, which requires international efforts from developing countries, particularly those with high biodiversity. The Global Genome Biodiversity Network could supply much of the DNA needed, as it is compiling lists and images of specimens at museums and other biorepositories around the world. As not all DNA samples in museum specimens are good enough for high-quality genomes, getting samples from the wild would be the biggest challenge and the highest cost. The EBP researchers also need to develop standards to ensure high-quality genome sequences and to record associated information for each species sequenced. (Elizabeth Pennisi, ScienceInsider)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

March 6, 2017 at 8:41 am