Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘technology

Science Policy Around the Web – July 17, 2018

leave a comment »

By: Saurav Seshadri, PhD

20180717_Linkpost

source: pixabay

Animal cruelty

New digital chemical screening tool could help eliminate animal testing

An algorithm trained to predict chemical safety could spare over 2 million laboratory animals per year from being used in toxicological screening.  Researchers at the Johns Hopkins University Center for Alternatives to Animal Testing (CAAT) report that their model reproduced published findings at a rate of almost 90% – higher, on average, than replication studies done in actual animals.  The model pools information from various public databases (including PubChem and the European Chemicals Agency or ECHA) to extract ~800,000 structural properties from ~80,000 previously tested chemicals, which can be used to assign a chemical similarity profile to a new compound.  The model then uses a supervised learning approach, based on previous results, to determine what toxicological effects would likely be associated with that compound’s profile.

The principle employed by the tool, i.e. predicting the toxicity of unknown compounds based on structural similarity to known compounds, is called read-across and is not new.  It was a core goal of REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals), an effort by the ECHA in 2007 to gather toxicological information about all chemicals marketed in the EU.  However, limited data (only ~20,000 compounds) and legal hurdles (ECHA claimed that its data was proprietary and prevented an earlier version of the algorithm from being released) delayed the publication of a machine learning-empowered read-across tool until now.  As it stands, the prospective availability of the tool is unclear: the authors intend to commercialize it through a company called Underwriters Laboratories, but claim to have shared it with other academic and government laboratories.

Another key question is how the tool will be received by regulatory agencies, particularly the Environmental Protection Agency (EPA).  In 2016, Congress passed legislation mandating thorough and transparent evaluation of chemical safety, and encouraging reduced animal testing.  To these ends, US agencies are already developing databases (such as ToxCast) and predictive modeling tools (e.g. this workshop organized by the Interagency Coordinating Committee on the Validation of Alternative Methods) – whether this model could be integrated with ongoing efforts remains to be seen.  While the authors point out that complex endpoints like cancer are still beyond the scope of the tool and will require in vivo testing, subjecting animals to tests for eye irritation and inhalation toxicity may soon, thankfully, be a thing of the past.

(Vanessa Zainzinger, Science)

 

Gene editing

Controversial CRISPR ‘gene drives’ tested in mammals for the first time

A potentially transformative application of the CRISPR gene editing technology has been given a dose of reality by a recent study.  A gene drive is an engineered DNA sequence encoding a mutated gene as well as the CRISPR machinery required to copy it to the animal’s other chromosome, thereby allowing the mutation to circumvent normal inheritance and spread exponentially in a population.  Researchers quickly recognized the potential of this approach to control problematic populations, such as malaria-carrying mosquitoes, and proof-of-concept studies in insects have been successful.  However, until gene drives could be proven effective in mammals, their applicability to invasive rodent populations has been unclear.

Now, researchers at UCSD have shown that the approach can work in mice, but with possibly insurmountable caveats.  In order to achieve efficient copying of the mutated gene, the team specifically turned on the DNA-cutting enzyme Cas9 during meiosis, when dividing sperm and egg cells are biased towards using gene insertion to repair DNA breaks.  Using this approach, they were able to boost inheritance of a mutation that produces a white coat in normally dark mice, from 50% to 73% of offspring.  However, due to differences in the mechanisms of sperm and egg production between mice and insects, the effect was only seen in females; even in these, differences in the timing of Cas9 activity between different strains of mice led to inconsistent phenotypes in the offspring (i.e., grey coats).  Ultimately, the low efficiency observed precludes any realistic application to population control in the wild.

This cautious result may be welcomed by opponents of gene drive technology: environmental activists, fearing the uncontrollable effects of an accidental release, had called for a moratorium on gene drive research at the UN Convention on Biodiversity in 2016.  However, this call was rejected, and groups such as GBIRd (Genetic Biocontrol of Invasive Rodents) are dedicated to finding responsible ways forward.  One example is to restrict use of modified mice to islands, which may be too populated for large-scale pesticide use but still geographically self-contained.  Another compromise, suggested by the authors, is to use the method to speed up production of polygenic disease model mice.  Overall, like CRISPR in general, it appears that population-level gene editing will need substantially more research before it can realize its dramatic potential.

(Ewen Callaway, Nature)

Have an interesting science policy link? Share it in the comments!

Advertisements

Written by sciencepolicyforall

July 17, 2018 at 5:09 pm

Science Policy Around the Web – February 16, 2018

leave a comment »

By: Mohor Sengupta, PhD

Health Bacteria Cell Infection Microbiology Black

source: Max Pixel

Antibiotic discovery

A potentially powerful new antibiotic is discovered in dirt

Antibiotics have been in an ongoing, constant battle with the pathogens they are aimed to eliminate. Bacteria constantly mutate their genetic material to acquire resistance to anti-microbial drugs, making multi-drug resistance a global concern. Misuse or overuse of antibiotics contributes to this phenomenon. To address the issue of multi-drug resistance, a team of microbiologists at the Rockefeller University, NY, have conducted a large screen of natural products produced by soil-dwelling bacteria. According to Dr. Sean Brady, who heads the group, only a small fraction of the bacterial biodiversity is cultured in the lab and only a tiny fraction of chemicals produced by these bacteria are detectable. Identification of naturally produced chemicals by bacteria that have never been cultured in the lab provides a promising new direction towards anti-microbial therapies.

Dr. Brady’s group adopted a “culture-independent”, metagenomics approach to analyze chemicals secreted by unknown bacteria from soil samples. Their aim has not been to identify the bacteria in the samples but to look for DNA signatures associated with calcium dependent antibiotic properties. This means that the chemical they are looking for will act against bacteria only in the presence of calcium. After the identification of a gene potentially encoding a calcium dependent antibiotic, the researchers cloned it and made a laboratory grown bacteria (S. albus J1074) express it. The gene product is a new class of antibiotics that have been named “malacindin”. Dr. Brady’s research has shown that malacidins act by interfering with bacterial cell wall formation and have shown this antibiotic to be effective against a range of superbugs, including methicillin-resistant Staphylococcus aureus (MRSA). Calcium dependent antibiotics are believed to make it more difficult for the target bacteria to evolve resistance. Dr. Brady’s research was published in Nature Microbiology on February 12.

Conventional methods to isolate new antibiotics from laboratory cultured bacteria often lead to the same antibiotics being found over and over again, resulting in abandonment of such approaches in recent times. The novelty of Dr. Brady’s work lies in the use of natural sources, like soil, sewage, water etc. to isolate the genetic blueprint encoding anti-microbial chemicals, made easier with the use of metagenomics and large-scale sequencing. Researchers elsewhere are also using this approach to identify new antibiotics from natural sources. In the modern scenario of increasing deaths due to multi-drug resistance, this type of research is critical to rapid discoveries of novel antimicrobial therapies. Of course, getting the newly discovered drug into the market will not be fast, as Dr. Brady warns, yet this is an ingenious solution to discovering clinically useful antibiotics.

(Sarah Kaplan, The Washington Post)

Risk assessment

He Took a Drug to Prevent AIDS. Then He Couldn’t Get Disability Insurance

Pre-exposure prophylaxis (PrEP) is a practice of taking a drug to prevent HIV infection in persons with high risk of contracting it. In the year 2012, the F.D.A. approved Truvada, a drug originally approved for HIV treatment a decade earlier, for prevention of HIV infection (PrEP). Since then PrEP has become increasingly popular and as of 2017, an estimated 136,000 people in the United States were on PrEP. Several studies have shown that Truvada is highly effective in preventing HIV infection. However in the initial days of Truvada use,  some thought that individuals taking prophylaxis might overestimate its level of protection, leading them to engage in risky behavior they otherwise would have avoided. This belief is prevalent even today, as several insurance companies across the United States regularly deny disability and life insurance to men on PrEP on the basis that this treatment is indicative of an increased level of personal risk.

The repercussions of this policy, was exemplified when Dr. Philip J. Cheng of Brigham and Women’s Hospital at Harvard accidentally cut himself while preparing an HIV positive patient for surgery. The responsible behavior in this situation is to immediately take steps to prevent infection. Dr. Cheng did just that, by enrolling into PrEP. However, when he applied for a disability insurance, he was denied coverage because he was taking Truvada. He could not get the insurance company to cover him even after agreeing to sign a waiver of benefits in case he got infected.

Disability insurance is usually applied for by people whose livelihood depends on their income. For people like Dr. Cheng this insurance will guarantee him his lifetime of income in the case of a disability. Use of Truvada has not shown any adverse side-effects till date. In fact, it is said to be safer than aspirin, whose long term usage causes gastro-intestinal bleeding. It is a consensus among AIDS doctors across the USA that PrEP is necessary for individuals at high risk of contracting HIV. Denial of insurance to PrEP users by insurance companies has been likened to denying insurance for using car seat-belts by Dr. Robert M Grant, whose group led the clinical trial that established the importance of PrEP. Even more perplexing is the fact that life insurance companies are regularly providing insurance to people with other conditions that are managed by regular medications, like diabetes and heart diseases. Even former alcoholics who are now un-addicted are not denied.

Mr. Bennet Klein, a lawyer with Boston based GLAD, an organization of legal advocates and defenders of GLBTQ community has asked several insurance companies the reason for denying insurance to men on PrEP. In most interviews with various insurance companies he and others have heard a range of answers, some ambiguous. The general understanding is that insurance companies are increasingly following this trend because they suspect potential high-risk behavior in PrEP users. The crux here is that regardless of risky sexual behavior, PrEP is highly protective. A prominent work of research published in The New England Journal of Medicine in 2010 showed that tenofovir, one of the chief components of Truvada reduced the risk of HIV infection by 95 percent. The famous HPTN 052 clinical trial of 2011 also showed the efficacy of PrEP.

Because of the prevalence of insurance denials, several people, like Dr. Cheng have stopped using PrEP. It is critical that this trend is reversed, in the light of clear benefits of taking PrEP. While there are insurance companies that do provide disability and life insurance to PrEP users, cases like Dr. Cheng’s result in disappointment and eventual withdrawal from using PrEP.

(Donald G. McNeal Jr, The New York Times)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 16, 2018 at 4:58 pm

Science Policy Around the Web – February 6, 2018

leave a comment »

By: Liuya Tang, PhD

Linkpost_20180206

source: pixabay

Cancer treatment

Breast cancer treatments can raise risk of heart disease, American Heart Association warns

Common cancer treatments include surgery, chemotherapy and radiation therapy. Chemotherapy and radiation therapy are always applied before or after surgical removal of a tumor, or applied to cancer patients when surgery cannot be performed. Not only will they attack tumor cells, chemotherapy or radiation therapy will also damage normal cells at the same time, which increases risks for other diseases. A recent report in the journal Circulation said that breast cancer treatments can raise risk of heart disease. It has been noticed that “breast cancer survivors who are 65 and older and were treated for their cancer are more likely to die of cardiovascular problems than breast cancer.” The possible cardiovascular consequences of breast cancer treatments may not be new to oncologists, but new cancer treatments have complex side effects which may not fully understood as they work differently from conventional cancer treatments. For example, the newly-developed cancer treatment, immunotherapy, stimulates the patient’s immune system to attack tumors, but sometimes the surging immune response can overshoot its target and attack healthy tissues and organs.

It is not a good idea to stop cancer treatment due to side effects, as saving ones life from a dangerous cancer is critical. But for this double-edged sword, how to make one edge blunt while keeping the other edge sharp? This requires surgeons and oncologists to work together to make a personalized treatment plan. As suggested by Dr. Deanna Attai, a breast surgeon at the University of California at Los Angeles, the patients with less-aggressive tumor may skip chemotherapy based on the test results on the cancer’s risk of recurrence. In addition, adopting different ways to deliver chemo drugs and developing more-targeted radiation can reduce the risks of cardiac damage for breast cancer patients.

It is not solely a doctor’s responsibility to monitor the side effects of cancer treatments, patients also need to be aware of what types of treatments and what the possible side effects are. Wrong treatments of side effects can aggravate symptoms, which may lead to severe problems. The new emerging immunotherapy presents a big challenge to the health care system as the side effects are not thoroughly understood. Doctors’ organizations and nonprofit groups are joining information campaigns to narrow the knowledge gap on immunotherapy, which will help patients better understand procedures of cancer treatment and manage any side effect if it occurs.

(Laurie McGinley, The Washington Post)

 

Drug development

Racing to replace opioids, biopharma is betting on pain drugs with a checkered past

The opioid epidemic has become a significant problem in the US, as 116 people died every day from opioid-related drug overdoses in 2016. To resolve this issue, biopharma continues to develop pain drugs. The class of drugs are called NGF inhibitors, which were halted by FDA in 2010 due to their severe side effects. NGF is short for nerve growth factor, which is a neuropeptide. When an injury occurs, the production of NGF is increased, which helps the brain perceive the pain. Theoretically antibodies that specifically bind NGF before it reaches cell receptors could be a good choice to inhibit NGF function, therefore treating people with chronic pain. But it was found that NGF antibodies are not suitable for a subset of patients with osteoarthritis, for whom treatment lead to dramatic joint deterioration. To obtain FDA’s approval of entering further clinical trials, drug companies showed that NGF drugs will probably be safe for patients not at risk of joint deterioration and shouldn’t be taken with nonsteroidal anti-inflammatory drugs such as Advil. So the clinical study was resumed in 2015. Will it become a replacement drug of opioids? Will the benefits outweigh its risks? The results will be put on table this year after drug companies finish their Phase 3 studies.

 

The severity of the opioid epidemic and the high need of non-addictive painkillers have kept drug companies optimistic about developing NGF drugs despite the side effects. However, there are opposite voices. The watchdog group Public Citizen criticized that the side effects are obvious and “further pursuit of testing in humans was an unreasonable course of action”. Criticisms also come from the business side. Leerink analyst Geoffrey Porges has warned Regeneron’s NGF drug would carry “all of the liabilities” of the past and scolded their continuing to pour money into the project. The failure has already been seen in the development of fulranumab, which is one type of NGF antibody. Even if NGF antibodies were approved by FDA, doctors would have concerns for prescribing a medication with potentially dangerous outcomes for patients with certain conditions.

(Damian Garde, STAT News)

 

 

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 6, 2018 at 10:53 pm

Science Policy Around the Web – January 30, 2018

leave a comment »

By: Kelly Tomins, BSc

20180130_Linkpost

By RedCoat (Own work) [CC-BY-SA-2.5], via Wikimedia Commons

Cloning

Yes, They’ve Cloned Monkeys in China. That Doesn’t Mean You’re Next.

Primates were cloned for the first time with the births of two monkeys, Zhong Zhong and Hua Hua, at the Chinese Academy of Sciences in Shanghai. Despite being born from two separate mothers weeks apart, the two monkeys share the exact same DNA. They were cloned from cells of a single fetus, using a method called Somatic Cell Nuclear Transfer (SCNT), the same method used to clone over 20 other animal species, beginning with the now infamous sheep, Dolly.

The recently published study has excited scientists around the world, demonstrating the potential expanded use of primates in biomedical research. The impact of cloned monkeys could be tremendous, providing scientists a model more like humans to understand genetic disorders. Gene editing of the monkey embryos was also possible, indicating scientists could alter genes suspected to cause certain genetic disorders. These monkeys could then be used a model to understand the disease pathology and test innovative treatments, eliminating the differences that can arise from even the smallest natural genetic variation that exists between the individuals of the same species.

Despite the excitement over the first cloning of a primate, there is much work to be done before this technique could broadly impact research. The efficiency of the procedure was limited, with only 2 live births resulting from 149 early embryos created by the lab. In addition, the lab could only produce clones from fetal cells. Now it is still not possible to clone a primate after birth. In addition, the future of primate research is uncertain in the United States. Research regarding the sociality, intelligence, and DNA similarity of primates to humans has raised ethical concerns regarding their use in research. The US has banned the use of chimpanzees in research, and the NIH is currently in the process of retiring all of its’ chimps to sanctuaries. In addition, there are concerns regarding the proper treatment of many primates in research studies. The FDA recently ended a nicotine study and had to create a new council to oversee animal research after four squirrel monkeys died under suspicious circumstances. With further optimization, it will be fascinating to see if this primate cloning method will expand the otherwise waning use of primate research in the United States.

The successful cloning of a primate has additionally increased ethical concerns over the possibility of cloning humans. In addition to the many safety concerns, several bioethicists agree that human cloning would demean a human’s identity and should not be attempted. Either way, Dr. Shoukrat Mitalipov, director of the Center for Embryonic Cell and Gene Therapy at the Oregon Health & Science University stated that the methods used in this paper would likely not work on humans anyways.

(Gina Kolata, New York Times)

Air Pollution

EPA ends clean air policy opposed by fossil fuel interests

The EPA is ending the “once-in always-in” policy, which regulated how emissions standards differ between various sources of hazardous pollutants. This policy regards section 112 of the Clean Air Act, which regards regulation of sources of air pollutants such as benzene, hexane, and DDE. “Major sources” of pollutants are defined as those that have the potential to emit 10 tons per year of one pollutant or 25 tons of a combination of air pollutants. “Area Sources” are stationary sources of air pollutants that are not major sources. Under the policy, once a source is classified as a major source, it is permanently subject to stricter pollutant control standards, even if emitted pollutants fall below the threshold. This policy was intended to ensure that reductions in emissions continue over time.

The change in policy means that major sources of pollution that dip below the emissions threshold will be reclassified as an area source, and thus be held to lower air safety standards. Fossil fuel companies have petitioned for this change for years, and the recent policy change is being lauded by Republicans and states with high gas and coal production. The EPA news release states that the outdated policy disincentives companies from voluntarily reducing emissions, since they will be held accountable to major source standards regardless of the amount of emissions. Bill Wehrum, a former lawyer representing fossil fuel companies and current Assistant Administrator of EPA’s Office of Air and Radiation, stated reversing this policy “will reduce regulatory burden for industries and the states”. In contrast, environmentalists believe this change will drastically increase the amount of pollution plants will expel due to the softening of standards once they reach a certain threshold. As long as sources remain just below the major source threshold, there will be no incentive or regulations for them to lower pollutant emissions.

(Michael Biesecker, Associated Press)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 30, 2018 at 3:30 pm

Science Policy Around the Web – December 1, 2017

leave a comment »

By: Kelly Tomins, BSc

20171201_Linkpost

source: pixabay

Fake Drugs

Health agency reveals scourge of fake drugs in developing world

The World Health Organization (WHO) released two concerning reports detailing the prevalence and impact of substandard and falsified medical products in low and middle income countries. Although globalization has led to the increase in e-commerce of medicine, making life-saving treatments available to a broader population, it also created a wider and more accessible market to dispense fake and harmful medicines for profit. Despite this, there was a lack of a systematic method of tracking falsified medicines on a global scale. Thus, the WHO created the Global Surveillance and Monitoring System for substandard and falsified medical products (GSMS). With this program, medicine regulatory authorities can enter information about fraudulent drug incidences into a centralized database, making it easier to understand global trends and to possibly identify the source of harmful products. The WHO also conducted an extensive literature search of nine years’ worth of medicine quality studies to assess rates of fake medicines.

Their dual analysis showed that falsified medicines are heavily prevalent, particularly in low-and -middle income countries. They estimate that an incredulous 10.5% of medicines in these countries are falsified or substandard, representing $30 billion in wasted resources. Low income countries are the most vulnerable to this type of exploitation, given their higher incidence of infectious disease and their likelihood of purchasing cheaper alternatives to more reliable and tested medicines. In addition, these countries are more likely to lack the regulatory framework and technical capabilities to ensure safe dispensing of medicines. However, reports of fake drugs were not limited to developing countries. The Americas and Europe each accounted for 21% of the reported cases, highlighting how this is a global phenomenon.

Antimalarials and antibiotics are the two products most commonly reported as substandard or falsified, with 19.6% and 16.9% of the total reports respectively. These findings are especially concerning given a recent finding that the number of malaria infections increased the past year, despite a steady global decrease from 2000-2015. In addition, the number of deaths from the disease have not decreased for the first time in 15 years. By providing an insufficient dose to eradicate the malaria parasite from the body, the use of substandard or falsified antimalarials can foster the emergence of drug-resistant strains of malaria, like those recently found in several Asian countries. Overall, the WHO estimates that falsified products may be responsible for 5% of total deaths from malaria in sub-Saharan Africa.

Despite the clear need for action to ensure drug safety around the world, there are an abundance of challenges to making this possible. The supply chain of drug manufacturing, from the chemical synthesis of the drug, to the creation of packaging and the shipping and dissemination, can span multiple countries with extremely variable regulatory procedures and oversights. The need for strengthened international framework and oversight is necessary to ensure patients receive the drugs they think they are getting and preventing hundreds of thousands of deaths each year.

(Barbara Casassus, Nature)

Biotechnology

AI-controlled brain implants for mood disorders tested in people

Mood disorders have been traditionally difficult to treat due to the often-unpredictable onset of symptoms and the high variability of drug responses in patients.  Lithium, a popular drug used to treat bipolar disorder, for example, can cause negative side effects such as fatigue and poor concentration, making it more likely for patients to elect to stop treatment. New treatments developed by the Chang lab at Massachusetts General Hospital and Omid Sani of UCSF hope to provide real-time personalized treatments for patients suffering from mood disorders, such as depression and PTSD. The treatment utilizes a brain implant that can monitor neural activity, detect abnormalities, and then provide electrical pulses to a specific region of the brain when needed. These electrical pulses, known as Deep Brain Stimulation (DSB), have already been used to treat other disorders such as Parkinson’s disease. Other groups have tried to use DSB in the past to treat depression, but patients showed no significant improvement. In those studies, however, the pulses were given constantly to a single portion of the brain. What is unique about this treatment is that the pulses are only given when necessary, or when the implant receives signals that the brain is producing abnormal neural activity. The researchers have also found ways to map various emotions and behaviors to specific locations in the brain. They hope to utilize that information in order to more finely tune a person’s behaviors. In addition, the algorithms created by the labs to detect changes in the brain can be modified for each patient, providing an alternative to the one-size-fits all pharmacological approaches currently used.

Despite the promising and appealing aspects of this personalized treatment, it also raises several ethical issues regarding privacy and autonomy. First off, with such detailed maps of neural activities, the patient’s mind is practically an open book to their doctor. They have little agency of what emotions they would want to share or, more importantly, hide. Also, the patient may feel a lack of autonomy over their treatment, as the implant itself decides when the patient is displaying an unwanted mood or behavior. The algorithms could also potentially change the patient’s personality for worse by limiting the spectrum or intensity of emotions that a patient can feel. Any type of manipulation of brain activity could be viewed as worrisome from an ethical standpoint, and although promising, this proposed treatment should undergo intense scrutiny in order to maintain autonomy for the patients.

(Sara Reardon, Nature)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

December 1, 2017 at 4:03 pm

Science Policy Around the Web – September 19, 2017

leave a comment »

By: Nivedita Sengupta, PhD

20170919_Linkpost_1

source: pixabay

Global Warming

The world may still hit it’s 1.5 °C global warming mark, if modelers have anything to say about it

Nature Geoscience published a paper on 18th September, which deliving what could be good news regarding climate change and global warming. The team of climate scientists in that paper claimed that it is possible to limit global warming to 1.5 °C above pre-industrial levels, as mentioned in the 2015 Paris climate agreement. The data in the paper stated that the global climate models used in a 2013 report from the Intergovernmental Panel on Climate Change (IPCC), overestimated the extent of warming that has already occurred. By adjusting this parameter when performing model-based calculations, they concluded that the amount of carbon that humanity can emit from 2015 and onwards to keep temperatures from rising above 1.5 °C is almost 3 times greater than the previous estimation done by IPCC. The conclusions have significant implications for global policymakers and suggests that IPCC’s carbon budget can be met with modest strengthening of the current Paris pledges up to 2030, followed by sharp cuts in carbon emissions thereafter.

Some scientists are already challenging the conclusions from the paper, questioning the analysis’s reliance on a period of slower warming, a so called climate hiatus, at the beginning of this millennium which continued until 2014. They say that natural variability in the climate system produced lower temperatures during that period and hence any calculations based on that can be artificially low because it calculates the human contribution towards warming during those periods. Moreover, the oceans and the land were probably absorbing more carbon than normal during this period. Hence natural processes can return some of that carbon to the atmosphere. Taking these factors into account reduces the predicted amount of carbon that can be released while keeping atmospheric temperatures under the 1.5°C limit. But the authors of the paper argue that the climate hiatus should not significantly contribute too their conclusions. They feel the multiple methodologies used to estimate the actual warming due to greenhouse gases should allow the calculations to be accurate irrespective of any short-term climate variability.

Nonetheless,  humanity’s rapid assent towards the global warming threshold is muddled by modelling scenarios, framing the 1.5 °C as a very small target for policy makers and scientists to try to hit by 2030. The fine details of carbon emission matters when scientists are looking for the precise effects of the different greenhouse gases on global warming. But even if the paper proves accurate in its prediction, huge efforts to curb greenhouse-gas emissions will still be necessary to limit warming. As the author of the paper, Dr. Millar, says, “We’re showing that it’s still possible. But the real question is whether we can create the policy action that would actually be required to realize these scenarios.”

(Jeff Tollefson, Nature News)

Debatable Technology

What’s in a face…prediction technology

A paper by the genome-sequencing pioneer Craig Venter published on 5th September raised a lot of criticism and have gathered fears about genetic privacy. The paper, published in the Proceedings of the National Academy of Sciences (PNAS), claims to predict people’s physical traits from their DNA. Dr. Venter and his colleagues sequenced the whole genomes of 1,061 people of varying ages and ethnic backgrounds at Human Longevity, Inc. (HLI). Artificial intelligence was applied to analyze combination of each participant’s genetic data with a high-quality 3D photographs of the participant’ face. This analysis revealed single nucleotide changes in the genetic code between participants that corresponded with with facial features such as cheekbone height and other factors like height, weight, age, vocal characteristics and skin color. Using this approach, they could correctly pick an individual out of a group of ten people randomly selected from the HLI’s database, 74% of the time. This technology could be tremendously powerful for any agencies handling human genome data. Simply removing personal identifying information, which is routinely done in practice, would not eliminate the possibility that individuals could still be identified by the data itself.

However, reviewers of the paper says that the claims are vastly overstated and the ability to use a person’s genes to identify the individual is hugely overblown. According to the skeptics, just knowing the age, sex and race alone can eliminate most of the individuals in a randomly selected group of ten people from a data set as small and diverse as HLI’s. Computational biologist Yaniv Erlich of Columbia University in New York City provided evidence in support of this statement by looking at the age, sex and ethnicity data from HLI’s paper. According to his calculations knowing only those three traits was sufficient to identify an individual out of a group of ten people in the HLI data set 75% of the time, irrespective of any information on the genome. He concluded that the paper doesn’t demonstrate that individuals can be identified by their DNA, as it claims to. HLI counter argued by saying that they used multiple parameters to identify someone, out of which a person’s face is just one.

The review process that the paper underwent is not standard for most journals. By submitting to PNAS as a member of the US National Academies of Science, Engineering, and Medicine, Venter was allowed to hand select the three reviewers his paper would be evaluated by. While the issues surrounding this paper are being hotly debated by members of the scientific community, some fear Venter’s stature will give the paper undo weight to policymakers, who may become overly concerned about DNA privacy and can thus affect the rules and regulations making processes.

(Sara Reardon, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 21, 2017 at 5:45 pm

Science Policy Around the Web – March 06, 2017

leave a comment »

By: Liu-Ya Tang, PhD

Source: pixabay

Technology and Health

Is That Smartphone Making Your Teenager’s Shyness Worse?

The development of new technologies, especially computers and smartphones, has greatly changed people’s lifestyles. People can telework without going to offices, and shop online without wandering in stores. While this has brought about convenience, it has also generated many adverse effects. People tend to spend more time with their devices than with their peers. Parents of shy teenagers ask, “Is that smartphone making my teenager’s shyness worse?”

Professor Joe Moran, in his article in the Washington Post, says that the parents’ concern is reasonable. The Stanford Shyness Survey, which was started by Professor Philip Zimbardo in the 1970s, found that “the number of people who said they were shy had risen from 40 percent to 60 percent” in about 20 years. He attributed this to new technology like email, cell phones and even ATMs. He even described such phenomena of non-communication as the arrival of “a new ice age”.

Contrary to Professor Zimbardo’s claims, other findings showed that the new technology provided a different social method. As an example, teenagers often use texting to express their love without running into awkward situations. Texting actually gives them time and space to digest and ponder a response. Further, Professor Moran said that the claim of Professor Zimardo was made before the rise of social networks;  shy teenagers can share their personal life online even if they don’t talk in public. He also talks about the paradox of shyness, where shyness is caused by “our strange capacity for self-attention”, while “we are also social animals that crave the support and approval of the tribe.” Therefore, new technologies are not making the shyness worse, in contrast social networks and smartphones can help shy teenagers find new ways to express that contradiction. (Joe Moran, Washington Post)

Genomics

Biologists Propose to Sequence the DNA of All Life on Earth

You may think that it is impossible to sequence the DNA of all life on Earth, but at a meeting organized by the Smithsonian Initiative on Biodiversity Genomics and the Shenzhen, China-based sequencing powerhouse BGI, researchers announced their intent to start the Earth BioGenome Project (EBP). The news was reported in Science. There are other ongoing big sequencing projects such as the UK Biobank, which aims to sequence the genomes of 500,000 individuals.

The significance of the EBP will greatly help “understand how life evolves”, says Oliver Ryder, a conservation biologist at the San Diego Zoo Institute for Conservation Research in California. Though the EBP researchers are still working on many details, they propose to carry out this project in three steps. Firstly, they plan to sequence the genome of a member of each eukaryotic family (about 9000 in all) in great detail as reference genomes. Secondly, they would sequence species from each of the 150,000 to 200,000 genera to a lesser degree. Finally, the sequencing task will be expanded to the 1.5 million remaining known eukaryotic species with a lower resolution, which can be improved if needed. As suggested by EBP researchers, the eukaryotic work might be completed in a decade.

There are many challenges to starting this project. One significant challenge is sampling, which requires international efforts from developing countries, particularly those with high biodiversity. The Global Genome Biodiversity Network could supply much of the DNA needed, as it is compiling lists and images of specimens at museums and other biorepositories around the world. As not all DNA samples in museum specimens are good enough for high-quality genomes, getting samples from the wild would be the biggest challenge and the highest cost. The EBP researchers also need to develop standards to ensure high-quality genome sequences and to record associated information for each species sequenced. (Elizabeth Pennisi, ScienceInsider)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

March 6, 2017 at 8:41 am