Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘PMI

Science Policy Around the Web – September 13, 2016

leave a comment »

By: Daniël P. Melters, PhD

Giraffe by Muhammad Mahdi Karim through Wikimedia

Conservation Policy

There are four species of giraffe – right?

Recent work published in Current Biology by Axel Janke’s group at Göthe University in Frankfurt, Germany looked at seven genes to determine the genetic relationship between different giraffe found throughout Africa. Previously, giraffes had been grouped in sub-genera based on their coating pattern, but the study of genetic relationships showed that over the last 1 to 2 million years, four distinct groups of giraffes have evolved. The authors argue that their findings represent four distinct giraffe species.

This finding has profound implications for our understanding of African bio-geography and subsequently conservation policy, especially after the latest report that states that in the last two decades 10% of earth’s wilderness has been destroyed. But using genetic data to guide conservation policy is a poorly developed area in part because of our limited understanding of how genetic variation can tell us if two groups of animals are indeed two distinct species. Genetic analysis showed that the forest and savannah elephant are indeed distinct from each other, but they can form hybrids if they do meet. To prevent conservation limbo, the International Union of Conservation of Nature still considers the African elephant as a single species. With regards to the giraffe study, evolutionary biologist Jerry Coyne wrote a critical note on his blog in response to Janke’s article and subsequent media coverage. In short, the geographical dispersion of giraffes limits the potential for hybrids to be formed; yet zoo giraffes can form hybrids without much trouble. (Chris Woolston, Nature News)

US Cancer Moonshot Initiative

Blue Ribbon Report lays out wishlist for moonshot against cancer

Vice-president Joe Biden proposed a moonshot to cure cancer last year after his son died from brain cancer. In the last State of the Union, President Obama vowed to accelerate 10 years worth of scientific advances in five years. To create a framework, a blue ribbon panel of the National Cancer Institute’s (NCI) National Cancer Advisory Board (NCAB) consulted 150 experts and reviewed more than 1600 suggestions from researchers and the public. This culminated in a list of 10 recommendations.

One recommendation that stands out is the push for clinical trials for immunotherapy, a promising approach to harness the bodies’ own immune system to fight against the disease. Another recommendation is the creation of a new national network that would allow patients across the country to have their tumors genetically profiled and included in the new database. This latter recommendation overlaps with another health initiative that recently came out of the White House, the Personalized Medicine Initiative.

This leaves one question unanswered: will Congress fund the moonshot. So far lawmakers have not included money in the draft-spending bill and inclusion in another bill remains uncertain. With the release of this Blue Ribbon Report, the NCI NCAB hopes it will implore Congress to fund the moonshot. Nevertheless, co-chair Dinah Singer suggests that even without new funding, NCI could begin funding some projects in the report on a small scale. (Jocelyn Kaiser, Science Insider)

Drug Policy

Public libraries frequently used for drug use

Libraries are an ideal location for studying and reading, with its public access, quiet corners, and minimal interaction with other people. An unforeseen consequence is that people who abuse heroin are using public libraries more and more.

The problem of heroin and painkiller resulting in overdoses is a growing epidemic. This was further exemplified by a recent controversial picture, made public by Ohio’s East Liverpool police, that has made world wide head lines, as it depicted two adults unconscious as a result of a heroin overdose and their 4-year old son in the backseat. Public libraries are especially exposed because everyone can walk in freely and linger around if they please. No transaction or interaction is required. As a result, public libraries are turning to strategies to limit their space being used for drug-abuse. The American Library Association encourages libraries to get training on interacting with special populations, such as drug users and the homeless. In addition, librarians are partnering with the police and social workers. Altogether, the role of a librarian now includes that of a mix of first responders and social workers. (Kantelo Franko, Stat News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 13, 2016 at 9:06 am

Science Policy Around the Web – August 19, 2016

leave a comment »

By: Ian McWilliams, PhD

Photo source: pixabay

Climate Change

Melting ice sheet may expose cold war base, hazardous waste

During the Cold War, the US Army Corps began a top-secret mission to determine the capability of launching nuclear missiles at Russia from a base in Greenland. The military base constructed for this mission, named Camp Century, lies approximately 125 miles inland from the Greenland coast and was later abandoned in 1964 after the Joint Chiefs of Staff rejected the plans to create a nuclear base. When soldiers abandoned the base, it was thought that leftover fuel and waste material would be safely interred, buried under ice for thousands of years.

However, climate change has now threatened those plans. The increased ice melt could reveal the base as early as 2090 and it is estimated that tens of thousands of gallons of diesel fuel, wastewater, sewage, and other chemicals could be exposed. Adding to concerns is the nuclear generator housed in the frozen base. Although the base never became a site for nuclear weapons, the low-level radioactive coolant from the nuclear generator is still stored in the base. If ice melt continues to occur at an accelerated rate, some have expressed concern that these chemicals could be released into the environment by seeping into waterways causing a potential environmental catastrophe. (Stephen Feller, UPI)


Mouse microbe may make scientific studies harder to replicate

Reproducibility is an issue that has been the subject of much debate in the scientific community recently. Now, scientists are concerned that the microbiome may further complicate the issue. The collection of commensal microorganisms that reside on or within the body is referred to as microbiota, and it is now well known to affect the health of the host. Although researchers have taken meticulous steps to ensure that experimental animals are housed in identical conditions, including sterile bedding, strict temperature control, and standard light cycles, determining experimental variability due to differences in their microbiome have remained elusive. As researchers explore the issue further they have found that mice from different vendors have very different compositions of bacteria in their gut that could explain some inconsistencies in researchers’ experiments.

Although it is not mandated, taking steps to control for microbiome may aid in the reproducibility crisis. Segmented filamentous bacteria (SFB) have been identified as a notable concern, and some vendors are providing SFB positive or SFB negative animals separately. Although it is unlikely that SFB is the only culprit for differences in studies, researchers continue to explore new variables in rodent husbandry in an effort to improve reproducibility of scientific results. To add to the dilemma, because the species that constitute the microbiome are constantly changing, it is difficult to characterize, and impossible to standardize. Since mice share their microbes through eating each other’s feces, cage-mates can have similar microbiomes that provide natural microbiota normalization for littermates. (Kelly Servick, Science)

Precision Medicine

Spiking genomic databases with misinformation could protect patient privacy

New initiatives, like the Precision Medicine Initiative (PMI), are helping to cultivate the human genome into usable sets of data for research purposes. This pursuit is founded upon the willingness of participants to allow their genetic information to be pooled for analyses, but many have expressed concerns over the privacy of this genetic information. It has previously been shown that individuals can be identified from their anonymized genomic data and this has prompted researchers to look for additional security measures. Computer scientists Bonnier Berger and Sean Simmons have developed a new tool to help achieve this goal by using an approach called differential privacy. To increase privacy, a small amount of noise, or random variation, is added to the results of a user’s database query. Although the information returned would provide useful results, it would make it more difficult to conclusively connect this data to a patient’s identity. A similar method has been used by the US Census Bureau and the US Department of Labor for many years.

However, some scientists, including Yaniv Erlich, have concerns that adding noise to the dataset will reduce users ability to generate useful results. Erlich stated that “It’s nice on paper. But from a practical perspective I’m not sure that it can be used”. In the search for privacy, free form access to the data is limited. This “privacy budget” limits the number of questions that can be asked and excludes hundreds or thousands of locations in a genome. Additionally, because noise naturally increases error, it weakens the overall conclusion that can be drawn from the query. Simmons expects that answers will be close enough to be useful for a few targeted questions. The tradeoff for increased security is that databases protected this way could be instantly accessible and searchable, which cuts down on getting access to databases such as those managed by the National Institutes of Health. Simmons added that this method is “meant to get access to data sets that you might not have access to otherwise”. The group plans to continue to refine this method to balance the needs of researchers for access to these data sets while maintaining patient privacy. (Anna Nowogrodzki, Nature)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

August 19, 2016 at 11:08 am

The Debut of Health Care Data Science

leave a comment »

By: Fabrício Kury, M.D.

Image source:

It is easy for a millennial – a person born between mid-1980’s to late-90’s – to be unaware of just how young the current methods used in health care research really are. Controlled randomized clinical trials (RCT), dating only from the late 40’s, are probably younger than most millennial’s grandparents. Case-control methodology and Kaplan-Meier curves only originated in the 1950’s, while meta-analyses were only accepted by medical researchers in the late 70’s. Step into the 80’s and early millennials are as old, if not older, than propensity scores and the concept that is today called cost-effectiveness research. The term “Evidence-Based Medicine” is as young as a millennial born in the early 90’s, while the late 90’s and 2000’s saw the explosion of genomics, proteomics, metabolomics, and other -omics research. Finally, the 2010s so far might be credited for when the term Data Science (“the fourth paradigm of science“) gained widespread notoriety, and established its modern meaning as – long story made short – the practice of producing knowledge out of data that had been created for other purposes.

While the second half of the 20th century transformed health care research into an ever more rigorous and technology-driven science, it also saw the cost of the health care sector of the U.S. unrelentingly grow from a comfortable 5% of the Gross Domestic Product in 1960 to a crushing 18% in 2015. Medical bills have become the leading cause of personal bankruptcies in the nation, while life expectancy, as well as other basic health indicators, depicted a country nowhere close to getting a similar bang for each buck as other developed nations. In 2009, the Obama administration prescribed to the health care sector a remedy that had previously brought efficiency and cost savings to every industry it had previously touched: information technology. The Health Information Technology for Economic and Clinical Health (HITECH) Act (part of the American Recovery and Reinvestment Act of 2009) literally gave away as much as $36.5 billion of taxpayers’ money to hospitals and physician practices for them to buy and “meaningfully use” electronic health records (EHRs). This outpouring of money was overseen by the Office of the National Coordinator of Health Information Technology (ONC), which had existed since 2004 as a presidential Executive Order, but became solidified as a legislative mandate via HITECH. This act fiercely transitioned the country from mostly paper-based health care in 2008 to near-universal EHRs adoption by 2015, giving electronic life, and potential reuse for research, to streams of health data previously dormant in paper troves.

Moreover, in March, 2010, the Patient Protection and Affordable Care Act (PPACA, a.k.a. “Obamacare”) was signed into law and, among so many other interventions, secured a few hundred million dollars for the creation of the Patient-Centered Outcomes Research Institute (PCORI). The mission of the PCORI is to do research that responds directly to real-life concerns of patients. For that purpose, among the first initiatives by the PCORI was the creation of PCORnet, a network of institutions capable of providing electronic health data for research. Most recently, in January 2015, President Obama announced the Precision Medicine Initiative (PMI). The PMI seeks to craft a nationwide and representative cohort of 1 million individuals, from whom a wealth of health data will be collected with no definitive goal besides to serve as a multi-purpose prime-quality dataset for observational electronic research. Meanwhile, private sector-led initiatives such as Informatics for Integrating Biology and the Bedside (i2b2) and Observational Health Data Sciences and Informatics (OHDSI) were also launched with the mission to access and do research on health care’s big data, and their publications can be easily found in PubMed.

These initiatives depict a political and societal hope – or hype? – that information technology, among its other roles in health care as whole, can make health care research faster, broader, more transparent, more reproducible, and perhaps also closer to the everyday lives of people. One premise is that by using existing EHRs for research, instead of data collected on-demand for a particular study, the researcher gets closer to the “real world” individuals that ultimately receive the treatments and conclusions produced by the study. In traditional clinical trials and other studies, the patients who participate are highly selected and oftentimes remarkably unrepresentative of the general population. Moreover, in EHR-based research there is also the potential to investigate more individuals than any previous method could possibly attempt. This broader reach makes rare conditions (or combinations of conditions) not so rare that they cannot be readily studied, and allows subtler variations in diseases to become detectable. On top of that, these studies can be done at the speed of thought. De facto, electronic health records-based clinical research has been recently published in the Proceedings of the National Academy of Sciences (PNAS) and evinced to be feasible at international, multi-hundred million patients scale at a breathtakingly swift time span. Altogether, one can sense in this picture that the millions of dollars spent on HITECH, PCORnet, PMI, and the NIH’s Data Science research grants might not have been just unfounded hype.

The relationship of IT and health care must, however, recognize its rather long history of frustrated expectations. In 1968, for example, Dr. Laurence Weed – the father of today’s prevailing paradigm of patient notes – predicted that in the future all text narratives present in electronic health records would be entered in structured form that enables scientific analysis. Today, to say the minimum, we have become less confident about whether such change is feasible or even desirable to begin with. In 1987, Barnett and colleagues believed that “relatively simple computational models” could be used to construct “an effective [diagnostic] assistant to the physician in daily practice” and distributed nationwide, but such assistant is yet to arrive at your physician’s office downtown (although, truth be recognized, it might be around the corner). While presently teaming with excitement and blessed with incentives, the journey of IT into health care and health care research is invariably one of uncertainties and risks. Health information technology has been accused of provoking life-threatening medical errors, as well as – like previous technological breakthroughs along the history of Medicine, including the stethoscope – harming the patient-physician relationship and the quality of care. The editors of the New England Journal of Medicine early this year went as far as to state that data scientists are regarded by some clinical researchers as “research parasites.”

Moreover, the Federal Bureau of Intelligence has investigated that medical information can be sold on the black market for 10 times more than a credit card number, while at the same time cybersecurity experts are stunned by the extreme vulnerability of current U.S. health care facilities. This provides sensible ground for concern about patient privacy violation and identity theft once the health records have moved from papers into computers. Unlike a credit card, your medical and identity information cannot be cancelled over the phone and replaced by a new one. Patient matching, i.e. techniques for recognizing that data produced at separate sites refer to the same person, oftentimes confronts blunt opposition by civil opinion, while the ultimate ideal of a National Patient Identifier in the U.S. is explicitly prohibited by present legislation (HIPAA). Such seamless flow of interoperable health data between providers, however, is the very first recommendation expressed in 2012 by the Institute of Medicine for realizing the Learning Health Care System – one that revolves around the patient and where scientific discovery is a natural outgrowth of patient care.

With or without attaining the ideal of a Learning Health Care System, the U.S. health care system will undergo transformation sooner or later, by intervention or by itself, because the percentage of the GDP that is spent on health care can only continuously increase for so long. Information technology is at minimum a sensible “bet” for improving efficiency – however, the power of IT for improving efficiency lies not in greasing the wheels of existing paradigms, but in outclassing them with novel ones. This might be part of the explanation for the resistance against IT, although there does exist some evidence showing that IT can sometimes do more harm than good in health care, and here the word “harm” sometimes can mean patient harm. The cold truth is that, in spite of decades of scientific interest in using computers for health care, only very recently the health care industry became computerized, so we remain not far from the infancy of health care informatics. Nevertheless, Clinical Informatics has been unanimously approved in 2011 as a board-certified physician subspecialty by the American Board of Medical Specialties, signaling that the medical community sees in IT a permanent and complex duty for health care. Similarly, the NIH has in late 2013 appointed its first Associate Director for Data Science, also signaling that this novel field holds importance for health care research. Finally, there might be little that can be done with the entire -omics enterprise, with its thousands over thousands of measurements multiplied by millions of patients, that does not require data-scientific techniques.

The first cars were slower than horses, and today’s high-speed, road-only automobiles only became feasible after the country was dependably covered with a network of roads and freeways. Such a network was built not by the automobile producers, but by the government upon recognition that it would constitute a public good. The same principle could very well be the case of health care IT’s important issues with privacy, security and interoperability, with the added complication that it is easy for an EHR producer to design a solution but then block its users from having their system interact with software from competing companies. Now that health care records are electronic, we need the government to step in once again and build or coordinate the dependable freeways of health care data and IT standards, which will also constitute a public good and unlock fundamental potentials of the technology. Health care, on top of its humanitarian dimension, is fundamentally intensive in data and information, so it is reasonable to conjecture that information technology can be important, even revolutionizing, for health care. It took one hundred years for Einstein’s gravitational waves to evolve from a conjecture based on theoretical fundaments to a fact demonstrated by experiments. Perhaps in the future – let us hope not a century from today! – some of the data-scientific methods such as Artificial Neural Networks, Support Vector Machines, Naïve Bayes classifiers, Decision Trees, among others, in the hands of the millennials will withstand the trial of time and earn an entry at the standard jargon of medical research. Just like how, in their generations, meta-analyses, case-control studies, Kaplan-Meier curves, propensity scores, and the big grandpa of controlled randomized trial were similarly accepted.

Written by sciencepolicyforall

July 13, 2016 at 11:15 am

Science Policy Around the Web – May 27, 2016

leave a comment »

By: Sophia Jeon, Ph.D.

Photo source:

Drug regulation and rare diseases

FDA delays decision on whether to approve Sarepta drug for Duchenne

The Food and Drug Administration (FDA) is wedged between a rock and a hard place to decide whether or not the agency should approve the controversial drug, eteplirsen, for Duchenne muscular dystrophy (DMD). DMD mainly affects boys and is considered a rare disease as it affects fewer than 200,000 US cases per year. As the name implies, it is a neuromuscular condition in which symptoms include frequent falling, trouble getting up or running, and learning disabilities. Average life expectancy for those afflicted with DMD is about 25, and there is no treatment. Considering these devastating factors, it is easy to understand why DMD patients, and their parents would want to hasten the approval of a drug that could potentially save their lives.

On the other hand, it is also easy to understand why the FDA is hesitant to approve this drug. The FDA is a regulatory agency and one of their missions is to evaluate drugs to determine whether or not they are safe and effective to be in the market. The issue is with a Phase III study that Sarepta Therapeutics did to test the efficacy of eteplirsen – that the trial was not well-designed making it difficult to come to a definitive conclusion that this drug works. The trial only involved 12 patients, without a placebo control group. If the FDA makes a decision to approve this drug under political pressure from various stakeholders, a drug that potentially could be ineffective for many kids with DMD only gives them and their family false hope and decreases the motivation for pharmaceutical companies to develop more effective DMD drugs.

Understanding patients’ needs, FDA has an expanded access program that, with the FDA’s approval, allows patients to try experimental drugs. In addition, the Orphan Drug Act gives pharmaceutical companies more incentive to develop drugs for rare diseases. However, it is clear that patients whose lives are on the line do not think the drug development is happening fast enough and are willing to try any option they have available. In 2015, there was even a bill introduced, called Right to Try Act that allows patients to have access to an experimental drug without the FDA’s approval. How much should public input or influence be taken into account in a drug approval process? Should FDA have better strategies in effectively communicating and engaging with patient groups? These are some good questions without definitive answers. (Ed Silverman, STATnews)

Research evaluation and bibliometrics

The pressure to publish pushes down quality

Let’s look at our current research culture. Whether or not you get an academic position, a grant, or a renewal of a grant all depends on how much you publish. “Publish or perish” is a phrase that is frequently used in academic science and it definitely rings true for many researchers in the US. People evaluate your research and productivity based on the number of your publications and the impact factor of the journals you publish in. Daniel Sarewitz recently wrote in Nature about the negative consequences of promoting this “publish or perish” culture.

The first problem is that increasingly everyone in research, whether conscious of it or not, seems to be contributing to this culture, and we need a cultural shift, which does not happen overnight. However, there are efforts to change the way we evaluate science. For instance, Declaration on Research Assessment (DORA) was initiated by the American Society for Cell Biology with a group of editors and journal publishers to try to make that cultural shift and start the movement for everyone in science to realize that merely an impact factor or how many times your paper has been cited should not and cannot accurately reflect your productivity, assess your work’s value or even define your career.

The second problem that Daniel talks more about in this article is the problem of rigor. Some researchers are so pressured to publish that they end up do things that are unethical or produce hard-to-replicate findings by experiments that are not rigorously designed. Researchers also sometimes exaggerate the importance of their findings in order to publish or hand-wave at any inconsistencies in their discussion sections. The real harm is done not only when other researchers waste time trying to chase a false lead but also when these not-rigorously-tested studies accumulate to adversely affect public health. A popular example is the study done by Andrew Wakefield who published his (false) claim linking the MMR vaccine to autism. These problems should not just stop at being “concerns,” instead it is time to re-think the strategies of evaluating science and doing science. Scientists could do better quality science by spending more time thinking and rigorously testing hypotheses than strategizing how to write an attractive story to publish more in a “high impact” journal. (Daniel Sarewitz, Nature Comments)

Clinical trial design and personalized medicine

Personalized medicine: Time for one-person trials

Biomedical research is in a very unique position right now. Recent technological advances have allowed scientists to easily and economically perform activities such as whole genome sequencing (WGS), big data analysis, mobile health data tracking and tissue and cell engineering. These technologies, especially when used in combination, can be a powerful tool that not only offers scientific insight into human biology but also brings up a number of exciting opportunities for prevention and treatment of diseases. These are a few of many reasons the President’s Precision Medicine Initiative (PMI) is gaining much attention.

With these advanced technologies, scientists are beginning to realize that personalized medicine, not just genetic counseling but one that includes a number of other measures, such as your metabolic profile, lifestyle factors, environmental exposures, etc., is the future of biomedical science. One-person trials hope to address a number of issues that the current clinical research or trial designs cannot address sufficiently such as the fact that people have different responses to drugs, and that there has been a lack of inclusion of minority or health disparities populations in many clinical trials taking place in the US. N-of-1 trials would not only address that issue but also could reduce any ethical concerns for placing patients in a randomized placebo control group especially when there is no standard-of-care, because everyone in the trial would be getting the experimental drug for a certain period of time, wait for the effects to wear off, and then a placebo for another to examine their responses to the drug. (Nicholas J. Schork, Nature Comment)


Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

May 27, 2016 at 1:30 pm

Is there a place for precision medicine in public health?

leave a comment »

By: Megan Roberts, Ph.D.

photo credit: BWJones via photopin cc

In January 2014, President Obama announced the Precision Medicine Initiative (PMI), the goal of which is to transform treatment and prevention from a “one size fits all” approach into an increasingly tailored approach that accounts for an individual’s genes, environment and lifestyle. While bipartisan support and PMI leadership have propelled the initiative forward, some public health researchers have voiced skepticism over whether precision medicine will improve public health. Some have even called precision medicine “a distraction from the goal of producing a healthier population.” These arguments often point to the need for public health researchers to address social determinants of health to improve overall public health, and reduce health disparities in a country that spends more on health care, but endures worse health outcomes compared to other developed countries. Furthermore, by definition, public health refers to the prevention of disease and the promotion of health among populations as a whole, which seems antithetical to the “precision medicine” paradigm, with its focus on individual-level nuances. By focusing on individual health, there is a fear that we will we lose sight of public health’s goal to improve the health of our whole population, particularly underserved groups. This begs the question: is there a place for precision medicine in public health?

 While aspects of the PMI are transformative (e.g., the MATCH trial and the PMI cohort), precision medicine principles are foundational for current prevention and treatment practices. While we might immediately think “genetics” when we hear “precision medicine,” the precision medicine approach is actually much broader than only focusing on genetics, by also incorporating an individual’s environment and lifestyle for both disease prevention and treatment. Since the early 1990s, a large body of evidence has demonstrated that the effects of public health interventions are often moderated by individual level characteristics, including biological, environmental and lifestyle factors. Often, tailored approaches addressing these moderators are more effective than non-tailored approaches. As such, to move public health research forward, we must consider the interactions between individual level factors and public health strategies. This paradigm reflects the same thinking behind precision medicine, and aligns with conceptual frameworks that drive public health research and practice.

Precision medicine is already incorporated into current disease prevention strategies. Increasingly, cancer-screening programs tailor prevention strategies through targeted, risk-based screening. In a health care system with finite public health resources, targeting cancer prevention efforts to those who will receive the greatest benefit is critical. For example, breast MRI is a highly sensitive breast cancer-screening tool; however, the test has high rates of false positive results. As such, the benefits of breast MRI only outweigh the harms for women who are at high risk for breast cancer.  In order to identify women who are at high risk, researchers have developed risk-based models that incorporate individual level risk factors, as well as genetic tests to identify genetic mutations that confer an increased cancer risk.  For those at significantly higher risk of breast cancer, clinical guidelines recommend MRI screening, as breast MRI is cost-effective and improves health outcomes in this setting.  A risk-based approach is also used in lung and cervical cancer screening. Specifically, lung cancer screening tailors on factors including smoking history, and HPV vaccination tailors on high-risk populations, including men who have sex with men and those with HIV. Precision medicine approaches for screening demonstrate an important application of precision medicine in public health, and have led to effective prevention strategies for high-risk groups.

In addition to prevention, linking individuals to high quality care remains a tenet of public health. Improved understanding of the genetic basis for disease has improved treatment strategies, particularly in cancer care. Today, high quality cancer care relies on targeting treatment using genetic tumor markers. Breast cancer, once viewed as a single disease, is now known to be multiple subtypes of breast cancer that can be distinguished by tumor genetics. Conversely, other studies have uncovered similarities between tumors that originate in different organ sites. For example, one study has found that lung squamous cell carcinoma, head and neck, and a subset of bladder cancers cluster by gene expression patterns, meaning these cancers all have genetic similarities. As such, therapeutics that target specific tumor markers have been developed.  There are drugs on the market that target tumor markers that occur in multiple tumor sites, such that a lung cancer patient may receive the same drug as a pancreatic cancer patient who has a similar genetic mutation. This treatment demonstrates a shift toward considering cancer according to a tumor’s genetics rather than by a tumor’s organ site.  Precision medicine programs have emerged that use this cancer treatment approach , and the MATCH trial—a component of the PMI—will elucidate the effectiveness of this approach. Similar precision medicine approaches could potentially be extended to other disease areas in the future.

Overall, the use of individualized information in research, prevention and treatment is neither new nor incongruous with the goals of public health.  “Precision” public health researchers  must ensure that precision medicine is equally accessible to all patients, with a strong focus on dissemination and implementation research around precision medicine approaches. While the PMI and public health priorities may not always mirror one another, to pit public health against precision medicine is a mistake.  Public health and precision medicine can synergize towards common goals of disease prevention and control. Precision medicine has helped researchers and clinicians identify important interactions between individual-level factors and life-saving prevention and treatment strategies. Research findings through the PMI will only further this progress and improve population health.

Written by sciencepolicyforall

April 7, 2016 at 12:00 pm

Posted in Essays

Tagged with ,

Science Policy Around the Web – February 26, 2016

leave a comment »

By: Kimberly Leblanc, Ph.D.

photo credit: Alex E. Proimos via photo pin cc

Precision Medicine Initiative

NIH’s 1-million-volunteer precision medicine study announces first pilot projects

On Thursday, the President participated in a panel discussion at the White House Precision Medicine Initiative (PMI) Summit, marking the one year anniversary of the announcement of the Precision Medicine Initiative, which attempts to tailor medical treatments to individuals. The White House and the National Institutes of Health (NIH) announced several pilot projects, including one to work out how to recruit hundreds of thousands of volunteers online. The cohort program is the largest piece of the PMI: A 1-million-volunteer health study that will probe the interplay among genetics, lifestyle factors, and health. Vanderbilt University Medical Center (VUMC) will lead the Direct Volunteers Pilot Studies under the first grant to be awarded in the federal PMI Cohort Program. The university will work out how to engage participants with a website and a phone line for signing up. Verily, formerly Google Life Sciences (renamed in December 2015), in Mountain View, California, will advise the project. To facilitate the contribution of volunteer data, the NIH is launching a program called Sync for Science, added Francis Collins, M.D., Ph.D., director of the NIH, which will “pilot the use of open, standardized applications that will give individuals the opportunity to contribute their data to research, including for the PMI cohort.” Sync for Science will include participation by electronic health records firms Allscripts, Athenahealth, Cerner, Drchrono, Epic, and McKesson, which have committed to deploying the applications required for individuals to donate their health data directly to the PMI cohort, he said. Such technologies will enable individuals to “control and manage their data … coordinate their care among their healthcare providers, and submit their data to researchers if they choose.” The White House also announced a batch of projects being launched by some 40 universities, patient groups, companies, and others to promote personalized medicine. The PMI “is an all-hands-on-deck operation,” John Holdren, director of the White House Office of Science and Technology Policy, said during the press briefing. “We really need the participation of all of these groups to realize the potential of precision medicine.” (Jocelyn Kaiser, ScienceInsider; a genome web staff reporter, genomeweb)

Public Health and Nutrition

Judge upholds NYC rule on restaurant salt warnings

Justice Eileen Rakower of the New York state Supreme Court ruled to uphold a recent regulation in New York City, requiring restaurants with 15 or more locations nationwide as well as concession stands at some movie theaters and sports stadiums to post a salt-shaker warning symbol next to menu items with more than 2,300 milligrams of sodium. That’s the recommended daily limit proposed in the latest Dietary Guidelines for Americans, released in January. Yet Americans are consuming close to 3,440 milligrams a day on average. Most of the sodium we consume is already added to our food, whether its in the processed foods we buy in the grocery or the meals we’re served in restaurants. Mandated salt warnings on menus are intended to make New Yorkers more aware of the link between excessive salt in their diets and high blood pressure, heart disease and stroke, according to health officials.

“I believe that the New York City salt label [on menus] does protect public health,” said Thomas Merrill of the Department of Health & Mental Hygiene. He says it gives people the information they need to make informed choices. New York City adopted the rule in December, and the National Restaurant Association then sued the city’s Board of Health saying the rule unfairly burdened restaurant owners. In court on Wednesday, Rakower denied the restaurant group’s motion for a preliminary injunction to stop enforcement of the rule. Starting March 1, violators will be punished by $200 fines. Unlike the city’s unsuccessful large-soda ban, she said, the rule did not restrict the use of sodium. S. Preston Ricardo, a lawyer for the restaurant group, said the association intended to appeal. Overall, health officials are happy with the judges’ decision. “This is really good news for the health of New Yorkers,” said Dr. Mary Travis Bassett, the city’s health commissioner. (Karen Freifeld with additional reporting by Jonathan Stempel, Reuters; Allison Aubrey, NPR)

FDA Leadership

Robert Califf confirmed as new FDA head

On Wednesday, after 4 months of delay, the U.S. Senate approved cardiologist Robert Califf, President Obama’s pick to head the Food and Drug Administration (FDA) by a vote of 89 to four. The nomination, announced last September, has faced significant opposition. Senators Edward Markey (D–MA) and Joe Manchin (D–WV) have both used the nomination as a chance to express frustration with the FDA over its response to the epidemic of opioid abuse—to argue that the agency has been too permissive in its approval of prescription opioids, and has failed to consult its scientific advisory board in those decisions. Ill will toward the agency over its approval of genetically modified salmon last November led Senator Lisa Murkoswki (R–AK) to block the nomination and demand that FDA put out guidelines requiring the fish to be labeled (Murkowski later lifted her hold.) And Senator Bernie Sanders (D–VT) also blocked the nomination, citing concerns that Califf wouldn’t be motivated to help combat the rising cost of prescription drugs. There were also personal concerns about Califf and his long-standing ties to the pharmaceutical industry — which funded many clinical trials he oversaw as an academic — and questioned whether his interests lay more with drug companies than with ordinary patients. In a confirmation hearing last fall, Califf defended his past work and said he had no intentions of lowering the FDA’s standards for safety and effectiveness of drugs and devices. After winning confirmation Wednesday, he spoke in a brief interview with the Washington Post about his priorities as FDA commissioner, including a desire to better explain to the public how the agency operates. (Kelly Servik, Science Insider; Brady Dennis, Washington Post)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 26, 2016 at 9:00 am

Science Policy Around the Web – January 29, 2016

leave a comment »

By: Daniël P. Melters, Ph.D.

Infectious Diseases

Zika virus, linked to microcephaly, on the rise

Only a few months after the scare of the epidemic of chikungunya, a new virus has emerged on the American continents: Zika virus. The same mosquito (Aedes aegypti) that transmits yellow fever, dengue, and chikungunya also transmits this virus. In the last few months of 2015, there was a sharp rise in babies born with microcephaly. Some hospitals in north Brazil that would only see five cases a year, now see over 300 in six months. These babies have abnormally small heads and the rare neurological disorder Guillain-Barré syndrome. The rise in cases with microcephaly strongly correlated with an ongoing Zika virus epidemic in the north of Brazil. In addition, the Zika virus RNA was found in the amniotic fluid of two fetuses. It is thought that women who were pregnant became infected with the virus and gave it to the growing fetus. Nevertheless, there is no formal evidence that the Zika virus causes microcephaly. In fact, a recent report argues that a surge in finding birth defects is too blame for the increase in microcephaly cases in Latin America.

This has not stopped local and global authorities from warning people of the potential dangers of the Zika virus. Brazil has suggested its citizens in affected regions not get pregnant. The CDC in the U.S. is warning tourists who go to regions where Zika virus is epidemic to take precautionary measures to prevent being bitten by mosquitos. On Thursday, January 28th, the World Health Organization declared an International Emergency. The last International Emergency was the Ebola outbreak in West Africa. Another complicating factor is the expected increase in number of mosquitos due to El Niño. Although most people who get infected by Zika virus will remain asymptomatic, some people will have a rash and a fever. As of now, no cure exists. Therefore, researchers around the world are rushing to develop a vaccine. Two potential vaccines against West Nile virus, after being repurposed for Zika, might enter clinical trials as early as late 2016, according to Dr. Fauci (NIH/NIAID) [recent talk by Dr. Fauci on emerging viruses]. But caution about a quick cure is warranted, as it might take several years before a Zika vaccine becomes commercially available. (, BBC News website)

Mental Health

One step closer to understanding schizophrenia

Schizophrenia is a debilitating psychiatric disease that affects over two million people in the United States alone. Often, this disease start in the later years of adolescence and early adulthood. Delusional thinking and hallucinations characterize schizophrenia, but the drugs available to date to treat schizophrenia are blunt and frequently patients stop using them because of their side effects. Although this new study will not lead to new treatments on the short term, it does provide researchers with first firm biological handle on the disease.

The developing human brain is the site of neuronal pruning. At first, the brain makes an excessive number of connections between neurons, but as children grow-up, most of these redundant connections are lost. You can see this a competition between the connections where the strongest ones survive. Neuronal pruning in the prefrontal cortex, the part of the brain involved in thinking and planning, happens in adolescence and early adulthood. The latest finding, published in Nature, found that people who carry genes that accelerate or intensify that pruning are at higher risk of developing schizophrenia than those who do not. To date, no specific genetic variant has been found, although the MHC locus seems a likely candidate. Indeed, one specific gene in this locus, C4 gene, is involved in neuronal pruning. The C4 gene produces two products: C4-A and C4-B. Too much of the C4-A variant results in too much pruning in mice, which would explain why schizophrenic patients have a thinner prefrontal cortex. These new findings help to connect the dots better than ever before. Next up will be developing drugs that regulate neuronal pruning and the hope is that this will create a new anti-schizophrenia drug. (Benedict Carey, New York Times)


Analyzing body chemistry through sweat sensor

A small, wearable sensor has been created that can measure the molecular composition of sweat send those results in real time to your smartphone. The sensor, a flexible plastic patch, can be incorporated into wristbands. Several labs have been working on developing such a patch for a while, but most of them could only detect one molecule at a time. This newly developed flexible printed plastic sensor can detect glucose, lactate, sodium, potassium, and body temperature. When the sensor comes in contact with sweat an electrical signal is amplified and filtered. Subsequently, the signal is calibrated with the skin temperature. This latter step is essential, according to the lead scientist Jarvey. The data is then wirelessly transmitted to your smartphone. Because the sensor is not as accurate as a blood test, rigorous testing for medical use is therefore required.

The potential of this new devise is that it can tell, for instance, a diabetic patient in real-time that his blood sugar levels are too low or too high. It could also tell someone who is physically active that she is getting dehydrated and needs to drink water. One particular project could greatly benefit from this new technology. Last year President Obama announced the Precision Medicine Initiative. The goal of this initiative is to enroll over one million American participants and follow them over time to learn about the biological, environmental, and behavioral influences on health and disease. After all, most disease still do not have a proven means of prevention or effective treatments. Having technology such as this that can monitor and track basic biological data in real time could provide a wealth of information to researchers looking to make connections between a person and a disease.  (Linda Geddes, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

January 29, 2016 at 9:00 am