Science Policy For All

Because science policy affects everyone.

Archive for February 2015

Science Policy Around the Web – February 25, 2015

leave a comment »

By: Kaitlyn Morabito

photo credit: Self II via photopin (license)

Forensics – DNA and Crime

Building face and a case on DNA

Although it may seem like a plot line from a prime-time crime drama, police are using DNA to generate sketches of suspects in crimes which lack eyewitnesses or photographic evidence, but in which the perpetrator leaves behind DNA. This technique, called forensic DNA phenoytping, involves using DNA to determine suspect characteristics including eye and hair color. Experts hope that as the technology evolves, they will also be able to predict skin color and whether they have freckles as well traits relating to hair (baldness, curliness), tooth shape and age. In addition to generating sketches, law enforcement may be soon able to use this technology to compare mug shots in their database to the DNA sketch. While forensic DNA phenotyping may not generate an exact likeness to the suspect, it may also be useful in ruling out suspects who do not match the phenotype. Police have already released a DNA phenotyping sketch to the public in a case in South Carolina involving the murder of a mother and daughter where there were no cameras or eyewitnesses. Opponents worry about accuracy, arguing that it may encourage racial profiling. (Andrew Pollack, The New York Times)


Peer Review

Nature to let potential authors try double-blind date

In an effort to reduce prejudice in the peer review process, Nature and its associated journals are adding double-blind review as an option when submitting papers. As opposed to the traditional single-blind peer review currently used by most journals where only the reviewers are anonymous, in the double blind scenario, both the authors and the reviewers are redacted. With the traditional system, there have been concerns about bias, both conscious and unconscious, which may unfairly impact women, minorities, and authors from lesser-known institutions. Nature piloted this system in two of their journals, Nature Climate Change and Nature Geoscience, and had enough success to expand this option to their other publications. However, they note that this is an on-going process and will evaluate how different fields respond to the double-blind option. Blinding the reviewers from the authors doesn’t guarantee anonymity; instead it may lead to the reviewers trying to guess the author which may be easier in smaller fields. This action by Nature is just one of many strategies aimed at evening out the playing field in the peer review process. Some journals are taking the opposite route and making the process more transparent by identifying the reviewers or utilizing and open review where comments are available along with the paper. (Dalmeet Singh Chawla, ScienceInsider)


Regulatory Science

FTC fines marketers of two apps that claim to detect melanoma

Recently, Health Discovery Corp, and New Consumer Solutions, developers of the apps, Mole Detective and MelApp, respectively, were fined by the Federal Trade Commission (FTC). These apps claimed to be able to detect melanoma by submission of photos taken by one’s phone. The app would give users ratings of high, medium or low of the likelihood that the moles in the pictures were melanoma. Despite warnings on the app that users should see a doctor for a real diagnosis, the FTC determined that these apps misrepresented themselves as valid methods of melanoma detection.   Both companies have reached settlements with the FTC with fines ranging from about $4000-$20,000. Accuracy of these apps is a real concern, and a trained dermatologist should determine real diagnosis. A study in JAMA looked at four melanoma apps, which were not identified and found a 30% misdiagnosis rate. Health apps fall into a gray area in terms of regulation by the US government since they are not typically considered medical devices. The FDA has recently released proposed guidelines to regulate apps which can be used as a diagnostic. (Hayley Tsukayama, The Washington Post)



Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 25, 2015 at 11:24 am

Science Policy Around the Web – February 20, 2015

leave a comment »

By: Nicholas Jury, Ph.D.

Photo credit: Jared Rodriguez / Truthout / Flickr

Academic Records – Privacy Regulation

Open records laws becoming vehicle for harassing academic researchers, report warns

In this digital age of communication where almost every public document is stored on a server, it is much easier for information to be disseminated to the public concerning academic research. However, a recent report entitled, “Freedom to Bully: How Laws Intended to Free Information are Used to Harass Researchers”, by the Union of Concerned Scientists (UCS) states that these same public records that are protected by open records laws to promote transparency are being used as a weapon by activists and lobbying firms to harass academic researchers with whom they disagree. The report specifically identifies discrepancies in how states may need to revise some of the laws that are used to promote transparency, while balancing the rights of privacy and academic freedom in responding to requests for information.

Journalists and activists have typically used these laws, most notably, the Freedom of Information Act (FOIA), to request documents that could expose mismanagement and financial conflicts of interest. However, some groups have requested these documents in an effort to discredit the research often including water and air pollution, climate change, genetically modified organisms, and gun violence.

A few recent attempts include a request by former Virginia Attorney General Ken Cuccinelli, a climate change denier, to obtain documents from the University of Virginia for research conducted by Michael Mann. However, Michael Halpern, a program manager with the USC says, “If lawmakers, universities, and researchers develop a shared understanding of what they should disclose and a system for proactively doing so, they can avoid costly and time-consuming lawsuits and other battles.   And that, in turn, will allow researchers to get back to what they are supposed to be doing: learning more about our world.”   Source: Puneet Kollipara (Science Insider)


Source: Val Altounian, Science

Source: Val Altounian, Science

Public Health – Ebola

Rapid test for Ebola now available

According to the World Health Organization (WHO), it has just approved a new rapid diagnostic test for the detection of the Ebola virus. The test is particularly useful in areas that are remote without electricity and far away from a well-equipped laboratory. Prior to this rapid test, the only available technology for detecting the virus was a PCR-based test that required a significant amount of blood from a needle draw. This was a slower way to test for the virus, as results could take more than a day. This new test only takes 15 minutes, and requires only a few drops of blood from a finger prick.

The new test is produced by Corgenix, a Colorado-based company that uses specific antibodies to identify specific Ebola virus protein. The total cost of each test will be roughly $15 says, Robert Garry, a disease expert at the University of Tulane in New Orleans. The WHO has determined that the kit has a success rate of 92% of identifying people that are infected with Ebola. This rapid test could assist health care workers and public health officials in determining new hotspots of Ebola outbreak.  Source: Gretchen Vogel (ScienceInsider)

Source:  Ashley Fisher / Flickr

Source: Ashley Fisher / Flickr

Federal Research Policy

A new shot at reducing research red tape

Running a research laboratory can be tough these days with very limited and highly competitive funding bids. On top of trying to maintain funding levels to keep a laboratory productive are the requirements of reporting progress to government entities. Scientists have long complained about how federal oversight can be a hindrance to their research. Perhaps it is time for some changes to make federal oversight of research better?

There is a new panel at the National Academies that is charged with determining exactly how the government is monitoring it’s nearly $40 billion per year investment. The panel considered a report that came out in 2005 that reports that some researchers spend up to 42% of their time working on reporting guidelines for federally funded research projects.

One concern is that the federal government doesn’t provide enough funding for universities to comply with new rules. Such new rules may cost universities about $4000 extra per student each year, said Arthur Bienenstock, physics professor and special assistant to the president of Stanford University. Larry Faulkner, president emeritus of the Univeristy of Texas at Austin spoke at the National Academies panel and said, “It would be a mistake to think that the only purpose of this study is to lighten the regulatory burden on universities. Regulation is required, it’s justified, and it’s needed. What we’re trying to do is guide both government and higher education to find more efficient ways to address those needs.”   Source: Jeffrey Mervis (ScienceInsider)




Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 20, 2015 at 8:01 pm

You Are What You Eat – The 2015 Dietary Guidelines for Americans

with 2 comments

By: Amanda Whiting, Ph.D.

photo credit: MyPlate Logo via photopin (license)

Many Americans, at one point or another, have probably heard about the “food pyramid” and know that it has something to do with what the government says makes up a healthy diet. But have you ever wondered where those federal food and nutrition guidelines actually come from, or what information they’re based on, or just who gets to decide what is “healthy” for everyone?

The United States Department of Agriculture’s (USDA) Center for Nutrition Policy & Promotion (CNPP) is responsible for the general nutritional guidelines for Americans. The most recent nutritional icon, called “MyPlate,” was released on June 2, 2011 by First Lady Michelle Obama and USDA Secretary Tom Vilsack and replaced the previous “MyPyramid” model. The visual guide features a colourful plate divided into approximate portions for fruits, vegetables, grains, protein and dairy. The dinner table imagery is meant to help children, parents, and other adults prioritize their food choices at meal times to include all of the groups listed, as well as consume them in proportions relative to each other (e.g. half a plate of fruits and veggies) as a model diet to promote good health.

The nutritional recommendations behind the simpler MyPlate came from the 2010 Dietary Guidelines for Americans (DGA), a policy document jointly produced by the USDA’s CNPP and the United States Department of Health and Human Services (HHS) Office of Disease Prevention and Health Promotion. Since the publication of the first edition in 1980, the Guidelines have been updated every five years to reflect the most current knowledge on health and nutrition. A Dietary Guidelines Advisory Committee (DGAC), made up of 13 to 17 nationally recognized experts in the field of health and nutrition, meets in the year prior to each update to discuss what should be included, removed, or revised in the guidelines by conducting a thorough review of scientific and medical literature, as well as soliciting comments from the public. A scientific report is written and delivered to Secretaries of the USDA and HHS containing the DGAC’s recommendations for the next edition. The next revision to the guidelines is currently in process, with the final report due to the USDA and HHS by early 2015.

As with any guide that tries to cater to a population as large and diverse as the American public, MyPlate and the Dietary Guidelines are not without disagreements and multiple opinions. Everyone likes to think that how they eat is “healthy” – be it vegetarian, fruitarian, vegan, gluten-free, dairy-free, carnivore, paleo, primal, veggie-free, or what-have-you. In addition, MyPlate has been criticized for removing a reference to physical activity, another important contributor to good health, which was present on the MyPyramid icon as a person climbing stairs.

While some people might be of the opinion that what the federal government says is “good food” and “healthy” isn’t all that important (because they’re going to eat however they want anyway), the Dietary Guidelines for Americans does play an important role in public health. In addition to consisting of guidelines for the general public’s own consumption, the DGA is a policy document that is used to set policy related to nutrition within the government. In the USDA, the dietary guidelines are used to set standards for school lunch and other feeding programs such as Supplemental Nutrition Assistance Program (SNAP) and Women Infants and Children (WIC) program. Within HHS, the DGA is used by parts of the National Institutes of Health (NIH) to produce consumer information materials supporting healthy lifestyles for various diseases (such as hypertension), while the Food and Drug Administration (FDA) uses parts of the DGA as the basis for the Nutrition Facts information guides found on all packaged food. Thus, it is important that the final DGA and the recommendations made by the DGAC are firmly based on rational, scientific facts and arguments and are not unduly influenced by groups with their own interests at heart.

This influential effect on other governmental policies is what makes the content of the DGA itself very political. It seems that every step forward in terms of recommendations based solely on scientific evidence for advancing optimal human health, is met head on by opposition from groups with powerful incentives to make money and/or preserve a status quo. As one example, it is likely that the 2015 DGA will include a recommendation that sugar be limited to no more than 10% of a person’s daily calories. All previous editions of the DGA have not included a recommendation for an upper limit on daily sugar consumption, which is why there is no number for % daily value (%DV) for sugar on any food product nutrition label. Meanwhile, the World Health Organization (WHO) is currently in the process of updating their guidelines on sugar consumption. This guidance, expected to be published in early 2015, suggests that a reduction in sugar consumption from less than 10% of total energy intake per day (the current 2002 guideline) to below 5% would have additional health benefits on body mass and tooth decay. For an average adult, the 5% mark would be equivalent to approximately 25 g of sugar per day or less. The American public currently consumes an average of 126 g of sugar per day, with much of that coming from added sugars in processed foods, and specifically, from sweetened beverages. Success in this one single area – reducing American’s consumption of sugar-sweetened beverages – could have a significant impact on the overall health and body mass of Americans. Not surprising, the beverage industry has issued some pushback for the inclusion of any specific limits on added dietary sugar (among other concerns) in the newest DGA. The American Beverage Association has submitted public comments for the DGA, suggesting that the WHO-commissioned review lacked scientific evidence and that the setting of Dietary Reference Intakes (DRIs) is not the responsibility of the DGAC and therefore should be done by other organizations. Similar arguments have been made by the Grocery Manufacturers Association, the Juice Products Association, the National Council of Farmer Cooperatives and the Sugar Association among others.

Other groups have tried to take the politics out of what we should eat and focus just on what the science of nutrition says. “Unfortunately, like the earlier U.S. Department of Agriculture pyramids, MyPlate mixes science with the influence of powerful agricultural interests, which is not the recipe for healthy eating,” said Walter Willett, professor of epidemiology and nutrition and chair of the Department of Nutrition at the Harvard School of Public Health (HSPH)1. HSPH released its own version of MyPlate known as the “Healthy Eating Plate”. This plate featured even more vegetables compared to fruit, an even split between grains and healthy protein, an emphasis on drinking water over dairy, and indicated that healthy oils should also be consumed. It also included a direction to “Stay Active” as a part of a healthy lifestyle. The goal of the Healthy Eating Plate is to give more specific information for a healthy diet in a way that is as clear and intuitive to follow as the MyPlate icon, without influence from the food industry or agricultural policies.

What one eats (and what one does) on a daily basis has a profound impact on one’s overall health and quality of life. “One of the most important fields of medical science over the past 50 years is the research that shows just how powerfully our health is affected by what we eat. Knowing what foods to eat and in what proportions is crucial for health,” said Anthony Komaroff, a professor of medicine at Harvard Medical School and editor in chief of Harvard Health Publications1. It will be interesting to see what the recommendations for the 2015 update to the Dietary Guidelines are and what recommendations actually make it into the final document. At the end of the day, what you choose to eat is up to you. However, everyone is entitled to accurate information about the health consequences of their personal food choices. Regardless of how you eat or what diet you follow, we are all human and the basic principles for good health and longevity remain the same for everyone. Like it or not, you are what you eat.


Written by sciencepolicyforall

February 18, 2015 at 9:00 am

Posted in Essays

Tagged with , ,

Science Policy Around the Web – February 17, 2015

leave a comment »

By: Sara Cassidy, M.S., Ph.D.

Healthcare Policy

Everyone wants a piece of the action: The nation’s largest health insurer denied entrance into California’s marketplace

When the Affordable Care Act was initiated, it encouraged states to adopt their own insurance marketplaces to promote competitive pricing. Citizens and insurers alike were skeptical. To incentivize insurers’ participation from the outset, California was one of a few states to impose a waiting period of three years on companies who did not take part on opening day. Covered California surpassed its enrollment numbers in 2014 and is on target to increase enrollment by ~29% in 2015. UnitedHealthcare wants a piece of that pie, but its request to sell insurance statewide ex post facto was recently turned away by the Covered California advisory board. Consumer advocacy groups support the move by Covered California, as they believe insurers took a big risk being first-adopters, knowing that they likely would be signing up an unequal share of sicker-than-average people. However, California’s insurance commissioner believes restricting access is bad for business since more competition would likely drive down prices for the individual. The outcome from the Covered California advisory board was a compromise: Plans will be able to apply for entrance into the statewide market in 2016 if they were newly licensed since August 2012 (when the first-adopters opted in), or are managed care plans for Medicaid. Otherwise companies can apply to offer coverage in regions of the state where less than three carriers currently offer plans. This will allow large insurers like UnitedHealthCare limited participation into the marketplace until the moratorium is over is 2017.  (Michelle Andrews, NPR; Victoria Colliver, SF Gate;


Regulatory Policy

Digital health monitors avoid FDA regulation

The wearable electronic health market (FitBit, smart phone apps, and the like) is predicted to be worth 11.6 billion dollars by 2020. Sustained lobbying by Apple, Intel and other digital health monitor companies has successfully persuaded the Food and Drug Administration to stay out of their business. Recently, the FDA agreed not to regulate technologies that receive, transmit, store, or display data from medical devices, and most mobile medication applications. However the industry and some members of Congress want more than promises; they want laws that assure the industry it can innovate and sell its products without government interference. For apps that monitor exercise routines and count calories, FDA regulation seems an over-reach, but what about apps that allow diabetics to plug glucometers into their smartphones to track insulin levels, or apps that take electrocardiograms to record cardiac events? These already exist, and sound like the types of medical devices the FDA would normally regulate to ensure safety and accuracy. And, as the market expands, it is easy to imagine it becoming increasingly difficult for patients and physicians to evaluate the quality and utility of these devices. As we enter the era of precision medicine, the mobile health market is posed to play a major role in personalized healthcare, but that position could be compromised without FDA oversight.  (Ashley Gold, Politico; Cortez et al (2014) NEJM 371:372-379; FDA)


Energy Policy

Republicans sign Keystone Pipeline Bill, Obama expected to veto

Speaker John Boehner (R-Ohio) staged a signing ceremony for a bill including plans to build the Keystone XL pipeline Friday the 13th, but the legislation won’t be forwarded to the White House until after the holiday weekend. President Obama is expected to veto the bill. The State Department has just finished collecting information on whether the project is in the nation’s best interest, and once Secretary of State John Kerry finishes reviewing the comments, they will be sent to Obama who will make the final decision. Opponents of the pipeline are optimistic Obama will reject the project, based on his recent negative comments. “It’s very good for Canadian oil companies, and it’s good for the Canadian oil industry but it’s not going to be a huge benefit to U.S. consumers, it’s not even going to be a nominal benefit to U.S. consumers,” Obama said in December. Despite this, the pipeline developer, TransCanada, has vowed not to give up on the project if Obama rejects the bill.  (Laura Barron-Lopez, The Hill)



Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 17, 2015 at 11:24 am

Three’s a Crowd: Should We Cross This Line?

leave a comment »

By: Tamara Litwin, Ph.D

photo credit: Mitochondria via photopin (license)

The British House of Commons recently voted to approve techniques that will enable couples to conceive children without inheriting deleterious mutations in their mitochondrial DNA1. In the United States, the FDA is considering the safety and efficacy of the same techniques, which raise a variety of medical and ethical issues because they involve manipulation of the human genome2. There are two closely related techniques, both known as mitochondrial replacement therapy, that aim to cure mitochondrial DNA disorders by combining mitochondrial DNA from a woman with healthy mitochondrial DNA, nuclear DNA from a woman with a mitochondrial DNA disorder, and nuclear DNA from a man.

Mitochondrial replacement therapy will create children who, in a sense, have three parents because the healthy mitochondrial DNA is donated by a woman who would not otherwise be related to the child3. What are the biological implications for this three parent model? Mitochondrial DNA contains less than 17,000 DNA base pairs out of the 3 billion base pairs that make up the total human genome. Therefore, mitochondrial DNA proportionally makes up a small percentage when compared to the total amount of DNA. Furthermore, mitochondrial DNA is found in mitochondria, a separate compartment of the cell from the nucleus of the cell where the nuclear DNA (all the rest) is located. While few in number, mitochondrial genes are essential to biological processes that focus on energy storage and consumption from oxygen and sugar4. When the mitochondria do not function properly, the result can be mitochondrial myopathy, a family of disorders with symptoms including muscle weakness, vision problems, heart problems, and others5. Mitochondrial disorders may also accelerate aging6.

Mitochondria are inherited solely from the mother because mitochondria from the sperm are actively degraded7. Therefore mitochondrial DNA disorders are passed down to every child (both boys and girls) conceived by an affected woman. The disorders can vary in severity among children in the same family. The potential benefits of the new mitochondrial replacement therapy to the children of mothers with mitochondrial disorders are enormous. One intervention at or before conception can prevent a lifetime of symptoms of mitochondrial dysfunction. Even better, any daughters will not have to face the same decision when they grow up and hope to start their own families, because they will pass on the healthy mitochondria to their children.

Mitochondrial replacement therapy has the potential to help specific families stop the chain of transmission of mitochondrial disorders. It is a medical technology that is not very different from traditional in vitro fertilization (IVF) techniques and does not lie on a slippery slope to ethically challenging eugenics techniques. If we have the technology to help these families, how can we withhold it on the grounds that someone may someday misuse the technology for other purposes? The technology already exists. However, it is worth proceeding cautiously because there could be health ramifications to manipulating eggs in this way. Animal studies have already shown the growth of healthy offspring for several generations with this technique8, which is very promising, but it would be wise to generate more data in animals before proceeding with human trials of this technique in the United States. The FDA can also watch the developments across the pond to help determine when and how to introduce mitochondrial replacement therapy to the United States.


Written by sciencepolicyforall

February 13, 2015 at 10:01 am

Science Policy Around the Web – February 13, 2015

leave a comment »

By: Julia Shaw, Ph.D

Climate Change – Geoengineering

Elite science panel calls on U.S. to study climate modification

A panel of experts selected by the National Research Council of the U.S. National Academy of Sciences recently released a report recommending government-sponsored research on the “risks and benefits” of geoengineering to alter albedo. “Albedo” in this context relates to how much light the Earth reflects back into space and albedo modification is one potential approach to counteracting the effects of global warming. Marcia McNutt, editor-in-chief of Science, former director of the U.S. Geological Survey, and chair of the committee stressed that the recommendation reflects a harsh reality and the need for action: “That scientists are even considering technological interventions should be a wake-up call that we need to do more now to reduce emissions, which is the most effective, least risky way to combat climate change.” The report supports “small-scale field experiments” provided an appropriate entity governing geoengineering research and the surrounding ethical issues is established and recommends researchers design studies to simultaneously improve basic knowledge of climate regulation. Some of the albedo experiments could include the injection of large volumes of sulfate particles into the atmosphere and marine cloud whitening through the introduction of salt particles into coastal cloud belts. A separate report released by the National Research Council addressed the topic of carbon dioxide removal, a much less controversial tactic for moderating the effects of climate change. However technological and financial hurdles continue to beleaguer removal approaches. (Chris Mooney, The Washington Post)


Agricultural researchers rattled by demands from group opposed to GM foods

Last month U.S. Right to Know (USRTK), a nonprofit opposed to genetically modified (GM) food targeted at least four universities with freedom of information requests asking administrative officials to give them all correspondence between certain researchers and specific companies associated with GM-produced foods, including Monsanto, Syngenta, DuPont, Dow and public relations firms Fleishman-Hillard and Ogilvy & Mather. The researchers involved had all written articles for GMO Answers, a website supported by food and biotechnology firms. According to Gary Ruskin, Executive Director of USRTK the goal of the correspondence search is “ to learn how these faculty members have been appropriated into the PR machine for the chemical-agro industry.” Although Kevin Folta, a researcher at the University of Florida, Gainesville, is willing hand over his records, he is wary of USRTK’s intentions saying, “They’ll report, ‘Kevin Folta had 200 e-mails with Monsanto and Syngenta’ as a way to smear me.’” Many researchers are waiting for the final say from university lawyers before responding to USRTK’s requests. Although USRTK contends their requests are only meant to increase transparency, some researchers are concerned about repercussions on academic freedom. As another targeted researcher, Alison Van Eenennaam of the University of California, Davis, put it, “Your first inclination . . . is to stop talking about the subject. But that’s what they want. And I don’t want to be intimidated.” (Keith Kloor, ScienceInsider)

Health and the Environment

In Nevada, a Controversy in the Wind

Two geoscientists from the University of Nevada, Las Vegas, Brenda Buck and Rodney Metcalf, together with Francine Baumann, an epidemiologist from the University of Hawaii are raising concerns about the effect of naturally occurring asbestos on cancer incidence in Nevada. Historically, naturally occurring veins of asbestos were actively mined; however the health risks of such activity are now more fully appreciated. Asbestos fibers can be easily inhaled and will lodge in the lungs, causing inflammation that can lead to mesothelioma and other respiratory diseases over time. Buck and Metcalf previously published research identifying numerous asbestos deposits in the state and the presence of potentially harmful asbestos fibers near Boulder City, eastern Henderson, and Las Vegas possibly caused by natural erosion and commercial development in the area. More recently they teamed-up with Dr. Baumann who used data from Nevada’s cancer registry to draft a preliminary report in 2012 noting an unusually high number of mesothelioma cases in younger residents and women in the aforementioned areas, which suggested exposure to asbestos at an early age. The response from the Nevada Department of Health was unequivocal. The department revoked Dr. Baumann’s access to the state cancer registry and forced her to remove an abstract and cancel a pending presentation on their findings for the Geological Society of America’s national meeting upon threat of legal action. Department officials maintain that their own analysis found no significant asbestos risks and further contend that “Dr. Baumann gave too much weight to a few anomalous cancer cases.” Nonetheless the Nevada Department of Transportation delayed plans for a highway project through Boulder City and the health department recently increased its monitoring of airborne fibers in southern Nevada. Determined to publish what they saw as an important health concern, Dr. Baumann and her colleagues instead evaluated cancer data reported to the Centers for Disease Control and Prevention. On Tuesday their study, which reported elevated rates of mesothelioma in adults under age 55 and increased disease rates in women, all found in southern Nevada and all potentially linked to exposure to naturally-occurring asbestos, was published in the Journal of Thoracic Oncology. Rather than run from controversy, Dr. Baumann believes that “with public health research, the important thing is getting information into the open and then discussing it.”   (Deborah Blum, The New York Times)


Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 13, 2015 at 9:00 am

Science Policy Around the Web – February 10, 2015

with one comment

By: Agila Somasundaram, Ph.D.

photo credit: ynse via photopin cc

Precision Medicine

The Problem with Precision Medicine

President Obama unveiled the Precision Medicine Initiative last week. Precision medicine, i.e. personalized, genetics-based medical treatments, would deliver “the right treatments at the right time, every time, to the right person.” But is the science ready for it? DNA testing is increasingly used to detect and treat various diseases, including cancer and birth defects, and the cost of genetic analyses has dropped significantly. But many doctors are not qualified enough to correctly interpret the data, make the right connections between DNA and disease, and successfully communicate it to their patients. Incorrect diagnoses based on genetic data are common. Children have been mistakenly diagnosed with serious syndromes. Sometimes mistakes can cause greater harm than just increasing anxiety of patients or their loved ones. In 2012, The Cancer Journal described the case of a woman who underwent major surgery because her genetic-test results were not interpreted correctly. There is a paucity of genetic expertise among physicians, partly because most of the currently practicing physicians went to medical school before the human genome was sequenced, when only a handful of genes had been identified/associated with diseases. “It’s very complicated, especially for generalists, who have a million other things on their minds besides genetics,” says Mary Norton, a clinical geneticist at the University of California, San Francisco. Doctors could seek help from specialists, but there is a dearth of trained medical geneticists, so doctors end up receiving instructions from companies that are pushing their products without adequate proof of their efficacy. A survey published in the journal Genetics in Medicine reported that a majority of the participants do not fully understand genetic test results or devote sufficient time to discuss outcomes with patients. MedSeq, launched by Robert Green, a medical geneticist at Brigham and Women’s Hospital and Harvard Medical School, is an example of an initiative that educates physicians about genetic testing. Martin Solomon, a MedSeq participant and a physician at Brigham and Women’s, says genetics is simply a new tool with a learning curve, like the electrocardiogram. But Mary Norton does not think that it is that simple. Given the pace of genetics research, the variability of test methods and results, and the companies’ marketing strategies, she says that though “over time everyone will come to have a better understanding of genetics… It will probably be a bit worse before it gets better.” (Cynthia Graber, The New Yorker)



Psychological Biases Play A Part In Vaccination Decisions

Why do some people choose not to vaccinate their children? The recent outbreak of measles in the US has triggered discussions around this topic. Misinformation is one reason – people’s belief that there is a link between vaccines and autism. But what might be the psychological biases that contribute to parents not willing to ‘intervene’ on their kids? Omission bias might be playing a role here, where parents judge vaccination (an action) as more harmful to their kids than failing to vaccinate (an omission) even if the risks associated with vaccination are lower than that from not vaccinating their children. People also exhibit this omission bias to varying degrees. A study published in the journal Medical Decision Making in 1994 showed that parents who objected to vaccinating their kids were more likely to think that vaccinating was more dangerous than not vaccinating their kids. Participants were asked if they would vaccinate their child under 3, in a hypothetical situation, if 10 out of 10,000 kids not vaccinated will die from the flu, while vaccination could have a fatal side effect on 5 out of 10,000 children. If a straight assessment of risk is done, parents should have opted to vaccinate their kids. But the study showed that parents who did not believe in vaccination had a lower mean ‘tolerable risk’ than parents who did not object to the vaccine. In other words, they would vaccinate their children only if the hypothetical vaccine had a risk of 2.4 deaths per 10,000 (even though the risk from the flu itself is 10 in 10,000), while the parents who were not opposed to vaccinating their kids had a mean tolerable risk of 5.4 deaths per 10,000. To vaccinate their children, both sets of parents needed a higher risk from the disease than from the vaccine itself, but the gap was greater for the non-vaccinators. One reason could be that the non-vaccinators did not wish to ‘intervene with nature.’ Another reason could be causal responsibility for a death resulting from an action (vaccination) versus an omission (failure to vaccinate). And lastly, a related reason could be anticipated regret – parents who feel they would be causally responsible for negative consequences of vaccination also anticipate feeling greater regret about having vaccinated their children if something went wrong. But how do we define ‘act’ and ‘omission’? The author concludes that merely educating people about the benefits of vaccinations maybe insufficient to change peoples’ attitudes. However, in a society where vaccination is the norm, and not vaccinating the exemption, interpreting failure to vaccinate as the deliberate ‘act’ could have some positive effects. (Tania Lombrozo, Psychology professor at the University of California, Berkeley, NPR)


Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 10, 2015 at 11:12 am

Posted in Linkposts

Tagged with ,

HIV and Ebola Pandemics: It all started with one

leave a comment »

By: Aminul Islam, Ph.D.

It is amazing to think that the pandemics of HIV, and more recently Ebola, all originated with just one infected person, otherwise known as ‘Patient Zero’. It is a testament to the globalized era that we all live in that an infectious disease can originate in a single individual and span across the globe within a single decade. CDC (Centers for Disease Control and Prevention) estimates that about 40 million people in the world are currently infected with HIV with approximately 2 million new cases reported every year. Fortunately for us, Ebola has not had quite the same devastating reach as seen with HIV, but like HIV, Ebola too originated in West Africa, and this has led to some debate as to why we are seeing such viruses emerge and spread rapidly from this region across continents and what we can do to tackle them.

Both HIV and Ebola are thought to have originated in species other than humans, namely chimpanzees (HIV) and fruit bats (Ebola), therefore zoonosis playing a large part in their transmission. Whether it involves local practices of eating bushmeat, illegal exotic animal trading, development of rural areas thorough deforestation, or exploration and mining of remote regions for precious materials and resources; it is evident that humans are more frequently coming into closer contact with isolated wild animals who are judged to be the natural reservoir hosts for these viruses. This can lead to an increased likelihood for zoonotic transmissions to take place. Should we now accept the burden of dealing with these viral pandemics as a likely price we have to pay for the cost of developing low-income nations and promoting globalization? If indeed we do, then as a global community we need to prioritize certain policies and back them up with appropriate resources. The recent failure of the WHO (World Health Organization) to adequately respond to the Ebola outbreak, both in a timely fashion and with sufficient resources, reminds us that such global infrastructures are weak and lack the correct tools in the fight against viral pandemics from West Africa.

I would suggest that a more balanced approach to dealing with this situation is to spend some time developing long-term science and global health policy countermeasures which focus on setting global strategies aimed at dealing with the ecological, geopolitical and socioeconomic changes. More precisely, let us truly develop and support a respected global infrastructure charged with unrestricted and even worldwide public health surveillance; which also has the right figurehead, capability, and capacity to effectively predict, monitor and respond to global health situations and challenges of the future, such as viral pandemics. This may mean setting up a permanent global fund to provide the best catalyst of them all, money, in an efficient, fair and sustained manner to facilitate the prediction, prevention, mitigation and countermeasures for pandemics. As has been the case of late, relying just on a single global superpower to make unilateral decisions and take action may not be the most wise, stable and long-lasting way to tackle such issues going forward. In addition, U2 front man Bono will not be around forever to campaign for the plights of Africa.

Yes, it is true that these viral pandemics do start with one individual but in the end, it is up to us all to act as a global community which engages with one another and resolves the challenges we all face from viral pandemics, particularly during the fine balancing act of promoting economic development in the midst of long term public health.










Written by sciencepolicyforall

February 8, 2015 at 1:40 pm

Posted in Essays

Tagged with , , ,

Science Policy Around the Web – February 6, 2015

with one comment

By: Courtney Pinard, Ph.D

photo credit: via photopin (license)


U.S. lawmakers want more humane treatment of laboratory animals after an exposé published in The New York Times last month reported numerous cases of maltreatment, suffering, and death of cows, pigs, and other livestock at the U.S. Meat Animal Research Center, a federally-funded research center in Nebraska. The Times interviewed two dozen current and former center employees, and reviewed thousands of pages of internal records obtained under the Freedom of Information Act. Scientists at this research center are trying to re-engineer the farm animal to produce more offspring, yield more meat and cost less to raise. As a result of these experiments, 10 million piglets are crushed by their mothers each year, lambs are dying at staggering rates due to sickness, neglect or predation, and cows produced from twinning experiments are born deformed. A key question is: How could the U.S. Agriculture Department fund projects, which cause such massive death and suffering of farm animals? The answer lies in the lack of oversight and incentive for the industry to make a profit. According to the Times examination of 850 of the center’s experimental protocols, most approvals were made by six or fewer staff members of the center and frequently included the lead researchers for the experiment. In addition to a lack of unbiased review committee members, the Agriculture Department does not review experiments proposed by the center. The AWARE Act would expand the Animal Welfare Act to include farm animals and would require closer monitoring and more inspections. (David Grimm, Science Insider; Michael Moss, The New York Times)


Climate Change

This weekend, NASA’s Deep Space Climate Observatory (DSCOVR) will finally launch from Cape Canaveral Air Force Station in Florida. Launching more than a decade later than originally planned, the goal of DSCOVR is to provide more accurate data about solar storms and to monitor the planet’s radiation balance. Solar storms strongly influence the local space weather in the Earth’s vicinity and present radiation hazards to spacecraft and astronauts. For scientists interested in Earth’s climate change, however, the more important aspect of the satellite is the National Institute of Standards Advanced Radiometer, which will measure radiation balance – the radiation we receive from our sun minus the radiation we reflect back into space. According to NASA, the earth’s radiative equilibrium changes with natural forces such as volcano eruptions and manmade forces such as air pollution and greenhouse gases. In an article published last year about the launch, NASA’s chief Earth scientist, Ghassem Asrar was confident that the science data from this satellite will be a major breakthrough for Earth science. (Craig Mellow, Air & Space Magazine; David Shultz, Science Insider; Joe Palca, NPR)


NIH – Study Bias

A new lung research survey concludes that fewer than 5% of lung disease studies funded by the National Institutes of Health (NIH) over the past 2 decades have included participants from racial or ethnic minorities. The study published last month in the American Journal of Respiratory and Critical Care Medicine, led by physician-scientist Esteban Burchard of the University of California, San Francisco, examined 58,160 respiratory disease studies and found that 4.4% reported that minorities made up 25% or more of the study population. University of Illinois, Chicago, pulmonologist Patricia Finn is concerned about the findings: “The findings are disturbing given that lung diseases disproportionately impact underrepresented minorities”. Concerns from experts about minority representation in clinical research are not new. In 1993, Congress ordered the NIH to recruit more minorities into federally-funded studies. The NIH officials say that the new survey may not fully capture the many efforts made in this regard because not all studies have published results. This conversation about study design begs the question: Why is it so important to look at racial or ethnic groups in clinical studies? One reason is that genetic factors can be linked to condition severity. For example, African American children are 4 times as likely to die of asthma than non-Hispanic white children possibly, in part, because of ancestry. One genetic mutation linked to asthma severity was about 40% more common in African-Americans. A second reason for clinician-scientists to be aware of the importance of health disparities is that the effectiveness of treatments may vary by race or ethnicity. The asthma medication albuterol, for example, is less likely to work in Puerto Ricans and African Americans. Hopefully, it will not take another 20 years before there are major efforts to increase implementation of inclusive studies. (Lindsay Konkel, Science Insider)


Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 6, 2015 at 9:13 pm

Science Policy Around the Web – February 4, 2015

leave a comment »

By: Jennifer Seedorff, Ph.D

photo credit: Synapse journal via photopin cc

Global Warming

On Monday, the United Nations’ World Meteorological Organization, WMO, joined the National Aeronautics and Space Administration, NASA, the National Oceanic and Atmospheric Admiration, and the Japan Meteorological Agency in concluding that 2014 was the hottest year on record.  The only major science agency that has not explicitly agreed with 2014 being the hottest year on record, was the Hadley Center in Britain which concluded “the uncertainty ranges mean it’s not possible to definitively say which of several recent years was the warmest.” The World Meteorological Organization also noted “the difference in temperature between the warmest years is only a few hundredths of a degree – less than the margin of uncertainty.”  These record temperatures were achieved despite the absence of an El Nino event, which typically temporarily increase temperatures. WMO Secretary-General Michel Jarraud noted, “The overall warming trend is more important than the ranking of an individual year.”  Although experts may disagree on which particular year was the warmest on record, it is striking that 14 of the 15 hottest years on record have been in the 21st century, and we are only 14 years into this century.  (Chris Mooney, The Washington Post)


Public Health – Precision Medicine

Details have begun to emerge on President Obama’s Precision Medicine Initiative, first announced during his State of the Union Address.  On January 30th, President Obama rolled out the initiative in the East Room. Precision medicine refers to tailoring medical treatments to an individual, including their specific genetic makeup, disease mutations, microbiome, etc.  Francis Collins, director of the NIH, and Harold Varmus, head of the National Cancer Institute, recently described the Precision Medicine Initiative in the New England Journal of Medicine, “The proposed initiative has two main components: a near-term focus on cancers and a longer-term aim to generate knowledge applicable to the whole range of health and disease.” They further commented that this initiative “will also pioneer new models for doing science that emphasize engaged participants and open, responsible data sharing. Moreover, the participants themselves will be able to access their health information and information about research that uses their data.” Funding for this $215 million initiative still needs to be approved by Congress. Jo Handelsman, associate director for science in the White House Office of Science and Technology Policy, referred to precision medicine as a “game changer” that “holds the potential to revolutionize the way we approach health in this country and ultimately around the world.” (Jocelyn Kaiser, ScienceInsider and Francis Collins and Harold Varmus, The New England Journal of Medicine)


Public Health – Medicare Reform

Medicare is in the process of reforming how it pays for medical care. It is planning a transition from a fee-for-service system that pays providers based on the quantity of services provided to a system that rewards providers, instead, for the quality of the services provided.  This transition has intensified the debate over how to measure the “quality” of a service.  Currently, quality is being measured in terms of process, how many patients with a given diagnosis are being given a specific intervention, for instance how many back pain patients are being advised against bed rest or how many chest pain patients in the ER are being given aspirin.   On January 30th, a nonprofit advisory group, the National Quality Forum, submitted to Health and Human Services recommendations on 199 performance metrics for consideration.  Christine Cassel, president of the National Quality Forum, said that many of their recommendations seek to replace narrow process metrics with “measures that matter” to patients. The Center for Medicare and Medicaid Services is publicly releasing many quality metrics on its Hospital Compare and Physician Compare websites, although comparing complication rates between physicians is not yet available.  Beck reports that some doctors have criticized tying physician reimbursements to these outcome-based measurements arguing that whether a patient gets better is often out of their control. The transition to quality-based measurements has had its successes, including reducing the rate of central-line blood stream infections by 50% since hospitals were required to report them. According to Scott Wallace, a visiting professor at Dartmouth, “Measurement fatigue is a real problem in hospitals.  But, to me, the only metric that matters is, did you get better?” (Melinda Beck, The Wall Street Journal)



Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 4, 2015 at 10:22 am