Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘reproducibility

Science Policy Around the Web – April 27, 2018

leave a comment »

By: Michael Tennekoon, PhD

20180427_Linkpost

source: pixabay

Productivity of Science

Is Science Hitting a Wall?, Part 1

Scientific research is hitting a wall- that’s the view from a recent study published by 4 economists.  The famous metric where the density of computer chips doubles every 2 years, now takes 18 times the number of researchers to accomplish. This pattern also extends to other areas of research as well. For example in medicine, “the numbers of new drugs approved per billion U.S . dollars spent on R&D has halved every 9 years since 1950”. In general, while research teams appear to be getting bigger, the number of patents being produced per researcher has declined. Alarmingly critics argue that some fields may even be regressing- for example the over-treatment of psychiatric and cancer patients may have caused more harm than the benefits.

But why would science be hitting a wall? One major factor could be the reproducibility crisis– the problem where many peer reviewed claims cannot be replicated thus calling into question the validity of the original research findings.  Researchers suggest that intense competition for funding and jobs, has resulted in the need to conduct innovative “high risk” research, in as short of a time as possible. While this type of research can gain plenty of press, they often lack the appropriate scientific rigor that ensure the findings are reliable. However, the perceived slow-down in research productivity could also be a result of the natural advancement of science- the low hanging fruit problem. Said another way, most of the easier problems have already been solved, leaving only problems that require vast scientific resources to solve.

On the other hand, researchers in some fields can rightfully pushback and argue that scientific progression is not stalling but is in fact accelerating. For example, technologies such as CRISPR and optogenetics have been able to produce a multitude of new findings particularly in the areas of neuroscience and genetics research. However, it must be noted, that even with these new technologies, the end product for general society is still relatively disappointing.

Given these concerns how scientific research moves forward raises some tough questions for the field. Given funding limitations, how much do we, as a society, value ‘pure science’- the effort to understand rather than manipulate nature? Scientific curiosity aside, in purely economic terms, is it worth understanding the out of Africa hypothesis of human origins, or sending humans to different planets? Is it worth investing in the latest innovative technology that produces new findings with limited applicability to human health? Scientists and the general society must be open to weighing the costs and benefits of scientific enterprises and deciding the avenues of research worth pursuing.

(John Horgan,  Scientific American)

Vaccine Ethics

The vaccine dilemma: how experts weigh the benefits for many against risks for a few

Cost-benefit analysis. Sure, it’s easy to do when you’re on an amazon shopping spree. But what about when millions of lives are at stake? And what if those millions of lives are of children, unable to give informed consent? Not so easy anymore, but that is the job of the Strategic Advisory Group of Experts (SAGE) for the World Health Organization, who last week decided to scale back the use of a new vaccine to protect against dengue.

2 years ago, SAGE concluded the vaccine was safe to use in children in places with high dengue infection rates, despite theoretical concerns the vaccine may increase the risk of developing a severe form of dengue in some children. Towards the end of last year, the vaccine’s manufacturer, Sanofi Pasteur, released new data validating these theoretical concerns.   How likely was this to happen? It was estimated that in a population where 70% of individuals had dengue at least once, the vaccine would prevent 7 times as many children from needing hospital care than would be needed as a result of the vaccine. If 85% of individuals had had dengue, that figure becomes 18 to 1. Those numbers were deemed not worth the risk.

What goes into making these decisions?

One factor is the prevalence of the disease. For example, the oral polio vaccine had the ability to prevent millions of children from becoming paralyzed, but it could also cause paralysis in a rare number of cases. In the 1950s and 1960s when polio was highly prevalent, it made sense to recommend this vaccine but as polio became nearly non-existent towards the end of the 20th century, using the oral vaccine was no longer prudent.

However, dengue is still rampant in today’s world, so what is different in this case?

Public perception. The modern world is highly litigious and has access to a wide variety of information, both facts and fake. This has resulted in a very skeptical perception of science where negative press for one vaccine could cause collateral damage for many other vaccines, unlike what would have happened a few decades ago. For example, in the 1950s, it was discovered that children were given a polio vaccine that mistakenly contained live viruses. This left 51 children in the US paralyzed, and killed 5. However, polio vaccinations resumed and the company responsible (Cutter Laboratories) went on and polio was virtually eradicated. On the other hand, RotaShield, a vaccine to protect against rotavirus (a virus that causes bowel blockage), had a very different experience. Approved in 1998, it was suspended one year later after the CDC estimated that for every 10,000 children there would be an extra 1 or 2 children who would get intussusception (a type of bowel blockage) over what would normally be seen. While in developing countries, the number of lives saved would have been far more than the extra cases of intussusception, the vaccine was still suspended. A safer rotavirus vaccine only made it to market in 2006. During this time, it is estimated that 3 million children died from rotavirus infections. (Note- risk of  rotavirus infections still persist even when the vaccine is given, but at far lower rates).

Given the tremendously difficult decisions that need to be made with the implementation of vaccines and the impact that public perception can have on these decisions, society has a responsibility to become more informed about the potential benefits and drawbacks of vaccines and must actively tease apart fact from fiction.

(Helen Branswell, STAT)

Have an interesting science policy link? Share it in the comments!

Advertisements

Written by sciencepolicyforall

April 27, 2018 at 3:26 pm

Science Policy Around the Web – August 18, 2017

leave a comment »

By: Nivedita Sengupta, PhD

20170818_Linkpost_1

Source: pixabay

Climate Science

Effort backed by California’s flagship universities comes as US President Donald Trump shrugs off global warming

As US President Donald Trump announces to withdraw from Paris Agreement, renouncing climate science and policy, scientists in California are deciding to develop a home-grown climate research institute -‘California Climate Science and Solutions Institute’. California has always tried to protect the environment with different initiatives and this one is already getting endorsed by California’s flagship universities and being warmly received by Governor Jerry Brown. The initiative is still in the early stages of development and will also need clearance from the state legislature. The institute will aim to fund basic as well as applied research in all the topics related to climate change ranging from ocean acidification to tax policy. Priority will be given to projects and experiments that engage communities, businesses and policymakers. “The goal is to develop the research we need, and then put climate solutions into practice,” says Daniel Kammen, an energy researcher at the University of California, Berkeley. He also states that this work will have global impact. The climate research project being undertaken in California may have an ally too, as the science dean of Columbia University of New York city, Peter De Menocal, plans to build an alliance of major universities and philanthropists to support research for answering pressing questions about the impacts of climate change. De Menocal already tested the idea on a smaller scale by launching the Center for Climate and Life at Columbia University last year, which raised US$8 million of private funding. This is no the first time California has taken the initiative to support an area of science that fell out of favor in Washington DC. In 2004, President George W. Bush restricted federal support for research on human embryonic stem cells. This led to the approval of $3 billion by the state’s voters to create the California Institute for Regenerative Medicine in Oakland. Since then, the center has funded more than 750 projects. The proposal for a new climate institute also started along a similar path, as a reaction to White House policies, but its organizers say that the concept has evolved into a reflective exercise about academics’ responsibility to help create a better future. The panel members wish to put forward a complete plan to set up the institute to the California legislature this year, in the hope of persuading lawmakers to fund the effort by September 2018, before Governor Brown’s global climate summit in San Francisco.

(Jeff Tollefson, Nature News)

Retractions

Researchers pull study after several failed attempts by others to replicate findings describing a would-be alternative to CRISPR

The high-profile gene-editing paper on NgAgo was retracted by its authors on 2nd August, citing inability in replicating the main finding by different scientists around the globe. The paper was published in Nature Biotechnology in May 2016. It described an enzyme named NgAgo which could be used to knock out or replace genes in human cells by making incisions at precise regions on the DNA. The study also emphasized the findings as a better alternative to the CRISPR-Cas9 gene editing system which revolutionized gene editing and has even been used to fix genes for a heritable heart condition in human embryos. Han Chunyu, molecular biologist at Hebei University of Science and Technology in Shijiazhuang is the inventor and immediately attracted a lot of applause for his findings. However, within months, news started emerging in social media about failures to replicate the results. These doubts were confirmed after a series of papers were published stating that the NgAgo could not edit genomes as stated in the Nature paper. Earlier, Han told Nature’s news team that he and his team had identified a contaminant that can explain other groups’ struggles to replicate the results and assured that the revised results would be published within 2 months. Yet on August 2, they retracted the paper stating that “We continue to investigate the reasons for this lack of reproducibility with the aim of providing an optimized protocol.”

The retraction of the paper, however, puts in question the future of the gene-editing center that Hebei University plans to build with 224 million yuan (US$32 million) as Han as the leader. Moreover, Novozymes, a Danish enzyme manufacturer, paid the university an undisclosed sum as part of a collaboration agreement. Dongyi Chen, Novozymes’ Beijing-based press manager, told Nature’s news team in January that the technology is being tested and shows some potential, but it is at a very early stage of development and hence it is difficult to determine its relevance. Following the news of retraction, he stated that the company has explored the efficiency of NgAgo, but so far has failed to track any obvious improvement. Yet they are not giving up hope as scientific researches takes time.

(David Cyranoski, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

August 18, 2017 at 5:11 pm

Science Policy Around the Web – June 06, 2017

leave a comment »

By: Kseniya Golovnina, PhD

Source: Flickr, by USDA, via Creative Commons     (CC BY 2.0)

Food Security

What if Food Crops Failed at the Same Time?

When one group of people is fighting with climate change and another considers it “mythical”, researchers specialized in the study of social-ecological systems are developing food supply risk assessment models. Food crops are one of the most important sources of human being existence, and less than one-fourth of the planet (“breadbaskets”) produces three-fourth of the staple crops that feed the world’s population. In fact, climate change could cause crop losses in most of the breadbaskets.

Two important factors included in the models are shocks to major land crop production and economy. Shocks like droughts and heat waves in Ukraine and Russia in 2007 and 2009 almost wiped out wheat crops, and caused global wheat prices to spike. And demand assessments project that food production may have to double by 2050 to feed a growing population. Together, the potential environmental and economic stresses are making the world food production system less resilient, and will affect both rich and poor nations. To measure the fragility of the system, researchers developed scenarios of small shocks (10 percent crop loss) and large shocks (50 percent crop loss). These were then applied to corn, wheat or rice output using an integrated assessment model, the Global Change Assessment Model, which was developed by the U.S. Department of Energy.

Among the critical findings are that “breadbasket” regions respond to shocks in different ways. For example, South Asia, where most of the arable land is already in use, is quite unresponsive to shocks occurring elsewhere in the world, because the total amount of land in agricultural production cannot be changed significantly. In Brazil the situation is opposite, it has a lot of potential to bring new land into production if large shocks occur. However, cleaning Brazil’s forests requires significant effort and would add significantly to global climate change. Within the research agenda of the Pardee Center, these risks and preventive actions are discussed in more detail. The warning is clear: humankind needs to be aware and prepared for potential multiple “breadbaskets” failure if we want to reduce the potential for catastrophe. (Anthony Janetos, The Conversation)

Reproducibility in Science

Research Transparency: Open Science

Increasing amounts of scientific data, complexity of experiments, and the hidden or proprietary nature of data has given rise to the “reproducibility crisis” in science. Reproducibility studies in cancer biology have revealed that only 40 % or less peer-reviewed analyses are replicable. Another large-scale project attempting to replicate 100 recent psychology studies was successful in replicating less than 50% of the original results.

These findings are driving scientists to look for ways to increase study reliability, and make research practices more efficient and available for evaluation. A philosophy of open science, where scientists share their primary materials and data, makes analytical approaches more transparent and allows common research practices and standards to emerge more quickly. For scientific journals and associations, open science methods enable the creation of different ways to store and utilize data. Some journals are specifically dedicated to publishing data sets for reuse (Scientific DataJournal of Open Psychology Data), others require or reward open science practices like publicly posting materials and data.

The widespread use of online repositories to share study materials and data helps to store large data sets and physical materials to help mitigate the problems of reproducibility. However, open science practice is still very much in development, and faces some significant disincentives. Habits and reward structures are two major forces work against. Researchers are used to being close, and hide their data from being stolen. Journal editors tend to favor publishing papers that tell a tidy story with perfectly clear results. This causes researchers to omit “failed” studies that don’t clearly support their theories.

While efforts to overcome these obstacles are difficult, development of fully transparent science should be encouraged, as openness helps improve understanding, and acknowledges the truth that real data are often messy. (Elizabeth Gilbert and Katie Corker, The Conversation)

 

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

June 6, 2017 at 9:00 am

Science Policy Around the Web – January 27, 2017

leave a comment »

By: Nivedita Sengupta, PhD

Source: NIH Image Gallery on Flickr, under Creative Commons

Human Research Regulation

US Agency Releases Finalized ‘Common Rule’, Which Govern Human-Subjects Research

On September 8, 2015 the US Department of Health and Human Services (HHS) proposed significant revisions to the Federal Policy for the Protection of Human Subjects which is also known as the “Common Rule”. “Common Rule” is the set of federal regulations governing the conduct of clinical research involving human subjects. Among the proposed changes, an important one was regarding getting peoples’ consent before using the biological samples for subsequent studies. On 18th January 2017, the final version of the rule was released in which the proposed change was abandoned. This is a blow to the patient-privacy advocates, however the US National Academies of Sciences, Engineering and Medicine argued against that requirement and others citing that the changes would impose an undue burden on researchers and recommended that it be withdrawn.

The current version of Common Rule has generated mixed feelings among people. Researchers are happy that the government listened to scientists’ fears about increased research burdens whereas people like Twila Brase, president and co-founder of Citizens’ Council for Health Freedom in St Paul, Minnesota, are disappointed as they believe that these specific changes are ought to be made. Moreover the new version of the Common Rule requires that scientists include a description of the study, along with the risks and benefits, on the consent forms used by patients, and federally-funded trials should post patient consent forms online. However, these requirements do not extend to trials that are conducted with non-federal funds. (Sara Reardon, Nature News)

Biomedical Research

An Open-Science Effort to Replicate Dozens of Cancer-Biology Studies is Off to a Confusing Start

The Reproducibility Project on Cancer Biology was launched in 2013 to scrutinize the findings of 50 cancer papers from high-impact journals. The aim is to determine the fraction of influential cancer biology studies that are sound. In 2012, researchers at the biotechnology firm Amgen performed a similar study and announced that they had failed to replicate 47 of 53 landmark cancer papers but they did not identify the studies involved. In contrast, the reproducibility project makes all its findings open. Full results should appear by the end of the year and eLife is already publishing five fully analyzed reports in January. Out of the five, one failed to replicate and the remaining four showed replication results that are less clear.

These five results paint a muddy picture for people waiting for the outcome to determine the extent of impact of these studies. Though some researchers praised the project, others feared unfair discredit of their work and career. According to Sean Morrison, a senior editor at eLife, the reason for the “uninterpretable” results is “Things went wrong with tests to measure the growth of tumors in the replication attempts and the replication researchers were not allowed to deviate from the protocols, which was agreed at the start of the projects in consultation with the original authors”. “Doing anything else — such as changing the experimental conditions or restarting the work — would have introduced bias”, says Errington, the manager of the reproducibility project.

According to Errington, the clearest finding from this project is that the papers include very few details about their methods. The replication researchers had to spend hours to work out the detailed protocols and reagents along with the original authors. Even after following the exact protocols, the final reports include many reasons why the replication studies might have turned out differently, including variations in laboratory temperatures to tiny variations in how a drug was delivered. He thinks that the project helps to bring out such confusing details to the surface, and it will be a great service for future follow up work to develop a cure for cancer. However, scientists think that such conflicts mean that the replication efforts are not very informative and couldn’t be compared to the original and will only cause delays in advancing future clinical trials. (Monya Baker and Elie Dolgin, Nature News)

 

Have an interesting science policy link?  Share it in the comments!

Science Policy Around the Web – August 26, 2016

leave a comment »

By: Leopold Kong, PhD

Adipose Tissue  Source: Wikipedia Commons, by Blausen.com staff, “Blausen Gallery 2014“.

Health Policy

Is there such a thing as ‘fat but fit’?

Nearly 70% of American adults are overweight or obese, raising their risk for health problems such as heart disease, diabetes, and high blood pressure. However, about a third of obese individuals appear to have healthy levels of blood sugar and blood pressure. Whether these ‘fat but fit’ individuals are actually “fit” has been controversial. A recent study published in Cell Reports has sought to dissect differences in the fat cells of the ‘unfit’ obese versus the ‘fit’ obese using tools that probe the patterns of genes being turned on or off. Fat from non-overweight people were also examined in the study. Interestingly, fat of non-overweight individuals and obese individuals differed in over 200 genes, regardless of ‘fitness’. However, the fat of ‘fit’ versus ‘unfit’ obese individuals only differed in two genes. Dr. Mikael Rydén, the lead author of the study commented: “We think that adds fuel to the debate. It would imply that you are not protected from bad outcomes if you are a so-called fit and fat person.” The study also highlights the complexity of fat’s influence on health, and raises the possibility of ‘fat’ biopsies. For example, fat from normal weight individuals following an unhealthy lifestyle may have marked differences that are diagnostic of future obesity. With the rising cost of treating chronic diseases associated with being overweight, further studies are warranted. (Lindzi Wessel, Stat News)

Biomedical Research

Half of biomedical research studies don’t stand up to scrutiny

Reproducible results are at the heart of what makes science ‘science’. However, a large proportion of published biomedical research appears to be irreproducible. A shocking study by scientists at the biotechnology firm Amgen aiming to reproduce 53 “landmark” studies showed that only 6 them could be confirmed. The stakes are even higher when it comes to pre-clinical cancer research. In fact, they are $30 billion higher, according to a recent study, suggesting that only 50% of findings can be reproduced. Primary sources of irreproducibility can be traced to (1) poor study design, (2) instability and scarcity of biological reagents and reference materials, (3) unclear laboratory protocols, and (4) poor data analysis and reporting. A major stumbling block may be the present culture of science, which does not reward publishing replication studies, or negative results. Higher impact journals generally prioritize work that demonstrates something new and potentially groundbreaking or controversial. When winning grant money and academic posts hinges on impact factor, reproducibility suffers. However, with such high potential for wasting substantial funds on medically significant areas, radical changes in science policy towards publishing, peer review and science education is urgently needed. The recent reproducibility initiative aiming “to identify and reward high quality reproducible research via independent validation” may be a step in the right direction. However, a paradigm shift in scientists’ attitudes towards what constitutes important research might be necessary. (Ivan Orannsky, The Conversation)

Biotechnology

In CRISPR fight, co-inventor says Broad Institute misled patent office

The intellectual property dispute over the multibillion-dollar CRISPR gene editing technology has grown increasingly heated in the last months. With the FDA giving the go-ahead for the first U.S. clinical trial using CRISPR and with China beginning a clinical trial this month using this technology, the tension is high. On one side of the dispute is University of California’s Jennifer Doudna whose initial work established the gene-editing technology in a test tube. On the other side is Broad Institute’s Feng Zhang, who within one year made the technology work in cells and organisms, and therefore broadly applicable for biotechnology. Was Zhang’s contribution a substantial enough advance to warrant its own patents? Was Doudna’s work too theoretical and basic? This week, a potentially damning email that emerged from the legal filings of the dispute was made public. The email is from a former graduate student of Zhang’s, Shuailiang Lin, to Doudna. In addition to asking for a job, Lin wrote that Zhang was unable to make the technology work until the 2012 Doudna publication revealed the key conceptual advances. Lin adds: “I think a revolutionary technology like this […] should not be mis-patented. We did not work it out before seeing your paper, it’s really a pity. But I think we should be responsible for the truth. That’s science.” A spokesperson for the Broad Institute, Lee McGuire, suggested that Lin’s claims are false, and pointed out that Lin was in a rush to renew his visa, and had sent his explosive email to Doudna after being rejected for a new post at the Broad Institute. With CRISPR technology promising to change the face of biotechnology, the drama over its intellectual property continues to escalate. (Antonio Regalado, MIT Technology Review)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

August 26, 2016 at 9:00 am

Science Policy Around the Web – August 19, 2016

leave a comment »

By: Ian McWilliams, PhD

Photo source: pixabay

Climate Change

Melting ice sheet may expose cold war base, hazardous waste

During the Cold War, the US Army Corps began a top-secret mission to determine the capability of launching nuclear missiles at Russia from a base in Greenland. The military base constructed for this mission, named Camp Century, lies approximately 125 miles inland from the Greenland coast and was later abandoned in 1964 after the Joint Chiefs of Staff rejected the plans to create a nuclear base. When soldiers abandoned the base, it was thought that leftover fuel and waste material would be safely interred, buried under ice for thousands of years.

However, climate change has now threatened those plans. The increased ice melt could reveal the base as early as 2090 and it is estimated that tens of thousands of gallons of diesel fuel, wastewater, sewage, and other chemicals could be exposed. Adding to concerns is the nuclear generator housed in the frozen base. Although the base never became a site for nuclear weapons, the low-level radioactive coolant from the nuclear generator is still stored in the base. If ice melt continues to occur at an accelerated rate, some have expressed concern that these chemicals could be released into the environment by seeping into waterways causing a potential environmental catastrophe. (Stephen Feller, UPI)

Microbiome

Mouse microbe may make scientific studies harder to replicate

Reproducibility is an issue that has been the subject of much debate in the scientific community recently. Now, scientists are concerned that the microbiome may further complicate the issue. The collection of commensal microorganisms that reside on or within the body is referred to as microbiota, and it is now well known to affect the health of the host. Although researchers have taken meticulous steps to ensure that experimental animals are housed in identical conditions, including sterile bedding, strict temperature control, and standard light cycles, determining experimental variability due to differences in their microbiome have remained elusive. As researchers explore the issue further they have found that mice from different vendors have very different compositions of bacteria in their gut that could explain some inconsistencies in researchers’ experiments.

Although it is not mandated, taking steps to control for microbiome may aid in the reproducibility crisis. Segmented filamentous bacteria (SFB) have been identified as a notable concern, and some vendors are providing SFB positive or SFB negative animals separately. Although it is unlikely that SFB is the only culprit for differences in studies, researchers continue to explore new variables in rodent husbandry in an effort to improve reproducibility of scientific results. To add to the dilemma, because the species that constitute the microbiome are constantly changing, it is difficult to characterize, and impossible to standardize. Since mice share their microbes through eating each other’s feces, cage-mates can have similar microbiomes that provide natural microbiota normalization for littermates. (Kelly Servick, Science)

Precision Medicine

Spiking genomic databases with misinformation could protect patient privacy

New initiatives, like the Precision Medicine Initiative (PMI), are helping to cultivate the human genome into usable sets of data for research purposes. This pursuit is founded upon the willingness of participants to allow their genetic information to be pooled for analyses, but many have expressed concerns over the privacy of this genetic information. It has previously been shown that individuals can be identified from their anonymized genomic data and this has prompted researchers to look for additional security measures. Computer scientists Bonnier Berger and Sean Simmons have developed a new tool to help achieve this goal by using an approach called differential privacy. To increase privacy, a small amount of noise, or random variation, is added to the results of a user’s database query. Although the information returned would provide useful results, it would make it more difficult to conclusively connect this data to a patient’s identity. A similar method has been used by the US Census Bureau and the US Department of Labor for many years.

However, some scientists, including Yaniv Erlich, have concerns that adding noise to the dataset will reduce users ability to generate useful results. Erlich stated that “It’s nice on paper. But from a practical perspective I’m not sure that it can be used”. In the search for privacy, free form access to the data is limited. This “privacy budget” limits the number of questions that can be asked and excludes hundreds or thousands of locations in a genome. Additionally, because noise naturally increases error, it weakens the overall conclusion that can be drawn from the query. Simmons expects that answers will be close enough to be useful for a few targeted questions. The tradeoff for increased security is that databases protected this way could be instantly accessible and searchable, which cuts down on getting access to databases such as those managed by the National Institutes of Health. Simmons added that this method is “meant to get access to data sets that you might not have access to otherwise”. The group plans to continue to refine this method to balance the needs of researchers for access to these data sets while maintaining patient privacy. (Anna Nowogrodzki, Nature)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

August 19, 2016 at 11:08 am

Science Policy Around the Web – July 26, 2016

leave a comment »

By: Ian McWilliams, Ph.D.

photo credit: Newport Geographic via photopin cc

Infectious Diseases

Research charities help marry two major South African HIV/TB institutes

Two institutes, the Wellcome Trust and the Howard Hughes Medical Institute (HHMI), have announced that they are joining efforts in to fund the fight against HIV and Tuberculosis (TB) in South Africa. South Africa has the largest population infected with HIV. Because TB thrives in HIV-infected individuals, South Africa is experiencing a co-epidemic that has been challenging to battle. This collaboration will mark the first time that HHMI and The Wellcome Trust have worked together on a global health institution.

The new Africa Health Research Institute combines the Africa Centre for Population Health’s detailed population data gathered from over 100,000 participants with basic laboratory science and medical research of the KwaZulu-Natal Research Institute TB-HIV (K-RITH). Together the organization will work towards eliminating HIV and TB by training African scientists and will “link clinical and laboratory-based studies with social science, health systems research and population studies to make fundamental discoveries about these killer diseases, as well as demonstrating how best to reduce morbidity and mortality.” Projects funded by the institute include maintaining the longest running population-based HIV treatment as prevention (TasP) trial in Africa and using genomics to study drug resistant TB.

The organization is funded by a $50 million grant from The Wellcome Trust that is renewable over the next five years. Additionally, HHMI has already spent $40 million for the construction of new facilities, including a new biosafety level 3 laboratory that is designed to handle dangerous pathogens. These new efforts aim to apply scientific breakthroughs to directly help the local community. Deenan Pillay, the director of the new institute, has expressed his support of the organization’s mission by stating “There’s been increasing pressure and need for the Africa Centre not just to observe the epidemic but to do something about it. How long can you be producing bloody maps?” (Jon Cohen, ScienceInsider)

Scientific Reproducibility

Dutch agency launches first grants programme dedicate to replication

While a reproducibility crisis is on the minds of many scientists, the Netherlands have launched a new fund to encourage Dutch scientists to test the reproducibility of ‘cornerstone’ scientific findings. The €3 million fund was announced on July 19th by the Netherlands Organisation for Scientific Research (NWO) and will focus on replicating work that “have a large impact on science, government policy or the public debate.”

The Replication Studies pilot program aims to increase transparency, quality, and completeness of reporting of results. Brian Nosek, who led studies to evaluate the reproducibility of over 100 reports from three different psychology journals, hailed the new program and stated “this is an increase of infinity percent of federal funding dedicated to replication studies.” This project is the first program in the world to focus on the replication of previous scientific findings. Dutch scientist Daniel Lakens further stated that “[t]his clearly signals that NWO feels there is imbalance in how much scientists perform replication research, and how much scientists perform novel research.” The NWO has stated that it intends to include replication in all of its research programs.

This pilot program will focus both on the reproduction of findings using datasets from the original study and replication of findings with new datasets gathered using the same research protocol in the original study. The program expects to fund 8-10 projects each year, and importantly, scientists will not be allowed to replicate their own work. The call for proposals will open in September with an expected deadline in mid-December. (Monya Baker, Nature News)

Health Care Insurance

US Sues to block Anthem-Cigna and Aetna-Human mergers

United States Attorney General Loretta Lynch has announced lawsuits to block two mergers that involve four of the largest health insurers. Co-plaintiffs in the suits include eight states, including Delaware, Florida, Georgia, Illinoi, Iowa, Ohio, Pennsylvania, Virginia, California, Colorado, Connecticut, Main, Maryland, and New Hampshire, as well as the District of Columbia. The lawsuits are an attempt by the Justice Department to block Humana’s $37 billion merger with Aetna and Anthem’s $54 billion acquisition of Cigna, the largest merger in the history of health insurers. The Justice Department says that the deals violate antitrust laws and could mean fewer choices and higher premiums for Americans. Antitrust officials also expressed concern that doctors and hospitals could lose bargaining power in these mergers.

Both proposed mergers were announced last year, and if these transactions close, the number of national providers would be reduced from five to three large companies. Furthermore, the government says that Anthem and Cigna control at least 50 percent of the national employer-based insurance market. Lynch further added that “competition would be substantially reduced for hundreds of thousands of families and individuals who buy insurance on the public exchanges established under the Affordable Care Act.” The Affordable Care Act (ACA) aimed to encourage more competition between insurers to improve health insurance options and keep plans affordable. The Obama administration has closely watched the health care industry since the passing of that legislation and has previously blocked the mergers of large hospital systems and stopped the merger of pharmaceutical giants, such as the proposed merger of Pfizer and Allergan.

Health insurers argue that these mergers are necessary to make the health care system more efficient, and would allow doctors and hospitals to better coordinate medical care. In reaction to the announcement by the Justice Department, Aetna and Humana stated that they intend to “vigorously defend” the merger and that this move “is in the best interest of consumers, particularly seniors seeking affordable, high-quality Medicare Advantage plans.” Cigna has said it is evaluating its options. (Leslie Picker and Reed Abelson, New York Times)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

July 26, 2016 at 11:00 am