Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘open access

Science Policy Around the Web – September 22, 2017

leave a comment »

By: Leopold Kong, PhD

20170922_Linkpost_1

source: pixabay

Scientific Publishing

Network strives to ‘make your research visible’…unless that research is copyright

More and more, scientific research is facilitated by large scale collaborations and sharing of new results across multiple disciplines.  However, a large proportion of scientific publications are behind paywalls, greatly limiting their access.  Since launching in 2008, ResearchGate has provided a social network site where researchers can upload and share their work.  With more than 13 million members and having raised more than $87 million in funding, ResearchGate has become an invaluable resource to scientists.  However, a recent study found that 201 out of 500 randomly picked English journal articles shared on the website infringed on publishers’ copyright.

Now, the International Association of Scientific, Technical and Medical Publishers (STM) in Oxford, UK, representing over 140 scientific, technical and medical publishers, has written to ResearchGate about their concerns, citing copyright violation.  The letter asks the website to improve communications educating users about copyright issues.  “The STM hopes that ResearchGate will choose to work with publishers to achieve a long-term sustainable solution that makes sharing of content simple and seamless, but importantly that respects the rights of authors and publishers,” says Michael Mabe, CEO of STM.  Some users such as neurobiologist Björn Brombs (University of Regensburg, Germany), may be dismayed by the potential loss of free paper sharing, viewing the letter as a ”thinly veiled threat”. Jon Tennat, communications director of professional research network  Science Open  says, “The unasked question that this all comes down to is: Do publisher-owned rights matter more than the sharing of research for whatever benefit?” Earlier this year, STM organized a teleconference to discuss efforts towards fostering fair sharing of papers without breaching copyright.  This may mean providing free links to final,read-only,non-downloadable articles. Ultimately, changes in the scientific publishing industry may be necessary to further facilitate the spread of knowledge and progress in research.

(Dalmeet Sing Chawla, Science)

The History of Science

Statues: an insufficient medium for addressing the moral shortcomings of scientific giants

The existence of monuments, place-names and building-names dedicated to individuals who practiced extreme racist acts in the past have come under heavy fire in recent years.  For example, Yale’s Calhoun College, named after a white supremacist who promoted slavery, was renamed earlier this year to Grace Hopper College, in honor a trail blazing computer scientist.  Also, there have been numerous efforts to remove Confederate statues, including the Robert E. Lee statue in Charlottesville that sparked a major confrontation between violent white supremacists, who gathered as part of a Unite the Right rally, and liberal activists. David Blight, a professor of history at Yale told Newsweek that, “This debate needs a big dose of humility […] There’s a danger here that we lose hold of learning from the past just by trying to make it feel and look better.”  He adds that it is hard when a city has a Confederate statue at its center because “that’s a city saying to the world ‘ this is the most important part of our past.”

Science and its monuments are not free from this debate.  Through her research, Harriet A. Washington, a medical ethicist and historian, has unearthed racist atrocities committed by James Marion Sims, the “father of American gynaecology” and founder of the New York Women’s hospital.  Taking direct readings of Sims’ personal medical correspondence, in addition to contemporary scholarship, Harriet revealed an extremely tainted past.  For example, Sims performed many fruitless surgeries on enslaved women, complying with only their masters’ consent, and forgoing anesthesia, believing that African Americans did not feel pain.  These findings may lead to the dismantling of the many statues dedicated to Sims.  “As the statues and portraits of Sims make clear, art can create beautiful lies […] No scientist, no thinking individual, should be content to accept pretty propaganda,” concludes Washington, in a recent essay published in Nature.

(Harriet A. Washington, Nature)

Have an interesting science policy link?  Share it in the comments!

Advertisements

Written by sciencepolicyforall

September 22, 2017 at 5:46 pm

Science Policy Around the Web – August 1, 2017

leave a comment »

By: Sarah L. Hawes, PhD

20170801_Linkpost_1

Source: pixabay

Climate Science

Conducting Science by Debate?

Earlier this year an editorial by past Department of Energy Under Secretary, Steven Koonin, suggested a “red team-blue team” debate between climate skeptics and climate scientists. Koonin argued that a sort of tribalism segregates climate scientists while a broken peer-review process favors the mainstream tribe. Science history and climate science experts published a response in the Washington Post reminding readers that “All scientists are inveterate tire kickers and testers of conventional wisdom;” and while “the highest kudos go to those who overturn accepted understanding, and replace it with something that better fits available data,” the overwhelming consensus among climate scientists is that human activities are a major contributor to planetary warming.

Currently, both Environmental Protection Agency Administrator, Scott Pruitt, and Department of Energy Secretary, Rick Perry, cite Koonin’s editorial while pushing for debates on climate change. Perry said “What the American people deserve, I think, is a true, legitimate, peer-reviewed, objective, transparent discussion about CO2.” That sounds good doesn’t it? However, we already have this: It’s called climate science.

Climate scientists have been forthright with politicians for years. Scientific consensus on the hazards of carbon emissions lead to the EPA’s endangerment findings in 2009, and was upheld by EPA review again in 2015. A letter to Congress in 2016 expressed the consensus of over 30 major scientific societies that climate change poses real threats, and human activities are the primary driver, “based on multiple independent lines of evidence and the vast body of peer-reviewed science.”

Kelly Levin of the World Resources Institute criticizes the red team-blue team approach for “giving too much weight to a skeptical minority” since 97% of actively publishing climate scientists agree human activities are contributing significantly to recent climactic warming. “Re-inventing the wheel” by continuing the debate needlessly delays crucial remediation. Scientific conclusions and their applications are often politicized, but that does not mean the political processes of holding debates, representing various constituencies, and voting are appropriate methods for arriving at scientific conclusions.

(Julia Marsh, Ecological Society of America Policy News)

20170801_Linkpost_2

source: pixabay

Data Sharing, Open Access

Open Access Science – getting FAIR, FASTR

Advances in science, technology and medicine are often published in scientific journals with costly subscription rates, despite originating from publicly funded research. Yet public funding justifies public access. Shared data catalyzes scientific progress. Director of the Harvard Office for Scholarly Communication and of the Harvard Open Access Project, Peter Suber, has been promoting open access since at least 2001. Currently, countries like The Netherlands and Finland are hotly pursuing open access science, and the U.S. is gearing up to do the same.

On July 26th, bipartisan congressional representatives introduced The Fair Access to Science and Technology Research Act (FASTR), intended to enhance utility and transparency of publicly funded research by making it open-access. Within the FASTR Act, Congress finds that “Federal Government funds basic and applied research with the expectation that new ideas and discoveries that result from the research, if shared and effectively disseminated, will advance science and improve the lives and welfare of people of the United States and around the world,” and that “the United States has a substantial interest in maximizing the impact and utility of the research it funds by enabling a wide range of reuses of the peer-reviewed literature…”; the FASTR Act mandates that findings are publicly released within 6 months. A similar memorandum was released under the Obama administration in 2013.

On July 20th, a new committee with the National Academies finished their first meeting in Washington D.C. by initiating an 18-month study on how best to move toward a default culture of “open science.” The committee is chaired by Alexa McCray of the Center for Biomedical Informatics at Harvard Medical School, and most members are research professors. They define open science as free public access to published research articles, raw data, computer code, algorithms, etc. generated through publicly-funded research, “so that the products of this research are findable, accessible, interoperable, and reusable (FAIR), with limited exceptions for privacy, proprietary business claims, and national security.” Committee goals include identifying existing barriers to open science such as discipline-specific cultural norms, professional incentive systems, and infrastructure for data management. The committee will then come up with recommended solutions to facilitate open science.

Getting diverse actors – for instance funders, publishers, scientific societies and research institutions – to adjust current practices to achieve a common goal will certainly require new federal science policy. Because the National Academies committee is composed of active scientists, their final report should serve as an insightful template for federal science agencies to use in drafting new policy in this area. (Alexis Wolfe & Lisa McDonald, American Institute of Physics Science Policy News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

August 1, 2017 at 7:38 pm

Science Policy Around the Web – June 23, 2017

leave a comment »

By: Saurav Seshadri, PhD

Drug Policy

Trump’s New Policy to Tackle Sky-High Drug Prices Makes Sense — Sort Of

Tackling high prescription drug prices was a repeated promise of the Trump campaign. The Trump administration has now taken its first step towards fulfilling this pledge, outlined in a blog post by Food and Drug Administration (FDA) commissioner Scott Gottlieb. The agency will pursue a Drug Competition Action Plan, whose goal will be to eliminate obstacles to the development of cheap generic drugs – particularly those caused by loopholes in existing FDA policies, which are exploited by pharmaceutical companies to extend their patent exclusivity period and maximize profits. An example of such ‘gaming’ the system, cited in the post, is the practice of limiting access to branded products for comparative testing by generic developers. Ultimately, the FDA will work closely with the Federal Trade Commission (FTC) to address such issues, since directly regulating business practices is outside its mandate.

On its face, the FDA’s effort is a step in the right direction. Availability of generics reduces the cost of medications by over half within the first year, and according to a recent Congressional report, manufacturers state that ‘competition…is the primary driver of generic drug prices’. However, it ignores evidence that the real driver of increased drug spending is new, branded medicines, not overpriced generics. In fact, early indications are that Trump’s policies will favor the pharmaceutical companies that produce such medicines, by reducing regulations and apparently abandoning his promise to enable the government to negotiate drug pricing through Medicare. Overall, these actions signal a commitment to promoting free market mechanisms in the pharmaceutical industry; time will tell whether this approach will actually lead to more affordable drugs. (Julia Belluz, Vox)

Cancer

In a Major Shift, Cancer Drugs go ‘Tissue-Agnostic’

With the landmark approval of Keytruda in May, the Food and Drug Administration (FDA) appears to have ushered in a new era of cancer drug development.  So far, cancer treatment and drug evaluation have largely used the tumor’s tissue of origin as a starting point. Keytruda (an immune system enabling drug developed by Merck and approved for melanoma in 2014) marked the first departure from this approach, receiving priority approval to treat any solid tumor containing a mutation in the mismatch repair pathway, regardless of context. Recently released data suggests that another tissue-agnostic cancer therapy is on the way: larotrectinib (a cell growth inhibitor developed by Loxo Oncology) showed high efficacy for any tumor with a certain biomarker (TRK fusion). Several other such drugs, whose indications will be based on tumor genetics rather than location, are in the clinical pipeline.

Although these advances have generated significant excitement in the cancer community, some caveats exist. First, identifying the patients that could benefit from tissue-agnostic treatments will require individual initiative and depend on the cost of screening, particularly when considering markers that are rare for a certain tumor type. A potential solution is suggested by the NCI-MATCH trial, part of the NIH’s Precision Medicine Initiative (PMI) – in it, patients can enroll in one of several parallel clinical trials if a corresponding drug-targeted mutation is found in their tumor’s genome. If these trials prove effective, patients could eventually be regularly matched with a personalized, tissue-agnostic, biologically valid treatment, based on a standardized screen.  Second, researchers caution that tissue-agnostic studies should have a strong scientific rationale and/or breakthrough-level efficacy. Otherwise, such efforts ‘could actually slow drug development if there are differential effects across tumor types by diverting resources from enrolling patients in a predominant population or in the tumor type most likely to respond’.

Despite these concerns, the tissue-agnostic paradigm offers great promise for cancer patients. NIH-funded resources such as The Cancer Genome Atlas could be invaluable to this field moving forward. (Ken Garber, Science)

Scientific Publishing

US Court Grants Elsevier Millions in Damages from Sci-Hub

A New York district court has awarded academic publishing giant Elsevier $15 million in damages from Alexandra Elbakyan, founder of the website Sci-Hub, for copyright infringement. Elbakyan, a 27-year-old neuroscientist turned programmer, started Sci-Hub in 2011 with the goal of ‘remov[ing] all barriers in the way of science’. The site allows users to download research papers that would normally be blocked by a paywall, by obtaining credentials from subscribing institutions and using them to access publisher-run databases like ScienceDirect. Over 60 million papers are posted on Sci-Hub, and users downloaded 28 million articles in 2016.

Elbakyan’s case is reminiscent of Aaron Swartz, another high-profile champion of open access to scientific research. Faced with federal charges related to his hacking of journal archive JSTOR, Swartz tragically committed suicide in 2013. Both Elbakyan and Swartz found publishers’ ability to profit from restricting access to scientific literature, effectively withholding knowledge from anyone outside of a privileged inner circle, as well as the legal protection provided to this system, to be deeply unethical. Their willingness to act upon these convictions has earned each a sizable following in the scientific community.

For their part, publishers claim that fees go towards overhead, and point to significant efforts to expand free and open access programs. While judges have so far been sympathetic, Elsevier’s legal battle has been largely one-sided. Elbakyan has been ignoring rulings requiring her to shut down Sci-Hub since 2015, opting to simply change domains instead, and since she is currently based in Russia and has no American assets, she is unlikely to pay any damages. (Quirin Schiermeier, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

June 23, 2017 at 11:00 am

Scientific Activism: Voting to Speed Up Discovery with Preprint Publishing

leave a comment »

By: Thaddeus Davenport, PhD

Source: Public Library of Science, via Wikimedia

         The election of Donald Trump to the Oval Office and the early actions of his administration have sparked a wave of protests in support of women’s rights and immigration, among other issues. Like other citizens, scientists have some cause to be concerned about the administration’s early actions that reveal a general disregard for facts and scientific evidence. In response, organizers have planned the March for Science for this Saturday, April 22nd, as an opportunity for people to gather in cities around the world to voice their support for factual information and scientific research. And while it is important to denounce the actions of the Trump administration that are harmful to science and health, it may be even more critical to acknowledge the underlying partisan divisions that created a niche for his rhetoric and to begin the difficult work of bridging the divide. For example, a Pew Research Center poll from 2015 indicates that 89% of liberal Democrats believe government investment in basic science pays off in the long-run, while only 61% of conservative Republicans feel the same way. Additionally, American adults with less knowledge of scientific topics are more likely to believe that government funding of basic science does not pay off. This suggests that improved science education and outreach will be important in building public support for scientific research. However, scientists often lead very busy lives and have little time outside of their professional activities to devote to valuable pursuits like science outreach. How, then, might scientists work towards building a better relationship with the public?

The products of science – knowledge, medicines, technology – are the clearest evidence of the value of research, and they are the best arguments for continued research funding. Efficiency in science is good not only for scientists hoping to make a name for themselves, but also for the public, who as the primary benefactors of academic research, must benefit from the products of that research. If taxpayers’ demand for scientific inquiry dissipates because of a perceived poor return on their investment, then the government, which supposedly represents these taxpayers, will limit its investment in science. Therefore, in addition to communicating science more clearly to the public, scientists and funding agencies should ensure that science is working efficiently and working for the public.

Information is the primary output of research, and it is arguably the most essential input for innovation. Not all research will lead to a new product that benefits the public, but most research will yield a publication that may be useful to other scientists. Science journals play a critical role in coordinating peer review and disseminating new research findings, and as the primary gatekeepers to this information, they are in the difficult position of balancing accessibility to the content of their journals with the viability of their business. This position deserves some sympathy in the case of journals published by scientific societies, which are typically non-profit organizations that perform valuable functions including scientific outreach, education and lobbying. However, for-profit journals are less justified in making a significant profit out of restricting access to information that was, in most cases, obtained through publicly-funded research.

Restricting access to information gathered in the course of research risks obscuring the value of research to a public that is already skeptical about investing in basic science, and it slows down and increases the cost of innovation. In light of this, there is growing pressure on publishers to provide options for open-access publishing. In 2008, the National Institutes of Health adopted a public access policy, which requires that “investigators funded by the NIH submit or have submitted for them to the National Library of Medicine’s PubMed Central an electronic version of their final, peer-reviewed manuscripts upon acceptance for publication, to be made publicly available no later than 12 months after the official date of publication: Provided, that the NIH shall implement the public access policy in a manner consistent with copyright law.” This policy was extended through an executive order from the Obama Administration in 2013 to include all federal agencies with research budgets greater than $100 million, with additional requirements to improve accessibility.

These requirements are changing scientific publishing and will improve access to information, but they remain limited relative to the demand for access, as evidenced by the existence of paper pirating websites, and the success of open access journals like PLoS and eLife.  Additionally, other funding agencies like the Bill and Melinda Gates Foundation and the Wellcome Trust have imposed even more stringent requirements for open access. Indeed, researchers will find a spectrum of open-access policies among the available journals, with the most rapid access to information allowed by so-called ‘preprint’ publishers like biorxiv.org. Given that many research manuscripts require months or years of revision and re-revision during submission to (usually multiple) journals, preprint servers accelerate the dissemination of information that is potentially valuable for innovation, by allowing researchers to post manuscripts prior to acceptance in a peer-reviewed journal. Many journals have now adopted explicit policies for handling manuscripts that have been previously submitted to bioRxiv, with many of them treating these manuscripts favorably.

Given that most journals accept manuscripts that have been previously published on bioRxiv, and some journals even look to bioRxiv for content, there is little incentive to submit to journals without also submitting to bioRxiv. If the goal is, as stated above, to improve the transparency and the efficiency of research in order to make science work for the public, then scientists should take every opportunity to make their data as accessible as possible, and as quickly as possible. Similarly, funding agencies should continue to push for increased access by validating preprint publications as acceptable evidence of productivity in progress reports and grant applications, and incentivizing grant recipients to simultaneously submit manuscripts to preprint servers and peer-reviewed journals. Scientists have many options when they publish, and by voting for good open-access practices with their manuscripts, they have the opportunity to guide the direction of the future of scientific publishing. These small, but important, actions may improve the vitality of research and increase the rate at which discoveries tangibly benefit taxpayers, and, in combination with science outreach and education, may ultimately strengthen the relationship between scientists and the public.

March for Science this Saturday, if it feels like the right thing to do, and then strive to make science work better for everyone by sharing the fruits of research.

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

April 20, 2017 at 11:44 am

Science Policy Around the Web – February 17, 2017

leave a comment »

By: Thaddeus Davenport, PhD

Source: pixabay

CRISPR

Decision in the CRISPR-Cas9 Patent Dispute

This week, Heidi Wedford from Nature News reported that the United States Patent and Trademark Office (USPTO) made a decision on the disputed patents for the gene editing technology known as CRISPR-Cas9 in favor of the Broad Institute of MIT and Harvard. The CRISPR-Cas9 system has been widely publicized, and this publicity is arguably not out of proportion with the potential of this technology to simplify and accelerate the manipulation of DNA of both microbial (prokaryotic) and higher order (eukaryotic) cells for research and therapy. A simplified, programmable version of CRISPR-Cas9 for use in gene editing was initially described by Charpentier and Doudna, and it was rapidly translated for use in eukaryotic cells by Zhang and colleagues at the Broad Institute in parallel with Doudna, Charpentier, and others.

The USPTO decision follows a dramatic and ongoing dispute over whether the patent application submitted by the University of California on behalf of Doudna and Charpentier – which was submitted before that of the Broad Institute, and described the technology in broad terms as a method of cutting desired DNA sequences – was sufficient to protect the CRISPR-Cas9 intellectual property when the Broad Institute later filed a fast-tracked patent application describing the use of CRISPR-Cas9 for use in eukaryotic cells. Because the Broad Institute’s application was expedited, it was approved before the University of California’s application. In January of 2016, the University of California filed for an ‘interference’ proceeding, with the goal of demonstrating to the USPTO that Doudna and colleagues were the first to invent CRISPR-Cas9, and that the patent application from the Broad Institute was an ‘ordinary’ extension of the technology described in the University of California application.

On February 15th of this year, the USPTO ruled that the technology described in the Broad Institute’s application was distinct from that of the University of California’s. The importance of this decision is that the patents granted to the Broad Institute for the use of CRISPR-Cas9 in mammalian cells will be upheld for now. It also creates some complexity for companies seeking to license CRISPR-Cas9 technology. Because of the overlapping content of the CRISPR-Cas9 patents held by the University of California and the Broad Institute, it is possible that companies may need to license the technology from both institutions. The University of California may still appeal the USPTO’s decision, but this is a significant victory for the Broad Institute for the time being. For many scientists, this dispute is a dramatic introduction to the inner workings of the patent application process. We would do well to familiarize ourselves with this system and ensure that it works effectively to accurately reward the discoveries of our fellow scientists and to facilitate the transfer of technology to those who need it most, without imposing undue economic burden on companies and consumers. (Heidi Wedford, Nature News)

Scientific Publishing

Open Access to Gates Foundation Funded Research

Also this week, Dalmeet Singh Chawla reported for ScienceInsider that the Bill and Melinda Gates Foundation had reached an agreement with the American Association for the Advancement of Science (AAAS) that will allow researchers funded by the Gates Foundation to publish their research in the AAAS journals Science, Science Translational Medicine, Science Signaling, Science Immunology, and Science Robotics. This agreement follows an announcement in January in which the Gates Foundation decided that research funded by the foundation would no longer be allowed to be published in subscription journals including Nature, Science, and New England Journal of Medicine, among others, because these journals do not meet the open access requirements stipulated by the new Gates open-access policies. The new Gates Foundation policy requires its grant recipients to publish in free, open-access journals and to make data freely available immediately after publication for both commercial and non-commercial uses. A similar policy is being considered by the nascent Chan Zuckerberg Initiative.

In the agreement with AAAS, the Gates Foundation will pay the association $100,000 in order to make Gates-funded published content immediately freely available online. Convincing a journal as prominent as Science to make some of its content open-access is a step in the right direction, but it is perhaps more important as a symbol of a changing attitude toward publishing companies. Michael Eisen, co-founder of the Public Library of Science (PLoS) open-access journals, was interviewed for the ScienceInsider article and noted, “[t]he future is with immediate publication and post-publication peer review, and the sooner we get there the better.” This sentiment seems to be increasingly shared by researchers frustrated with the hegemony of the top-tier journals, their power over researchers’ careers, and the constraints that subscription-based journals impose on the spread of new information. Funding agencies including the Gates Foundation, Howard Hughes Medical Institute, and the National Institutes of Health are in a unique position to be able to dictate where the research they fund may be published. A collective decision by these agencies to push the publishing market towards an improved distribution of knowledge – through open-access publishing and post-publication peer review – and away from the historical and totally imagined importance of validation through high-tier journal publication would enrich the scientific ecosystem and accelerate innovation. In this regard, the efforts by the Gates Foundation are laudable and should be extended further. (Dalmeet Singh Chawla, ScienceInsider)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 17, 2017 at 12:44 pm

Science Policy Around the Web – May 10, 2016

leave a comment »

By: David Pagliaccio, Ph.D.

Source: Ashley Fisher / Flickr

Scientific Publishing

Who’s downloading pirated papers? Everyone

Sci-Hub is an online repository for millions of scientific and academic articles, which has sparked major controversy among the scientific and publishing communities. The site, launched in 2011 by Alexandra Elbakyan, a graduate student in Kazakhstan, provides free access to ‘pirated’ articles. These articles would otherwise only be accessible through personal or institution journal subscriptions or by purchasing individual articles, which often can cost ~$30 each. Recent analysis of Sci-Hub’s 28+ million download requests from September-February 2015 found that requests were coming from over 3 million different IP addresses (potentially many more individual users as those sharing university internet network will often share an IP address). These download requests came from all over the world and across all types of scientific fields. Download rates reached more than 200,000 per day. An opinion survey regarding Sci-Hub found that at least half of users download articles from Sci-Hub because they do not otherwise have access to the articles at all. Interestingly, many others use Sci-Hub purely out of convenience when they would still have access through their institution. Many respondents also use Sci-Hub in objection to the profits made by publishers off of academics and feel that efforts like Sci-Hub have the power to disrupt the status-quo of science publication. That said, Elsevier, of the largest publishers affected by Sci-Hub, launched a lawsuit against Elbakyan last year for infringing on their legal rights as copyright holders. Despite having their domain seized during the lawsuit, Sci-Hub is largely beyond the reach of the U.S. legal system by being based in Russia. This is an still evolving situation and debate, which may have large effects on the state of scientific publishing today particularly given the major support from much of scientific community. (John Bohannon, Science News)

Mental Health

New Study Shows Mental Health Diagnoses and Treatment Vary Significantly by Race and Ethnicity

The Department of Research and Evaluation at Kaiser Permanente published result of a large study in the journal Psychiatric Services regarding the diagnosis and treatment of mental health conditions. The study included data from electronic health records of 7.5 million adult patients. The patients were part of 11 private, not-for-profit health care systems participating in the Mental Health Research Network. The results indicated that 15.6% (1.17 million) of these patients received a mental health diagnosis in 2011. This varied by race and ethnicity from 7.5% among Asians to 20.6% among Native American/Alaskan Native patients. Most groups had generally lower diagnosis rates than non-Hispanic white patients. Importantly, regardless of race and ethnicity, all patients with a diagnosed mental health condition were much more likely to receive psychiatric mediations (73%) than they were to receive formal psychotherapy treatment (34%). While the study does not point to any specific causative factors, they do indicate a need for evaluation of the causes and effects of racial and ethnic differences in diagnosis and treatment of mental health conditions as well as those relating to the vast discrepancy in treatment by medication vs. therapy. (PR Newswire)

Child Development Policies

Bringing Brain Science to Early Childhood

Researchers at Harvard’s Center on the Developing Child are pushing better use of developmental psychology and neuroscience research in the creation and implementation of policy regarding early-childhood programs. Particularly, they critique incentives in the current policy system and call for research and development on the most effective early-childhood programs for stemming intergenerational poverty. Programs for child development should all be based on the rapidly evolving knowledge base in the scientific field and should be allowed to develop as we learn and understand more. Work in this area has shown lifelong consequences of early childhood stress as well as lifelong benefits of early positive parenting both on mental and physical health. The Center has already been to pilot programs in Washington state aimed at improving executive function and self-control among parents and children and hopefully to improve parental engagement. This work allows for testing and refining of new interventions based on data collected from the pilot testing. On the other hand, many interventions have previously been enacted at large-scale without adequate follow-up testing or methods for improvement based on outcomes. For example, they cite that the Head Start program, which aims to help young disadvantaged infants and children, has but has not utilized the infrastructure to evaluate the effectiveness of their various programs and to identify which programs benefit which types of individuals most. As research suggests, intervening early in development can be incredibly impactful, and thus we should be capitalizing on our scientific understanding to implement the most evidence-based programs and utilizing outcomes data to constantly improve our programs. (Emily Deruy, The Atlantic)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

May 10, 2016 at 9:00 am

Science Policy Around the Web – April 24, 2015

leave a comment »

By: Danielle Friend, PhD

photo credit: DSC03602.JPG via photopin (license)

Genetic-based Drug Discovery

23andMe will invent drugs using customer data

As of March 2015, 23andMe will no longer simply be known for direct-to-consumer genetic tests. 23andMe has now made progress toward their long-term goal of influencing drug discovery. 23andMe claims to have collected DNA from approximately 850,000 consumers through marketing of their $99 kit, and the company plans to use this genetic information to identify new drug targets. Additionally, 23andMe reports that approximately 80% of the consumers that purchase the kits have agreed to allow 23andMe to use their genetic information for this research. To help lead these discovery efforts, 23andMe recently hired Richard Scheller, who formerly lead research and development at Genetech, as the chief scientific director and head of operations. In addition to these in-house efforts, 23andMe has also recently formed partnerships with pharmaceutical companies, including both Pfizer and Genetech who plan to use the genetic information to develop drugs for diseases like Parkinson’s disease. Although the partnerships with companies like Pfizer and Genetech are clearly defined to help identify drug targets for particular diseases, 23andMe plans to organize their in-house research as a broad sweep through their databases without a particular disease in mind. However, 23andMe has mentioned that they have a particular interest in metabolic and immune system disorders, eye disease, and cancer. (Mathew Harper, Forbes; Ron Winslow, Wall Street Journal)

Transparency in Clinical Trial Data

World Health Organization calls for increased transparency in clinical trials

In mid-April, the World Health Organization (WHO) released a statement recommending that findings from all clinical trials be made public regardless of the results of the study. Dr. Marie-Paule Kieny, the assistant director-general for health systems and innovation with the WHO, stated that the goals of this new mandate are to “…promote the sharing of scientific knowledge in order to advance public health.” Additionally, Dr. Kieny also stated that, “failure to publicly disclose trial results engenders misinformation, leading to skewed priorities for both [research and development] and public health interventions,” and that “it creates indirect costs for public and private entities, including patients themselves, who pay for sub-optimal or harmful treatments.” Several factors may come between completed research and the publication of results. However, unpublished results (even if negative) can lead to the perception that treatments are more or less effective than they are. The WHO mandate requires that results from clinical studies be submitted to peer-reviewed journals within 1 year after the completion of data collection, and that the work should be published within 24 months in an open access journal. The WHO also asks that “key outcomes” — limited details of the study including the number of participants, main findings, and adverse events — be made available online within a year of study completion. Although these new requirements are a step in the right direction for clinical trial transparency, it remains unclear just how the WHO plans to enforce these recommendations. (Chris Whoolston, Nature Research Highlights; Martin Enserink, Science Insider; The World Health Organization)

Ebola Clinical Trials

Lack of patients hampers Ebola drug and vaccine testing

As attention on the Ebola outbreak in Africa has increased, more resources and medical assistance have been provided. Although the number of Ebola cases has significantly decreased due to these interventions, an unexpected troubling scenario has developed: Ebola vaccine clinical trials are now having trouble testing the efficacy of their vaccines due to the lack patient populations. In fact, one company has altogether halted their trial. Chimerix, a company running a trial for their antiviral drug, brincidofovir, has decided to end the trial altogether due to a lack of patients. In fact, the World Health Organization’s weekly report from April 19 states that new cases of Ebola are now down to a total of 33. Because of the dramatic decrease in Ebola cases, the public health community faces ethical issues regarding whether more promising drugs should be prioritized and given preferential access to patients and geographical regions. (Andrew Pollack, The New York Times; Richard Harris, National Public Radio; The World Health Organization; Kai Kupferschmidt, Science)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

April 24, 2015 at 9:00 am