Science Policy For All

Because science policy affects everyone.

Archive for the ‘Linkposts’ Category

Science Policy Around the Web – November 17, 2017

leave a comment »

By: Janani Prabhakar, PhD 


source: pixabay

Public Health

The ‘Horrifying’ Consequence of Lead Poisoning

Changes in water treatment practices in Flint, Michigan in 2014 resulted in large levels of lead in the water supply, and eventually culminated to a state of emergency in January 2016. The supply affected over 10,000 residents, forcing these individuals to refrain from using the city’s water supply until 2020. Because state officials may have been aware of lead contamination in the water supply for months before it became public, these officials are now facing criminal charges. This negligence is particularly troubling given recent evidence that shows persisting effects of lead contamination on health outcomes in Flint residents. In a working paper by Daniel Grossman at West Virginia University and David Slusky at the University of Kansas, the authors compared fertility rates in Flint before and after the change in water treatment practices that led to the crisis, and compared post-change fertility rates in Flint to those of unaffected towns in Michigan. They found that the fertility rate declined by 12 percent and fetal death rate increased by 58 percent. These reductions in rate have been witnessed in other cities after similar incidents of lead contamination in the water supply. Furthermore, given that the number of children with lead-poisoned blood supply doubled after changes to the treatment practices, the long-term effects on cognitive, behavior, and social outcomes of this contamination are only beginning to be examined and understood. The circumstances in Flint are an example of how misplaced focus of high-level policy decisions can negatively impact local communities, particularly low-income black neighborhoods. Black neighborhoods are disproportionately affected by lead contamination, but the lack of sufficient attention as well as the false suggestion that effected individuals were to blame propagated by industry leaders and policy makers have deterred progress in addressing critical issues in at-risk and underserved communities.

(Olga Khazan, The Atlantic)

Climate Change

Why China Wants to Lead on Climate, but Clings to Coal (for Now)

In a country of 1.4 billion people, China is one of the world’s largest coal producers and carbon polluters. However, it aims to spearhead the international agreement to address climate change. Despite this contradiction, China is already on track to meet its commitment to the Paris climate accord. This move towards reducing its dependence on coal comes as a necessity to China because of internal pressure to curb air pollution. But, according to NRDC climate and energy policy director Alvin Lin, given its size and population, phasing out coal dependence will not only be a long process for China, but one that has lots of ups and downs. For instance, while China has shown progress in meeting its commitments, a recent report shows higher emission projections this year may reflect an uptick in economic growth and reduction in rains needed to power hydroelectric technologies. While Lin portrays this uptick as an anomaly, competing interests in the Chinese government make the future unclear. In efforts to increase its presence abroad, China has built coal plants in other countries. But, China is also the largest producer of electric cars. President Xi Jinping has derided the United States for being isolationist and reneging on the Paris climate accord, but how his Government plans to hold its end of the deal has not been revealed. An important revelation is the fact that even if every country achieves their individual Paris pledges, the planet will still heat up by 3 degrees Celsius or more. Given that this increase is large enough to have catastrophic effects on the climate, adherence to Paris pledges serves only as a baseline for what is necessary and sufficient to combat global warming.

(Somini Sengupta, The New York Times)

Have an interesting science policy link?  Share it in the comments!


Written by sciencepolicyforall

November 17, 2017 at 4:51 pm

Science Policy Around the Web – November 14, 2017

leave a comment »

By: Saurav Seshadri, PhD


source: pixabay

Alzheimer’s Disease

Bill Gates sets his sights on neurodegeneration

Microsoft founder Bill Gates has announced a new funding initiative for research into Alzheimer’s Disease, starting with a personal donation of $50 million to the Dementia Discovery Fund (DDF).  The DDF is a UK-based, public-private collaboration, launched in 2015 and designed to encourage innovative research into treatments for dementia, of which Alzheimer’s Disease is a leading cause.  Initial investment in the DDF, which came from the pharmaceutical industry and government entities, was $100 million, meaning Gates’ contribution will be significant.  Gates says his family history makes him particularly interested in finding a cure for Alzheimer’s Disease.  The DDF has already taken steps in this direction: its first investment was in the biopharma company Alector, which is moving forward with immune system-related research to combat Alzheimer’s Disease.

Gates is already famous for his philanthropy through the Bill and Melinda Gates Foundation, which funds efforts to fight poverty and disease throughout the world.  However, the Foundation has traditionally focused on infectious diseases, such as HIV and malaria, making Alzheimer’s Disease Gates’ first foray into neuroscience.  In this regard, he has some catching up to do to match philanthropic contributions and business pursuits by other tech billionaires.  These include his Microsoft co-founder Paul Allen, who started the Allen Institute for Brain Sciences with $100 million in 2003.  The Allen Institute provides a range of tools for basic researchers using mouse models, generating comprehensive maps of brain anatomy, connectivity and gene expression.  More recently, Tesla founder Elon Musk started Neuralink, a venture which aims to enhance cognitive ability using brain-machine interfaces.  Kernel, founded by tech entrepreneur Bryan Johnson, has a similar goal.  Finally, while the Chan Zuckerberg Initiative (started by Facebook CEO Mark Zuckerberg in 2015) doesn’t explicitly focus on neuroscience, its science program is led by acclaimed neuroscientist Cori Bargmann.

As pointed out by former National Institutes of Mental Health Director Tom Insel, this infusion of money, as well as the fast-moving, results-oriented tech mindset behind it, has the potential to transform neuroscience and deliver better outcomes for patients.  As government funding for science appears increasingly uncertain, such interest and support from private investors is encouraging.  Hopefully the results will justify their optimism.

(Sanjay Gupta, CNN)



Elusive particles create a black hole for funding

The Large Hadron Collider (LHC) enabled a scientific breakthrough in 2012 when it was used to produce evidence for the Higgs boson, a physical particle that endows matter with mass.  In the wake of the worldwide excitement generated by that discovery, physicists finalized plans for a complementary research facility, the International Linear Collider (ILC), to be built in Japan.  While the LHC is circular and collides protons, the ILC would collide positrons and electrons, at lower energy but with more precise results.  Unfortunately, anticipated funding for the $10 billion project from the Japanese government has failed to materialize.  Following recent recommendations by Japanese physicists, the group overseeing the ILC has now agreed on a less ambitious proposal, for a lower energy machine with a shorter tunnel.  Though physicists remain optimistic that the ILC will still provide useful data, it will no longer be able to produce high-energy quarks (one of its planned uses), and will instead focus on previously detected particles and forces.  The ILC’s future is currently in limbo until the Japanese government makes a concrete financial commitment, and it is unlikely to be completed before 2030.

After the Higgs boson, the LHC struggled to find proof of the existence of other new particles.  One such high-profile disappointment was the search for dark matter.  When dark matter was hypothesized to be the source of unexplained gamma radiation observed with NASA’s Fermi Space Telescope, the search for a dark matter particle became a top priority for the LHC’s second run.  Such evidence would also have supported supersymmetry, a key theory in particle physics.  However, these efforts, as well as multiple others using different detectors, have thus far failed to find any signs of dark matter.  These unsuccessful experiments certainly contributed to scaling back the ILC, and illustrate larger problems with setting realistic expectations and/or valuing negative results among scientists, government officials, and the public.  As a result, in order to advance our understanding of the basic building blocks of our universe, particle physicists will now have to do more with less.

(Edwin Cartlidge, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

November 14, 2017 at 5:40 pm

Science Policy Around the Web – November 10, 2017

leave a comment »

By: Vanessa L.Z. Gordon-Dseagu, PhD


Source: Pexels


House Passes CHIP Funding, Sends Bill to Uncertain Future in Senate

The Children’s Health Insurance Program (CHIP) aims to extend healthcare coverage for children living in low-income households. Funded jointly by the federal government and states, the program currently covers around 8.9 million children and 370,000 pregnant women. Initially started in the 1990s, the program has helped to reduce the number of uninsured children from 13.9% in 1997 to 5.3%% at the start of 2017:. Previous funding for CHIP stopped on Oct 1, 2017, although the majority of states have enough money to pay for the program until 2018.

On Friday November 3rd, the House passed (by 242-174 votes) the Healthy Kids Act to extend funding for CHIP for the next five years. Although passed, the act is not without contention. The majority of Democrats voted against the bill due to concerns over how the program would be funded. The current act aims to fund CHIP via increased premiums for older, wealthier individuals on Medicare, narrowing the period for those under Affordable Care Act (ACA) to pay their premiums and taking money from the ACA’s Prevention and Public Health Fund -the fund is the first of its kind mandated to focus upon prevention and improvements in public health.

As reported in a number of sources, the issue of funding is likely to delay the bill once it reaches the Senate, with both Democrats and Republicans accusing each other of delay tactics. Rep Frank Pallone (D-N.J.) stated that the bill “will go to the Senate, and it will sit there”, while Greg Walden (R-Ore.) accused the Democrats of “Delay, delay, delay.” A number of commentators suggest that funding for CHIP will be rolled in to an end-of-year budget seeking to prevent a government shut-down.

(Laura Joszt, The American Journal of Managed Care)


Maine Voters Approve Medicaid Expansion, a Rebuke of Gov. LePage

The ballot question was a relatively simple one:

“Do you want Maine to expand Medicaid to provide healthcare coverage for qualified adults under age 65 with incomes at or below 138% of the federal poverty level, which in 2017 means $16,643 for a single person and $22,412 for a family of two?”

And on the 7th of November, in the first successful vote of its kind, residents of Maine demonstrated overwhelming support (59% voted yes) for the policy. Seeking to expand Medicaid, under the umbrella of the Affordable Care Act, an estimated 70,000 individuals living in the region would receive coverage – the program is specifically for those who are currently uninsured and earning ≤138% of the federal poverty line – 31 states already have similar programs in place.

A key stumbling block to implementation of the program comes from strong opposition from Maine’s Governor Paul LePage. The governor, who had on five occasions previously vetoed health insurance expansion bills proposed by the state legislature, responded to the vote in a statement on the 8th of November calling it “fiscally irresponsible” and “ruinous to Maine’s budget.” In defense of the expansion, Anne Woloson of Maine Equal Justice Partners commented that, with 90% of funding coming from central government, “This will improve the Maine economy. It’s going to create good-paying jobs.”

LePage also stated that his administration was not prepared to expand Medicaid unless the state found a way to pay for it at the levels calculated by the Department of Health and Human Services, but the ability of the governor to veto the implementation of such legislation is limited and it is likely to become law in 2018.

Following the ballot’s success, and with increasing support for Medicaid expansion nationally, advocates in Utah and Idaho are planning similar votes in their own states.

(Abby Goodnough, The New York Times)


Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

November 10, 2017 at 6:55 pm

Science Policy Around the Web – November 7, 2017

leave a comment »

By: Rachel F Smallwood Shoukry, PhD 


source: pixabay

Science and Society

What’s Your (Epistemic) Relationship To Science?

A recently published paper presented the results of a meta-analysis of studies that were published in the journal Public Understanding of Science. The goal of the paper was to determine whether these studies were correctly distinguishing scientific knowledge versus scientific understanding. While the exact definitions are a topic of debate in the field of epistemology, understanding of a concept is generally considered to require a deeper level of comprehension than having knowledge of a concept. This level of comprehension should allow one to draw inferences and make judgements. For the purposes of their paper, the authors set a specific definition of understanding, which included the ability to grasp “how a constellation of facts relevant to that subject are related to one another.” They found 67 papers that used the term “understanding” or similar, and compared the papers’ definitions and descriptions of understanding with their specified one. They found that only one paper defined understanding in line with their definition, and only six used it consistently without explicitly defining it correctly. Forty-seven papers were unclear about their definition of understanding, two used alternate definitions, and 11 conflated knowledge and understanding.

In a follow-up analysis, authors examined papers that were evaluating an “epistemic state”, whether the evaluation targeted the specified definition of understanding correctly. Of the 13 papers they found, only one correctly measured understanding, while two others attempted to. The two attempts and the other ten papers were more closely targeting an epistemic state that resembled knowledge than one resembling understanding.

The authors expressed concern over these findings and promote that scientists and educators think carefully about the difference between knowledge and understanding, and to be more deliberate in targeting understanding. The United States has consistently ranked behind many countries in the fields of science and mathematics. Besides the international ranking of the general population’s scientific understanding, there are other important reasons for the public to pursue understanding over knowledge. Voters elect officials who determine policy, priority, and funding of scientific concerns and research. Additionally, social media plays an increasingly prominent role in the public’s consumption of science news and information, and having a public with a better understanding of science can only help in shaping society’s actions regarding scientific discoveries and recommendations for health, environmental, technological, and social advancement.

(Tania Lombrozo, NPR)


Public Health

US government approves ‘killer’ mosquitoes to fight disease

The US Environmental Protection Agency (EPA) has approved the start-up company, MosquitoMate, to release lab-bred, non-biting mosquitos infected with the bacterium Wolbachia pipientis in order to infect wild mosquito populations. The goal is decrease the Asian tiger mosquito population by releasing infected males which will then create nonviable progeny with wild females. The Asian tiger mosquito can transmit diseases such as dengue, yellow fever, West Nile Virus, and Zika. The EPA has approved the release of these mosquitos in 20 states plus Washington, DC, stating that the approved states have climates similar to those in the regions where MosquitoMate performed test trials.

The biggest obstacle MosquitoMate faces presently is production time; males must be separated from the females, and the company currently separates them both by hand and mechanically. Improvement is possible, however, proven by the research team from Michigan State University and Sun Yat-sen University in China who release millions of mosquitoes infected with the same bacterium weekly in China. They mechanically separate the males and females with 99% accuracy.

Another company using lab-altered mosquitoes to fight wild mosquitoes, Oxitec, has hit resistance in the US. Although widely used in Brazil, voters in Florida were wary of releasing the genetically modified mosquitoes into the wild. People seem to fear the unanticipated consequences of having a large population of GM organisms on the loose. MosquitoMate has appeared to avoid much of this controversy due to the distinction of their mosquitoes being infected with a common bacterium in insects, rather than being genetically engineered. There has been little attention to MosquitoMate’s activities in trial areas, and most of the feedback has been positive. The company will begin by selling their mosquitoes locally and then expanding to nearby areas and beyond as they are able to boost production

(Emily Waltz, Nature News).

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

November 7, 2017 at 5:21 pm

Science Policy Around the Web – November 3, 2017

leave a comment »

By: Liu-Ya Tang, PhD


source: pixabay

Cancer Research

Genomic studies track early hints of cancer

Does getting cancer mean the end of the world? No, if it is found early and treated early. Early detection of cancer greatly increases the chances for successful treatment and survival. Researchers have been striving to develop new approaches for diagnosing cancer, as effective screening tests for early detection do not exist for many types of cancer.

Liquid biopsy is one of the new techniques developed in recent years. Different from traditional biopsies which are usually based on tissues, liquid biopsy is a test done on blood, urine or sputum samples. For example, blood based biopsies are to detect circulating tumor cells, DNA or other substances released by tumor cells. One prominent advantage of liquid biopsy over traditional biopsy is that liquid biopsy is less invasive. In addition to being used in cancer diagnosis, liquid biopsy can also monitor a patient’s response to a treatment plan. After several years’ research at the bench, liquid biopsy is now being applied to diagnose and treat cancer patients.

Liquid biopsy can detect cancer at early stage, but some researchers are aiming to detect cancer before its onset. Two recently-funded projects of genomic studies also use DNA sequencing techniques, similar to those used in liquid-biopsy, but they are using the technology to get one step ahead of cancer. To understand the molecular events that trigger benign tumors to become malignant, researchers are interested in creating a “pre-cancer genome atlas” by sequencing DNA from precancerous tissues. This approach will be applied to lung, breast, prostate and pancreatic cancer. It is particularly important for pancreatic cancer as no early detection method is available, leading to high mortality. Moreover, many pancreatic tumors seem to be driven by mutations in the same genes, which could make the disease predictable if known mutations can be detected at the precancerous stage.

As part of the National Cancer Moonshot Initiative, cancer prevention and early detection are as important as cancer treatment in fighting cancer. If the researchers can successfully identify the early hints of cancer development, it would possibly make cancer more predictable or preventable. Avrum Spira at Boston University in Massachusetts, a leader of both projects, believes her work “could herald a change in how researchers approach cancer prevention.”

(Heidi Ledford, Nature News)


The Scientific Workforce

NIH-Funded Network to Foster Diversity: Achievements and Challenges

In October 2014, the National Institutes of Health launched a $250 million diversity initiative, which included a 5-year $22 million grant to support the National Research Mentoring Network (NRMN). The purpose of NRMN is to increase workforce diversity by bringing more historically underrepresented population, which includes blacks, Hispanics and Native Americans, into biomedical research. The mentoring network covers the full spectrum of research community, from undergraduates to senior faculty members. The content of mentoring is not limited to training in grant writing or application, but also various aspects of professional development.

One advantage of NRMN is its expansion of the traditional way of mentoring, where young scientists learn the profession through their contact with supervisors. The mentees could suffer if good mentoring doesn’t come naturally to their mentors or when the extent and quality of mentoring are not tracked. The mentoring needs are particularly high in less research-intensive institutions because potential mentors might be scarce. To overcome these, NRMN was created to provide a mentoring resource to mentees, which is very important to minority students with less access to high quality mentoring. By signing up online, a mentee will be matched by NRMN system based on mentee’s interests and professional goals. As of June 30, NRMN has 3713 mentees and 1714 mentors, compared to only 37 mentees and 16 mentors who signed up during the first year. Demographically speaking, blacks and Hispanics make up 57% of the mentee group compared to only 11% of the Bachelor of Science holders nationwide. NRMN has featured success stories from Rachel Ezieme, who is the daughter of Nigerian immigrants, and Crystal Lee, who is a Native American from the Navajo tribe.

In addition to its success, some challenges still exist. One big challenge is recruiting more mentors. As good mentoring is not directly tied to professional achievement, senior faculty members may not take the extra effort to seek the opportunity to mentor. Dr. Hannah Valantine, who is the National Institutes of Health’s chief officer for scientific workforce diversity, commented that “the burden is enormous”. As evidenced by NRMN data, 58% of the mentors are women, however less than 40% of the tenured faculty positions are held by a woman in the United States. A solution to this could be “tie NIH funding to a university’s commitment to mentoring”, which was suggested by Dr. Karen Winkfield at Wake Forest University in Winston-Salem, North Carolina. A second problem is associated with the evaluation of the program. Dr. Keith Norris, who is leading an official assessment team for NRMN, tries to identify good mentoring and determine which element of NRMN is beneficial for professional development. He said that there’s a large degree of variation in how the mentees and mentors have interacted. Additionally, mentoring relationship can be very short or can last for several years. To address this, Dr. Norris plans to apply “high-touch interventions” and hopes that the results can generalize to the entire population.

(Jeffrey Mervis, ScienceInsider)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

November 3, 2017 at 6:44 pm

Science Policy Around the Web – October 31, 2017

leave a comment »

By: Michael S. Tennekoon, PhD


source: pixabay

Forensic DNA testing

Using DNA to Sketch What Victims Look Like; Some Call it Science Fiction

CSI on Steroids”. That is how the latest forensic tool used by law enforcement agencies has been described. It is called phenotyping. But is it all that it promises to be?

Phenotyping is a technique that uses human DNA, from materials such as skin or blood, to predict an individual’s appearance. By analyzing the genetic sequence, scientists can look for genes that code for physical characteristics such as skin color, eye color, geographic ancestry, and many others. This information is then plugged into a computer algorithm to predict an individual’s appearance. Today, dozens of law enforcement agencies from New York to Louisiana use phenotyping for cases where traditional forensics has resulted in no leads.

However, critics argue that this technique is reaching far beyond its means, especially as there is a lack of peer-reviewed research to back up its claims. Indeed, Parabon Nanolabs, one of the pioneering companies that offers phenotyping to law enforcement agencies, has yet to publish the methods underlying its techniques. Furthermore, Dr. Yaniv Erlich, a computer scientist that studies genetics at Columbia University, states that apart from basic predictions like human ancestry, phenotyping of faces is “on the verge of science fiction.”

In addition to concerns about the reliability of phenotyping, there are other ethical and legislative concerns. For example, the New York Civil Liberties Union points out that using ancestry to identify potential suspects in a criminal case will place many innocent people without any connection to the incident under suspicion. Theoretically, this could be used in a similar way to ‘stop and frisk’. These concerns are in addition to the already well-documented susceptibility of DNA testing to human error and bias.

States are still in the process of establishing laws and guidelines to regulate DNA testing, including phenotyping, for use in criminal cases. For example, in New York, one must have authorization from state officials before DNA testing is done.

Ethical issues not withholding, supporters such as Deputy Chief Katranakis, who is the commander of the New York Forensics Investigation Division believe that phenotyping offers more benefits than drawbacks, especially for cases where there are no other alternatives. However, as the use of phenotyping becomes more prevalent, caution must still be urged when weighing the contribution of phenotyping to criminal cases.

(Ashley Southall, The New York Times)



Research Misconduct

The Cookie Crumbles: A Retracted Study Points to a Larger Truth

Generously, the chances for a PhD student to get a job in academia are less than 15%. Therefore, the pressure to publish has never been higher. Some would argue that, because of this pressure to publish, there is an increased quantity of lower quality research. Perhaps unsurprisingly then, there is a big problem of researchers failing to replicate studies in the social sciences, and there has been a sharp increase in the number of the papers that have been retracted over the past decade.

On Friday, October 20th, another study which appeared to offer a cheap and simple tool for the fight against national obesity has just been retracted as well. The study suggested that simply placing cartoon Elmo stickers on apples could nudge more children to pick an apple over cookies when offered the choice. However, other researchers noted discrepancies with the numbers in the paper, which led to the submission of a replacement article by the original authors. However, the problems continued when it became known that the study was actually performed on children much younger than originally reported (3-5 years old rather than 8-11 years old, as reported). This situation is exacerbated by the fact that these concerns may also have impacted other published reports from the same lab.

Studying ways to change complex eating behaviors in children is no easy task. Children are considered a vulnerable population and there are several additional regulatory requirements for doing work specifically with children. Examples include parental permission, working on school premises, and getting additional approval from Institution Review Boards to name but a few. However, these hurdles are no excuse to bypass scientific rigor. Given the ease by which scientific findings can reach the masses through social media and the press, scientists must take on the responsibility to be extra vigilant to ensure their findings are accurate, or risk losing the public’s trust and ultimately public funding for the wider scientific community.

(Aaron E. Carroll, The New York Times)


Climate Change

Fighting Poverty Might Make it Harder to Fight Climate Change

At first glance, the goal of tackling poverty appears noble and completely unrelated to tackling climate change. However, new research shows that eradicating poverty may indeed make it harder to tackle climate change. Why? If extreme poverty is eradicated, people may travel more and increase their energy consumption, thus creating a larger carbon footprint.

Given this potential conflict, researchers from the University of Maryland in College Park modeled what the impact of eradicating poverty would be on climate change. The authors found that eradicating extreme poverty (i.e. increasing income from less than $1.90 a day to between $1.90 and $2.97 a day) would not jeopardize current targets for tackling climate change. However, lifting everyone to the next income level (the global middle-income level defined as living on $2.97-8.44 per day) would have a significant impact (an extra 0.6°C of warming) on climate change.

This leaves the global society in the precarious moral position of deciding what level of poverty is acceptable to ensure the sustainability of the planet. If we not only want to eradicate poverty, but also wish to bring everyone to the middle class, we would need to dedicate almost 7 times more resources than we are currently towards tackling climate change.

However, all hope is not lost. Clean energy, if it becomes cheaper than fossil fuels, would be a viable option for developing nations to use to fuel economic growth and hence would reduce future carbon emissions. There are some encouraging signs that this may be a possibility. In recent years, while the global economy has grown, carbon-dioxide emissions have not followed suit. Amazingly, emissions in the United States, Europe and China have actually fallen—though the amount of carbon-dioxide accumulated in the atmosphere has increased. In the meantime, however, the authors of the current study call for lifestyle changes, such as taking public transportation, living in smaller houses, and eating less meat.

(Allie Wilkinson, Science Magazine)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

October 31, 2017 at 8:43 pm

Science Policy Around the Web – October 27, 2017

leave a comment »

By: Cindo O. Nicholson, Ph.D.


source: pixabay

The Opioid Crisis

Engulfed in opioid deaths, Ohio turns to science

The state of Ohio has been the most plagued by the opioid scourge in the United States. It suffered the most opioid-related overdoses in 2014, and the number of opioid overdoses have since increased by 32% according to the state’s health officials. The increase in opioid-related overdoses is not due to a lack of effort on Ohio’s part. In fact, the state spent nearly $1 billion on measures like prevention, law enforcement, and treatment, but all to no avail.

In May of this year, the state of Ohio has embraced a new strategy to curb its opioid crisis – technology. The state has passed a request for $20 million to boost the development of new technologies that will stem the tide. While the three-phase, prize-base Ohio initiative is still accepting applications, technologies vying for funding can be broadly grouped into two categories, non-opioid compounds or devices for treating pain and chemical compounds that effectively reduce the withdrawal symptoms of those already gripped by addiction. Non-opioid technologies for pain treatment that are seeking funding include, an implantable mesh of a special polymer, loaded with sufficient doses of non-opioid pain relievers, allowing the controlled delivery of pain medication and a wearable device that uses electric pulses to calm nerves that are sending pain feedback to the brain.

A compound that has already been FDA approved for use in helping recovering addicts manage withdrawal symptoms that could be immediately put to use in Ohio is Probuphine. Probuphine is an implant that consists of a partial opioid receptor agonist buprenorphine linked to a polymer. When worn, the Probuphine implant provides a low dose of the chemical over a six-month period.

These measures are not a “cure-all” as their success will be dependent on patients’ adherence to strict drug regimens. Also, patients will need to resist the urge to discontinue their follow-ups with doctors upon having early success with these interventions. Nevertheless, there is hope that more research and development of effective non-opioid pain therapies and therapies that can minimize the withdrawal symptoms will significantly reduce opioid overdoses in Ohio. The success of this approach in Ohio will serve as model for other states and countries, and show how policies in support of scientific research can benefit communities.

(Alfonso Serrano, Scientific American)

Genomics Policy

The Navajo Nation is considering a new policy to allow genetic research

Tribal leaders of the Navajo Nation are considering put in an end to a fifteen-year moratorium on studying the DNA of its people. The Navajo Nation is an independent Native American territory occupying 71,000 square kilometers (27,413 square miles) of land on portions of Arizona, Utah, and New Mexico. Like many Native American Nations, the Navajo were concerned about the potential for misuse and privacy infringements of genetic research performed by scientists from outside the community. Additionally, in the early 2000s the Navajo Nation’s department of health did not feel they had enough expertise to pursue genetic research and wished to develop their own research policies.

Now, the Navajo Nation will open its first oncology center in Tuba City, Arizona. This will be a major help to Navajo people living on the reservation who currently have to drive hundreds of kilometers to get specialized care. It also has also elicited a reconsideration of the moratorium on genetic research. If the moratorium on genetic research is lifted, it would allow the collection of blood and other tissue samples for further study. The Navajo Nation’s department of health is collaborating with traditional leaders, tribal officials and other delegates to draft a policy that would allow approval of genetic research, and maintain control of DNA samples.

Being able to collect genetic material and maintain them on the reservation would allow research to be conducted on the reservation which now has tribe members that are geneticists, bio-ethicists and other types of medical experts. Lifting the moratorium would allow Navajo scientists and medical experts to research genetic and environmental factors underlying other diseases exhibited by members of the Navajo population.

Ultimately the success of whatever new policy adopted by the Navajo Nation should increase the availability of specialized and personalized care for its people. It would lay the groundwork for regulating the use of genetic material requested from Navajo repositories by scientists outside of the territory. In addition, the success of a new policy for genetic research would serve as a model for other Native American territories seeking to establish their own policies on genetic research.

(Sara Reardon, Nature)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

October 27, 2017 at 5:05 pm