Science Policy For All

Because science policy affects everyone.

Archive for September 2017

Science Policy Around the Web – September 29, 2017

leave a comment »

By: Allison Dennis, B.S.

20170929_linkpost

source: pixabay

Hospital Budgets

Another health care deadline looms: Payments to safety net hospitals due to expire

On October 1, 2017, America’s healthcare safety net will face its first in a series of annual funding cuts to Disproportionate Share Hospitals payments scheduled by Obamacare. This safety net is held together by hospitals that shoulder the responsibility of serving the uninsured, Medicaid, and economically-vulnerable patients, often at a financial loss. The American Hospital Association reports that hospitals are only reimbursed 88 cents for every dollar they spend on Medicaid patients. Further insufficiencies arise when hospitals adhere to The Emergency Medical and Treatment Labor Act of 1986, which mandates the treatment of patients seeking emergency services regardless of their ability to pay. Highly profitable hospitals that attend communities where these patients represent a marginal burden can absorb the cost through lost profits. However, hospitals serving communities where these patients pose an undue burden are supported by subsidies from federally-funded, state-matched, Disproportionate Share Hospitals payments. The scheduled cut for this year would reduce the $21 billion allocated state-by-state in fiscal year 2017 by a total of $3.6 billion, $2 billion from federal contributions combined with $1.6 billion from state contributions.

Obamacare expanded Medicaid to cover individuals up to 138% of the Federal Poverty Level, set at $33,948 for a four-person household and $16,642 for an individual, reducing national levels of uncompensated care. Further, the law provided opportunities for individuals to purchase insurance coverage through HealthCare.gov, while implementing penalties for those who chose to forgo coverage. Lawmakers included a schedule for reducing Disproportionate Share Hospital Payments on the assumption that the proposed changes in the healthcare system would be sufficient to reduce the financial burden on safety net hospitals, lessening the need for federal and state assistance.

The rate of uninsured Americans did fall to 8.8% in 2016, compared with 16.3% in 2010. However the extent to which this change has improved revenues for safety net hospitals remains unclear. Further occluding the readiness of hospitals to cope with the budget cut are the unique challenges each may face. The agreement to extend discussions of the 2018 fiscal year budget past the October 1 deadline may give lawmakers the last minute chance to forgo the cuts for yet another year.

(Max Blau, STATnews)

Pharmaceutical Regulation

You’ve heard about precision medicine. Now get ready for precision drug ads

In 2016, $5.7 billion, about $17.5 per person, was spent by pharmaceutical companies on traditional advertising. But how many of those ads were seen by the people who need them? Their recent interest in the online giants Facebook and Pandora, suggests that pharmaceutical companies are looking to enter the new age of advertising, targeted ads. The Health Insurance Portability and Accountability Act (HIPAA) of 1996 prevents insurance companies and healthcare providers from sharing an individual’s health information, especially when it may be used for marketing purposes. However, being privy to a person’s private interactions with their electronic devices may be far more revealing about a user’s health than conversations with their doctor.

The FDA has been tasked with regulating pharmaceutical advertising on the internet. However, applying the rules surrounding pharmaceutical adverting in the age of 140 character limits and browser history mining will take some reimagining. So far, the FDA’s approach to limiting inappropriate advertising has been to call it when they see it. In 2015, the FDA famously issued a warning letter regarding an Instagram post made by Kim Kardashian, seemingly promoting a prescription-only anti-nausea pill. Nevertheless, in 2017, Pfizer successfully experimented using geographical information to target ads to online consumers without any FDA upset.

This comes as a growing field of research is investigating the observation that the words used to describe a particular disease may influence the treatment options a patient gravitates towards. Parents who were told their child had “pinkeye” instead of an “eye infection” were more likely to give their child a course of antibiotics, even when doctors stated the treatment was likely ineffective. Altering the language appearing in targeted ads on platforms like facebook may further provide means for social experimentation, adding another layer of concern for the FDA.

(Rebecca Robbins, STATnews)

Have an interesting science policy link?  Share it in the comments!

Advertisements

Written by sciencepolicyforall

September 29, 2017 at 7:40 pm

Science Policy Around the Web – September 26, 2017

leave a comment »

By: Rachel F Smallwood, PhD

20170926_Linkpost

source: pixabay

Public health

Air Pollution Tied to Kidney Disease

A new study has reported a link between kidney disease and air pollution. Using data collected from over 2.4 million veterans, researchers were able to examine this relationship by consulting NASA and EPA pollution data. They found that glomerular filtration rate, a measure of kidney function that quantifies how much blood is passing through the kidneys to be filtered, decreased as levels of fine particulate matter less than 2.5 microns in diameter (PM 2.5) increased. These particles are small enough to enter the bloodstream where they then enter the kidneys. The authors estimate that 44,793 cases of chronic kidney disease and 2,438 cases of end-stage renal disease can be attributed to PM 2.5 levels that exceed EPA standards.

Kidney disease is just the latest disease that can be partially attributed to air pollution. Pulmonary conditions, cardiovascular disease, and stroke have been established as being contributed to and exacerbated by air pollution. Earlier this year, it was reported that air pollution also increased the risk of Alzheimer’s and dementia. With so many researchers reporting on such a wide variety of adverse health effects due to air pollution, it is becoming more imperative to address these issues and not lose ground in the struggle for cleaner air. Citizens and policymakers need to be educated about the importance and vigilant about the risks of air pollution. They need to work together to find ways to reduce pollution levels, not just for the planet for future generations, but for the health of today’s.

(Nicholas Backalar, The New York Times)

Biomedical Research

Scientists grow bullish on pig-to-human transplants

Speaking of kidney disease, the list of people in the United States waiting to receive a kidney transplant has almost 100,000 entries. One solution that scientists and physicians have long considered is xenotransplantation – harvesting donor kidneys and other organs from animals like pigs, which naturally have human-like anatomies. Until now, there have never been any demonstrations that have come close to being acceptable for trials in humans. However, that may be changing. A few research groups are reporting that they are close to moving into human trials and have begun early talks with the FDA.

They have been testing their methods by implanting pig kidneys and hearts into monkeys. The monkeys typically have an extreme immune response, which is what the groups have been attempting to ameliorate. Researchers have not been able to completely eliminate the immune response, but recently they reported that a transplanted kidney lasted over 400 days in a macaque before rejection. A transplanted heart lasted 90 days in a baboon before experimental protocol required that they stop the experiment. A previous experiment demonstrated the ability to keep a pig heart viable when implanted into an immune-suppressed baboon’s stomach for over two years, though it was just to test the biocompatibility and the baboon still had its autogenous heart.

This success is partially attributable to CRISPR technology, which has allowed scientists to remove portions of the pig DNA that intensify the immune response. The International Society for Heart and Lung Transplantation laid out bare minimum of requirements for moving xenotransplantation into human trials. They require at least 60% survival of life-supporting organs for at least 3 months with at least 10 animals surviving. Scientists also need to provide evidence that survival could be extended out to 6 months. These experimental results and minimum requirements are informative for expectations of xenotransplantation: it is not a permanent solution (at least not any time soon). However, they may provide temporary solutions that give patients more time while they are waiting on transplants from human donors. This is good news; over 7000 people died on an organ transplant waiting list in 2016. For those just trying to get through with dialysis or who just need a few more months before receiving their heart, these xenotransplants could mean the difference between life and death.

(Kelly Servick, Science)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 26, 2017 at 4:58 pm

Science Policy Around the Web – September 22, 2017

leave a comment »

By: Leopold Kong, PhD

20170922_Linkpost_1

source: pixabay

Scientific Publishing

Network strives to ‘make your research visible’…unless that research is copyright

More and more, scientific research is facilitated by large scale collaborations and sharing of new results across multiple disciplines.  However, a large proportion of scientific publications are behind paywalls, greatly limiting their access.  Since launching in 2008, ResearchGate has provided a social network site where researchers can upload and share their work.  With more than 13 million members and having raised more than $87 million in funding, ResearchGate has become an invaluable resource to scientists.  However, a recent study found that 201 out of 500 randomly picked English journal articles shared on the website infringed on publishers’ copyright.

Now, the International Association of Scientific, Technical and Medical Publishers (STM) in Oxford, UK, representing over 140 scientific, technical and medical publishers, has written to ResearchGate about their concerns, citing copyright violation.  The letter asks the website to improve communications educating users about copyright issues.  “The STM hopes that ResearchGate will choose to work with publishers to achieve a long-term sustainable solution that makes sharing of content simple and seamless, but importantly that respects the rights of authors and publishers,” says Michael Mabe, CEO of STM.  Some users such as neurobiologist Björn Brombs (University of Regensburg, Germany), may be dismayed by the potential loss of free paper sharing, viewing the letter as a ”thinly veiled threat”. Jon Tennat, communications director of professional research network  Science Open  says, “The unasked question that this all comes down to is: Do publisher-owned rights matter more than the sharing of research for whatever benefit?” Earlier this year, STM organized a teleconference to discuss efforts towards fostering fair sharing of papers without breaching copyright.  This may mean providing free links to final,read-only,non-downloadable articles. Ultimately, changes in the scientific publishing industry may be necessary to further facilitate the spread of knowledge and progress in research.

(Dalmeet Sing Chawla, Science)

The History of Science

Statues: an insufficient medium for addressing the moral shortcomings of scientific giants

The existence of monuments, place-names and building-names dedicated to individuals who practiced extreme racist acts in the past have come under heavy fire in recent years.  For example, Yale’s Calhoun College, named after a white supremacist who promoted slavery, was renamed earlier this year to Grace Hopper College, in honor a trail blazing computer scientist.  Also, there have been numerous efforts to remove Confederate statues, including the Robert E. Lee statue in Charlottesville that sparked a major confrontation between violent white supremacists, who gathered as part of a Unite the Right rally, and liberal activists. David Blight, a professor of history at Yale told Newsweek that, “This debate needs a big dose of humility […] There’s a danger here that we lose hold of learning from the past just by trying to make it feel and look better.”  He adds that it is hard when a city has a Confederate statue at its center because “that’s a city saying to the world ‘ this is the most important part of our past.”

Science and its monuments are not free from this debate.  Through her research, Harriet A. Washington, a medical ethicist and historian, has unearthed racist atrocities committed by James Marion Sims, the “father of American gynaecology” and founder of the New York Women’s hospital.  Taking direct readings of Sims’ personal medical correspondence, in addition to contemporary scholarship, Harriet revealed an extremely tainted past.  For example, Sims performed many fruitless surgeries on enslaved women, complying with only their masters’ consent, and forgoing anesthesia, believing that African Americans did not feel pain.  These findings may lead to the dismantling of the many statues dedicated to Sims.  “As the statues and portraits of Sims make clear, art can create beautiful lies […] No scientist, no thinking individual, should be content to accept pretty propaganda,” concludes Washington, in a recent essay published in Nature.

(Harriet A. Washington, Nature)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 22, 2017 at 5:46 pm

Science Policy Around the Web – September 19, 2017

leave a comment »

By: Nivedita Sengupta, PhD

20170919_Linkpost_1

source: pixabay

Global Warming

The world may still hit it’s 1.5 °C global warming mark, if modelers have anything to say about it

Nature Geoscience published a paper on 18th September, which deliving what could be good news regarding climate change and global warming. The team of climate scientists in that paper claimed that it is possible to limit global warming to 1.5 °C above pre-industrial levels, as mentioned in the 2015 Paris climate agreement. The data in the paper stated that the global climate models used in a 2013 report from the Intergovernmental Panel on Climate Change (IPCC), overestimated the extent of warming that has already occurred. By adjusting this parameter when performing model-based calculations, they concluded that the amount of carbon that humanity can emit from 2015 and onwards to keep temperatures from rising above 1.5 °C is almost 3 times greater than the previous estimation done by IPCC. The conclusions have significant implications for global policymakers and suggests that IPCC’s carbon budget can be met with modest strengthening of the current Paris pledges up to 2030, followed by sharp cuts in carbon emissions thereafter.

Some scientists are already challenging the conclusions from the paper, questioning the analysis’s reliance on a period of slower warming, a so called climate hiatus, at the beginning of this millennium which continued until 2014. They say that natural variability in the climate system produced lower temperatures during that period and hence any calculations based on that can be artificially low because it calculates the human contribution towards warming during those periods. Moreover, the oceans and the land were probably absorbing more carbon than normal during this period. Hence natural processes can return some of that carbon to the atmosphere. Taking these factors into account reduces the predicted amount of carbon that can be released while keeping atmospheric temperatures under the 1.5°C limit. But the authors of the paper argue that the climate hiatus should not significantly contribute too their conclusions. They feel the multiple methodologies used to estimate the actual warming due to greenhouse gases should allow the calculations to be accurate irrespective of any short-term climate variability.

Nonetheless,  humanity’s rapid assent towards the global warming threshold is muddled by modelling scenarios, framing the 1.5 °C as a very small target for policy makers and scientists to try to hit by 2030. The fine details of carbon emission matters when scientists are looking for the precise effects of the different greenhouse gases on global warming. But even if the paper proves accurate in its prediction, huge efforts to curb greenhouse-gas emissions will still be necessary to limit warming. As the author of the paper, Dr. Millar, says, “We’re showing that it’s still possible. But the real question is whether we can create the policy action that would actually be required to realize these scenarios.”

(Jeff Tollefson, Nature News)

Debatable Technology

What’s in a face…prediction technology

A paper by the genome-sequencing pioneer Craig Venter published on 5th September raised a lot of criticism and have gathered fears about genetic privacy. The paper, published in the Proceedings of the National Academy of Sciences (PNAS), claims to predict people’s physical traits from their DNA. Dr. Venter and his colleagues sequenced the whole genomes of 1,061 people of varying ages and ethnic backgrounds at Human Longevity, Inc. (HLI). Artificial intelligence was applied to analyze combination of each participant’s genetic data with a high-quality 3D photographs of the participant’ face. This analysis revealed single nucleotide changes in the genetic code between participants that corresponded with with facial features such as cheekbone height and other factors like height, weight, age, vocal characteristics and skin color. Using this approach, they could correctly pick an individual out of a group of ten people randomly selected from the HLI’s database, 74% of the time. This technology could be tremendously powerful for any agencies handling human genome data. Simply removing personal identifying information, which is routinely done in practice, would not eliminate the possibility that individuals could still be identified by the data itself.

However, reviewers of the paper says that the claims are vastly overstated and the ability to use a person’s genes to identify the individual is hugely overblown. According to the skeptics, just knowing the age, sex and race alone can eliminate most of the individuals in a randomly selected group of ten people from a data set as small and diverse as HLI’s. Computational biologist Yaniv Erlich of Columbia University in New York City provided evidence in support of this statement by looking at the age, sex and ethnicity data from HLI’s paper. According to his calculations knowing only those three traits was sufficient to identify an individual out of a group of ten people in the HLI data set 75% of the time, irrespective of any information on the genome. He concluded that the paper doesn’t demonstrate that individuals can be identified by their DNA, as it claims to. HLI counter argued by saying that they used multiple parameters to identify someone, out of which a person’s face is just one.

The review process that the paper underwent is not standard for most journals. By submitting to PNAS as a member of the US National Academies of Science, Engineering, and Medicine, Venter was allowed to hand select the three reviewers his paper would be evaluated by. While the issues surrounding this paper are being hotly debated by members of the scientific community, some fear Venter’s stature will give the paper undo weight to policymakers, who may become overly concerned about DNA privacy and can thus affect the rules and regulations making processes.

(Sara Reardon, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 21, 2017 at 5:45 pm

Science Policy Around the Web – September 15, 2017

leave a comment »

By: Liu-Ya Tang, PhD

20170915_Linkpost_1

source: pixabay

Public Health

A new flavor in cancer research: sugar

Sugar is an important energy source for fueling our body. However, eating too much sugar doesn’t do any good to our health. It can cause obesity or diabetes, both of which are considered risk factors for cancer. Moreover, there is evidence showing that sugar may play a direct role in cancer development/progression. Here’s why.

The first study is about how sugar protects cancer cells from attack by the immune system. The immune system is the safeguard of our body that acts by cleaning out “foreign” invaders or bad cells. When the cell becomes cancerous, it may disguise itself and escape the challenges imposed by the immune system. Dr. Carolyn Bertozzi’s group at Stanford University found that cancer cells have denser sialic acid, a type of sugar, on the cell surface than that on normal cells. This sugary coating makes cancer cells invisible to the immune system, so they can divide freely in the body. To enable the immune system to attack cancer cells, Dr. Bertozzi proposed to strip away the sugary coating by using drugs.

In addition to masking cancer cells, sugar can also directly promote their growth. Glucose, a metabolic product of sugar, is important for not only the growth of normal cells but also for that of cancer cells. One distinct phenotype of cancer cells is uncontrolled cell growth, which may require more glucose. This notion has been supported by a recent study done by Dr. Jung-whan Kim’s group at The University of Texas at Dallas. The researchers found that high levels of a protein called glucose transporter 1 (GLUT1), which is responsible for transporting glucose, is associated with lung squamous cell carcinoma (SqCC). However, they didn’t observe similar results in lung adenocarcinoma (ADC), which indicates that different cancer cells may adopt different mechanism to satisfy their energy needs. They further found that using GLUT1 inhibitor can suppress the tumor growth in a SqCC mouse model, but not the adenocarcinoma mouse model. Their findings will help the development of specific treatment plans for SqCC patients targeting GLUT1.

The American Heart Association recommended that the maximum amount of sugar consumption for women is 25 grams a day and that for man is 36 grams. However, the reality is that in 2015, on average, each American consumed more than 93 grams of sugar a day. Reducing sugar consumption and eating a balanced diet will not only help decrease the high rates of obesity in the US, but also benefit cancer prevention.

(Erin Blakemore, Washington Post and University of Texas at Dallas, ScienceDaily)

Innovative Technology

Stem cells could help Parkinson’s patients get the dopamine they need

Parkinson’s disease (PD) is a chronic degenerative disorder of the central nervous system that mainly affects movement. It is a progressive disease and PD patients can have very severe symptoms such as the inability to walk or talk. The cause of PD is the death of dopamine-producing neurons, as dopamine, a neurotransmitter, is essential for motor neurons to function properly.

Medications and surgery can help alleviate the symptoms of PD, but there is no cure for it. Recently, a study published in Nature brings hope to doctors, PD patients and their families. This study was led by Dr. Jun Takahashi, a stem-cell scientist at Kyoto University in Japan. As the loss of dopamine is the root cause of PD, to implant dopamine-producing cells to the brain of PD patients would be the most effective way to cure PD. Embryonic stem cells have the versatile ability to develop into different organs, but there are always ethical issues around this research. Dr. Takahashi’s group generated induced pluripotent stem (iPS) cells derived from both healthy people and those with PD, transformed iPS cells to dopamine-making neurons and implanted the cells into monkeys with neurodegenerative disorders. After two years, the monkeys are still alive and the disorder symptoms are greatly mitigated. Dr. Takahashi hopes to begin a clinical trial by the end of next year.

Ideally and in theory, deriving iPS cells from a patient’s own cells would allow them to avoid taking the immune-suppressing drugs that are usually necessary when introducing non-native tissues. But generating customized iPS cells is expensive and requires a couple months for propagation. A good solution, planned by Dr. Takahashi, is to establish iPS cell lines from healthy people and match them with PD patients by using immune cell biomarkers. This approach will probably be feasible as it has been successfully applied in monkeys.

In addition to Dr. Takahashi, there are other scientists conducting stem-cell research on PD. Dr. Jeanne Loring, working at Scripps Research Institute in La Jolla, California, prefers to transplant iPS-derived neurons made from a patient’s own cells. She hopes to start a clinical trial in 2019. Another stem-cell expert from the Memorial Sloan Kettering Cancer Center in New York City, Dr. Lorenz Studer, is working on a trial that will use neurons derived from embryonic stem cells. Although there are still issues in this field, all the efforts will ultimately lead to a better treatment for PD patients. (Ewen Callaway, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 15, 2017 at 4:05 pm

Science Policy Around the Web – September 12, 2017

leave a comment »

By: Saurav Seshadri, PhD

Vaccination Medicine Vaccine Ampoules

Source: maxpixel

Infectious Diseases

Is a Zika vaccine worth the effort?

A collaboration between pharmaceutical giant Sanofi and the US Army to develop a vaccine for the Zika virus has come to an end.  About a year ago, Sanofi received $43.2 million in funding from the Biomedical Advanced Research and Development Authority (BARDA, a division of DHHS) to move a Zika vaccine candidate, generated by the Walter Reed Army Institute of Research, into Phase II development.  BARDA has now decided to ‘de-scope’ the project, leading Sanofi to abandon its efforts to develop or license the candidate.

The number of cases of Zika has declined sharply since its peak in early 2016.  While this ‘evolving epidemiology’ has hampered Zika-related clinical research and drug development, it may actually be a welcome relief for Sanofi. The French company has endured months of political pressure to agree to pricing assurances for any vaccine produced from the collaboration, with lawmakers, including Senator Bernie Sanders, arguing that it would be a ‘bad deal‘ for a private company to profit from research funded in part by American taxpayers.  In particular, the exclusivity of Sanofi’s license,  uncommon for such agreements, has been singled out as ‘monopolistic’.  Sanofi has been defending itself vigorously against this characterization, pointing out that it took on significant risk itself for a vaccine that was far from approval, and that it has already discussed reimbursing the US government for its investment through milestone and royalty payments.  Ultimately, ending the collaboration puts this PR-damaging debate to rest, while also providing Sanofi a face-saving opportunity to avoid committing to a drug with limited prospective demand and profitability (as recently transpired with the dengue fever vaccine Dengvaxia, which only reached 55 of its projected 200 million euros in sales in 2016).

In its statement, Sanofi says that it continues ‘to believe that public-private partnerships are the right model to address…public health challenges’ posed by infectious diseases.  Indeed, several pharmaceutical companies responded to the WHO’s declaration of Zika as a public health emergency in 2016; of these, Takeda and Moderna appear to still have ongoing large-scale collaborations with BARDA to develop Zika vaccines.  While the drop in Zika prevalence is clearly a good thing, it’s unclear how it will affect the economic and scientific feasibility of such collaborations in the future.  One solution is to promote vaccine development before an outbreak occurs: groups such as the Coalition for Epidemic Preparedness Innovations (CEPI) hope to facilitate this approach, but the need to allocate limited resources makes its practicality questionable.  However, the alternative is usefully illustrated by the Ebola epidemic of 2014.  Despite concerted global efforts that led to successful vaccine development by Merck, current outbreaks are small enough that the deployment of vaccines may not even be warranted.  Barring an overhaul of regulatory processes and/or financial priorities, it seems likely that when the next epidemic emerges, we’ll be playing catch-up again.

(Eric Sagonowsky, FiercePharma)

Neuroscience

Is every human brain study a clinical trial?

Basic research into the mechanisms underlying cognition and their impairment in a range of brain disorders is the primary focus of hundreds of neuroscience laboratories.  While such studies feed into drug discovery for diseases such as autism, schizophrenia, and bipolar disorder, since they do not directly involve testing any treatments, they are not commonly considered to be clinical trials.  This perception became technically incorrect in 2014, following an NIH announcement broadly redefining ‘clinical trial’ to include any study in which ‘one or more human subjects’ receive an intervention and ‘health-related biomedical or behavioral outcomes’ are observed.  Last year, the NIH revised its data reporting policies for such trials.  These more stringent policies are now being implemented, and will affect grant applications submitted in 2018.

Several members of the scientific community have begun to voice their concern about the changes.  The Association for Psychological Science (APS) and Federation of Associations in Behavioral & Brain Sciences (FABBS) have both sent critical letters to the NIH. A petition by neuroscience researchers pushing back against the policy has garnered over 3,400 signatures.  Opponents argue that the requirements imposed by the ‘clinical trial’ label are overly burdensome and would impede basic research.  These requirements include timely study registration and public disclosure of results through ClinicalTrials.gov. Further, they demand that all staff receive Good Clinical Practices training. Investigators dread the bureaucracy that will be involved in complying with these mandates.  Perhaps most concerning for scientists is the constraint that new proposals must respond to a Funding Opportunity Announcement, which have specific stipulations about study objectives, design, oversight, and evaluation.  While these rules are intended to promote scientific rigor and transparency, the more immediate effects may be to stifle exploration and creativity and to deter basic researchers who may not know how to tailor their applications to reflect clinical values.

For its part, the NIH is steadfast that the broad redefinition is ‘intentional’ and that current standards of data reporting are ‘unacceptable’.  Policymakers argue that they are simply asking scientists to inform the public about the existence and outcome of their research.  While this sounds unimpeachable in theory, scientists are already reporting practical challenges: for example, asking potential study participants to sign a clinical trial consent form can scare them away.  While the NIH is making efforts to provide guidance to the community, it is running out of time to stamp out confusion before next January, let alone achieve enthusiastic compliance.  Neuroscientists are likely to face setbacks in funding and progress as a result.

(Sara Reardon, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 12, 2017 at 5:52 pm

Science Policy Around the Web – September 8, 2017

leave a comment »

By: Emily Petrus, PhD

20170908_Linkpost_1

source: pixabay

Science funding

Congress Returns and Funding Anxiety Continues for Scientists

The summer recess is over, which means congress needs to get to work and pass funding bills to keep the government running past the end of fiscal year: September 30th. The agenda is full, including funding hurricane Harvey relief, raising the debt ceiling and allocating funds for 2018. The budget fight is bound to be full of surprises, even just last night Trump sided with democrats on these three issues, throwing most conservative GOP members for a loop. It remains to be seen how the 2018 budget will impact research, but here’s what we know so far.

The 2017 budget was considered a positive one for scientists, because the large cuts demanded by the president went unheeded by congress. The president requested to cut most federal agencies, and the EPA (-31%), NOAA (-22%) and FDA (-31%) were the largest targets.  However, most research institutions did not see major cuts, and although the National Institutes of Health (NIH) budget was requested to be reduced by 22%, it received a $2 billion raise.  The upcoming 2018 fight would pit the president’s proposed agenda against senate against house, providing a 3-way fight which leaves scientists in the middle of potentially hostile waters.

Proposed budgets by the house of representatives and senate are still being formulated, but there are already discrepancies between the two proposals. For example, the house proposes increasing NASA’s budget by $94 million (+1.6%), while the senate would reduce funds by $193 million (-3.3%). The discrepancies can be found even deeper in NASA’s budget, with reversed support for planetary science (increased spending from the house) and earth science research (cuts from the house, maintained spending from the senate). These cuts could impact our ability to monitor distant planets and moons which could be sustainable for human life. For example, an unmanned mission to Jupiter’s moon, Europa, slated to launch in 2020 and land in 2024 could be stalled. In flyby missions from 1995-2003, this moon was found to have brown sediment, a warm core and probably a salty ocean under an icy surface, making it similar, albeit colder, to our planet.

Back on earth, our ability to design new ways to produce renewable, sustainable energy could also take a hit, as funding may be cut from the Department of Energy’s Advanced Research Projects Agency – Energy (ARPA-E). This department funds “high-risk, high-reward” projects and has only been in operation for 8 years, which makes it difficult to determine if the investment is worth the so far limited outputs. The senate proposes increasing this funding by 1.1%, while the house would scrap the project entirely.

Finally, the National Oceanic and Atmospheric Administration (NOAA) is on the chopping block, with the house following the president with a 22% decrease in funding, while the senate only seeks to cut the budget by 1%. Controversial projects overseen by NOAA include the Polar Follow-On programme, which monitors weather in collaboration with NASA. Cutting this program could impact our ability to predict hurricanes, something not likely to sit well with voters and representatives in states impacted by current weather catastrophes.

Although there are big discrepancies in proposed budgets between the president, the house and the senate, time will tell how much cooperation the republicans and democrats can achieve by the end of the month to avoid a government shut down. On a positive note, the NIH can hope for a boost from the house and the senate, as funding human health is an issue which usually enjoys bipartisan support.

(Rachael Lallensack, Nature News)

The science of education

School’s Back in SessionGet your learning on!

School is back in session; teachers are teaching, students are learning, and education is supposed to be breaking down socioeconomic barriers. What can science do to help educators have the greatest impact on students? There’s an intersection between teaching strategy, learning, and education policy which can be implemented for better student outcomes.

A recent report by Science News describes new strategies developed in the lab to enhance student learning. However, researchers are finding that studies performed in a lab setting with college kids do not yield the same results for optimizing student performance when applied to a bustling classroom of younger students. For example, when college students were asked to read a passage and jot down notes, their recall of the reading assignment was improved a week later. However, younger grade school students were shown to need an extra cue to help connect associations and make memories “stick”. This strategy helps teach students how to recall information, providing an extra support link until they can perform this task without a second thought. Another ongoing study is helping students improve executive function in students as young as middle school. Researchers designed a video game which requires players to shift strategies as rules change mid-game, which thus far has positively impacted the students’ performance on cognitive tests.

Being able to adapt to new situations is a cornerstone of learning, and neuroscience has long been searching for the magic that makes this task easy sometimes but challenging othertimes. The methods to study this process are becoming more sophisticated. Researchers can now view single synapses coming and going, and in some cases receptors on those synapses popping in and out. But understanding brain-wide learning requires zooming out and looking at neural network activity. It seems intuitive that to learn something new, connections must be formed between brain areas. These associations “stick” that memory or fact somewhere in the brain. Indeed, people who are learning something new display greater “brain flexibility”: the ability to not only make new connections, but let some others fall apart. Children with low math performance actually had higher connectivity during brain scans while doing math problems. It seems forgetting unimportant information to make room for new ideas is as important as making just more new connections.  In addition, a researcher scanned himself three times per week for a year, and found his brain displayed greater flexibility on days when he was in a good mood. The balance between making new connections and letting others go may be the key to better learning.

As science puts more pieces together on how learning best occurs, we can see some things coming into focus to enhance student learning. People who can make new connections and loose old ones in a dynamic fashion can be better learners. Being in a good mood, meaning a stable home and school environment with food and housing security can lead to better brain flexibility. Teachers trying new strategies to enhance brain flexibility with their students could help the students learn how to absorb and use new information. All of this information can be used to inform policy on what makes for a successful student as we proceed through the academic year.

(Susan Gaidos, Science News; Laura Sanders, Science News )

 

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 8, 2017 at 3:54 pm