Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘climate change

Science Policy Around the Web – October 26, 2018

leave a comment »

By: Mohor Sengupta, Ph.D.

rig-2251648_1280

Source: Pixabay

Environmental Problems

A 14-year-long oil spill in the Gulf of Mexico verges on becoming one of the worst in U.S. history

In the year 2004, hurricane Ivan leveled an oil production platform in the Gulf of Mexico, owned by Taylor Energy. Its striking magnitude destroyed the colossal platform which had drilled into several oil wells. The result was a huge mound of muck, filling the broken steel structure and leaking oil. To date, efforts to seal off the leakage have not been successful.

Taylor Energy at first denied that there was any leakage and then underreported the extent of the leakage. According to current estimates, about 700 barrels of oil are leaking per day, with each barrel holding 42 gallons of oil. The company has kept this information a secret for many years, and few people are aware of the actual level of spillage. The Taylor Energy spillage in fact pre-dates the Deepwater Horizon oil spill (also called the BP leak), so far the largest marine oil spill in history at 168 million gallons. While BP has coughed up $66 billion for fines, legal settlements and cleanup, Taylor Energy is a comparatively smaller operation and financially too cash-strapped to afford cleanup on such a large scale.

In these actions Taylor Energy flouted both the EPA’s Oil Pollution Act of 1990, which mandates that spillage must be reported to the U.S. Coast Guard National Response Center (NRC), and the Clean Water Act of 1972, which created a structure for regulating water pollutants. Taylor Energy was taken to court by environmentalists, and Taylor Energy and the NRC have been jointly found accountable in presenting false numbers and data. In an assessment submitted to Taylor Energy in 2009 by Waldemar S. Nelson and Company, a private firm, risks involved in ingesting fish from the affected area were discussed. A recent and independent analysis by the Justice Department showed that the original estimate of 1 to 55 barrels of leakage per day provided by NRC was inaccurate. After several spillage tests Oscar Garcia-Pineda, the author of the article, concluded that his results didn’t tally with those reported by NRC and the actual rate of spillage was 48 to ~1700 barrels per day.

These disturbing findings have arrived at a delicate time for environmental protection policy. Earlier this year, the Trump administration proposed a wide expansion of leases to the oil and gas industry. This would render all off-shore areas on the continental shelf, including those along the Atlantic coast, amenable to drilling. Oil and gas representatives are lobbying for this cause and have provided financial justifications including billions of dollars’ worth of annual economic growth, increased jobs and lower heating costs. However, multiple governors representing states across the four planning areas, from Maine to the Florida Keys, are opposed to this proposal.

Reports show that on average there are 20 uncontrolled releases of oil per 1000 wells under state or federal governments. In Louisiana alone, approximately 330,000 gallons of oil are spilt from off-shore and on-shore rigging platforms. With changing climate patterns, hurricanes on the Atlantic are predicted to be more intense in future, and given the government’s plans to extend rigging along the Atlantic coast, a bleak prospect looms ahead.

(Darryl Fears, Washington Post)

Health Supplements

The Problem with Probiotics

The healthy balance or maintenance of the natural flora of the gut, also called the gut microbiome,is essential for a healthy digestive system. Antibiotics have been shown to disrupt the gut-microbiome, resulting in diseases such as diarrhea and infections with Clostridium difficile. As an antidote, it has been common practice to pop in “good bacteria”, or probiotics, while on antibiotic treatment. These probiotics are essentially a mixture of supposedly healthy gut microbiota and are meant to replace those disrupted by the antibiotic.

Although people commonly take probiotics, this class of product is not regulated by the FDA and there are rising concerns about the standard of manufacture and quality of these commonly sold over-the-counter health supplements. Most recently, Dr. Pieter A. Cohen cautioned against overlooking the harmful effects of widely marketed probiotics in his recent article published in “JAMA Internal Medicine”.

There have been several studies discussing the benefits of probiotics, so much so that the journal “Nutrition” recently published a systematic review of systematic reviews. In a nutshell, all the studies ever done on efficacy of probiotics have produced very limited positive results and only pure microbial strains were used as the probiotic supplement in these studies. On the other hand, there has been no evidence to show that probiotics have been beneficial in treating conditions such as Crohn’s Disease, chronic diarrhea, ulcerative colitis or liver disease, all related in some way to the gut microbiome.

Safety assessment studies have found probiotics to be contaminated with unwanted microbial strains, and without FDA regulation of the manufacturing process production doesn’t often follow a well-defined pipeline. It is not known what kinds of health hazards might be caused by these contaminants, warns Dr. Cohen, and they can be lethal. In a notorious case, the death of an infant was attributed to a contaminated dietary supplement.

Unfortunately, none of these events have deterred Americans from using probiotics. Almost four million people, or 1.6 percent of adults in the United States used probiotics in 2012 and the global market for probiotics is steadily on the rise. In this situation, it is of great importance for dietary supplements be given the rigorous assessment and quality control checks that a prescription drug undergoes. There should be increased efforts to make consumers aware of adulterations in probiotics.

(Aaron E. Carrol, New York Times)

Have an interesting science policy link? Share it in the comments!

Advertisements

Written by sciencepolicyforall

October 26, 2018 at 12:36 pm

Science Policy Around the Web – October 19, 2018

leave a comment »

By: Ben Wolfson, Ph.D.

swamp-718456_1280

Source: Pixabay

Climate Change

 

Climate Change prompts a rethink of Everglades management

The Florida Everglades is a large area of tropical wetlands that has received significant attention due to the degradation of its unique ecosystem by urban development. The Everglades were designated a World Heritage Sitein 1979 and Wetland Area of Global Importancein 1987, and in 2000 Congress approved the Comprehensive Everglades Restorative Plan (CERP) to combat further decline and provide a framework for Everglades restoration.

For the past 18 years, these efforts have been directed towards curtailing damage from urbanization and pollution. However, as outlined in a congressionally mandated report released on October 16th by the National Academies of Science, Engineering, and Medicine, new strategies may be necessary. In the biennial progress report, an expert panel called for CERP managers to reassess their plans in light of new climate change models. The report focuses on the 7 centimeters of sea level rise seen since 2000, and points out that Southern Florida is especially at risk from climate change and is expected to experience a 0.8-meter rise in sea level by the year 2100.

It is clear that as more is learned about the realities of climate change, the goals and methods of conservation projects are shifting, and past strategies must be adapted to fit the realities of a warming world.

(Richard Blaustein, Science)

Animal Research

NIH announces plan for chimp retirement

 

In 2015, the NIH announced that it would no longer support biomedical research on chimpanzees, two years after pledging to significantly reduce the numbers of chimpanzees used in research. These decisions were made based on a combination of reduced demand for chimpanzees in research and the designation of captured chimpanzees as an endangered species in 2015.

On Thursday October 18th, the NIH announced the next step in the process of retiring research chimps. While research was stopped in 2015, many of the chimpanzees had nowhere to go and remained housed at laboratories. One federal chimpanzee sanctuary, Chimp Haven, exists in Keithville, Louisiana, however lack of space and the difficulty of relocating some animals has slowed their transition to better habitats.

In the Thursday announcement NIH director Francis Collins outlined the guidelines for future chimpanzee relocation. These include streamlining medical records and determining whether chimpanzees are physical healthy enough to be relocated. Many of the chimpanzees are at an advanced age, meaning they have developed chronic illnesses similar to those experienced by humans. However, Collin’s emphasized that there must be a more acute medical problem for relocation not to take place. In addition both the research facility and Chimp Haven must agree that the former research chimpanzees are capable of being relocated, and disagreements will be mediated by a panel of outside veterinarians.

Collins additionally stressed that while transfer to Chimp Haven is the ideal outcome for all retired chimps, those housed at NIH-supported facilities do not live isolated in cages or in laboratories and are housed in social groups with appropriate species-specific accommodations.

The development of these clear guidelines will expediate chimpanzee relocation while emphasizing chimpanzee health and comfort.

(Ike Swetlitz, Statnews)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

October 19, 2018 at 3:25 pm

Science Policy Around the Web – September 21, 2018

leave a comment »

By: Mohor Sengupta, Ph.D

Heart Colorful Sharpened Rainbow Colored Pencils

source: maxpixel

Inclusion in healthcare

India’s Anti-Gay Law Is History. Next Challenge: Treat LGBTQ Patients With Respect

On September 6 this year, in a landmark verdict, the Supreme Court of India officially decriminalized gay sex. It was a much-awaited move that toppled the archaic, colonial-era Section 377 of the Indian Penal Code, which was used to criminalize sexual activities “against the order of nature”. In 2009 Section 377 was provisionally invalidated, prompting more Indians to come out. But in 2013, the law was reinstated, as it was “a miniscule fraction” of the population that was in question. Although the country has made a decisive and progressive leap catapulting itself into the international arena of contemporary sexual norms, a big change is still needed in its healthcare sector to accommodate the repeal.

Till this date, reporting to clinics or hospitals is an ordeal for the LGBTQ community. Routine prescriptions like the preventive post-exposure prophylaxis for homosexual men is still met with confusion and dilemma at the clinic. Situations have changed from the time when HIV infection was a social stigma, but it isn’t quite at a place where a transgender or homosexual person can talk freely about their medical problems with healthcare personnel. Many doctors view different sexual orientations as something that can be “cured”. Such attitudes have caused a large section of the LGBTQ community to avoid seeing a doctor altogether. Most visit clinics recommended by others in the community.

In April 2014, the Supreme Court of India officially recognized the transgender community as a third gender and granted them the same fundamental rights that the Indian Constitution grants all citizens. Gender-reassignment surgeries were legalized. Yet, in most government hospitals, patients are segregated into a male or a female ward. Arnav Srinivasan, a transgender person and almost approaching menopause, has never visited a gynecologist even though it is necessary. Government directive to construct more gender-neutral public toilets hasn’t seen the light of the day.

Indians and people all over the world are rejoicing the recent Supreme Court repeal of IPC Section 377, and rightfully so. But now the major problem grappling the government is how to educate healthcare personnel about LGBTQ-specific health issues and disseminate appropriate instructions to law enforcement agencies, where harassment of LGBTQ people has been common. The Supreme Court did mandate sensitization programs for schools and the police and some non-profit organizations are planning to offer anti-discrimination workshops to district courts and law enforcement agencies.

In an ethnically, financially, and educationally diverse and sometimes disjointed community like India, repeal of IPC Section 377 is only the tip of the iceberg. It has heralded a new age of public health policy. Attitudes towards sexuality and sexual health needs a systematic and major re-orientation.

(Sushmita Pathak & Furkan Latif Khan, NPR)

Climate

Florence, Mangkhut bring data and destruction to coastal scientists

Two violent weather systems rocked two opposite ends of the world recently. Hurricane Florence originated from a strong tropical wave off the African west coast and it steadily intensified into a Category-4 hurricane en route to North America. Subsequently weakened, it made landfall just south of Wrightsville Beach, NC on September 14. Typhoon Mangkhut arose from a tropical depression near the International Dateline and rapidly intensified in strength as it moved westwards. It made landfall as a Category-5 equivalent super typhoon in the Cagayan province of Philippines on September 15. Both storms have caused significant damage to life and property, mostly in USA, Philippines and Hong Kong.

Meteorologists in USA have noted that recent tropical storms here have caused more floods than damaging winds. They attributed this observation to rising atmospheric temperatures, which make these storms hold on to greater amount of moisture. Overall warmer weather also diminishes the temperature differences between land and ocean, making the storm hover on the land for longer durations. Although wind speeds were as high as 215 km/hr, it was sustained winds that drove large volumes of water on land and caused widespread flooding in affected areas.

Typhoon Mangkhut, on the other hand, brought damaging winds with gusts of up to 228 km/hr in Hong Kong. Tall buildings in the city caused wind tunneling, that shattered its walls. We are all familiar with photographs of skeletonized buildings left in the wake of the storm.

In all this havoc, weather scientists have gained valuable information about overtures of a changing climate pattern. Giant waves off the coast of Wilmington, NC, dragged out a buoy equipped with sensors that measure wind speeds, wave heights and other storm conditions. All data management and web services connected to these sensors had been migrated to Amazon cloud services and provided that the buoys remained functional through the storm, data collected from the sensors could be invaluable. Earlier this week it was still transmitting data and Debra Hernandez, executive director of the buoy’s operator, the Southeast Coastal Ocean Observing Regional Association in Charleston, SC is waiting to see if that data can be tapped. Two automated submarine gliders, also known as autonomous underwater vehicles (AUVs), equipped with sensors to detect water temperature, chlorophyll a, salinity etc, have been deployed along the American continental shelf. These could throw more light on specifics. Ocean researchers at The Swire Institute of Marine Science, The University of Hong Kong (HKU) are beginning to comb through the data collected on Mangkhut. They learned that the storm had passed over relatively cooler surface waters before it made landfall, and this took away some of its power.

Climate change is very real, and its tangible effects are already showing. At most we can brace ourselves from such extreme weather, but it is of utmost importance to gather as many facts as possible and get to “know” these storms, to brace ourselves better.

(Frankie Schembri, with reporting by Dennis Normile and Paul Voosen, Science)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

September 21, 2018 at 3:48 pm

Science Policy Around the Web – February 23, 2018

leave a comment »

By: Janani Prabhakar, Ph.D.

20180223_Linkpost

source: pixabay

Climate Change

Permafrost experiments mimic Alaska’s climate-changed future

In Denali National Park and Preserve, you will find ecologist Ted Schuur near Eight Mile Lake on an endeavor to answer some of the toughest questions in climate change research. His “laboratory” is situated in the middle of a tundra, filled with many instruments to measure changes in carbon dioxide. A gas-sensing tower can detect carbon dioxide levels a quarter mile away. Polycarbonate chambers at the top of this tower traps CO2 as it drifts through the air and measure its amount. Using a clever manipulation, he seeks to determine how rising temperatures will impact this region’s CO2 emissions.

The key to understanding the impact of rising temperature is to understand the dynamics between carbon dioxide, plants and soil. Microbes in the soil release CO2. Plants absorb more CO2 than they release, keeping it out of the atmosphere. Critically, microbes release CO2 all year while plants absorb CO2 only during growing season. For a perfect balance, there should be enough microbes in the soil that release CO2 throughout the year and enough plants in the environment to absorb it during growing season. How do rising temperatures impact this balance? Schuur measured CO2 from two different plots of land: one that was surrounded by snow fences and the other that was unfenced. Snow fences catch the cold drift and as a result, the ground they surround is 3 to 4 degrees Fahrenheit warmer than the unfenced plot. This amount of warming is significant because Alaska is projected to see an additional 4 to 5 degrees of warming by 2100. So, Schuur has created an environment within the fenced plot that mimics the projected environment of 2100.

Schuur finds that due to the warmer temperatures, slumping permafrost causes the land to lower by several feet. This, in turn, causes the depth to which the soil thaws in the summer to be deeper, allowing the permafrost layer to add more organic matter to the soil. The result is that more organic matter produces more plant growth, which means more CO2 is absorbed in these warmed fenced plots than the cooler unfenced plots. But, this only happens during the growing season. Since the deeper soil also sees more microbial growth, more CO2 is released from the soil all year around in the fenced than unfenced plot. Schuur finds that the amount of CO2 released in these warmer plots is not offset by what is absorbed by the plants in the growing season, despite the extra plant growth.

Altogether, this news is not good. Given the current rate of temperature rise, this imbalance between CO2 absorption and release may only grow. By the end of the century, the amount of carbon transferred from the thawing permafrost to the atmosphere could reach 1 billion tons, as much as present-day emissions of Germany and Japan.

(J. Madeleine Nash, Wired)

Healthcare

Synergy Between Nurses And Automation Could Be Key To Finding Sepsis Early

Sepsis is the body’s reaction to overwhelming infection and causes about a quarter of a million deaths in American each year. If caught early, it can be treated. But, healthcare workers struggle to identify sepsis in patients in a timely manner. Blood tests cannot specifically test for it, and there is nothing to search for under a microscope. Dr. David Carlbom, a pulmonologist at Harborview Medical Center in Seattle, devised a system to help healthcare providers identify sepsis symptoms and provide timely treatment. His system uses day to day electronic health records to detect subtle clues and send warning flags for impending sepsis. It helps to capture patterns in symptoms, including high or low temperatures, low blood pressure, fast breathing, and high white blood cell count. The system is implemented at nursing stations in the hospital. After a patient is admitted, a red box appears in the patient record, prompting the nurse to answer questions about symptoms and determine whether they point to early signs of sepsis. If the nurse determines that they do, a provider is paged and responds within a half hour. Altogether, the system is intended to ensure the patient is seen within three hours.

While this is a much more precise and efficient method than prior practice, there are circumstances that lead to false alarms. For example, faster breathing may be due to multiple factors, including simply walking down the hall. Or, symptoms such as high white blood cell count may not be due to sepsis, particularly in patients being seen for other health issues like cancer. One way to reduce false alarms is built into the system: the red box appears only every 12 hours. This ensures that providers are not paged throughout the day for false alarms. Furthermore, if nurses determine that the patient is not experiencing sepsis, they must report why and provide an explanation for the symptoms the patient is experiencing. This allows for thoroughness, accountability, and precision. It also ensures that nurses keep a close eye on their patients. The effectiveness of this system has been seen in the reduction of mortality rates since it was installed in 2011.

Despite the reduction in mortality rates, entering vital signs manually could have its shortcomings. Sepsis symptoms can arise quickly and affect the body rapidly. Nurses may miss these symptoms within the 12-hour window if they are not vigilant. Recent efforts have begun to address this issue. Dr. Matthew Churpek at the University of Chicago is partnering with a company to create a device that will go under a patient’s mattress to continuously calculate heart rate and respiratory rate. This will reduce false alarms and allow researchers to use an evidence-based approach to clinical practice. They can generate algorithms based on data to predict early onset of sepsis. Critically, this approach will allow clinicians to focus on preventative efforts rather than treatment.

(Richard Harris, NPR)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

February 23, 2018 at 10:43 pm

Science Policy Around the Web – November 17, 2017

leave a comment »

By: Janani Prabhakar, PhD 

20171117_Linkpost

source: pixabay

Public Health

The ‘Horrifying’ Consequence of Lead Poisoning

Changes in water treatment practices in Flint, Michigan in 2014 resulted in large levels of lead in the water supply, and eventually culminated to a state of emergency in January 2016. The supply affected over 10,000 residents, forcing these individuals to refrain from using the city’s water supply until 2020. Because state officials may have been aware of lead contamination in the water supply for months before it became public, these officials are now facing criminal charges. This negligence is particularly troubling given recent evidence that shows persisting effects of lead contamination on health outcomes in Flint residents. In a working paper by Daniel Grossman at West Virginia University and David Slusky at the University of Kansas, the authors compared fertility rates in Flint before and after the change in water treatment practices that led to the crisis, and compared post-change fertility rates in Flint to those of unaffected towns in Michigan. They found that the fertility rate declined by 12 percent and fetal death rate increased by 58 percent. These reductions in rate have been witnessed in other cities after similar incidents of lead contamination in the water supply. Furthermore, given that the number of children with lead-poisoned blood supply doubled after changes to the treatment practices, the long-term effects on cognitive, behavior, and social outcomes of this contamination are only beginning to be examined and understood. The circumstances in Flint are an example of how misplaced focus of high-level policy decisions can negatively impact local communities, particularly low-income black neighborhoods. Black neighborhoods are disproportionately affected by lead contamination, but the lack of sufficient attention as well as the false suggestion that effected individuals were to blame propagated by industry leaders and policy makers have deterred progress in addressing critical issues in at-risk and underserved communities.

(Olga Khazan, The Atlantic)

Climate Change

Why China Wants to Lead on Climate, but Clings to Coal (for Now)

In a country of 1.4 billion people, China is one of the world’s largest coal producers and carbon polluters. However, it aims to spearhead the international agreement to address climate change. Despite this contradiction, China is already on track to meet its commitment to the Paris climate accord. This move towards reducing its dependence on coal comes as a necessity to China because of internal pressure to curb air pollution. But, according to NRDC climate and energy policy director Alvin Lin, given its size and population, phasing out coal dependence will not only be a long process for China, but one that has lots of ups and downs. For instance, while China has shown progress in meeting its commitments, a recent report shows higher emission projections this year may reflect an uptick in economic growth and reduction in rains needed to power hydroelectric technologies. While Lin portrays this uptick as an anomaly, competing interests in the Chinese government make the future unclear. In efforts to increase its presence abroad, China has built coal plants in other countries. But, China is also the largest producer of electric cars. President Xi Jinping has derided the United States for being isolationist and reneging on the Paris climate accord, but how his Government plans to hold its end of the deal has not been revealed. An important revelation is the fact that even if every country achieves their individual Paris pledges, the planet will still heat up by 3 degrees Celsius or more. Given that this increase is large enough to have catastrophic effects on the climate, adherence to Paris pledges serves only as a baseline for what is necessary and sufficient to combat global warming.

(Somini Sengupta, The New York Times)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

November 17, 2017 at 4:51 pm

Science Policy Around the Web – August 18, 2017

leave a comment »

By: Nivedita Sengupta, PhD

20170818_Linkpost_1

Source: pixabay

Climate Science

Effort backed by California’s flagship universities comes as US President Donald Trump shrugs off global warming

As US President Donald Trump announces to withdraw from Paris Agreement, renouncing climate science and policy, scientists in California are deciding to develop a home-grown climate research institute -‘California Climate Science and Solutions Institute’. California has always tried to protect the environment with different initiatives and this one is already getting endorsed by California’s flagship universities and being warmly received by Governor Jerry Brown. The initiative is still in the early stages of development and will also need clearance from the state legislature. The institute will aim to fund basic as well as applied research in all the topics related to climate change ranging from ocean acidification to tax policy. Priority will be given to projects and experiments that engage communities, businesses and policymakers. “The goal is to develop the research we need, and then put climate solutions into practice,” says Daniel Kammen, an energy researcher at the University of California, Berkeley. He also states that this work will have global impact. The climate research project being undertaken in California may have an ally too, as the science dean of Columbia University of New York city, Peter De Menocal, plans to build an alliance of major universities and philanthropists to support research for answering pressing questions about the impacts of climate change. De Menocal already tested the idea on a smaller scale by launching the Center for Climate and Life at Columbia University last year, which raised US$8 million of private funding. This is no the first time California has taken the initiative to support an area of science that fell out of favor in Washington DC. In 2004, President George W. Bush restricted federal support for research on human embryonic stem cells. This led to the approval of $3 billion by the state’s voters to create the California Institute for Regenerative Medicine in Oakland. Since then, the center has funded more than 750 projects. The proposal for a new climate institute also started along a similar path, as a reaction to White House policies, but its organizers say that the concept has evolved into a reflective exercise about academics’ responsibility to help create a better future. The panel members wish to put forward a complete plan to set up the institute to the California legislature this year, in the hope of persuading lawmakers to fund the effort by September 2018, before Governor Brown’s global climate summit in San Francisco.

(Jeff Tollefson, Nature News)

Retractions

Researchers pull study after several failed attempts by others to replicate findings describing a would-be alternative to CRISPR

The high-profile gene-editing paper on NgAgo was retracted by its authors on 2nd August, citing inability in replicating the main finding by different scientists around the globe. The paper was published in Nature Biotechnology in May 2016. It described an enzyme named NgAgo which could be used to knock out or replace genes in human cells by making incisions at precise regions on the DNA. The study also emphasized the findings as a better alternative to the CRISPR-Cas9 gene editing system which revolutionized gene editing and has even been used to fix genes for a heritable heart condition in human embryos. Han Chunyu, molecular biologist at Hebei University of Science and Technology in Shijiazhuang is the inventor and immediately attracted a lot of applause for his findings. However, within months, news started emerging in social media about failures to replicate the results. These doubts were confirmed after a series of papers were published stating that the NgAgo could not edit genomes as stated in the Nature paper. Earlier, Han told Nature’s news team that he and his team had identified a contaminant that can explain other groups’ struggles to replicate the results and assured that the revised results would be published within 2 months. Yet on August 2, they retracted the paper stating that “We continue to investigate the reasons for this lack of reproducibility with the aim of providing an optimized protocol.”

The retraction of the paper, however, puts in question the future of the gene-editing center that Hebei University plans to build with 224 million yuan (US$32 million) as Han as the leader. Moreover, Novozymes, a Danish enzyme manufacturer, paid the university an undisclosed sum as part of a collaboration agreement. Dongyi Chen, Novozymes’ Beijing-based press manager, told Nature’s news team in January that the technology is being tested and shows some potential, but it is at a very early stage of development and hence it is difficult to determine its relevance. Following the news of retraction, he stated that the company has explored the efficiency of NgAgo, but so far has failed to track any obvious improvement. Yet they are not giving up hope as scientific researches takes time.

(David Cyranoski, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

August 18, 2017 at 5:11 pm

Science Policy Around the Web – August 1, 2017

leave a comment »

By: Sarah L. Hawes, PhD

20170801_Linkpost_1

Source: pixabay

Climate Science

Conducting Science by Debate?

Earlier this year an editorial by past Department of Energy Under Secretary, Steven Koonin, suggested a “red team-blue team” debate between climate skeptics and climate scientists. Koonin argued that a sort of tribalism segregates climate scientists while a broken peer-review process favors the mainstream tribe. Science history and climate science experts published a response in the Washington Post reminding readers that “All scientists are inveterate tire kickers and testers of conventional wisdom;” and while “the highest kudos go to those who overturn accepted understanding, and replace it with something that better fits available data,” the overwhelming consensus among climate scientists is that human activities are a major contributor to planetary warming.

Currently, both Environmental Protection Agency Administrator, Scott Pruitt, and Department of Energy Secretary, Rick Perry, cite Koonin’s editorial while pushing for debates on climate change. Perry said “What the American people deserve, I think, is a true, legitimate, peer-reviewed, objective, transparent discussion about CO2.” That sounds good doesn’t it? However, we already have this: It’s called climate science.

Climate scientists have been forthright with politicians for years. Scientific consensus on the hazards of carbon emissions lead to the EPA’s endangerment findings in 2009, and was upheld by EPA review again in 2015. A letter to Congress in 2016 expressed the consensus of over 30 major scientific societies that climate change poses real threats, and human activities are the primary driver, “based on multiple independent lines of evidence and the vast body of peer-reviewed science.”

Kelly Levin of the World Resources Institute criticizes the red team-blue team approach for “giving too much weight to a skeptical minority” since 97% of actively publishing climate scientists agree human activities are contributing significantly to recent climactic warming. “Re-inventing the wheel” by continuing the debate needlessly delays crucial remediation. Scientific conclusions and their applications are often politicized, but that does not mean the political processes of holding debates, representing various constituencies, and voting are appropriate methods for arriving at scientific conclusions.

(Julia Marsh, Ecological Society of America Policy News)

20170801_Linkpost_2

source: pixabay

Data Sharing, Open Access

Open Access Science – getting FAIR, FASTR

Advances in science, technology and medicine are often published in scientific journals with costly subscription rates, despite originating from publicly funded research. Yet public funding justifies public access. Shared data catalyzes scientific progress. Director of the Harvard Office for Scholarly Communication and of the Harvard Open Access Project, Peter Suber, has been promoting open access since at least 2001. Currently, countries like The Netherlands and Finland are hotly pursuing open access science, and the U.S. is gearing up to do the same.

On July 26th, bipartisan congressional representatives introduced The Fair Access to Science and Technology Research Act (FASTR), intended to enhance utility and transparency of publicly funded research by making it open-access. Within the FASTR Act, Congress finds that “Federal Government funds basic and applied research with the expectation that new ideas and discoveries that result from the research, if shared and effectively disseminated, will advance science and improve the lives and welfare of people of the United States and around the world,” and that “the United States has a substantial interest in maximizing the impact and utility of the research it funds by enabling a wide range of reuses of the peer-reviewed literature…”; the FASTR Act mandates that findings are publicly released within 6 months. A similar memorandum was released under the Obama administration in 2013.

On July 20th, a new committee with the National Academies finished their first meeting in Washington D.C. by initiating an 18-month study on how best to move toward a default culture of “open science.” The committee is chaired by Alexa McCray of the Center for Biomedical Informatics at Harvard Medical School, and most members are research professors. They define open science as free public access to published research articles, raw data, computer code, algorithms, etc. generated through publicly-funded research, “so that the products of this research are findable, accessible, interoperable, and reusable (FAIR), with limited exceptions for privacy, proprietary business claims, and national security.” Committee goals include identifying existing barriers to open science such as discipline-specific cultural norms, professional incentive systems, and infrastructure for data management. The committee will then come up with recommended solutions to facilitate open science.

Getting diverse actors – for instance funders, publishers, scientific societies and research institutions – to adjust current practices to achieve a common goal will certainly require new federal science policy. Because the National Academies committee is composed of active scientists, their final report should serve as an insightful template for federal science agencies to use in drafting new policy in this area. (Alexis Wolfe & Lisa McDonald, American Institute of Physics Science Policy News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

August 1, 2017 at 7:38 pm