Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘FDA

Science Policy Around the Web December 10th, 2019

leave a comment »

By Kellsye Fabian MS, PhD

Image by allinonemovie from Pixabay 

Two New Drugs Help Relieve Sickle-Cell Disease. But Who Will Pay?

The FDA recently approved two new treatments for sickle-cell disease — Adakveo and Oxbryta. Sickle-cell disease is caused by a gene mutation that results in abnormal hemoglobin and sickle-shaped red blood cells. These cells clump together, restricting the flow in blood vessels and limiting oxygen delivery to the body’s tissues. This leads to severe pain and organ damage. Adakveo, made by Novartis, can prevent occurrence of bouts of severe pain caused by misshapen blood cells getting stuck in blood vessels. Oxbryta, made by Global Blood Therapeutics, can prevent disease-induced anemia that can result in permanent damage to the brain and other organs. For a disease that has been historically overlooked, the availability of these new cutting-edge medicines may represent new hope for people with sickle-cell disease. However, the new drugs might not be accessible to all ~100,000 Americans with the disease. Each treatment must be taken for life and is priced at $100,000 a year, which is double the median family income in the US.

Novartis and Global Blood Therapeutics have been in negotiation with insurance providers about covering the new drugs. Both companies are optimistic that most insurers will pay for the new treatments. The companies argue that without these drugs, the management of sickle-cell disease is expensive. Treating sickle-cell disease complications such as pain, stroke, and organ damage costs about $10,000 a year for children and about $30,000 for adults. Not included in that amount is the economic burden on adults who cannot work due to debilitating disabilities associated with sickle-cell disease and on family members who often wind up as caregivers.

Despite this, some experts and patient advocacy groups question the drug makers’ justification for the treatments’ hefty price tags. Actual development costs and taxpayer support must be considered when setting the price for these treatments. 

More affordable options are available for sickle-cell disease patients. Hydroxyurea, which was approved in 1998 and can reduce the frequency of pain crises and stroke, costs about $1,000 a year. While some patients on public insurance programs have $0 co-pays for Hydroxyurea, only 30% of sickle-cell patients take it. Therefore, insurers may possibly require sickle-cell disease patients to be treated with hydroxyurea before moving on to the more expensive Adakveo or Oxbryta.

Medicaid covers about 50% of sickle-cell disease patients while Medicare covers another 15%. It remains unclear how these programs can afford to pay for all who might need the new drugs. 

(Gina Kolota, New York Times

FDA warns Liveyon about selling unapproved stem-cell products that pose a risk to consumers

The FDA has issued a warning to Liveyon Labs and Liveyon LLC of Yorba Linda, California for making and selling unapproved umbilical cord blood products. The agency also issued a warning for significant deviations from safety practices that create serious risks for patients that receive the stem cell therapy.

In 2018, Liveyon LLC distributed contaminated non-FDA-approved umbilical cord blood products processed by the San Diego-based company, Genetech, Inc. The products were linked to the hospitalization of twelve patients who received the injections or infusions. Liveyon conducted a voluntary recall and began making its own products called Pure and Pure Pro through Liveyon Labs. These products are marketed mainly as a treatment for patients with back, knee, and other joint problems. 

An FDA inspection conducted in May revealed that Liveyon Labs and Liveyon LLC were manufacturing and distributing products that are considered drugs although they did not have the approval to do so. An approved biologics license application is needed to lawfully market a drug and an approved investigational new drug application (IND) is required for a drug to be used in humans during the development stage. No such licenses or INDs exist for Pure and Pure Pro. The inspection also documented that the companies did not meet safety standards, including failing to screen donors’ relevant medical records for communicable disease, inadequate aseptic practices to prevent contamination, and deficient environmental monitoring, such as failing to establish a system for cleaning and disinfection the processing room and equipment. According to the FDA, Liveyon took some corrective actions after the inspection. However, Liveyon has yet to provide “proof of updated policies and procedures” and it did not address its lack of required approvals. 

The FDA requested a response from the companies within 15 working days that details how the issues will be corrected. Failure to correct the problems could lead to seizure, injunction or prosecution. Liveyon said it would cooperate with the FDA. 

(Laurie McGinley, The Washington Post

Written by sciencepolicyforall

December 10, 2019 at 10:57 am

Science Policy Around the Web November 29th, 2019

leave a comment »

By Maria Disotuar, PhD

Source: Pixneo

To Drive Down Insulin Prices, W.H.O. Will Certify Generic Versions

Without insulin, a person with type 1 diabetes cannot survive, and the cost and accessibility to insulin continues to be a problem for individuals suffering from this incurable autoimmune disease. Diabetes mellitus is a chronic metabolic disease characterized by high blood glucose levels. There are two types of diabetes, Type 1 diabetes results from the loss of pancreatic β-cell function, resulting in an inability to produce insulin, a peptide-based hormone. On the other hand, Type 2 diabetes patients are resistant to insulin. Those suffering from Type 1 diabetes require daily insulin therapy to stay alive, and patients with type 2 diabetes require insulin therapy to maintain a healthy lifestyle. Currently, more than 400 million people worldwide have diabetes and this number is expected to increase in the coming years. The main problem being that there are no generic forms of insulin and the price for current insulin analogs has gone from approximately $20 per vial to $250 per vial depending on the type of insulin. This price increase over the past 20 years has made insulin unaffordable for many individuals particularly for younger generations of Americans struggling to pay student loans. For these individuals, seeing the price of insulin jump from $4.34 to $12. 92 per milliliter has meant rationing the lifesaving drug to the bare minimum – a deadly decision for some.

As a response to the growing demand for insulin and skyrocketing prices, the World Health Organization (WHO) has proposed a two year prequalification pilot project, which will allow pharmaceutical companies to produce generic insulin to be evaluated by WHO for efficacy and affordability. These types of pilot projects have been previously deployed to improve the accessibility of life saving drugs for malaria, HIV, and tuberculosis. These efforts have led to an increase in production and market competition leading to reduced costs for individuals.

Currently, the major producers of insulin, Eli Lilly, Novo Nordisk, and Sanofi have welcomed the prequalification program, vowing to be a part of the solution not the problem. According to WHO, companies in several countries, including China and India, have already expressed interest in the pilot project. This shift in insulin production would allow companies producing insulin domestically to enter the global market. As WHO-certified suppliers, these new competitors could dramatically drive down the price of insulin and improve accessibility on a global scale. Despite this positive global outlook, there are still some hurdles to cross for Americans to obtain these generic insulin products. The main one being that the pharmaceutical market is regulated by the FDA and the review process can be expensive for smaller companies. Nonetheless, Americans are fighting back to reduce the cost of insulin and other life savingdrugs, prompting lawmakers, presidential candidates, and the President to prioritize reduced drug prices for Americans. These mounting pressures will hopefully lead to a faster solution for this life or death situation.

(Donald G. McNeil Jr., The New York Times)

Will Microneedle Patches Be the Future of Birth Control?

In 2018, the The Lancet reported that between 2010 and 2014 44% of all pregnancies in the world were unplanned. Despite medical advances in sexual and reproductive health, new contraceptive methods are needed to expand accessibility and improve reliability for women. In the United States, the establishment of the Affordable Care Act (ACA) and health policies such as the Federal Contraceptive Coverage Guarantee, which requires private health plans to include coverage for contraceptives and sexual health services, has improved family planning for women of reproductive age. Despite the social and economic benefits of improved family planning and enhanced accessibility, conservatives continue to challenge these beneficial health policies. Unfavorable changes to these policies could result in major barriers for women to access some of the most effective, yet pricier forms of contraceptives such as intrauterine devices (IUDs) and implants. Studies show these long-acting forms of birth control are up to 20 times more effective in preventing unintended pregnancies than shorter-acting methods such as the pill or ring. Thus, new long-term contraceptives with reduced cost barriers would be essential in reducing unintended pregnancies and enhancing economic benefits on a global scale.

To address this issue, researchers at the Georgia Institute of Technology and University of Michigan in partnership with Family Health International (FHI) – a non profit human development organization, have developed a long-acting contraceptive administered by a patch containing biodegradable microneedles. The patch is placed on the surface of the skin and the microneedles painlessly come into contact withinterstitial fluid resulting in the formation of carbon dioxide bubbles, which allow the microneedles to detach from the patch within 1 minute of application. The needles themselves do not introduce a new contraceptive hormone, rather they provide levonorgestrel (LNG), which is regularly used in IUDs and has been deemed as safe and efficacious. After dissociation from the patch the needles slowly release LNG into the bloodstream. 

Thus far, the pharmacokinetics of the patches has been tested on rats and a placebo version has been tested in humans to test the separation process between the patch and the needles. The in vivo animal studies indicate the patch is able to maintain LNG concentrations at acceptable levels for more than one month and the placebo patch was well tolerated among study participants with only 10% reporting transient pain or redness at the site of patch application. Lastly, the researchers analyzed conceptions and acceptability of this new contraceptive method among American, Indian, and Nigerian women compared to oral contraceptives and monthly contraceptive injections administered by a physician. The results indicate women overwhelmingly preferred the microneedle patch method over the daily pill (90%) or monthly injections (100%). The researchers expect the patch to be simple to mass produce and a low-cost contraceptive option, which will reduce cost barriers and improve accessibility for women. Although the results of the study are promising, additional studies will have to be completed to address some of its limitations. Future studies will have to increase the number of animals used in the study and the number of human participants. Additionally, the release profile for LNG will likely need to be extended beyond 1-month to truly address the need for new long-acting forms of contraceptives. Finally, clinical trials will have to be completed to test the efficacy and general reliability of this method at reducing unintended pregnancies. If the microneedle patch is approved, it would be the first self-administered long-term birth control to enter the market, which could ultimately lead to enhanced accessibility for women with limited access to health care.

(Claire Bugos, Smisothian) 

Science Policy Around the Web September 20th, 2019

leave a comment »

By Allison Cross, PhD

Image from Flickr

Hunt for Cause of Vaping Illness Suggests Multiple Mechanisms of Damage

A vaping-related respiratory illness has affected nearly 500 individuals across 3 dozen states and has been linked to 6 deaths since the first case was reported back in April. Experts, however, are still uncertain about what is causing the nationwide outbreak and even what the condition is exactly.  

report earlier this month from the FDA suggested they may have identified the source of the problem, vitamin E acetate,  a common contaminate in vaping products.  However, more recent information indicates that no single contaminate was identified in all product samples tested from sick individuals. To date, the only thing found in common among the nearly 500 individuals who have fallen ill is that they recently vaped in the US or its territories.  

On September 16th, the U.S. Centers for Disease Control and Prevention activated its Emergency Operations Center (EOC) to help to enhance operations and provide additional support to CDC staff working to identify the cause of the disease.  The CDC advices those concerned about the outbreak to refrain from using e-cigarettes or vaping products.

E-cigarettes and other vaping products have recently been under scrutiny by those concerned about the recent increase in popularity of vaping among adolescents.  Many have been pushing for a ban on flavored e-cigarettes as these products are believed to be deliberately targeting youth.  The recent outbreak has led to renewed calls for a total ban on these and other vaping products.  In response to the outbreak, regulators in New York approved a ban on the sale of flavored e-cigarettes on Tuesday the 17thand Michigan followed suit on Wednesday.   The health and human services secretary, Alex M. Azar II, also announced that the FDA is outlining a plan for removing flavored e-cigarettes and nicotine pods from the market, though finalizing this ban will take several weeks. 

(Emily Willingham, Scientific American)

Trump’s decision to block California vehicle emissions rules could have a wide impact

California has long struggled to reduce smog in its cities and for almost 4 decades, as a part of the federal Clean Air Act, they have been granted special permission by the EPA to set their own air pollution standards.  This may soon change however as President Trump announced that the administration plans to revoke California’s authority to set its own automotive emissions standards. The Trump administration, instead, aims to set a single national standard for automotive emissions. Many are concerned, however, about the more lenient national standard proposed by the Trump administration. 

Although California is only 1 of 49 states, the implications of revoking California’s authority to set its own emission standard are far reaching.  The Clean Air Act currently allows others states to adopt the standards set by California and, as of today, thirteen other states and Washington DC abide by California’s stricter standards. 

The plan currently proposed by the Trump administration aims to freeze fuel-efficiency standards for all vehicles after 2020.  Experts estimate that this new standard would increase average greenhouse gas emissions from new vehicles by 20% in 2025 compared to the level projected under the current rules.  

California leaders have pledged to challenge the decision by the Trump administration in court.  It is likely that other states and environmental groups will join in support of California and it is possible that the lawsuit makes its way all the way to the supreme court. 

 (Jeff Tollefson, Nature

Written by sciencepolicyforall

September 20, 2019 at 5:44 pm

Recent trends and emerging alternatives for combating antibiotic resistance

leave a comment »

By: Soumya Ranganathan, M.S.

Image by Arek Socha from Pixabay 

Antibiotic resistance is an ongoing and rising global threat. While there is a tendency for bacteria and other microbes to develop resistance to antibiotics and antimicrobials slowly over time, the overuse and abuse of antibiotics has accelerated this effect and has led to the current crisis. The new Global Antimicrobial Surveillance System (GLASS), developed by the World Health Organization (WHO), reveals antibiotic resistance is found in 500,000 people with suspected infections across 22 countries. A study supported by the UK government and the Wellcome Trust estimates that antimicrobial resistance (AMR) could lead to an annual death toll of about 10 million by 2050. It is also predicted to have a huge economic impact and could cost 100 trillion USD between 2017 and 2050

Factors underlying the non-targeted use of antibiotics

Prescribing the right antibiotic for an infection takes about a week due to the process of identifying the infectious agent. To avoid the spread of infection, physicians are forced to make a prognosis prior to agent identification, and typically prescribe a broad-spectrum antibiotic. Since the broad-spectrum antibiotics act against a wide range of bacterial strains, their rampant use has led to the emergence of bacterial strains which are resistant to even the most potent antibiotics available. This trend has caused difficulty in treating previously curable hospital acquired infections and other benign infections. Not only is the discovery of new antibiotics is complicated (only one new class of antibiotics has been developed in the past three decades), the development of antibiotics, from discovery to medicine,  also in general takes about 1 to 2 decades. Here we will explore certain alternative strategies scientists all around the world are pursuing in their fight against antibiotic resistance. 

Antibiotic Susceptibility Test  

Reducing the time between a patient becoming ill and receiving treatment is critical for containing and effectively treating the infection. A part of this process that is currently required entails making improvements to the antibiotic susceptibility testing (AST) system, which typically has two steps: (i) Identifying the infectious agent and (ii) Identifying the most effective antibiotic to treat the infection.

Conceptually, new and rapid AST systems have been proposed and developed thanks to advancements in phenotyping methods, digital imaging and genomic approaches. But a plethora of factors act as roadblocks for implementing rigorous and standardized AST systems worldwide. A recently published consensus statement explores the major roadblocks for the development and effective implementation of these technologies while also suggesting ways to move past this stalemate. The major points of the statement are summarized below. 

  • Regulation– Since different regions and countries have their own requirements for marketing and validating a diagnostic method, the onus is on the developers to meet various demands. This also requires harmonization and cooperation among policy makers to formulate and agree on a standard set of rules.
  • Collection and dissemination of information regarding various strains and antibiotics– Antibiograms are a summary of antimicrobial susceptibility rates for selected pathogens to a variety of antimicrobial drugs, provide comprehensive information about the local antibiotic resistance. The challenge here lies in making the data available in real time and in developing a “smart antibiogram”.This is necessary to perform quicker analysis of samples and to reduce the time to treat which eventually translates to increase in lives saved. 
  • Cost involved in developing new, sensitive, and faster diagnostics– Though current diagnostics are cheap they are slow in identifying pathogenic bacteria. The transition to more advanced and sensitive diagnostics has been slow since their developmenttake time and incur more cost. However, this scenario is likely to change soon with the rising levels of antibiotic resistance that are making existing diagnostics obsolete, effectuating more investment in this sector. 

Antivirulence therapy

Small molecules are gaining prominence as an alternative or as adjuvants to antibiotic treatments. Recently, researchers from Case Western University have developed two small molecules F19 and F12 that show promise in the treatment of methicillin resistant Staphylococcus aureus(MRSA) infection in mouse models. The small molecules bind to a Staph. aureustranscription factor called AgrA, deterring it from making toxic proteins and rendering the bacteria harmless. Treatment with F19 on its own resulted in 100% survival rate in a murine MRSA bacteremia/sepsis model while only 30% of untreated mice survived. This kind of antivirulence therapy allows the immune system to clear the pathogens (since the bacteria are essentially harmless) without increasing pressure to develop resistance. When used as an adjuvant with antibiotic, F19 resulted in 10X times lesser bacteria in the mouse bloodstream than treatment with antibiotic alone. This kind of combination therapy can be used on immunocompromised patients. It has also been effective against other bacterial species such as Staph. epidermidis, Strep. pyogenes, and Strep. pneumoniaeand may act as arsenal for a broad variety of gram-positive bacterial infections. Overall the small molecule approach could also bring many of the previously shelved antibiotics back to use as they provide means to improve their efficacy in treating bacterial infections. Another class of engineered proteins called Centyrins show promise to treat Staph. aureusinfection using a similar mechanism, as they bind to the bacterial toxins and prevent them from disrupting the immune system. 

Molecular Boosters

Stanford University chemists (study published in Journal of the American Chemical Society) have developed a booster molecule called r8 which when used in combination with vancomycin (first line antibiotic used for MRSA infections), helps the antibiotic penetrate the biofilm and remain for a long time, enabling it to attack pathogens once they resurge from their dormant stage. This small molecule booster approach could be pursued further to provide existing antibiotics with additional abilities in sieging the pathogens and arresting the spread of infections.

Photobleaching

A recent collaborative effort by scientists from Purdue University and Boston University has resulted in an innovative light-based approach called photobleaching (using light to alter the activity of molecules) to treat certain bacterial infections. Photobleaching of MRSA using low-level blue (460nm) light has been found to lead to the breakdown of STX, an antioxidant (pigment) found in the membrane of bacteria. Since STX protects the bacteria against neutrophils (a class of white blood cells involved in body’s immune mechanism), prior attempts have been made using medication to eliminate the STX but those efforts have been futile. Photolysis of STX leads to transient increase in the permeability of bacterial membrane, rendering them more susceptible to even mild antiseptics like hydrogen peroxide and other reactive oxygen species. Since pigmentation is a “hallmark of multiple pathogenic microbes” this technology could be extended for use in other microbes to tackle resistance. In addition to advantages such as ease of use and development, photobleaching could also be used with minimal or no adverse side effects. 

Antisense Therapy

One of the consequences of the non-targeted use of antibiotics to treat infections has been the occurrence of C.difficileinfection in the colon. This condition is due to the elimination of useful bacteria along with the harmful bacteria in the gut. To tackle this infection, Dr. Stewart’s team from the University of Arizona has developed an antisense therapy which act by silencing genes responsible for the survival of pathogenic bacteria while sparing other useful bacteria in the gut. This strategy involves using molecules with two components – an antisense oligonucleotide moiety that targets the genetic material in C.diffand a carrier compound to transport the genetic material into the bacterium. Though this treatment approach shows potential in providing a targeted, less-toxic, nimble and cost-effective alternative against existing and evolving pathogens, clinical trials must be undertaken to see its effects in practice.

Future perspectives

In addition to the aforementioned strategies, the scientific community is pursuing immune modulation therapy, host-directed therapy, and probiotics to deal with the current AMR crisis. The problem with developing new antibiotics is that microbes will eventually develop resistance to them. Though time will reveal the approaches that are truly effective in evading antibiotic resistance, the looming threat must be dealt with prudently. A holistic approach to restrict and channel the targeted use of antibiotics while pursuing alternative therapies must be adopted by the clinicians, researchers, companies, global health experts, public and policy makers to curb the resistance emergency.

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

April 26, 2019 at 4:35 pm

Posted in Essays

Tagged with , , ,

The need for regulation of artificial intelligence

leave a comment »

By: Jayasai Rajagopal, Ph.D.


Source: Wikimedia

The development and improvement of artificial intelligence (AI) portends change and revolution in many fields. A quick glance at the Wikipedia article on applications of artificial intelligence highlights the breadth of fields that have already been affected by these developments: healthcare, marketing, finance, music and may others. As these algorithms increase their complexity and grow in their ability to solve more diverse problems, the need to define rules by which AI is developed becomes more and more important.

            Before explaining the potential pitfalls of AI, a brief explanation of the technology is required. Attempting to define artificial intelligence begs the question of what is meant by intelligence in the first place. Poole, Mackworth and Goebel clarify that for an agent to be considered intelligent, they must adapt to their surrounding circumstances, learn from changes in those circumstances, and apply that experience in pursuit of a particular goal. A machine that is able to adapt to changing parameters, adjust its programming, and continue to pursue a specified directive is an example of artificial intelligence. While such simulacra are found throughout science fiction, dating back to Mary Shelly’s Frankenstein, they are a more recent phenomenon in the real world. 

            Development of AI technology has taken off within the last few decades, as computer processing power has increased. Computers began successfully competing against humans in chess as early as 1997 with DeepBlue’s victory over Garry Kasparov. In recent years, computers have started to earn victories in even more complex games such as Go and even video games such as Dota 2. Artificial intelligence programs have become common place for many companies which use them to monitor their products and improve the performance of their services. A report in 2017 found that one in five companies employed some form of AI in their workings. Such applications are only going to become more commonplace in the future.

In the healthcare field, the prominence of AI is readily visible. A report by BGV predicted a total of $6.6 billion invested into AI within healthcare by the year of 2021. Accenture found that this could lead to saving of up to $150 billion by 2026. With the recent push towards personalized and precision medicine, AI can greatly improve the treatment and quality of care. 

However, there are pitfalls associated with AI. At the forefront, AI poses a potential risk for abuse by bad actors. Companies and websites are frequently reported in the news for being hacked and losing customer’s personal information. The 2017 WannaCry attack crippled the UK’s healthcare system, as regular operations at many institutions were halted due to their compromised data infrastructures. While cyberdefenses will evolve with the use of AI, there is a legitimate fear that bad actors could just as easily utilize AI in their attacks. Regulation of use and development of AI can limit the number of such actors that could access those technologies.

Another concern with AI is the privacy question associated with the amount of data required. Neural networks, which seek to imitate the neurological processing of the human brain, require large amounts of data to reliably generate their conclusions. Such large amounts of data need to be curated carefully to make sure that identifying information that could compromise the privacy of citizens is not easily divulged. Additionally, data mining and other AI algorithms could information that individuals may not want revealed. In 2012, a coupon suggestion algorithm used by Target was able to discern the probability that some of their shoppers were pregnant. This proved problematic for one teenager, whose father wanted to know why Target was sending his daughter coupons for maternity clothes and baby cribs. As with the cyberwarfare concern, regulation is a critical component in protecting the privacy of citizens.

Finally, in some fields including healthcare, there is an ever present concern that artificial intelligence may replace some operations entirely. For example, in radiology, there is a fear that improvements in image analysis and computer-aided diagnosis by the use of neural networks could replace clinicians. For the healthcare field in particular, this raises several important ethical questions. What if the diagnosis of an algorithm disagrees with a clinician? As the knowledge an algorithm has is limited by the information it is exposed to, how will it react when a unique case is presented? From this perspective, regulation of AI is important not only to address practical concerns, but also pre-emptively answer ethical questions.

While regulation as strict as the Asmiov’s Three Laws may not be required, a more uniform set of rules governing AI is required. At the international level, there is much debate among the members of the United Nations as to how to address the issue of cyber security. Other organizations, such as the European Union, have made more progress. A document recently released by the EU highlights some ethical guidelines which may serve as the foundation for future regulations. At the domestic level, there has been a push from scientists and leaders in the field towards harnessing the development of artificial intelligence for the good of all. In particular, significant headway has been made in the regulation of self-driving cars. Laws passed in California restrict how the cars can be tested and by 2014, four states already had legislation applying to these kinds of cars. 

Moreover, the FDA recently released a statement expressing their approach to the regulation of artificial intelligence in the context of medical devices. At the time of this writing, there is a discussion paper that is open for commentary describing the proposed approach that the FDA may take. They note that the conventional methods of acquiring pre-market clearance for devices may not apply to artificial intelligence. The newly proposed framework adapts existing practices to the context of software improvements.  

Regulation must also be handled with care. Over-limitation of the use and research in artificial intelligence could lead to stifling of development. Laws must be made with knowledge of the potential benefits of new technological advancements could cause. As noted by Gurkaynak, Yilmaz, and Haksever, lawmakers must strike a balance between preserving the interests of humanity and the benefits of technological improvement. Indeed, artificial intelligence poses many challenges for legal scholars.

In the end, artificial intelligence is an exciting technological development that can change the way we go about our daily business. With proper regulation, legislation, and research focus, this technology can be harnessed in a way that benefits the human experience while preserving development and the security of persons.

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

April 18, 2019 at 2:25 pm

Science Policy Around the Web – April 2, 2019

leave a comment »

By: Patrice J. Persad Ph.D.

Image by Jason Gillman from Pixabay

Worrisome nonstick chemicals are common in U.S. drinking water, federal study suggests

What lurks in our drinking water—and all its effects on organismal health—may be more of a mystery than what resides in the deep recesses of our oceans. In a recent investigation conducted by the United States Geological Survey and the Environmental Protection Agency (EPA), manmade per- and polyfluroalkyl substances (PFAS) tainted drinking water samples were analyzed. PFAS, which put the “proof” in water-proof items, are substances of concern, or, more aptly, contaminants of emerging concern (CECs), given their potential carcinogenicity and permanence in ecosystems. Perfluorooctane acid (PFOA), a PFAS discontinued in production domestically, was at a concentration over 70 nanograms per liter (ng/l) in a sample. A trio of other PFAS surpassed this concentration level, as well. A standard level issued by federal agencies has yet to transpire. However, the Centers for Control of Disease(CDC) attests that the existing cut-off of 70 ng/l is unacceptable in that it is not sufficiently low, or conservative, with respect to human health. 

The Environmental Working Group(EWG) suspects that over 100 million individuals in the U.S. drink water with PFAS. Citizens currently advocate for authorities to test drinking water samples and disclose PFAS concentrations. Without setting standards, accountability for future detriments to health is up in the air. Only through discussion with the public, policy makers, the research community, and parties formerly or currently producing PFAS can we set safeguards to protect our water supply plus well-being. 

(Natasha Gilbert, Science)


To Protect Imperiled Salmon, Fish Advocates Want To Shoot Some Gulls

In recreating the fundamental question “Who stole the cookies from the cookie jar?”, nature’s version spins off as “Who stole the juvenile salmon from Miller Island?” In this spiraling whodunit mystery, an unexpected avian culprit surfaces: the gull. According to avian predation coordinator Blaine Parker, surveys revealed that a fifth of imperiled salmon were whisked away by gulls near channels flowing out of dams. Gulls also spirited away these juvenile fish from other avian predators, such as Caspian terns. Parker maintains that not every gull is a perpetrator of decreasing the species’ numbers; gulls can assist with the population control of other birds who feast on the juveniles. Therefore, he supports killing the individual gulls disturbing juvenile salmon booms—lethal management.

Although there has been precedent of sacrificing avian species for the security of juvenile salmon, several entities denounce lethal management of wayward gulls affecting the young fish’s survival rates. The Audubon Society of Portlandpoint out that the Army Corps. of Engineers’ modifications to dams for warding away gulls, or other airborne predators, are slipshod and ineffective, if not inexistent. The U.S. Army Corps., despite this criticism, avows that killing specific gulls is only a final resort. From Parker and these organizations’ opposing viewpoints, a new mystery migrates to the surface. Will killing avian predators populating dams and waterways have a significant impact on the endangered salmons’ survival? Research collaboration on ecological impacts may be a way to tell or reassess the futures of both juvenile salmon and gulls. 

(Courtney Flatt, Northwest Public Broadcasting/National Public Radio



Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

April 3, 2019 at 10:32 am

Science Policy Around the Web – March 26, 2019

leave a comment »

By: Neetu M. Gulati Ph.D.

Image by Dimitris Vetsikas from Pixabay

Sunscreen ban aimed at protecting coral reefs spark debate – among scientists

Corals around the world have begun “bleaching,” turning white and expelling the algae that live within them. After a 2015 study found that oxybenzone can harm corals, environmentalists have worked to bar the sale of sunscreens containing the chemical. Last year, Hawaii was the first US state to ban sale of sunscreens containing oxybenzone, as well as another harmful chemical octinoxate, which are found in up to 75% of sunscreens on the US market. The ban will go into effect in 2021. Florida and California are considering similar laws. However, while some are fighting to limit the use of these toxic chemicals, others say the major issue is not sunscreen – it’s climate change.

Evidence indicates that harmful chemicals and warming oceans due to climate change are both damaging corals and leading to bleaching. Scientists agree that the major contributing factor is climate change and the chemicals play a lesser role. Nevertheless, they disagree about what should be done. C. Mark Eakin, an oceanographer and the coordinator for NOAA’s Coral Reef Watch program, commented “if we don’t deal with climate change, it won’t matter what we do about sunscreens.” Furthermore, some people believe there is not enough clear evidence explaining how damaging these chemicals can be. While many scientists share this viewpoint, others think that every step towards saving the corals matters. Some lawmakers agree with this philosophy; Teri Johnston, the mayor of Key West, Florida, said of banning the harmful chemicals, “if it’s something we can do to minimize damage to reefs, it’s one small step we’re going to take.” The city of Key West banned the sale of sunscreens containing oxybenzone and octinoxate last month, an act that will go into effect in 2021.

Damage to coral reefs is a complicated issue, with multiple stressors likely to be involved: not only climate change and sunscreens, but also pollution and other harmful chemicals. While many are worried about protecting the reefs, there is also concern as to how these bans will affect human health. In response to the Hawaii ban, the Skin Cancer Foundation put out a statement which said, “by removing access to a significant number of products, this ban will give people another excuse to skip sun protection, putting them at greater risk for skin cancer.” 

One possible solution is to expand the number of ingredients permitted in sunscreen, to allow for other protective chemicals that are less harmful to the environment. The FDA has not expanded its list of approved ingredients in approximately 20 years. Comparatively, Europe allows for more chemicals, hopeful that any one single chemical will have a less harmful environmental impact when more diversity of ingredients is allowed. Towards this end, the FDA recently proposed new regulationsto improve American sunscreens.

(Rebecca Beitsch, Washington Post

In a first, U.S. private sector employs nearly as many Ph.D.s as schools do 

The career landscape for burgeoning PhDs has changed drastically in the last 20 years; while the number of PhDs awarded has increased, especially in the fields of life and health sciences, the proportion of PhDs employed in tenured and tenure-track positions has declined. This is in contrast to what some current faculty members, who may assume that tenure track positions are the standard path for PhDs, and other career paths are “alternative.” According to the Survey of Doctorate Recipients from the US National Science Foundation (NSF), in 2017, for the first time, private sector employment of PhDs (42%) is nearly equivalent to employment by educational institutions (43%). This is in stark contrast to 1997, when educational institutions employed 11% more PhDs than the private sector. While the survey takes into consideration all PhDs under the age of 76 who are employed full-time in the US, it is expected that newer PhDs are less likely to secure tenure-track positions. 

As career trajectories change, some universities are using new information about PhD outcomes to improve programming for current graduate and prospective students. According to the Coalition for Next Generation Life Science, ten academic institutions have released data onlineabout the career outcomes of their PhD graduates, with more institutions planning to release similar data by the end of next year. The data indicates the traditional model of training, which treats graduate school like an apprenticeship to becoming faculty, is outdated. Other skills that transfer beyond educational institutions, may be necessary to successfully train the next generation of PhDs. 

(Katie Langin, Science)



Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

March 26, 2019 at 5:00 pm