Science Policy For All

Because science policy affects everyone.

Archive for May 2021

On the negative public health and environmental impacts of fracking: evidence builds, but more data is needed

leave a comment »

By Trisha Tucholski, PhD

File:Flickr - Nicholas T - Rig.jpg
Marcellus shale gas-drilling site along PA Route 87, Lycoming County by Nicholas A. Tonelli from Pennsylvania, USA, CC BY 2.0 https://creativecommons.org/licenses/by/2.0, via Wikimedia Commons

Hydraulic fracturing, or “fracking”, is a process that uses large volumes of fluid at high pressures to break impermeable rock formations (e.g., shale) deep below the earth’s surface to extract natural gas and crude oil. The practice of fracking has boomed over the past 20 years in the United States, largely due to two transformative technological advances – the  use of slickwater (a cocktail of water, sand, and chemicals) and horizontal drilling. These advances allowed for the increased production of natural gas and crude oil in the U.S., positioning the country as a global leader in natural gas and petroleum production. The newfound ability to produce gas and oil at high volumes has decreased U.S. dependence on foreign sources and has led to lower energy prices. In addition, fracking advocates suggest that since natural gas is cleaner than coal in terms of carbon emissions (producing half the carbon dioxide per unit of energy), it may help curtail global warming. This stance remains controversial, since the greenhouse gas methane (80-times more potent than carbon dioxide), can leak from natural gas wells. Fracking has also been associated with economic growth and has produced new gas and oil industry jobs, especially in states with shale. The mid-2000s saw an increase in drilling for oil and gas in states like North Dakota, Texas, and Pennsylvania. Despite the fracking boom, scientific investigations into the environmental and public health impacts of fracking have not kept up, and epidemiology experts have concluded that more data must be systematically collected to ensure public health and safety needs are met.

Though the fluid used to fracture shale, or “frack fluid,” contains mostly water, chemical additives are included for different purposes, such as to decrease the viscosity of the fluid to improve the flow of oil to the surface. The U.S. Environmental Protection Agency (EPA) reported 1,084 different frack fluid additives between 2005 and 2013. The concern is not in the use of these chemicals to fracture shale to release gas and oil but instead it is what happens to the frack fluid, which becomes hazardous wastewater during the process. A report prepared by Argonne National Laboratory estimated that 260 barrels (10,920 U.S. gallons) of wastewater were produced per one unit of natural gas extracted in 2007. Wastewater from fracking is often stored in lined pits or underground wells, meaning that it has the potential to contaminate drinking water sources, such as wells or underwater aquifers. Frack fluid leaks, unintentional rock fractures, poor well construction, and wastewater mismanagement are just a few sources of water contamination caused by fracking. Many of the chemicals found in frack fluid and the resulting wastewater are known to be harmful to human health (arsenic, mercury, cadmium, benzene, toluene, ethylene glycol, methanol- to name a few). In 2016, a group from Yale systematically evaluated 1,021 chemicals found in fracking fluids and wastewater; for the substances with toxicity data, many were found to have reproductive or developmental toxicity. Alarmingly, they found that toxicity information for 76% of the chemicals on their list did not exist. Despite regulations from some states requiring disclosure of frack fluid additives, many companies avoid disclosure of the chemicals in their recipes altogether by classifying them as trade secrets. The EPA reported that up to 70 percent of the chemical disclosure forms submitted between 2011 and 2013 listed at least one chemical as “confidential business information.” Without knowing the full repertoire of chemical additives used in frack fluid across the country, studying the health effects of fracking pollution or knowing how to properly manage the large volume of wastewater is challenging.

It is not only the chemicals added to frack fluid which have potential to cause harm, but also the naturally occurring substances that are picked up during the fracking process and carried with the wastewater. For instance, naturally occurring heavy metals, toxic hydrocarbons, and radioactive materials are picked up during the fracking process and have potential to contaminate the environment. Residents of Estill County, Kentucky, were outraged to find that 1,900 tons of radioactive fracking waste had been illegally buried in a local landfill. Exposure to the toxic waste has real-life implications for the humans who live nearby, and there have been numerous reports of observed negative health impacts. A concerned mother and pharmacologist who lives within one mile of 158 wells in Colorado had her son’s blood tested for volatile organic compounds. The test found unusually high levels of the carcinogenic chemicals benzene, ethylbenzene, and o-xylene in his blood. In my home county in southwestern Pennsylvania, Westmoreland County, where fracking is prevalent, there are 251 active natural gas wells with 57 reported violations. Not insignificantly, a high number of rare cancer cases in young people have prompted the Centers for Disease Control and Prevention to investigate a causal link between fracking and public health consequences. Between 2011 and 2018, 12 cases of Ewing sarcoma (a rare cancer affecting 3 in 1 million children annually) were diagnosed in Westmoreland County. As of June of 2019, the Pittsburgh Post-Gazette has documented 67 cases of cancer in children and young adults in four southwestern PA counties, with 27 of those being Ewing sarcoma. Despite the alarming number of cancer cases, there is hesitance to link them to fracking, partly due to a lack of data. Now, Pennsylvania’s governor, Tom Wolf, has asked the state’s Department of Health to systematically investigate the public health risks associated with fracking, citing that the “[economic] benefits [of Pennsylvania’s natural gas development] should not require a choice between them and public health or safety.” Governor Wolf has recently funded two two-year studies to be carried out at the University of Pittsburgh that will investigate the potential relationship between rare childhood cancers and fracking in southwestern PA.

Fracking has several other environmental impacts, including water usage, land usage, pollution of aquatic ecosystems, and natural gas leaks. The United States Geological Survey indicates that anywhere between 1.5 million to 16 million gallons of water are used to frack a single well. Fracking can draw from fresh water supplies which would otherwise be used for drinking water, irrigation, and aquatic ecosystems. The water used in the process may never be recycled, as extensive treatment of fracking wastewater would be required to return it to freshwater cycle. Because the amount of water used and discarded by fracking continues to increase, and could increase by up to 50-fold by 2030, fracking could put an enormous strain on water supplies, especially for areas where freshwater is already a scarce resource. Air pollution related to fracking is also a concern, since methane, a potent greenhouse gas, can leak from natural gas wells after drilling. Dangerous levels of the flammable gas have also been found in drinking water wells near drilling sites.

Systematic investigation into the environmental and public health impacts of fracking have been limited, but are critically important if fracking is to continue. The 2016 report released by the EPA determined that under “some circumstances” fracking contaminated drinking water, but the report failed to conclude any widespread or systemic effects, largely due to a lack of data. Given the potential harm to the environment and public health, measures should also be taken to regulate and enforce safe fracking practices. This would entail reporting and mitigating chemical leaks and enforcing proper wastewater management. Currently, fracking is regulated at the state level, and policies vary from state to state. Despite sitting atop Marcellus Shale, New York became the first state to ban fracking, determining that the environmental and public health harms outweigh the economic benefits. The federal government has taken few steps to regulate the process, aside from exempting fracking from parts of the Clean Water Act, the Clean Air Act, the Safe Drinking Water Act, and the EPA’s hazardous-waste laws. The Biden administration did place a temporary ban on new drilling permits for federal lands in January, but it is not enough. The federal government should fund systematic investigations into the environmental, climate, and public health consequences of fracking, and institute policies which require proper wastewater management, storage, and disposal. In addition, loopholes which exempt the fracking industry from essential environmental policies should be removed. One additional way to mitigate the environmental and public health harms imposed by fracking is to reduce our reliance on natural gas and crude oil by immediately investing in greener energy sources, such as solar and wind energy (which imposes its own environmental impacts). Afterall, natural gas is thought to be a bridge to renewable energy. Reducing or removing fracking as a practice has been a contentious topic, primarily because it would affect local economies and individuals whose livelihoods are based in the fracking industry. However, given the growing evidence of environmental and public health impacts imposed on those same communities, it is worth considering whether the harms imposed by fracking outweigh the benefits.

Written by sciencepolicyforall

May 31, 2021 at 8:27 am

Science Policy Around the Web May 28, 2021

leave a comment »

By Somayeh Hooshmand, PhD

Image by Bob Dmyt from Pixabay

FDA official says heart issue possibly linked to Covid vaccines is rare, would inoculate own kids

Getting vaccinated is the best way to defeat the COVID-19 pandemic, however vaccines are still subject to debate over whether or not the vaccine is safe and effective for children. Although fewer children have been sick and tested positive for COVID-19 compared to adults, they can still get infected and transmit the virus to others.

The CDC recommends everyone ages 12 and older to get the Pfizer-BioNTech COVID-19 Vaccine. It is important to vaccinate children to help protect kids against COVID-19 as well as to stablish herd immunity. Vaccination against COVID-19 has now started for older children; however, several dozen cases demonstrated mostly mild heart problems called myocarditis after COVID-19 vaccinations.

According to the Mayo Clinic, “Myocarditis is an inflammation of the heart muscle that, can affect the heart’s electrical system, reducing its ability to pump and causing rapid or abnormal heart rhythms.” This condition was viewed more often in males that females, and most cases are mild and usually happened after a second dose and often go away without requiring treatment.

Dr. Marks, FDA’s top vaccine regulator said that rare cases of heart inflammation condition in vaccinated teens and young adults have been reported and it remains unclear whether the vaccine is the cause of the heart problem. There is consensus among scientists and health experts that a side effect is an expected reaction to the COVID-19 vaccine that your body is starting to build immunity against the disease. Most people experience mild side effects after vaccination and if myocarditis was caused by the COVID-19 vaccine itself, the risk of having this side effect is very low when compared with the risks of being infected with COVID-19.

(Berkeley Lovelace Jr., CNBC)

Written by sciencepolicyforall

May 31, 2021 at 8:10 am

COVID-19 vaccination in low-income countries: Strategies to overcome the challenges

leave a comment »

By Jubayer Rahman, PhD

Image by Frauke Riether from Pixabay

The aggression of COVID-19 around the globe has been claiming thousands of lives and leaving the survivors with serious post-COVID health issues. The COVID-19 pandemic has not only had a tremendous impact on health, but also on the economic and social fabrics of our societies. It is conceivable that this current pandemic is not going to be over soon. Moreover, it has also been reiterated that, without global cooperation, COVID-19 infections will continue to persist. Availability of effective vaccines against COVID-19 is perhaps the highest priority to bring lives and economies back to normal. 

Sadly, the whole world is currently going through the COVID-19 pandemic, and it is urgent that case numbers are reduced while at the same time to ensuring public safety. Some of the goals of vaccines are to protect peoples’ lives and prevent future pandemics. The first challenge is to reduce the burden of infection in people who are the most vulnerable through COVID-19 vaccination. Next, it is important to offer mass vaccination to the remaining population. In this endeavor, rapid production, distribution and dispensing of medical countermeasures are vital in the face of the COVID-19 surge. Mass vaccination is the goal for every country during a pandemic in order to reduce the burden on the healthcare system and economy and to restart social life. This will only be possible when a successful vaccine is available for everyone. 

Mass vaccination and the ultimate goal

According to the World Health Organization (WHO), routine immunization is the most successful intervention in reducing morbidity and mortality associated with vaccine-preventable diseases. For example, global incidences of smallpox and polio have gone down to zero by routine vaccination. Routine vaccination typically creates an immune barrier against pathogens to keep people safe as pathogens get defeated by the vaccine-induced immune responses. Natural immunity or vaccine-induced immunity develops slowly and could take a few weeks before a person is called fully immune to a specific infection. When an infection is caused by a highly contagious pathogen, like COVID-19, it is a global priority to limit the spread of infection because some individuals in the community are extremely vulnerable to the infection. In order to protect the vulnerable population, we need either an effective treatment strategy or vaccination as a preventive measure. The idea behind mass vaccination to immunize approximately three-fourths of the population. Once these numbers are reached, the rest of the non-vaccinated people should be protected as well because the infectious pathogen would not find enough hosts to propagate. Therefore, to achieve such herd immunity in the community, we must undergo an effective, safe, and massive vaccination program.

Which vaccines should be targeted? COVID-19 specific frontline vaccine candidates

Vaccine developers made history by developing COVID-19 specific vaccines for human use in record time. The US government invested hundreds of millions of dollars early in the pandemic for vaccine discovery resulting in two successful mRNA vaccines against COVID-19. Like the US, many other countries made deals to receive vaccines as soon as they were available. There are nearly 150 vaccine candidates around the world at different stages of development or in clinical trials. Only a few of them recently completed phase III clinical trials with very satisfactory data. Two different vaccine platforms have been used to develop successful vaccines against COVID-19: gene-based and protein-based technologies. Protein-based technology has been used for many other vaccines in the form of inactivated or subunit vaccines. The purpose is to expose the immune system to a foreign antigen in order to develop immunity ahead of encountering pathogens. This old-fashioned technology (inactivated) works for many other pathogens, but it also generates unwanted immunity (antibody-dependent enhancement) and may pose the risk of causing complications in the vaccinated individuals. To avoid such problems, the gene-based vaccine platform (mRNA) has been under investigation for almost 10 years but had never entered large scale human clinical trials until recently. Therefore, it is remarkable that Pfizer  and Moderna mRNA vaccines against COVID-19 received emergency approval for human use. 

Challenges towards mass vaccination

For COVID-19 vaccination, new regulations are desperately needed for COVID-19 vaccine storage, distribution, and speedy immunization programs. The new COVID-19 vaccines may potentially differ from the existing vaccines in the Expanded Program on Immunization (EPI). Thus, introduction of COVID-19 vaccines in low-income countries would require more information regarding the type of vaccines available. Incorporation of COVID-19 vaccines into the current vaccination programs, making changes in the policy for better training, and increasing efficiency to reach the target groups are all essential at this point. Unfortunately, the decision-making process is very complex due to multiple factors and offices involved. Thus, it is important for experts to be up to date on recent data and understand the any new vaccines. In this pandemic, it is likely that multiple vaccine candidates will gain emergency use authorization, but few, if any, vaccine candidates will have time to gather complete research data on long-term safety and efficacy. Vaccine storage and distribution would also be different between the types of vaccines. Countries need to be prepared for smooth distribution of vaccines. 

The biggest challenge and preparedness

It has become very challenging formany countries to secure required doses of an effective vaccine because of vaccine ‘grabbing’ by rich countries.  This unequal race for securing vaccines through pre-orders may deprive billions of people in developing countries from receiving vaccines. In this rapidly evolving situation, each of the low-income countries should not wait to get a vaccine through COVAX, a global taskforce established for equitable vaccine distribution around the globe. COVAX is deviating from its primarily objective of fair and equitable access for every country in the world due to the unequal race for securing vaccines. 

COVAX is ensuring access to COVID-19 vaccines either by signing a commitment with vaccine developers for a certain number of doses or seeking help through GAVI (The Global Alliance for Vaccines and Immunizations) or WHO to get enough doses. Unlike other routine vaccination programs that have been successfully running throughout the globe, the shortage in COVID-19 vaccines has become a serious problem. Nearly all countries put out requests directly or indirectly to secure COVID-19 vaccines within a short period of time.

It is likely that the typical mass vaccination will not be possible in low-income countries because of the race for vaccine pre-order. Unfortunately, different COVID-19 vaccines are likely to have different requirements in terms of storage temperature. Some vaccines, whether shipped in frozen vials or in lyophilized powder form, require a certain temperature for storage. The Pfizer COVID-19 vaccine needs an ultra-cold storage temperature (-20 to -70°C). It would be an enormous task for many developing countries to establish a country-wide system for transportation, distribution, and storage for handling ultra-frozen vaccines. In such scenario, governments should promote private initiatives for vaccines requiring ultra-cold storage temperature. Though the transportation could be done with a dry-ice box, vaccine doses must be administered within days of thawing as increased temperature may destroy the vaccine. Since the Pfizer vaccine is currently being used in the US, Canada, and UK, there are number of strategies that have been taken to efficiently distribute the vaccine. Identification of the first group of people to be vaccinated, shipping and storage of vaccine lots at different states within a country, and establishment of major distribution sites will ease the process. The most favorable solution to help the COVID-19 vaccination program includes introduction of mobile vaccine clinics to reach local communities, which will allow more rural areas to be reached. Announcing the arrival date and locations of mobile vaccine clinic in a community ahead will make the process faster and more effective. 

There are a significant number of challenges in countries regarding the distribution of COVID-19 vaccines.  These challenges include vaccine storage, priority to who receives it first, and follow-up visits to determine whether vaccinated individuals are developing antibodies. Due to the extreme constraints of the availability of COVID-19 vaccines, every country needs to create a distribution strategy. First, we need to understand who the most at-risk or vulnerable individuals are based on the pandemic incidence data.  After determining those populations, the required health-care professionals must be located and incorporated into the distribution plan. Next, logistic support must be added as a part of the preparedness plan, which could even include to calling on military personnel for help to monitor the storage and transportation of vaccines. Another critical point is to follow-up with patients who have received a vaccine that requires two doses scheduled a month apart. It is also critical to keep the record of which vaccine was given to an individual to ensure they receive the same second dose of a COVID-19 vaccine. 

Second, it is not as urgent to get the vaccine to those that are relatively healthy and young, as they fall into the lower risk group. Similarly, people who had a history of COVID-19 exposure and recovered recently might not need to be prioritized for the vaccine. Instead, COVID-19 specific antibody tests could be given to people in a zone of high COVID-19 cases, such as doctors, nurses and associated workers who are meeting on a daily/weekly basis with COVID-19 patients. If people have a high antibody titer, they may be able to wait to get the vaccine.     

The government should work with pharmaceutical companies and vaccine developers to ensure that countries get priority access to suitable vaccines and secure more doses of a vaccine. Another possibility to use the vaccine fill-finish platform available in pharmaceutical companies in many low-income countries. Fill and finish is a process of packaging vaccines ready for distribution. This will help to secure enough doses of vaccines within a short-time.

Finally, it is noteworthy that the whole world needs to work together in COVID-19 vaccine production, distribution, and immunization to reduce the burden of the disease. Similarly, it is our responsibility at national or individual levels to understand the need and cooperate with the decisions taken by the government. It is important that under this global crisis and scarcity of vaccine production, individuals follow safety precautions until they are fully vaccinated and precautions taken not to waste any vaccine doses.

Written by sciencepolicyforall

May 28, 2021 at 10:11 am

The changing world of equitable sharing from genetic resources in Nairobi to digital sequence information in Kunming

leave a comment »

By Ann M. McCartney, PhD

Image by Darwin Laganzon from Pixabay

Open data sharing as it pertains to biodiversity and conservation research

The Bermuda Accord, Human Genome Project, and Fort Lauderdale Agreement revolutionized data sharing amongst the human genomics research community. Since, open data sharing has transcended into the burgeoning -omics field with over 1.4 billion existing Creative Commons licenses. The field of biodiversity and conservation genomics is no exception, as international efforts such as the Earth BioGenome Project and the Vertebrate Genomes Project champion open data sharing of sequencing information on their mission to catalogue all of life, prevent further biodiversity loss and inform conservation strategies for species threatened by extinction.

However, with the increasing volume of digital sequence information (DNA/ RNA/proteins/metabolites/macromolecules/associated traditional knowledge) being created and stored and the growing number of international participants, the equity of “openness” is under question by the Parties of the Convention of Biological Diversity. This year in Kunming, China, the 15th Conference of Parties will re-negotiate the implementation of its Nagoya Protocol on access and benefit-sharing and finalize the Post-2020 Biodiversity Framework. This re-negotiation profoundly impact the biodiversity research community and could potentially provide challenges to large-scale transnational biodiversity and conservation genomics efforts where digital sequence information (DSI) and DSI comparisons could be seen as the scientific currency required for impactful research toward effective conservation management.

The history of access and benefit-sharing as a legal requirement

Since the 1980s, the equitable conservation and sustainable use of biodiversity have been at the forefront of international discussions with the International Union for Conservation of Nature’s (IUCN) Commission on Environmental Law, the IUCN Environmental Law Centre, and the United Nations Environment Program (UNEP). In 1988, the UNEP tasked an Ad hoc committee of Experts on Biological Diversity to explore the need for an international convention on Biological Diversity. In 1989, an Ad hoc group of Technical and Legal Experts was charged with preparing this international legal instrument for the conservation, sustainable use, and fair and equitable benefit-sharing of biodiversity.  In 1991, the first formal draft was prepared and considered by an Intergovernmental Negotiating Committee (INC), and the text was developed by four subsequent INC negotiating sessions. By 1992, just 27 of the 42 developed articles had been agreed upon.

Despite this, later that year, this international instrument, namely the Convention of Biological Diversity (CBD), was adopted at the Nairobi Conference, opened for signature at the Earth Summit in Rio De Janeiro, and entered into force in 1993 with 168 contracting Parties and considers access and benefit-sharing (ABS) of genetics resources (GR) throughout. However, it has been reported that due to the pressures of time leading to a rushed adoption caused inconsistencies amongst articles and many drafting deficiencies. The CBD provided a legal framework for both providers and users of genetic resources (GR) to enter into a bilateral ABS agreement encompassing prior informed consent (PIC) and mutually agreed terms (MAT). This agreement intends to mitigate the misappropriation and misuse of GR. Misappropriation is when a GR is acquired in violation of domestic ABS legislation and misuse is when GR is utilized to violate the established MAT. In 1993,  An Intergovernmental Committee on the Convention on Biological Diversity (ICCBD) convened in Geneva to ensure the early operation of the Convention and to prepare for the first Conference of Parties (COP). COP would meet every two years and would act as the convention’s governing body.

In the 2 years from adoption to the first COP, several challenges of implementing the CBD were acknowledged. First, the lack of legal clarity within the convention’s bilateral agreement pertaining to the misuse and misappropriation of GR that was intended to facilitate fair and equitable collaborations actually resulted in users shying away from building partnerships with providers when ABS was required – contributing to cases of jurisdiction shopping. However, this was not the only challenge of the implementation of the CBD – due to the convention being subject to domestic implementation, ABS lacked consistency amongst its Parties, a fact  illustrated by the differing definitions of GR across parties, no definition for traditional knowledge (TK), and no indicators to differentiate commercial and non-commercial uses of GR.

Importantly, digital sequence information (DSI) and its relationship to GR was not clearly defined. The convention suffered both at an institutional and individual user level due to lack of capacity, support, and expertise required for implementation. It became clear that the CBD required a supplementary agreement to provide legal clarity to both providers and users on the fair and equitable sharing of the benefits arising from the utilization of genetic resources. This became item 6.6 on the agenda of the first COP1 held in Nassau (1994).

The Road to Nagoya

Following the initial discussion in Nassau, the topic of ABS was addressed at both COP2 in Jakarta and COP3 in Buenos Aires, but it was not until COP4 in Bratislava that the conversation accelerated involving both the public sector, Indigenous Peoples and Local Communities (IPLCs). In Nairobi at COP5, an open ended Ad hoc Working Group (AHWG) was tasked with the development of an initial set of guidelines, evolutionary in nature, that would provide a starting point. These became known as the Bonn Guidelines and were adopted at COP6. The work of AHWG was not over; at COP7, the group was charged with the mission of constructing an international regime for ABS. Throughout this process work on certificates of sources, certificates of origin, and certificates of legal provenance were drafted and language developed through the Paris Annex and Montreal Annex with the goal of having a draft recommendation by COP10. Similar to the rushed adoption of the CBD, the initial draft was adopted by the AHWG only six months prior to COP10 in Nagoya. Again, Parties were painfully aware of the shortcomings of the draft.

In 2011, 18 years after the adoption of the CBD, the Nagoya Protocol opened for signature, and after receiving the required 50 international ratifications, it entered into force in 2014. Its adoption was largely due to its flexibility and the nature of COP and its ability to correct inconsistencies present after adoption (notably errors generally remain uncorrected). The protocol consists of 36 articles, and its association to the CBD is articulated in Article 32 as well as its relationship to other ABS-related international instruments such as the International Treaty on Plant Genetics Resources for Food and Agriculture (ITPGRFA) that entered into force in 2004, International Convention on New Varieties of Plants, United Nations Convention on the Law of the Sea entered into force 2004, the Antarctic Treaty System, Trade Related Aspects of Intellectual Property Rights (TRIPS) by the World Health Organization (WHO) entered into force in 1995, and the World Intellectual Property Organization (WIPO) Convention established in 1967. Again, a concrete definition of DSI and its relationship with ABS of GR was neglected in the Protocol.

Beyond Nagoya  

Alongside the Nagoya Protocol, the Strategic Plan for Biodiversity 2011-2020 was adopted. The rationale of the plan was based on conclusions from the third edition of the Global Biodiversity Outlook (2010) and was thought of as an initial stepping stone toward the ultimate vision of “Living in harmony with nature” by 2050. Its mission was to ensure that by 2020, ecosystems were resilient to further biodiversity loss, would continue to provide necessary services, and would contribute to well-being and poverty eradication. The plan was broken down into 20 Aichi Biodiversity Targets that were organized into five strategic goals: mainstreaming biodiversity; reduce pressure and promote sustainable use of biodiversity; safeguard ecosystems, species, and genetic diversity; enhance benefit-sharing from biodiversity; and implement diversity management. These targets provided a flexible framework to enable national implementation and international progress reporting. However, the 98 indicators for each target were not adopted until years later at COP13 in decision XIII/28, their uptake by Parties was inconsistent, with national indicators being prioritized over international indicators and the number of indicators per target varying significantly. This strategic plan expired at the end of 2020 with not a single Aichi target achieved – including target 16 that related to NP implementation and ABS. Ambiguous language, a lack of a clear baseline, an unrealistic number of targets, and a lack of quantifiable target indicators have been identified as barriers to the success of the Aichi targets. IPBES and OCED have provided detailed reports on each Aichi target and their pitfalls and made recommendations for a smaller number of SMART (Specific, Measurable, Attainable, Relevant, Time-Bound) targets that would be developed simultaneously with standardized, quantifiable indicators. During COP 14, decision 34 called for a comprehensive and participatory process for proposing new targets for a Post-2020 Biodiversity Framework through Subsidiary Body on Scientific, Technical and Technological Advice (SBSTTA) that should align to other international processes such as the 2030 Agenda for Sustainable Development and the Paris Agreement. These suggestions will be deliberated upon at COP15 in Kunming, China. Notably two proposed targets directly apply to equitable sharing of benefits arising from genetic resources calling for “50% of all financial ABS benefits shared through use of genetic resources is directly deployed for biodiversity conservation” and that “Genetic diversity is maintained and its benefits are shared equitably,” and it remains to be seen whether GR will formally include its associated DSI.

The Road to Kunming

Kunming is due to host the UN Biodiversity Conference, including COP15 and the 4th meeting of the Parties of the Nagoya Protocol (COP/MOP4). Of particular interest, in terms of equitable sharing of genetics resources and DSI, are agenda items 14, 15 and 16 that address COP14s decisions on DSI, its associated TK, the domestic implementation of ABS, and transboundary ABS agreements (14/20,NP-3/12, NP-3/13,NP-3/14). In 2019, a call for information and views by stakeholders as well as research and studies associated with these topics was issued by the CBD and an Ad Hoc Technical Expert Group (AHTEG) was commissioned and tasked to provide clarity on the definition of DSI and whether its associated TK is covered. A follow up series of informal webinars were hosted to present the findings of these studies with the outcome that TK did not fall into the definition of DSI.

Webinars were also hosted to summarize findings on the domestic implementation of ABS of CBD Parties. It was reported that just 16 signatory countries have domestic legislation regarding ABS, and in 2017, two Parties adopted ABS legislation for DSI as a GR, namely Brazil and India. Moreover, just 18 additional signatories plan to or are in the process of drafting such legislation. Another webinar was held to discuss five potential ABS transboundary policy archetypes, drafted from 13 recent peer-reviewed publications, and the criteria for policy assessment was subsequently presented. The road to COP15 remains unclear but the consensus of the findings will be presented in Kunming.

Why does this matter?

One of the many issues highlighted from the ongoing Sars-Cov-2 pandemic is the inequity of the Status Quo. The initial flexibility that facilitated the NP’s adoption due to its ambiguous and inconsistent language has inhibited its effective implementation, thus promoting inequitable practices. In 2015, 915 commercial biology-related patents were registered in the EU alone. Currently, the naked mole-rat has eight registered patents and one pending – these cases highlight the importance of equitable benefit sharing of GR. Following Moore’s Law, processing power per unit cost doubles every 18 months. Human genome sequencing that cost ~$300 million in 2001 is now coming close to just $1000. DSI is being generated more quickly and for a fraction of the cost.  If DSI is formally included in the definition of GR, most of the policy options suggested would be a drastic shift for the research community from the overwhelmingly adopted open science data sharing model facilitated through Creative Commons licenses. For this to be a success, it must be feasible and align with current DSI use. Biodiversity researchers rely on making large amounts of DSI comparisons to facilitate accurate, scientifically sound outcomes. The policy must consider the changing nature of both the presentation and volume of DSI as sequencing technology evolves.

A standardized, multilateral, economically feasible, less cumbersome policy that aligns with current researcher use of DSI could support researchers if the implementation is practical, clear, and straightforward. Moreover, an approach that does not consider associated TK would be an egregious failure and exacerbate failures of the CBD and NP to date. Alternatively if  an uneconomical, misaligned, unclear, bilateral policy is adopted it not only runs the risk of stagnating collaboration and innovative research, with the unexpectedly low tangible benefits being returned, but it also increases the likelihood that the failures of the previous CBD Strategic plan will be emulated in the Post-2020 Biodiversity Framework.

Written by sciencepolicyforall

May 22, 2021 at 1:31 pm

Will Covid-19 infection rates remain low in the less developed countries in the coming days?

leave a comment »

By Sharmina Deloer, PhD

Image by PDPics from Pixabay

In December 2019, the first case of a SARS-CoV-2 infection was reported in the Wuhan province of China. In the US, the first Covid-19 case was reported on January 21, 2020. In South Asia, India reported its first Covid-19 case on January 27, 2020. By March 2020, the virus had spread across the globe, resulting in more than 95,000 confirmed cases worldwide (Table 1), and had been declared a global pandemic. SARS-CoV-2, a highly contagious virus that spreads rapidly through contact or air-droplets from infected persons, can have different impacts on those infected, depending on their health condition.

One and half years after the initial cases were reported, the list of the most affected countries, in terms of both infection and death, includes both developed and emerging economies. One might expect that the world’s poorest countries, with a lacking health infrastructure as indicated by their Human Development Index (HDI), would be at the top of this list. However, the US, the largest economy in the world, tops this list with both the most infections (31,103,006) and deaths (559,010) from Covid-19 (Table 2, data reported as of April 17, 2021). In terms of infection, the USA is followed by India and Brazil, (two major emerging economies), although these two countries recorded their worst single-day infected cases recently when the US, along with other developed countries, began to improve their situations. In terms of fatality, the US is followed by Brazil, Mexico, India, the UK, and Italy. Among the ten countries with the highest infection and mortality rates, there are currently four emerging and six developed economies. Developed countries, such as France, the UK, Germany, Italy, Spain, and the US, have experienced a seesaw of infection and death surges but have never become completely successful at flattening the curve. A similar situation is seen in Brazil, Russia, and India, which are all member of BRIC (Brazil, Russia, India and China), an informal grouping of countries with comparable level economic development to newly advanced economies. However, the remaining member of BRIC, China, successfully contained the spread. In the meantime, Turkey, Iran, Iraq, Indonesia, Bangladesh, Pakistan also reported a high number of Covid-19 cases. Overall, a trend has started to emerge: at first, the developed countries were more severely affected in both infection and mortality rates than developing and emerging economies, but gradually the conditions in the latter countries started to worsen, and some of them (India, Brazil, Turkey, Mexico, etc.) surpassed most of the developed countries. An equally important aspect of this trend is that most of the least developed nations of the world, irrespective of the way they are classified, remain as the least affected countries from SARS-Co-2.  

Table 1: First reported case in six global regions as classified by WHO

RegionsFirst reported case
AmericasUS, January 2020
EuropeFrance, January 2020
South East AsiaThailand, January 2020
Eastern MediterraneanArab Emirates, January 2020
AfricaEgypt, February 2020
Western PacificJapan, January 2020

Now the question is why the infection rates are lower in many developing countries than developed countries, and why it is lowest in the least developed countries (LDCs). The difficulty in answering this question is compounded when we compare the rate of infections in these countries to their populations. Indeed, the LDCs, which account for 14% of the global population, account for only 2% global Covid-19 cases. Among the ten most populated countries in the world (Table 2), except the US with highest infection rate, all other countries can be broadly classified as developing. Another important aspect is why the spread is comparatively low in developing countries where population densities are relatively high. Among the countries listed in Table 2, population density is highest in Bangladesh (1095/Km2), while it is lowest in Russia (8/ Km2). Bangladesh is followed by India (403/Km2) and Pakistan (293/Km2). Interestingly while the infection rate is 3.29% in Russia, that is 0.43%, 1.06% and 0.31% in Bangladesh, India and Pakistan respectively (April 17, 2021). Except in India, which is now going through the worst global Covid-19 outbreak, other densely populated countries listed in Table 2 are doing better in containing the spread. Thus the question is why the spread of the virus is low in a country like Bangladesh where it is difficult to maintain social distancing or isolate infected person. In Bangladesh, there are an average of 4.5 people living in every house. In Dhaka, the capital of Bangladesh, 1.1 million people live in slum and shanti towns, where multiple families share a single cleaning and cooking facility and 10-16 people share the same bathroom. Even outdoor, crowded gatherings for shopping, traveling, and participating in religious festivals are also reported.

Table 2: Top 10 populated country vs Covid-19 cases

Countrytotal populationPopulation density /km2Cases of infection% of infection
China1,397,897,720145.26103,2030.007
India1,339,330,514403.414,291,9171.067
United States330,425,18433.8331,103,0069.413
Indonesia275,122,131140.215,89,3590.578
Pakistan238,181,034293.31739,8180.311
Nigeria219,463,862231.69164,0000.075
Brazil213,445,41724.8613,673,5076.406
Bangladesh164,098,8181095.59707,3620.431
Russia142,320,7908.294,684,1483.291
Mexico130,207,37165.492,291,2461.760

Source: https://covid19.who.int ; https://www.census.gov/library/stories/2020/07/census-bureau-estimates-united-states-population-reached-330-million-today.html

A prompt response to the question posed in the previous paragraph is that the comprehensive testing and accurate reporting of Covid-19 cases are not found in most of these countries for various reasons. One reason is that Covid-19 testing is not free and uniform, as in Bangladesh, and it is expensive for lower income people. Moreover, in Bangladesh, the decision to charge for Covid-19 tests discouraged people to get tested, and the number of tests declined. In India, PCR (polymerase chain reaction) based tests cost $13-40 depending on the state, while in Bangladesh, it costs $41 in private sector. As for the LDCs, a recent report from the UN highlighted a set of factors regarding lower infection cases in these countries. They include effective policy responses (like swift lockdown, closing of school, etc.), emphasis on prevention, targeted investment, and rapid institutional reforms, higher proportion of young people, lower testing, and low quality of data. However, it is difficult to say whether this set of factors is applicable to those developing countries where the spread of the virus is relatively low.

Overall, infection rates remain low in many developing and most low-income countries. However, as the global trend clearly showed that there is little scope for easing preventive measures without risking a resurgent of infections and deaths, the government should keep such measures in place to put an end to this pandemic. Most of the countries, which experienced a second or third wave, found the latest surge to be deadlier than the previous ones. For example, in India where the first wave reached its peak in September 2020 with 93,000 daily Covid-19 cases, the second wave in recent months reached a global peak of 412,431 cases in May 2021 and was reported to be caused by emergence of a new deadly variant, premature relaxation of previous restrictions, and failure to take other supportive measures. Moreover, there were incidents of political meetings and religious festivals. Religion is intricately connected with human emotion and life in India, and recently the Kumbh festival, a holy ritual for Hindu believers, drew millions of devotees to the bank of the Ganges rivers this year (starts April 9, 2021). A BBC piece reported that among them more than 1600 devotees tested positive for Covid-19 between April 10-14. That report also warned that there might be more people transmitted the disease as they traveled home across the country. However, containment of Covid-19 in China can be cited as a successful example of continued preventive measures, despite the fact this is the country where Covid-19 originated. Some of this success in containment can be explained by the rigor of the measures taken by the government and by the fact that the major outbreak was limited to Wuhan (although later there were cases of small spread in other large cities). In Wuhan, the authorities instituted a 76-day full lockdown, tested 9 million people for SARS-CoV-2, separated Covid-19 patients, and sent critical ones to hospitals. International travelling was also strictly monitored and restricted. Compared to other countries like the US and the UK, where a more relaxed and later lockdown approach was imposed, the rigor of Chinese lockdowns can be seen when the country put almost 11 million people into lockdown in the city of Shijiazhuang.

However, such wide and strict lockdown can cause a huge economic cost for the developing and low-income countries where a large part of the population is still living in poverty and governments are less capable financially, institutionally, or logistically. Although these countries are varied in their employment composition, some countries have larger informal economies than others. For example, in Bangladesh rickshaw pullers in the Dhaka city are numbered at 2,200,000, among all daily wagers (minimum wages 8100tk/month or 95.18 in Bangladesh). With reduced working hours or joblessness, this vulnerable group could be the hardest hit. Moreover, lockdown can result in reduced flow of foreign direct investment (FDI), which could lead to lower employment in the export-oriented sector, which in turn not only hurt national income but also disproportionately affected a particular group of workers who are predominantly employed in this sector.

Thus, while it is clear that strict lockdowns effectively contain the spread, this measure can cause other short and long term impacts on various aspects of life. Moreover, in nations where infection spreads widely, lockdown procedures are not as successful. In this situation, mass vaccination remains the only viable solution to flatten the pandemic curve without causing havoc on future development. However, rolling out vaccines to achieve herd immunity is not easy and even developed countries are having mixed success in this regard. Even India, as the largest producer of vaccines in the world, is struggling to vaccinate its own population, which is admittedly a mammoth task to achieve in a country of 1.3 billion people. Other related issues that complicate the roll out of vaccines include protection of intellectual property rights, restriction on movement of raw materials and vaccines, very few vaccine-producing countries, complicated vaccine movement agreements among countries, vaccine hesitancy, and only a handful of approved vaccines.

Thus, after more than a year of struggling with a global pandemic, the containment of spread remains a challenge for the governments around the world. Right now, it seems that many LDCs and other developing countries are reporting less severe Covid-19 outbreaks than developed countries. But this may not remain the case. There are already concerns over the future of poor countries, particularly in the wake of the massive infection surge in India and over the issue of vaccine distribution. The most important short-term initiatives that can be applied or kept in place in the developing countries are increasing screening and testing and isolating infected and vulnerable individuals until enough people receive a vaccination to reach herd immunity. Covid-19 testing and treatment should be free, though it is admittedly a herculean burden on the governments, but it will encourage people to get tested and seek treatment. Mass vaccination programs should be also be implemented. An important strategy for the governments of developing countries could be to obtain vaccines from multiple sources. At the same time, these governments can also seek assistance in the form of knowledge or personnel from the experienced logistical systems of developed countries.

As for the long-term policies, there is no doubt that setting up a preparedness infrastructure to fight global infectious threats should be the priority at national and international levels. The Global Health Security Index already cautioned in 2019 that the world was not ready to deal with a global pandemic as most countries were not prepared. The report provided a range of recommendations, emphasizing building capacity within a coordinated global network that connects national and international bodies. However, any such coordination and cooperation would be futile to keep the human and financial cost at minimum unless they are honest, accountable, empathic, and conscientious. At a national level, the preparedness should begin with establishing a network of health research institutions and/or making it future-oriented to project and prepare for fast-spreading infectious diseases. The importance of connected research agencies and organizations is evident from the astounding speed of developing and rolling out vaccines in such short time in the US and UK, despite all their failings in containing the spread. In the US, the development of a vaccine started in early January 2020, even before the first fatality was officially reported. For the developed countries, there should be a better coordination of short- and long-term measures. For developing countries, the building of health service infrastructure is a priority. For both developed and developing countries, the efforts include providing medical services as well as producing vaccines, medicines, and other supported materials.

Written by sciencepolicyforall

May 21, 2021 at 8:56 pm

Science Policy Around the Web May 20, 2021

leave a comment »

By Ben Wolfson, PhD

Image by Jens Junge from Pixabay

Senate Weighs Investing $120 Billion in Science to Counter China

On Monday the Senate voted 86 to 11 to advance the Endless Frontier Act and allow floor debate to begin. The act was introduced last April and is co-sponsored by Senators Chuck Schumer (D-NY) and Todd Young (R-IN). The bill would funnel $120 billion towards scientific research and innovation in new technologies such as robotics, artificial intelligence and semiconductors. $10 billion will be used for the creation of 10 technology hubs across the United States with the aims of connecting manufacturing and research universities. While the entire sum of funding was initially intended for National Science Foundation initiatives, including the creation of a technology directorate, an amendment introduced last week redirected approximately half towards labs run by the Energy Department, a change that is being contested by some Senators.

The goal of the Endless Frontier Act, aside from spurring research and innovation, is to directly compete with China in these realms. China’s research and development funding has been on a steady increase over the past several decades, and a recent 5-year plan announced in March aims to further this trend. China will increase research and development by 7% every year over the next five years, with the plan highlighting seven key new technologies: artificial intelligence, quantum, brain science, semiconductors, genetic research and biotechnology, clinical medicine and health and space, and sea and polar exploration. China’s latest 5-year plan will also increase their self-reliance in some industrial applications such as semiconductors, something that the Endless Frontier Act seeks to mimic for American industry.

The Endless Frontier Act also seeks to remedy supply chain issues made clear by the COVID-19 pandemic. The entanglement of the Chinese and American economies means each are progressively more reliant on the other, and the Act will attempt to bring some industries back to American soil.

While the Endless Frontier Act would represent one of the most significant investments in U.S. science and technology in decades, some believe it will not be as effective as desired in its current form. By focusing on key areas of research it risks eclipsing new critical technology that have not yet emerged. Furthermore, changes to the structure of the National Science Foundation may damage a highly effective agency.

It remains to be seen what form the Endless Frontier Act will take when (or if) it is made into law, however with (currently) strong bipartisan support and President Biden indicating he supports the bill, some form of it is likely to be enacted in coming months.

(Catie Edmondson, NYT)

Written by sciencepolicyforall

May 20, 2021 at 8:14 pm

Science Policy Around the Web May 18, 2021

leave a comment »

By Maria Disotuar, PhD

Image by ParallelVision from Pixabay

China’s Mars Rover Mission Lands on the Red Planet

China’s space program may have started a new space race on Saturday morning when it landed the Zhurong rover on the surface of Mars.  Although the China National Space Administration (CNSA) had said little about the mission, the news broke out via social media and news outlets. China launched the orbiter spacecraft, Tianwen-1, in July 2020 during the time when the distance between Mars and Earth is the shortest. The spacecraft began to orbit Mars in February and finally released the Zhurong in the Martian plain called Utopia Planitia.  The rover is expected to spend 90 days on the planet and search for evidence of life. 

Landing on Mars is not as easy as landing on the moon due to the heat encountered upon entering the atmosphere. Landing safely requires shields, parachutes, and retro-rockets – all of which have to be used at the right time to avoid a crash landing. China has now become only the second nation in history to land a rover on Mars – NASA’s Spirit and Opportunity rovers landed successfully on Mars in 2004. This historic landing brings excitement to the global scientific community, particularly since China will openly share the data collected from the mission.  The data may mark a new age for deep-space exploration. 

Not surprisingly, China’s space program has big plans for the years to come. Next month they plan to send three astronauts into space. In the future, they hope to launch a Jupiter probe, collect samples from an asteroid, return samples from Mars in 2028, and launch spacecrafts to explore the edge of the solar system. It is evident that these future plans and Saturday’s successful landing makes China one of the top nations to watch as the space race continues. While these events highlight exciting advances they also highlight the growing concern of space debris.  Earlier this spring, China received criticism from NASA Administrator, Bill Nelson, and the international community when Chinese rocket debris crashed into the Indian Ocean in March. Since the mid-1990s, experts have voiced concerns about the growing number of objects launched and left in space. As more nations join the space race it will be vital for agencies and governments to implement policies and mitigation plans to reduce the amount of junk orbiting the earth.  The goal should be to reduce collisions which could spark chain reactions leading to disastrous events on Earth and the environment.

(Steven Lee Meyers and Kenneth Chang, NYT)

Written by sciencepolicyforall

May 18, 2021 at 2:40 pm

Where are we in addressing climate change?

leave a comment »

By Surangi Perera, PhD

Image by Anja🤗#helpinghands #solidarity#stays healthy🙏 from Pixabay

The impacts of climate change were felt in every corner of the world in 2020. We observed these impacts closer to home in the unprecedented wildfires in the western US and more frequent flooding in the central US and further away in the record heat waves in Siberia and torrential rains in Africa. Climate change, long-term changes in the average weather patterns in our planet, can no longer be ignored. 

Sadly, these climate impacts were not limited to last year and have been increasing at an alarming rate in the last few decades. Since the early 20th century, Earth’s climate has changed from an increase in the level of heat-trapping greenhouse gases—such as carbon dioxide and methane—in the atmosphere. These greenhouse gases cause extra heat to be trapped in the atmosphere, raising Earth’s average surface temperature. Greenhouse gases are now higher than they have been for the last 3.6 million years, and has been driven by human activity that our modern civilization depends upon such as fossil fuel burning. The human-induced increase in Earth’s temperature is commonly referred to as global warming. If current levels of greenhouse gas emissions continue, we are on the pathway to a world that is 3 and 4 degrees Celsius warmer by the end of the century, resulting in further climate extremes to what we have observed so far and increase in sea levels, making some countries and regions uninhabitable.

To combat global warming, the Paris Climate Agreement was signed in 2015 by almost 200 countries to hold countries accountable and take actions to limit global warming. Its goal is to limit global warming to below 2 degrees Celsius compared to pre-industrial levels, while striving for the tougher target of 1.5 degrees Celsius. To avoid the worst impacts of climate change, countries aim to rapidly reduce greenhouse gas emissions to net-zero by 2050. Net-zero means achieving a balance between the greenhouse gases put into the atmosphere and those taken out by cutting emissions to a minimal level and removing any residual amounts of greenhouse gases from the atmosphere naturally (for example by planting new trees) or artificially (through new technologies). However, according to data from Climate Action Tracker, a project run by a group of three climate-research organizations, very few countries that signed the Paris Agreement appear to be making efforts to meet their goals.

While a cohesive global effort may be lacking, there are certainly a few countries that are showing promise to achieving their climate goals. Gambia and Morocco are the only countries on track to meeting the 1.5-degree emissions reduction strategy. Both countries’ principal pathway to reduction is the generation of electricity production from renewables. Costa Rica is on track to achieving the 2-degree emissions reduction strategy—they aim for electricity production to be 100 percent renewable by 2021 and to use renewable energy across roads and rails (in addition to implementing electronic transportation over time). Like Costa Rica, India is on track to be 2 degrees Celsius compatible through its investments in renewable energy. However, there is always more that can be done, and a country like India that has high carbon emissions (third highest in the world) could aim to do more to reach the 1.5 degrees Celsius goal. While the UK is not yet on track to achieve their climate goals, they are regarded as a leader in the developed world for decarbonizing their economy, which has been attributed in part to not just having a target but also a legislative framework to achieve its goals. The UK has also created an independent scientific commission to determine their goals and evaluate the country’s progress, which appears to have a positive impact. Norway is another country that deserves recognition for their efforts, mainly through its policies on low-carbon transportation. 

The countries with the highest carbon dioxide emissions in comparison (eg: China, the US and Russia) are lagging reaching their climate goals in accordance with the Paris Agreement. While the US is the second largest emitter of greenhouse gases and has the highest per person emissions, the country is currently categorized as taking “critically insufficient efforts” to meet their climate goals with a lack of actionable changes at the federal level in the last four years. However, there is hope for change with the shifting administrations in the US this year. Newly elected President Biden rejoined the Paris Agreement (which his predecessor withdrew from during his tenure in office) as one of his first acts in the Oval Office, and last month, he pledged to cut the nation’s global warming emissions at least in half by 2030. While such pledges deserve recognition and are certainly achievable, it is important to critically address challenges that are likely to be faced in reaching such goals. Biden has proposed a $2 trillion infrastructure plan that would steer the country’s economy in a direction to help achieve its climate goals by improving infrastructure for clean energy and transportation. Achieving such sweeping climate change reform to curb carbon emissions, of course, hinges on these reforms passing through congress, not an easy task in the polarized US political climate. Furthermore, clean energy and transportation alone will not be enough to meet the long-term goal of reducing greenhouse gas emissions to net-zero by mid-century, and other major sources of greenhouse gas pollution in the US such as heavy industry, agriculture, land-use strategies, and international shipping that produce a bulk of global carbon and methane pollution will need to be restructured. However, change has to start somewhere, and we have to acknowledge that an administration that is committed to reducing greenhouse emissions is certainly a step in the right direction.

Modern living is a major cause of global warming, and while we wait for green policies to pass, we too have a responsibility to act with urgency to reduce our carbon footprint. Modern conveniences like electricity, transportation, air conditioning, and diet contribute to climate change, and lifestyle changes are necessary. While it will certainly be challenging and require sacrifice, we should consistently educate ourselves as to how those lifestyle changes can be achieved. Afterall, a global increase in Earth’s temperature will affect all of us. Even if some regions and countries will be affected more than others, all of us will see negative impacts in our health, agriculture and food security, and water supply. 

Climate change is the greatest threat to humanity, and we all have a role to play to fix it. To change the future, we must make change possible. 

Written by sciencepolicyforall

May 14, 2021 at 10:20 am

The Growing Presence of Artificial Intelligence in Healthcare

leave a comment »

By Adam Swiercz, PhD

Image by Brother from Flickr

In the summer of 1955, a computer scientist named John McCarthy submitted his proposal for a Dartmouth Summer research project on artificial intelligence (AI).  The following year, a small group of mathematicians and scientists met for nearly two months in a brainstorming session that is now widely considered the founding event of the field.  Over the past 65 years, AI has gone through periods of optimism and excitement interspersed with periods of frustration and pessimism.  Now however, we are in an era of great enthusiasm for AI.  Today’s society has everything AI needs to thrive: high speed processors, cheap limitless data storage, confident investors, and successful tech companies constantly developing more sophisticated algorithms. 

Artificial Intelligence is the field of research devoted to creating computing machines with human-like intelligence.  AI runs on algorithms that allow it to adapt to changing environments, construct analytical models, and continuously learn through multiple rounds of trial and error.  Although the two are often used interchangeably, AI is not the same as machine learning; another technological buzzword that has recently become part of our vocabulary.  Machine learning is a branch of AI that uses large data sets, rather than explicitly coded instructions, to “train” itself to accomplish a task.  AI applications also utilize deep learning, which is a subtype of machine learning that uses multi-layered artificial neural networks to identify patterns in data.  Inspired by the architecture of the human brain, where neurons are connected through multiple axonal junctions, artificial neural networks consist of interconnected groups of nodes.  Deep learning algorithms can be largely autonomous.  With very little guidance they can identify complex patterns, continuously improving as they work their way through massive sets of raw data. 

Many of us encounter AI-based technology on a daily basis through exposure to social networks, movie recommendations, and interactions with our smart speakers.  While these technologies aim to make our everyday lives more convenient, AI’s most important role in society may actually be in fields like healthcare, where machine learning and deep learning have become powerful tools that help analyze medical data, generate predictions, and assist in making clinical decisions.  AI’s growing presence in medicine warrants careful consideration for how it might impact our healthcare system.  Healthcare providers and patients should be aware of what AI is, and how it has the potential to significantly affect our health and well-being.

The most recent resurgence of interest in AI-based healthcare can be attributed in part to the Covid-19 pandemic.  The HealthMap system at Boston Children’s Hospital was able to sound the alarm about the novel coronavirus as early as December of 2019, and companies like BlueDot were able to use AI models based on media, airline data, and livestock health reports to accurately predict where the worst COVID-19 outbreaks would occur.  As hospitals became overwhelmed in the spring of 2020, the limitations of our healthcare system and the operational constraints of our hospitals were exposed.  As a result, there was a rapid shift toward virtual health spurred by necessity and regulatory flexibility.  The pandemic overwhelmed health systems all over the world, leading researchers to turn to AI for things like tracking hospital capacity and sourcing personal protective equipment (PPE).  Researchers were even able to develop a machine learning-based rapid screening test for COVID-19 using clinical information that is routinely collected when patients are admitted to the hospital (e.g. blood tests and vital signs).

Computer-assisted decision support systems have been slowly increasing their footprint in healthcare since the 1970’s.  Generating diagnoses and interpreting electrocardiograms (ECGs) were initially made possible with rule-based approaches, which help arrive at decisions using human-curated rule sets.  However, because of their dependence on human-authored rule sets and updates, the performance and accuracy of such systems are limited.  Still, the use of computer programs steadily rose as technology progressed, and by the early 1990s, over 50 percent of the ECGs recorded each year in the United States were being interpreted with the aid of computers. 

Modern AI-based healthcare technology relies more heavily on machine-learning and deep learning methods, which can identify patterns in large data sets and account for complex interactions with minimal human intervention.  Recent studies have shown successful diagnosis of heart attacks and detection of arrythmias with accuracy comparable to cardiologists.  Current AI technology may also be particularly useful in the fields of radiology and neurology, which rely heavily on the examination of medical scans.  Studies have shown high accuracy in analyzing computed tomography (CT) scans for lung nodules, brain scans for intracranial hemorrhages, and chest radiographs for pulmonary tuberculosis.  AI also shows promise in fields like mental health, where it may help to objectively redefine mental illnesses, identify illness earlier while patients are more responsive to interventions, and help personalize treatments based on the unique characteristics of each patient.  A recent study found that AI can be used to automate screening for patients that need advanced care for depression.  Machine learning can also be used to build predictive models based on clinical trial data that identify which patients are most likely to respond to specific antidepressants. 

One sign that the future of healthcare will be dramatically influenced by AI is the large financial commitment that big tech companies and hospital systems have made in the last few years.  Earlier this month, Microsoft announced that it will acquire Nuance Communications Inc., which is a provider of conversational artificial intelligence and cloud-based ambient clinical intelligence.  This acquisition, on the heels of the Microsoft Cloud for Healthcare launch in November 2020,  highlights the company’s intent to become a fixture in the competitive industry of developing AI for healthcare providers.  Google is also betting on the future of AI in healthcare.  After some initial rough starts with now-defunct projects like Google Health and Google Flu Trends, the tech giant has been aggressively expanding its roster of healthcare initiatives.  Through subsidiaries like Verily, Deepmind, and Calico, Google and its parent company Alphabet plan to apply their AI technology to everything from data generation and disease detection to lifestyle management and health insurance. 

In addition to tech companies, healthcare providers themselves are actively seeking AI solutions to improve patient care.  For instance, the Mayo Clinic recently launched the Remote Diagnostic and Management Platform (RDMP).  RDMP promises to help clinicians make faster and more accurate diagnoses by providing clinical decision support tools, diagnostic insights, and care recommendations by augmenting clinical workflows with AI.  The launch of RDMP will be supported by Mayo Clinic’s two new companies: Anumana Inc. and Lucem Health Inc.  Anumana will use neural network algorithms in combination with Mayo’s repository of medical data to enable early detection and treatment of heart disease.  Lucem Health and the technology company Commure will provide an AI-enabled platform to collect and curate data from remote patient telemetry devices.  According to a recent report from Deloitte, the Mayo Clinic is not alone, and nearly 3 out 4 healthcare organizations in the US plan to increase their AI funding. 

Despite its potential, the growing presence of AI in healthcare presents some unique challenges to policymakers, and the rapid adoption of this technology raises a number of concerns.  First, an AI system can only be as good as the data on which it is trained.  EHRs and insurance data only represent individuals who have access to the healthcare system,  and often do not contain information about social determinants of health.  Incomplete and error-filled data sets can result in skewed interpretations and recommendations; errors which could have disastrous outcomes in the field of medicine.  Furthermore, when data collection is not representative of a population, bias can be introduced into and propagated by the algorithm, potentially increasing health disparities.  If DNA sequencing information, for instance, is based primarily on people of European descent, training AI on this data alone could lead to inaccurate generalizations in non-representative populations.  Algorithms must be designed to minimize potential biases introduced by incomplete or non-representative data. 

A second concern for medical-AI involves privacy.  The strict confidentiality and privacy laws protecting medical data present a challenge to successful integration of AI into medical systems.  Healthcare organizations need to determine how to access and share data with their AI tools legally, and with informed patient consent.  AI will have to be designed from the beginning to comply with the Health Insurance Portability and Accountability Act (HIPAA).  At the same time, healthcare organizations need to identify digital vulnerabilities and prevent the inadvertent sharing of private data, both by flawed algorithms and by cyberattacks. 

There is also the issue of accountability.  When errors are made, who will be held responsible?  Will it be the developers of the algorithm, the safety engineers who fail to properly vet the technology, or the physicians who do not detect and correct the errors?  As deep learning algorithms become more advanced, it is likely that neither the patient nor the physician will fully comprehend how the technology arrives at certain conclusions.  The term ‘black box’ is used to describe these situations, where medicine uses opaque computational models to make healthcare decisions.  While deep learning is a useful tool capable of creating models that outperform traditional methods used to predict things like disease risk, the inner workings of deep neural networks can involve thousands of nodes arranged in hundreds of interconnected layers.  This complexity makes it difficult, or even impossible, to fully understand how outputs are generated.  Even properly trained clinicians may be unable to identify when algorithms are incorrect.  If human physicians do not have complete understanding, control, and trust in an AI system’s recommendations, they will not be able to explain to patients why those recommendations were followed.  Dealing with these issues will require updated views on moral accountability that include AI technology. 

To capitalize on the full potential of AI, government and industry will need to work together to design federal strategies supported by patients, clinicians, and developers.  To this end, the Consumer Technology Association (CTA) created a working group made up of more than 50 organizations with the goal of developing a standard of common language so that stakeholders can better understand AI technologies.  That standard was released last year and became the first ever ANSI-certified standard for the use of AI in healthcare.   Another working group convened by CTA released a standard to advance trust in AI solutions earlier this year.  These efforts demonstrate that tech companies and healthcare providers are aware of the critical need for transparency, common language, and trust as AI-based medicine becomes more mainstream.

In terms of federal recommendations, the U.S. Food and Drug Administration published a discussion paper titled “Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning-Based Software as a Medical Device” in 2020.  The agency has since released an action plan describing its approach towards oversight of Artificial/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD).  In agreement with developing FDA and congressional guidance, a recent report from the National Academy of Medicine argues that a graduated approach should be taken toward the regulation of AI, based on factors like patient risk and AI autonomy.  The government accountability office also released a joint report in collaboration with the National Academy of Medicine describing the benefits and challenges of using AI technologies to augment patient care, and how AI may be used in health settings outside the hospital and clinic. 

Industry, government, and health experts tend to agree that in the near future, AI is likely to have a significant impact on our healthcare system.  Just like any new technology, proper implementation will require a delicate balance of regulation that protects consumers while allowing industry enough leeway to grow and improve.  Effective regulation of AI technology will need to take into consideration several variables, including accurate and representative collection of medical data, consent frameworks to guide data sharing, and liability.  Policymakers and innovators should clearly define outcome measures and expectations, requirements for clinical validation, and provide suggestions for best practices as we integrate new AI technology into our healthcare system.  

Science Policy Around the Web May 13, 2021

leave a comment »

By Joshua Collins, PhD

Image by athree23 from Pixabay

Concrete steps to diversify the scientific workforce

A cadre of prominent scientists have called for the creation of a federal program aimed at diversifying the scientific workforce.  In a recent communication in the policy forum of Science, the authors use their collective voice to correctly assert the importance of diversity in science with the following rationale:  scientific progress is slowed when the full range of talent is unavailable, inclusivity broadens scientific curiosity and creativity, barriers to inclusion limit the potential impact of science on society, and diversity increases the strength of US science on the competitive global stage.   

Citing several programs that have successfully increased the numbers of minority scientists pursuing successful careers, the authors identified three key features of these programs that will serve as the core for diversification initiatives.  Each program’s success was predicated on reducing any sense of isolation through the creation of community cohorts, strong institutional and individual commitments to mentoring, and full financial support during training.

With these principles in mind, the authors first recommend the creation of a coordinated federal program that will establish a long-range national strategic plan to diversify the scientific and engineering workforce through actions that provide support beginning with K-12 education up through professional training programs and employment. 

Second, a reshaping of institutional policies is recommended with the criteria for hiring and promotion of all scientists including evidence of a commitment to diversity, equity, and inclusion.  

Lastly, recognizing the long timeframe from inception to impact of the primary proposals, the authors suggest more immediate actions that the NIH can institute in the here and now without the need for Congressional action.  In short, the authors suggest the NIH address financial barriers faced by minority scientists, in part, through grant supplements paired with a student loan repayment program, reduce the NIH funding gap of grants for minority scientists, and expand funding for businesses that employ minority scientists.

(Shirley Tilghman, Bruce Alberts, Daniel Colón-Ramos, Kafui Dzirasa, Judith Kimble, Harold Varmus, Science)

Written by sciencepolicyforall

May 13, 2021 at 12:19 pm