Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘climate change

Plastics, Problems, and Progress

leave a comment »

By:Jedidiah Acott, PhD

Image by Steve Buissinne from Pixabay 

Plastic is a staple of modern society, due in part to its malleable and durable properties, providing an applicability in innumerable contexts. The first plastic – known as Parkesine – was created in 1862 by heating, molding, and cooling organic cellulose; Alexander Parkes found that after processing, Parkesine could maintain a rigid shape. Less than 50 years later, the commercially manufactured synthetic plastic, Bakelite, was introduced at a chemical conference by Leo Hendrik Baekeland. Interest in the material was immediate, and soon Bakelite was widely used in the public sphere. All throughout the 20th century, new synthetic polymers were invented and brought to the industrial forefront, eventually replacing the plastic progenitors. The ubiquitous role of these synthetic plastics fills such a crucial function in modern society, that if one were to ponder the hypothetical state of the world in their absence, then it could seem like they are a cornerstone of necessity. In fact, it would appear we have become so dependent on plastics that some deem the current times as the plastic age of human history. What are the effects of our plastic addiction? This is the burning question that emerges from witnessing these extreme behaviors, and as current conditions display, where there is smoke there is fire.

It is not a stretch to say that plastics interact with almost every sphere of the global ecosystem, and similar to the inert use of mercury in a barometer compared to its’ interaction with human physiology, the consequences of using a material is determined by its context of use. The pure form of synthetic plastics appear to be non-toxic, but the inclusion of additives that leech into the environment alters the organic polymer into a mysterious and possibly dangerous material. Plastic degradation may take a thousand years or more, and as they degrade, microplastics are produced that can be readily consumed by marine organisms. On the ocean surface, microplastics smaller than 1 cm in diameter have been documented to be heavily abundant, and researchers in Honolulu have observed these microplastics to be present within the small shortbill spearfish, a population native to the area. In larger marine life, plastic bags have been found obstructing the digestive tracts of beached whales, and the stomachs of sea turtles. As a testament to the abundance of oceanic plastics, works of art entirely composed of plastic removed from the stomachs of seabirds hangs on the walls of the National Oceanic and Atmospheric Administration in Honolulu, Hawaii. Another set of researchers predicts that by the year 2050, 99% of all seabird species will have ingested plastics. It is obvious enough that mechanical obstruction can cause issues, but what of the environmental and biological consequences of plastic consumption?

It is well-known that the amazon rainforest serves as a carbon sink for atmospheric carbon dioxide, but lesser known is that sea ice serves a similar function for microplastics. One study shows that as a plastic sink, sea ice traps microplastic particulates at concentrations several orders of magnitude beyond highly contaminated waters. Even if every bit of plastic floating in the ocean were to be removed, there would remain a reservoir of plastic waiting to be re-released into the environment. On one front, increasing ocean acidity threatens the formation of calcium carbonate shells for growing organisms, while on another, plastic waste promotes the colonization of disease-associated pathogenic microbes that threaten coral reefs. In 2017, scientists studying coral reefs provided evidence that corals in contact with plastics increase their risk of disease from 4% to 89%. As one of the most diverse ecosystems on earth, coral reefs harbor plants and animals that actively contribute to drug discovery and development for human ailments. The molecules from these organisms have relevance for conditions ranging from cancer and arthritis, to bacterial and viral infections. The present circumstance does not project a promising future for the worlds’ oceans. Ecosystem imbalance, plastic reservoirs, threats to marine life, and microplastics in ocean-derived resources are immediately visible consequences, but are there tangible causes for concern toward the human species in particular?

The current literature surrounding the effects of micro- and nano-plastics on human health is sparse, but a lack of evidence is not evidence of lack. As new knowledge is created, the present paradigms are renovated, and a type of hindsight bias may emerge confounding future generations from the current apathy. A recent study published in Canada measured the contents in a cup of liquid following a normal steeping process using a manufactured plastic tea bag. The researchers found 11.6 billion microplastics, and 3.1 billion nanoplastics in the beverage, several orders of magnitude above the plastic loads reported in other foods. In accordance with this report, the Tea and Herbal Association of Canada issued a statement that no evidence shows harm to human health by microplastics, and that polyethylene terephthalate (PET) and nylon have been deemed safe for use as tea bags for hot food and beverages. Not so long ago cigarettes were claimed as being not harmful by tobacco companies and health professionals. Today though, we have ample evidence to support tobaccos’ role in cancer, heart disease, complications with blood circulation, and addiction. It would be highly irresponsible and quite the historical oversight to lay behind the thin veil of ignorance as justification for allowing plastics to continue polluting our environment and our bodies. Allowing these conditions to become precedent now, and only asking questions later, is actively participating in our own dissolution. Research has already revealed that plastic-derived BPA and DEHP are detrimental to human health, increasing risk for breast and uterine cancer, and interfering with testosterone levels and childhood development. More than enough evidence already points to the need for addressing the plastic crisis with urgency, and as we attempt damage control and eventual reparations, a multi-layered approach may now be the best option.

Equally for climate change and the plastic crisis, the current generation did not create the problem: we inherited it. But because we have also contributed to it, it is our burden to create meaningful solutions and demand institutional changes to prevent continued indifference and destruction toward the world. Several possibilities, such as government policy, institutional accountability, mechanical recycling, clean-up groups, and enzyme-based depolymerization, are already being enacted. In India, 17 states have joined together to “ban the manufacture, use, storage, distribution, sale, import, and transportation of many plastic goods and materials.” Even while making the change, industrial plastic and clothing manufacturers filed a lawsuit challenging the ban. On the basis of adverse effects to businesses, the Indian government gave the companies 3 months to dispose of banned items. A ban on imported recyclables was put into effect by the Chinese government 2017, and in relation to this ban many American counties have cancelled their recycling programs, leaving consumers to throw plastics in the trash, which may actually reduce ocean-bound plastics. Agilyx, a company in Tigard, Oregon has taken a small-scale approach toward the problem. By using chemical plastic depolymerization to break the molecular bonds between plastic polymers, the Oregon company can turn plastic into reusable raw materials. Carbios, a plastic depolymerization startup in France, is using an enzyme specific for the synthetic PET molecule, which the CEO calls a “conceptually…infinite recycling process.” Some studies have shown that chemical recycling may even reduce greenhouse gas emissions, addressing two environmental issues with one method.

Humanly-created problems require humanly-created solution, and although the plastic crisis is actively being worked on by engineers, scientists, companies, and governments around the world, international accountability may be a bottle-neck impeding authentic solutions; but with the perseverance of the human spirit, we may even yet clear the streams of pollution toward an unimpeded flow of environmental conscientiousness, and re-forge the bottle into a favorable material for future life of the planet.

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 23, 2019 at 1:42 pm

Science Policy Around the Web November 19th, 2019

leave a comment »

By: Andrew Wright Bsc

Source: Pixabay

EPA’s ‘secret science’ plan is back, and critics say it’s worse

​The Environmental Protection Agency (EPA) has been exploring new rules on the incorporation of scientific data in its rulemaking process. The so-called “secret science” rules were originally proposed in 2018 under the EPA’s previous administrator Scott Pruitt, and have since been revised by its new administrator Andrew Wheeler in response to harsh criticism from scientific, environmental, and patient groups. Rather than addressing these criticisms to mollify the proposals detractors, the draft of the newly proposed rule, which was leaked to the New York Times, seems to drastically broaden the scope of which data cannot be used. 

According to the 2018 proposed rule, all raw data would have to be made available for studies that assessed a “dose-response” relationship, a bedrock of toxicity research. This could be difficult, if not impossible, when considering patient privacy protection laws and proprietary information requirements that would prevent the dissemination of that data. In the new draft rule, this set of constraints is imposed on all scientific studies used to guide agency procedures, instead of just dose-response studies. The draft also seeks comment on whether these restrictions should be imposed retroactively. According to the draft rule, if the underlying data were not made available, the EPA would be able to “place less weight” or “entirely disregard” those studies.  

While the draft does provide room for a tiered data-sharing approach such as those implemented at the National Institutes of Health and the Food and Drug Administration and allows for political appointees to provide exemptions, critics worry that these new requirements will effectively remove science from the EPA’s decision-making process.  Thus far, the EPA’s scientific advisory board has not been afforded the opportunity to weigh-in.

(David Malakoff, Science)

‘Insect apocalypse’ poses risk to all life on Earth, conservationists warn

A recent study looking at insect populations in the UK suggests that up to half of all insects have been lost since 1970 and that 40% of all known insect species are facing extinction. Due to the complexity of ecological systems that rely on insect biodiversity to function properly, this level of insect loss could lead to “catastrophic collapse” on a global scale. 

This study demonstrates a similar severity of insect decline as has been seen in other regions around the world. In Puerto Rico, for example, insect biomass has declined between 10 and 60 times and has led to the destruction of its rainforest’s food web. In Germany, 75% of flying insects have vanished in the past 27 years.

Solutions to address what is now considered Earth’s sixth mass extinction event are becoming increasingly complex as failing components of anthropogenic damage to the global ecosystem are beginning to interact. However, conservationists suggest that insect numbers could be rapidly recovered through a combination of pesticide reduction and land management. 

(Damian Carrington, The Guardian)

Written by sciencepolicyforall

November 19, 2019 at 11:59 am

Science Policy Around the Web October 11th, 2019

leave a comment »

By Ben Wolfson PhD

Image by Thomas B. from Pixabay 

Massive California power outage triggers chaos in science labs

On Wednesday and Thursday of this week, upwards of 600,000 California residents lost power when Pacific Gas & Electric, the state’s largest utility company, instituted rolling blackout. Due to high winds, PG&E worried that keeping power on could result in sparking and increased risk of wildfires.

PG&E has been found to be liable for approximately two dozen wildfires, including the deadly 2018 Camp Fire, and filed for bankruptcy in January of 2018 due to the lawsuits it faced. The weeks rolling blackouts were instituted in an attempt to prevent further wildfires. State Senator Jerry Hill (D-CA), stated that the decision to target such large numbers of people for blackouts demonstrated the serious risk of fire, but also showed that PG&E has so far failed to improve the safety of their power system. 

In addition to affecting residential customers, the rolling blackouts have also thrown scientists and research labs into disarray as they struggle to protect valuable reagents and samples. Many labs have limited or no access to backup power, meaning items that must be refrigerated or frozen are at risk of being lost when they increase in temperature. In addition, tissue culture requires a stable environment maintained by a powered incubator, and laboratory animals need filtration and temperature control systems that may be shut off in light of power loss. 

While California has always had high risks of wildfires, the warming climate has increased the chance and frequency of deadly fires. California’s annual burned area has increased 5-fold since 1972, and 7/10 of the most destructive fires have occurred in the last decade. 

(Jeff Tollefson, Nature)

Written by sciencepolicyforall

October 11, 2019 at 3:47 pm

Science Policy Around the Web September 27th, 2019

leave a comment »

By Andrew Wright BSc

Image by Herm from Pixabay 

Extreme sea level events ‘will hit once a year by 2050’

According to the International Panel on Climate Change (IPCC) Special Report on the Ocean and Cryosphere in a Changing Climate released on September 25th,  the effect of climate change on the worlds oceans and ice formations are so severe that they are partially irreversible even with steep cuts to emissions by 2050. While such cuts will reduce far more drastic and damaging changes in the latter half of the 21stcentury and beyond, increased temperatures, acidification, oxygen decline, marine heatwaves, and a weakening of critical ocean currents (that affect weather patterns among other ecological systems) are all but certain. This will lead to extreme sea level events that are “historically rare”, meaning they occur roughly once per century, occurring every year, especially in tropical regions.  Further, sea level rise is projected to continue beyond 2100 even in optimal emission reduction scenarios, with an estimated range of 1 meter of ocean rise in the best-case scenario and multi-meter rise in the worst-case scenario. To compare, since 1993 the ocean has risen by about 8cm, or less than 10% of the projected expected ocean rise, and flooding in the United States has increased by over 200%.

            Almost 2 billion people live on the coast and as such sea level rise will cost several trillions of dollars a year in damages and lead to millions of displaced migrants. These damages will be alongside collapsing coastal ecosystems that supply 10% of the world’s population with their livelihood and 4.3 billion people with a significant part of their food. The magnitude of these effects was revised upward in this most recent report to account for accelerating ice melt from Greenland and Antarctica, which is now surpassing thermal expansion as the primary contributor to ocean rise.

(Damian Carrington, The Guardian

Grad student unions dealt blow as proposed new rule says students aren’t ‘employees’

The question of whether graduate students at private universities are considered employees has been revisited several times by the National Labor Relations Board (NLRB) since its original 2000 decision allowing graduate students to form a union at New York University. As NLRB members are politically appointed, decisions about the validity of university graduate student unions have vacillated with the priorities of concurrent presidential administrations. In 2004, the NLRB under the George W. Bush administration ruled that graduate students are not considered employees, while in 2016 the NLRB overturned that decision under the Barrack Obama administration and allowed graduate students at Columbia University to form a union if they were compensated for teaching.

The most recent proposed rule, under the Donald Trump administration, counters previous guidance in stating that graduate students are not “employees” regardless of the compensatory mechanism, and thereby do not hold the capacity to form a union. In this case, the NLRB is comprehensively addressing the graduate student employee issue by going through the mechanism of the official rulemaking process rather than by deciding the issue on a case-by-case basis as has been previously.

This new rule, which has a 60-day public comment period, will affect a number of private universities where students have decided to unionize as at least 12 schools have done so far. What is less clear is how this will affect ongoing negotiations that have been entered into by graduate student unions and universities as those unions face delegitimization.

Interestingly, part of the reason that the NLRB is using this rulemaking process is due to the withdrawal of petitions by students at the University of Chicago, Yale University, Boston College, and the University of Pennsylvania, without which the NLRB was unable to rule on their individual cases. However, according to the executive director of the National Center for the Study of Collective Bargaining in Higher Education and the Professions William Herbert, Congress has the authority to stipulate who is and is not an employee under U.S. labor law, which will most likely open the proposed rule to litigation.

(Katie Langin, Science)

Written by sciencepolicyforall

September 27, 2019 at 10:55 am

Science Policy Around the Web August 30th, 2019

leave a comment »

By Andrew Wright, BSc

Image by Steve Buissinne from Pixabay

EPA’s controversial ‘secret science’ plan still lacks key details, advisers say

In early 2018 under its previous administrator Scott Pruitt, the U.S. Environmental Protection Agency (EPA) first proposed rules to restrict the use of scientific findings whose data and methodologies are not public or cannot be replicated. Following the removal of all sitting Science Advisory Board (SAB) members who receive EPA grants in late 2017 (roughly half of its members) there was concern that environmental experts were being sidelined from EPA decision-making, which the proposed rule seemed to support. While making data public and replicable has merits, the SAB has raised concerns that the proposed rule would make it impossible to use the most accurate information as many environmental studies are long-term ones that assess human exposure to toxins and cannot be ethically or efficiently replicated. Now, under administrator Andrew Wheeler, how this proposed rule will be implemented is still unclear. 

A central concern is how to maintain privacy over personally identifiable information (PII) to comply with existing privacy laws and concerns (such as the Health Insurance Portability and Accountability Act or HIPAA). One proffered strategy is to try a tiered approach based of the model currently used by the National Institutes of Health, whereby the more sensitive the PII is, the more restricted its access will be. 

As the SAB has decided to engage in a consultation of the proposed rule, individual members will have their comments written up in a report to be sent to Wheeler but will not have to come to a consensus for the proposed rule to move forward.  

(Sean Reilly, Science (Reprinted from E&E News

 Brazilian Amazon deforestation surges to break August records 

While the recent spate of fires in the Amazon rainforest have been capturing international attention, regular deforestation via cutting and clearing techniques have also been rapidly increasing. In August alone, 430 square miles, or a region the size of Hong Kong, has been cut down. This comes after the July’s loss of 870 square miles, a 275% jump from the previous year.  At the current rate of deforestation Brazil is on track to lose more than 3,800 square miles of rainforest, or an area roughly one and a half times the size of Delaware, this year.

“The August data from Deter is hardly surprising,” said Claudio Angelo of Climate Observatory, referencing the Deter-B satellite that was put into place in 2015 to monitor Brazil’s rainforests. According to him and other representatives from non-governmental organizations, the Bolsonaro government is delivering on its promises to support local industries such as mining, ranching, farming, and logging rather than enforcing environmental protections. 

While this deforestation data is separate from data on forest fires, felled trees are often left to sit and dry before they are lit aflame, leading forest engineers to portend that the fires are going to get worse in the coming months.

Since the Amazon rainforest generates its own weather patterns, studies have demonstrates the possibility that after 40% deforestation has occurred, the biome may irreversibly convert to savannah. This could impact global weather patterns, affected Brazilian weather most severely. However, recent estimates place that tipping point closer to 20-25% due to the synergistic effects of climate change. According to the World Wildlife Fund, approximately 17% of the rainforest has been lost in the past 50 years, putting uncontrollable forest conversion much closer than previously assumed.

(Jonathan Watts, The Guardian

Written by sciencepolicyforall

August 30, 2019 at 11:08 am

Science Policy Around the Web August 16th, 2019

leave a comment »

By Neetu M. Gulati PhD

Image by vegasita from Pixabay 

How Eating Less Meat Could Help Protect the Planet from Climate Change

A recent report by the United Nations climate science body, the Intergovernmental Panel on Climate Science (IPCC), warns that now is a moment of reckoning for how humans use the planet. The report highlights how the planet has been impacted by land-use practices, deforestation, agriculture, and other activities. These threaten our ability to limit the global temperature increase as outlined by the 2015 Paris climate agreement. The report further outlines how humans can help stop the impacts of climate change by drastically changing what food we eat as well as how it is produced.

Explaining this logic, Debra Roberts, the co-chair of the IPCC Working Group II, commented, “some dietary choices require more land and water, and cause more emissions of heat-trapping gases than others.” If people eat more sustainably grown and produced foods, as well as more plant-based diets, this could provide opportunities to adapt and mitigate the potential climate issues. Meats like beef and lamb are particularly taxing on the environment for the amount of meat obtained, partially because such livestock require a large space to graze. Reducing the amount of land to produce meat and also using that land more efficiently through sustainable farming practice will be imperitive to ensure that land remains usable as the planet warms. 

While a lot of the world already eats majority plant-based diets, the countries that eat a lot of meat tend to be wealthier countries. As countries with lesser meat consumption gain wealth, there is a risk that they will eat more meat and put a greater strain on the environment. While not every country will stop eating meat, the recent popularity of meatless products is encouraging, and hopefully the public will begin to focus on the fact that food and agriculture are important in the fight against climate change.

(Abigail Abrams, Time)

“Qutrit” Experiments are a First in Quantum Teleportation

Many believe that quantum information science is a key avenue of research for future technologies. Now, for the first time, researchers have used this technology to teleport a qutrit, a tripartite unit of quantum information. This is an important advance for the field of quantum teleportation, previously limited to the quantum equivalent of binary bits of information known as qubits. The two research teams who independently achieved this feat first had to create qutrits from photons, a challenge in and of itself. Because qutrits can carry more information and have more resistance to noise than qubits, these experiments may mean that qutrits become an important part of future quantum networks.

In quantum science, the states of entangled particles have a connection. Thus, in quantum teleportation, the state of one entangled particle, for example the spin of an electron particle, influences the second particle instantaneously, even if far apart. While this sounds like something out of a science-fiction story, this milestone may have important real-world implications. Quantum teleportation may be important for secure communications in the future. In fact, much of the quantum teleportation research is funded because of its importance for the future of cybersecurity.

The qutrit teleportation experiments were independently performed by two research teams. One team, led by Guang-Can Guo at the University of Science and Technology of China (UTSC), reported their results in a preprint paper in April 2019. The other team, co-led by Anton Zeilinger of the Austrian Academy of Sciences and Jian-Wei Pan at the UTSC, reported their findings in a preprint paper in June 2019 that has been accepted for publication in Physical Review Letters. The two teams agree that each has successfully teleported a qutrit, and both have plans to go beyond qutrits, to at least ququarts (four level systems). Other researchers are less convinced, saying the methods used by the two teams are slow and inefficient, and therefore not suited for practical purposes. In response, one of the authors of the paper by Zeilinger and Pan’s team, Chao-Yang Lu, said, “science is step by step. First, you make the impossible thing possible. Then you work to make it more perfect.”

(Daniel Garisto, Scientific American

 

Written by sciencepolicyforall

August 16, 2019 at 3:15 pm

Science Policy Around the Web August 1st, 2019

leave a comment »

By Andrew Wright BSc

Image by Steve Buissinne from Pixabay 

Major U.S. cities are leaking methane at twice the rate previously believed

While natural gas emits less carbon dioxide (CO2) when burned, if allowed to enter the atmosphere as methane (CH4) it can act as a greenhouse gas that is 20-80 times more potent than CO2. Some of this impact is supposed to be mitigated by the relatively low amount of leaked methane, roughly 370,000 tons in six major urban areas studied according to a 2016 report from the EPA. However, a new study in the journal Geophysical Research Letters analyzed those same metropolitan centers and found that the EPA has underestimated methane release by more than half. By taking simultaneous measurements of ethane, which appears only in natural gas supplied to homes and businesses, researchers were able to delineate the sources of leakage, as natural sources and landfills do not give off ethane. 

From their analysis, the total estimate from the six cites studies was 890,000 tons of CH4, 84% of which was from methane leaks. While the authors of the study are unsure as to why the EPA estimates are so low, they suggest it could be because the EPA only estimate leaks in the distribution system, rather than endpoint leaks in home and businesses. While these results cannot be reliably extrapolated to newer cities which may contain infrastructure more resilient to leakage, they could engender further study to gather a clearer picture of national methane release.

(Sid Perkins, Science)

 

Japan approves first human-animal embryo experiments

On March 1st the Japanese science ministry lifted a ban on growing human cells in animal embryos and transferring them to animal uteri. While human-animal hybrid embryos have been made before, functional offspring have not been allowed to develop.  The first researcher to take advantage of this new regulatory scheme is Hiromitsu Nakauch, the director of the Center for Stem Cell Biology and Regenerative Medicine at the Institute of Medical Science at the University of Tokyo and a faculty member at Stanford University. His long-term goal is to grow human organs in animals such as pigs, from which the functional organs could be extracted and transplanted into human patients. His intent is to start in an early embryonic mouse model, then a rat model, and finally a pig model with embryos that form for up to 70 days. 

This measured approach is in stark contrast to the recent controversy regarding CRISPR edited babies in China, but has still been met with a certain level of ethical skepticism. Bioethicists are particularly concerned that the human cells being injected into animal embryos, induced pluripotent stem (iPS) cells, may deviate from their intended target (in this case the pancreas) and affect the host animal’s cognition. According to Nakauchi, the experimental design, which involves eliminating the gene for the target organ and injecting human iPS cells to compensate, is such that the cells should only be involved in a specific part of the animal. 

While Nakauchi’s group used this method to successfully grow a pancreas in a rat from mouse cells, they have had limited luck putting human iPS cells into sheep embryos. Given the evolutionary distance between mice, rats, pigs, and humans it may be difficult for experimenters to produce more satisfactory results. To address this Nakauchi has suggested that he will be trying genetic editing techniques as well as using various developmental stages of iPS cells.

(David Cyranoski, Nature)

 

 

Written by sciencepolicyforall

August 1, 2019 at 12:23 pm