Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘Global Warming

Science Policy Around the Web August 16th, 2019

leave a comment »

By Neetu M. Gulati PhD

Image by vegasita from Pixabay 

How Eating Less Meat Could Help Protect the Planet from Climate Change

A recent report by the United Nations climate science body, the Intergovernmental Panel on Climate Science (IPCC), warns that now is a moment of reckoning for how humans use the planet. The report highlights how the planet has been impacted by land-use practices, deforestation, agriculture, and other activities. These threaten our ability to limit the global temperature increase as outlined by the 2015 Paris climate agreement. The report further outlines how humans can help stop the impacts of climate change by drastically changing what food we eat as well as how it is produced.

Explaining this logic, Debra Roberts, the co-chair of the IPCC Working Group II, commented, “some dietary choices require more land and water, and cause more emissions of heat-trapping gases than others.” If people eat more sustainably grown and produced foods, as well as more plant-based diets, this could provide opportunities to adapt and mitigate the potential climate issues. Meats like beef and lamb are particularly taxing on the environment for the amount of meat obtained, partially because such livestock require a large space to graze. Reducing the amount of land to produce meat and also using that land more efficiently through sustainable farming practice will be imperitive to ensure that land remains usable as the planet warms. 

While a lot of the world already eats majority plant-based diets, the countries that eat a lot of meat tend to be wealthier countries. As countries with lesser meat consumption gain wealth, there is a risk that they will eat more meat and put a greater strain on the environment. While not every country will stop eating meat, the recent popularity of meatless products is encouraging, and hopefully the public will begin to focus on the fact that food and agriculture are important in the fight against climate change.

(Abigail Abrams, Time)

“Qutrit” Experiments are a First in Quantum Teleportation

Many believe that quantum information science is a key avenue of research for future technologies. Now, for the first time, researchers have used this technology to teleport a qutrit, a tripartite unit of quantum information. This is an important advance for the field of quantum teleportation, previously limited to the quantum equivalent of binary bits of information known as qubits. The two research teams who independently achieved this feat first had to create qutrits from photons, a challenge in and of itself. Because qutrits can carry more information and have more resistance to noise than qubits, these experiments may mean that qutrits become an important part of future quantum networks.

In quantum science, the states of entangled particles have a connection. Thus, in quantum teleportation, the state of one entangled particle, for example the spin of an electron particle, influences the second particle instantaneously, even if far apart. While this sounds like something out of a science-fiction story, this milestone may have important real-world implications. Quantum teleportation may be important for secure communications in the future. In fact, much of the quantum teleportation research is funded because of its importance for the future of cybersecurity.

The qutrit teleportation experiments were independently performed by two research teams. One team, led by Guang-Can Guo at the University of Science and Technology of China (UTSC), reported their results in a preprint paper in April 2019. The other team, co-led by Anton Zeilinger of the Austrian Academy of Sciences and Jian-Wei Pan at the UTSC, reported their findings in a preprint paper in June 2019 that has been accepted for publication in Physical Review Letters. The two teams agree that each has successfully teleported a qutrit, and both have plans to go beyond qutrits, to at least ququarts (four level systems). Other researchers are less convinced, saying the methods used by the two teams are slow and inefficient, and therefore not suited for practical purposes. In response, one of the authors of the paper by Zeilinger and Pan’s team, Chao-Yang Lu, said, “science is step by step. First, you make the impossible thing possible. Then you work to make it more perfect.”

(Daniel Garisto, Scientific American

 

Advertisements

Written by sciencepolicyforall

August 16, 2019 at 3:15 pm

Homegrown Apocalypse: A Guide to the Holocene Extinction

leave a comment »

By: Andrew Wright BSc

Homegrown Apocalypse: A Guide to the Holocene Extinction

One of the unifying factors of mass extinctions is a rapid change in global average temperature. The end-Ordovician extinction, the second largest, occurred when newly forming mountains made of silicate rock quickly absorbed atmospheric CO2. The global average temperature plunged, leading to the formation of enormous glaciers, drastically lower ocean levels, and much colder waters. Since complex life was still relegated to the oceans, this killed 86% of all species. The most well-known extinction is the end-Cretaceous or K-Pg event caused in part by a massive asteroid impact in Chicxulub, Mexico. The immediate impact, roughly one billion times stronger than the atomic bombings of Japan, was devastating in its own right. However, the subsequent ejection of sulfate-bearing rock into the atmosphere was the real killer, dropping global temperatures by 2-7°C, inhibiting photosynthesis, and acidifying the oceans. Coming right after a period of global warming, this extinction killed about 76% of all species.

            These extinctions pale in comparison to the end-Permian extinction, also known as the Great Dying. When Pangea was the sole continent, an enormous pool of lava called a flood-basalt plain slowly erupted over what is modern-day Siberia. Over 350,000 years, magmatic rock up to a mile thick solidified and covered an area roughly half the size of the United States. This igneous cap forced underground lava to move sideways and spread in paths called sills. As the lava traveled, it vaporized increasing amounts of carbonates and oil and coal deposits, leading to an immense build-up of CO2. Once the sills reached the edge of the cap, these gases were violently expelled, ejecting up to 100,000 gigatons of CO2. The immediate effect was a global average temperature increase of roughly 5°C. Subsequently, oceanic methane hydrate (or methane clathrate) crystals, which become unstable at high temperatures, broke down. Since methane is 20-80 times more potent than CO2as a greenhouse gas, global average temperature increased a further 10°C, bringing the total to 15°C. This left the planet barren, desertified most of Pangea, strongly acidified the oceans, killed 96% of marine life, and 90% of all life on Earth.

            We are currently living through the beginnings of the sixth mass extinction event, known as the Holocene. Species are dying off 10-100 times faster than they should and that rate is accelerating. Insects, including pollinators, are dying off so quickly that 40% of them may disappear within decadesOne in eight birds are threatened with extinction, 40% of amphibians are in steep decline, and marine biodiversity is falling off as well. At current rates, half of all species on Earth could be wiped out by the end of the century. 

What is the commonality between our present circumstances and the past? As with previous mass extinctions, global average temperature has increased. Since 1880, global average temperature has increased by 0.8°C and the rate of warming has doubled since 1975. This June was the hottest month ever recorded on Earth, with global average temperature reaching 2°C above pre-industrial levels. Greenland lost two billion tons of ice in one day. This increase in temperature is because we are currently adding 37.1 gigatons of CO2 per year to the atmosphere, and that number is rising

            From the most recent International Panel on Climate Change (IPCC) report, we know that the best outcome is to keep the increase in global average temperature below 1.5°C. Instead, let us consider what would happen if current trends stay the same and CO2 emissions continue to increase at similar rates until 2100. This is known as the RCP 8.5 model. Under this paradigm, atmospheric CO2 levels will rise from 410 parts per million (ppm) to 936 ppm. The global average temperature will increase by 6°C from pre-industrial levels. That puts the Earth squarely within the temperature range of previous mass extinction periods. 

Given this level of warming the following can be expected to occur: first and foremost, the extreme heat on the planet will massively decrease glaciation, causing a surge in ocean levels. Since water expands as it gets warmer, ocean levels will increase even further to about 12ft higher than current levels. This means most coastal areas will perpetually flood while others will be completely underwater. Unfortunately, non-coastal areas won’t be free from hardship as high air temperature will cause desertification, crop die-off, drought, and widespread wildfires. Secondly, as the ocean absorbs CO2 from the atmosphere, it will become increasingly acidic. So far, the pH of the ocean has only changed by 0.1, but under an RCP 8.5 model, that decrease could be as high as a 0.48 reduction in pH. Since this measurement is on a logarithmic scale, this means that the oceans will be acidic enough to break down the calcium carbonate out of which shellfish and corals are built. Warmer water cannot hold oxygen as effectively as cold, meaning many water-breathing species will suffocate. In combination, these two factors will serve to eliminate a huge source of the human food supply. Finally, since weather patterns are based on ocean and air currents and increasing temperatures can destabilize them, massive hurricanes, dangerously cold weather systems, and flood-inducing rainfall will become the norm. 

One parallel to the end-Permian extinction might result as well. Over millions of years, methane clathrate re-stabilized in the permafrost of Siberia and in the deep ocean floor. But in what has been termed the clathrate gun hypothesis, if methane clathrate destabilizes again at high temperatures, then the resultant methane emissions and planetary warming could form a positive-feedback loop, releasing even more crystallized methane until we end up in another “great dying”. While short-term warming probably won’t cause a runaway temperature increase, a 6°C increase in global average temperature might. New research suggests methane release may not even be necessary as the ocean is reaching a critical point in the carbon cycle where it could rapidly expel an amount of CO2on par with flood-basalt events. Moreover, like the end-Permian extinction, anthropogenic climate change is occurring on a near instantaneous geological time scale and species, including our own, will not have the requisite time to adapt.

Of course, none of these effects exists in a vacuum. They will be alongside increasing deforestation for agriculture, plastic and chemical pollution, and resource extraction. The end result would be a planet with less space, little food, mass migration, and devastating weather. So, what can be done to stop this scenario from coming true? The latest IPCC report essentially places humanity at an inflection point. Either CO2output is cut in half by 2030 and humans become carbon neutral by 2050, or the planet is irrevocably thrust past the point of no return. 

This timeframe may seem short, but it takes into account that even if civilization were to completely stop emitting greenhouse gasses today, it would take hundreds of years for global average temperature to  go back down since it takes time for the ocean to absorb CO2from the atmosphere. Like any problem of scale, there is no one solution to reaching carbon neutrality and it will take a multivariate approach. Some solutions include enacting carbon tax measures, subsidizing and implementing renewable energy (while divesting from new coal and oil production), an increased reliance on nuclear power, large-scale reforestation, livestock reduction, and carbon-sequestration technology. Some of these efforts have come a long way and some have gone in the wrong direction.

This is, of course, a global problem to be solved. At a time when the United States has signaled its intention to withdraw from the Paris Climate Accord as soon as possible and states are rejecting carbon cap-and-trade measures, other nations are moving ahead with unprecedented boosts in renewable energy and bold commitments to reducing greenhouse gas emissions. India, the third-largest polluter after the United States, is on track to surpass its Paris Accord commitments. Should the United States re-engage with and lead the international effort to tackle what is an existential threat, then it is not improbable that the end of this century could be a pleasant one. So, if the idea of living through a global extinction event is disconcerting, one can be assured that the problem is still just barely a solvable one. 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

July 11, 2019 at 4:24 pm

Science Policy Around the Web – December 21, 2018

leave a comment »

By: Mohor Sengupta, Ph.D.

syringe-1884784_1920

Medical Detectives: The Last Hope For Families Coping With Rare Diseases

Rare diseases affect far fewer people than other diseases, and consequently many are difficult to diagnose or may even not be identified yet. Current approaches seek to identify rare diseases by examining genetic mutations at one gene at-a-time, picking the gene by roughly informed guesses based on the symptoms. This method may soon be a thing of the past, thanks to Undiagnosed Diseases Network (UDN). UDN is a research study on a never-ending scale. It collects genomic data from persons with rare diseases and identifies the culprit mutation/s. The findings are cataloged, and doctors encountering novel symptoms in their patient can go to the UDN database and dig out the disease that matches most with the symptoms. That would give them a possible starting point. In this respect, the sole purpose of UDN is to find solutions for rare medical challenges where doctors are not able to.

UDN is made up of three components: A coordinating center based at the Department of Biomedical Informatics at Harvard Medical School, twelve clinical sites in the USA, including the NIH Undiagnosed Disease Program at Bethesda and core facilities. UDN is backed by the National Institutes of Health Common Fund that seeks to provide answers for patients and families affected by mysterious conditions.

There are more than 6000 known rare diseases, and combined these affect fewer than 200,000 Americans at a given time. Eighty percent of all known rare diseases are genetic in origin and half of all rare diseases affect children. Symptoms of a single rare disease may vary from patient to patient, and the disease itself is often masked by common symptoms. This confounds appropriate diagnosis of a rare disease greatly. For many rare diseases there is no knowledge of the underlying cause and information of disease progression is limited. Without the correct intervention, patients and their families experience a decline in the quality of life over time.

Research on rare diseases need to be collaborative across nations, and a global network of physicians and researchers is needed to facilitate knowledge sharing about these diseases. There needs to be a comprehensive approach in the understanding of rare diseases, analogous to virtual knowledge bases like the UDN database and Orphanet. Happily, progress is being made in this direction. Several countries have appropriate policies in place, and there are organizations that are the voice of patients with rare diseases, such as ERORDIS in the EU, the National Organization for Rare Disorders in USA and the Organization for Rare Diseases in India. It is imperative that these cohorts have greater interaction and knowledge sharing among one another.

Finally, public awareness is crucial. The patient community plays a crucial role in addressing awareness. They form the voice of the rare disease community and the starting point for development of policies. Rare Disease Day was created by EURORDIS in 2008, and February 28th, the rarest day, was chosen to mark our combat against rare diseases and our support for those living with it. On its tenth anniversary last year, 94 countries and regions from every corner on the globe commemorated the day, and 2018 saw the addition of five more countries into the group.

As a person afflicted with a rare disease myself, I would say, we may be only a few but with your support, we have the best shot at it!

(Original article by Lesley McClurg covers UDN, NPR)

 

Will We Survive Climate Change?

 

The holidays are upon us and many will head out to different directions outside the city. Writing the last blog for this year, I thought we could ponder over some rather worrying issues and offer solace to one another.

It’s December and today is winter solstice. The day with the longest night arrived amidst torrential rains. Each year we are seeing more storms than the last. Hurricanes like Florence and Mangkhut have rocked the world with damage and destruction. The temperature on Earth is already at 1 degree Celsius above pre-industrial levels. Record number of people have experienced extreme heat wave in 2018. Countries like Canada and Japan got much warmer summers than they are used to, and July 2018 ranked as one of the hottest months in Europe. Changing wind patterns and drier climate have ravaged the state of California with wild fires. A total of 8,434 fires burnt an area of 1,890,438 acres (765,033 ha), the largest amount of burned acreage recorded in a fire season, according to the California Department of Forestry and Fire Protection and the National Interagency Fire Center, as of December 6. From June through mid-July, severe downpour in southwestern Japan caused devastating floods and mudflows, killing nearly 300 people. A month later, the southern Indian state of Kerala was slapped by an unusual monsoon, causing the worst flood in nearly a century in the state with traditionally high rainfall. It left nearly 500 people dead.

The Paris Climate Accord has set a goal of keeping global temperatures from rising more than 2 degrees above pre-industrial levels. At 2 degrees above, things are this bleak:

Icebergs in the Arctic waters are ten times more likely to vanish in the summers.

Most of the world’s coral reefs are to disappear.

37 percent of all people on Earth are to experience extreme heat waves.

411 million people are to experience severe urban draught.

80 million people will be threatened by rising sea levels.

However, at 0.5 degrees Celsius lower, many of these situations seem slightly less devastating. Arctic ocean ice is more likely to survive the summers. Coral reefs will not be wiped out completely. 14 percent of people will be exposed to extreme heat waves and 20 million will be exposed to urban drought.

Seems slightly better? Yet, no industrialized nation is expected to meet the 2 degrees goal, let alone the 1.5 degrees mark, as per their current consumption of fossil fuels. The effects of today’s atmospheric carbon dioxide will be felt by generations to come.

Enough of grim talk. As stated in John Schwartz’s article, “there is no scientific support for inevitable doom”.

Reducing the amount of greenhouse emissions could address the most troubling issues of global warming. Many countries are making efforts to rely on renewable, cleaner energy sources like solar energy. There are increased efforts to use public transportation in some countries. Cars run by electricity are trending. The world is changing. Only not as fast as we want it to.

Let us resolve to consciously cut down on fossil fuel consumption in 2019. No one way is the perfect solution for this self-created menace and not everyone will be touched by this problem in the same manner. But collective awareness and efforts can go down a log way for everyone and for the Earth.

(John Schwartz, New York Times)

 

 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

December 21, 2018 at 11:20 am

Science Policy Around the Web – September 19, 2017

leave a comment »

By: Nivedita Sengupta, PhD

20170919_Linkpost_1

source: pixabay

Global Warming

The world may still hit it’s 1.5 °C global warming mark, if modelers have anything to say about it

Nature Geoscience published a paper on 18th September, which deliving what could be good news regarding climate change and global warming. The team of climate scientists in that paper claimed that it is possible to limit global warming to 1.5 °C above pre-industrial levels, as mentioned in the 2015 Paris climate agreement. The data in the paper stated that the global climate models used in a 2013 report from the Intergovernmental Panel on Climate Change (IPCC), overestimated the extent of warming that has already occurred. By adjusting this parameter when performing model-based calculations, they concluded that the amount of carbon that humanity can emit from 2015 and onwards to keep temperatures from rising above 1.5 °C is almost 3 times greater than the previous estimation done by IPCC. The conclusions have significant implications for global policymakers and suggests that IPCC’s carbon budget can be met with modest strengthening of the current Paris pledges up to 2030, followed by sharp cuts in carbon emissions thereafter.

Some scientists are already challenging the conclusions from the paper, questioning the analysis’s reliance on a period of slower warming, a so called climate hiatus, at the beginning of this millennium which continued until 2014. They say that natural variability in the climate system produced lower temperatures during that period and hence any calculations based on that can be artificially low because it calculates the human contribution towards warming during those periods. Moreover, the oceans and the land were probably absorbing more carbon than normal during this period. Hence natural processes can return some of that carbon to the atmosphere. Taking these factors into account reduces the predicted amount of carbon that can be released while keeping atmospheric temperatures under the 1.5°C limit. But the authors of the paper argue that the climate hiatus should not significantly contribute too their conclusions. They feel the multiple methodologies used to estimate the actual warming due to greenhouse gases should allow the calculations to be accurate irrespective of any short-term climate variability.

Nonetheless,  humanity’s rapid assent towards the global warming threshold is muddled by modelling scenarios, framing the 1.5 °C as a very small target for policy makers and scientists to try to hit by 2030. The fine details of carbon emission matters when scientists are looking for the precise effects of the different greenhouse gases on global warming. But even if the paper proves accurate in its prediction, huge efforts to curb greenhouse-gas emissions will still be necessary to limit warming. As the author of the paper, Dr. Millar, says, “We’re showing that it’s still possible. But the real question is whether we can create the policy action that would actually be required to realize these scenarios.”

(Jeff Tollefson, Nature News)

Debatable Technology

What’s in a face…prediction technology

A paper by the genome-sequencing pioneer Craig Venter published on 5th September raised a lot of criticism and have gathered fears about genetic privacy. The paper, published in the Proceedings of the National Academy of Sciences (PNAS), claims to predict people’s physical traits from their DNA. Dr. Venter and his colleagues sequenced the whole genomes of 1,061 people of varying ages and ethnic backgrounds at Human Longevity, Inc. (HLI). Artificial intelligence was applied to analyze combination of each participant’s genetic data with a high-quality 3D photographs of the participant’ face. This analysis revealed single nucleotide changes in the genetic code between participants that corresponded with with facial features such as cheekbone height and other factors like height, weight, age, vocal characteristics and skin color. Using this approach, they could correctly pick an individual out of a group of ten people randomly selected from the HLI’s database, 74% of the time. This technology could be tremendously powerful for any agencies handling human genome data. Simply removing personal identifying information, which is routinely done in practice, would not eliminate the possibility that individuals could still be identified by the data itself.

However, reviewers of the paper says that the claims are vastly overstated and the ability to use a person’s genes to identify the individual is hugely overblown. According to the skeptics, just knowing the age, sex and race alone can eliminate most of the individuals in a randomly selected group of ten people from a data set as small and diverse as HLI’s. Computational biologist Yaniv Erlich of Columbia University in New York City provided evidence in support of this statement by looking at the age, sex and ethnicity data from HLI’s paper. According to his calculations knowing only those three traits was sufficient to identify an individual out of a group of ten people in the HLI data set 75% of the time, irrespective of any information on the genome. He concluded that the paper doesn’t demonstrate that individuals can be identified by their DNA, as it claims to. HLI counter argued by saying that they used multiple parameters to identify someone, out of which a person’s face is just one.

The review process that the paper underwent is not standard for most journals. By submitting to PNAS as a member of the US National Academies of Science, Engineering, and Medicine, Venter was allowed to hand select the three reviewers his paper would be evaluated by. While the issues surrounding this paper are being hotly debated by members of the scientific community, some fear Venter’s stature will give the paper undo weight to policymakers, who may become overly concerned about DNA privacy and can thus affect the rules and regulations making processes.

(Sara Reardon, Nature News)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

September 21, 2017 at 5:45 pm

Science Policy Around the Web – June 06, 2017

leave a comment »

By: Kseniya Golovnina, PhD

Source: Flickr, by USDA, via Creative Commons     (CC BY 2.0)

Food Security

What if Food Crops Failed at the Same Time?

When one group of people is fighting with climate change and another considers it “mythical”, researchers specialized in the study of social-ecological systems are developing food supply risk assessment models. Food crops are one of the most important sources of human being existence, and less than one-fourth of the planet (“breadbaskets”) produces three-fourth of the staple crops that feed the world’s population. In fact, climate change could cause crop losses in most of the breadbaskets.

Two important factors included in the models are shocks to major land crop production and economy. Shocks like droughts and heat waves in Ukraine and Russia in 2007 and 2009 almost wiped out wheat crops, and caused global wheat prices to spike. And demand assessments project that food production may have to double by 2050 to feed a growing population. Together, the potential environmental and economic stresses are making the world food production system less resilient, and will affect both rich and poor nations. To measure the fragility of the system, researchers developed scenarios of small shocks (10 percent crop loss) and large shocks (50 percent crop loss). These were then applied to corn, wheat or rice output using an integrated assessment model, the Global Change Assessment Model, which was developed by the U.S. Department of Energy.

Among the critical findings are that “breadbasket” regions respond to shocks in different ways. For example, South Asia, where most of the arable land is already in use, is quite unresponsive to shocks occurring elsewhere in the world, because the total amount of land in agricultural production cannot be changed significantly. In Brazil the situation is opposite, it has a lot of potential to bring new land into production if large shocks occur. However, cleaning Brazil’s forests requires significant effort and would add significantly to global climate change. Within the research agenda of the Pardee Center, these risks and preventive actions are discussed in more detail. The warning is clear: humankind needs to be aware and prepared for potential multiple “breadbaskets” failure if we want to reduce the potential for catastrophe. (Anthony Janetos, The Conversation)

Reproducibility in Science

Research Transparency: Open Science

Increasing amounts of scientific data, complexity of experiments, and the hidden or proprietary nature of data has given rise to the “reproducibility crisis” in science. Reproducibility studies in cancer biology have revealed that only 40 % or less peer-reviewed analyses are replicable. Another large-scale project attempting to replicate 100 recent psychology studies was successful in replicating less than 50% of the original results.

These findings are driving scientists to look for ways to increase study reliability, and make research practices more efficient and available for evaluation. A philosophy of open science, where scientists share their primary materials and data, makes analytical approaches more transparent and allows common research practices and standards to emerge more quickly. For scientific journals and associations, open science methods enable the creation of different ways to store and utilize data. Some journals are specifically dedicated to publishing data sets for reuse (Scientific DataJournal of Open Psychology Data), others require or reward open science practices like publicly posting materials and data.

The widespread use of online repositories to share study materials and data helps to store large data sets and physical materials to help mitigate the problems of reproducibility. However, open science practice is still very much in development, and faces some significant disincentives. Habits and reward structures are two major forces work against. Researchers are used to being close, and hide their data from being stolen. Journal editors tend to favor publishing papers that tell a tidy story with perfectly clear results. This causes researchers to omit “failed” studies that don’t clearly support their theories.

While efforts to overcome these obstacles are difficult, development of fully transparent science should be encouraged, as openness helps improve understanding, and acknowledges the truth that real data are often messy. (Elizabeth Gilbert and Katie Corker, The Conversation)

 

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

June 6, 2017 at 9:00 am

How GMOs Could Help with Sustainable Food Production

with one comment

By: Agnes Donko, PhD

World Population estimates from 1800 to 2100

           The world population has exceeded 7.5 billion and by 2050 it is expected to reach 9.7 billion. The challenge of feeding this ever-growing population is exacerbated by global warming, which may lead to more frequent droughts or the melting of Arctic sea and Greenland ice. The year 2016 was the warmest ever recorded, with the average temperature 1.1 °C above the pre-industrial period, and 0.06 °C above the previous record set in 2015. According to the United Nations, the world faces the largest humanitarian crisis in East-Africa since the foundation of the organization in 1945, particularly in Yemen, South Sudan, Somalia and Nigeria. In these countries, 20 million people face starvation and famine this year because of drought and regional political instability.

How could genetically modified organisms (GMO) help?

The two main GMO strategies  are the herbicide-tolerant (HT) and insect-resistant crops. HT crops were developed to help crops survive application of specific herbicides (glyphosate) that would otherwise destroy the crop along with the targeted weeds. Insect-resistant crops contain a gene from the soil bacterium Bt (Bacillus thuringiensis) that encodes for a protein that is toxic to specific insects, thus protecting the plant. Insect-resistant crops can reduce pesticide use, which decreases the ecological footprint of cultivation in two ways – first by reducing insecticide use, which in turn will reduce the environmental impact of insecticide production, and second by reducing the fuel usage and carbon dioxide (greenhouse gas) emission, by fewer spraying rounds and reduced tillage. Thus, adoption of GM technology by African nations and other populous countries like India could help with sustainable agriculture that can ameliorate the burden of changing climate and growing populations.

In developed nations, especially in the US, GM technology has been widely used since the mid-1990s, mainly in four crops: canola, maize, cotton and soybean. GM crops account for 93 percent of cotton, 94 percent of soybean and 92 percent of corn acreage in the US in 2016. Although the appearance of weed resistance to glyphosate increased herbicide usage, in 2015 the global insecticide savings from using herbicide-tolerant maize and cotton were 7.8 million kg (84% decrease) and 19.3 million kg (53% decrease), respectively, when compared with pesticide usage expected with conventional crops. Globally these savings resulted in more than 2.8 million kg of carbon dioxide, which is equivalent to taking 1.25 million cars off the road for one year.

Another way in which GM crops can help sustainable food production is by reducing food wastage in developed nations. The Food and Agriculture Organization of the United Nations (FAO) estimates that one-third of all food produced for human consumption in the world (around 1.3 billion tons) is lost or wasted each year, which includes 45% of all fruits. For example, when an apple is bruised, an enzyme called polyphenol oxidase initiates the degradation of polyphenols that turns the apple’s flesh brown. But nobody wants to buy brown apples, so the bruised apples are simply trashed. In Arctic apples, the level of the enzyme is reduced by gene silencing, thereby preventing browning. The Arctic Apple obtained USDA approval in 2015, and is expected to reach the market in 2017.

In 2015, the FDA approved the first GMO food for animal consumption, a genetically modified Atlantic salmon called AquAdvantage. Conventional salmon farming has terrible effects on the environment. However, AquAdvantage contains a growth hormone regulating transgene, which allows for accelerated growth rates, thus decreasing the farming time from 3 years to 16-18 months. This would dramatically reduce the ecological footprint of fish farming, leading to more sustainable food production. Even though FDA did not find any difference in the nutritional profile between AquAdvantage and its natural counterpart, AquAdvantage will not hit the U.S. market any time soon, because the FDA banned import and sale until the exact guidelines on how this product should be labelled are published.

This FDA action was initiated by bill S. 764 that was signed by former president Barack Obama in 2016. Bill S. 764 requires food companies to disclose GMOs without necessarily using a GMO text label on packaging. They may choose to label GM ingredients with a symbol or a QRC (quick response code) that, when scanned by a smartphone, will lead the consumer to a website with more information on the product. But this requires the consumer to have both a smartphone and access to the internet. The bill also has ‘lax standards and broad definition’. For instance, if the majority of a product contains meat, but some other less significant ingredient is produced from GM crops, then it need not be labelled. Oil extracted from GM soybean, or starch purified from GM corn are exempt from labeling, because they were only derived from GM sources, but no longer contain any genetic material in them. Contrarily, in the European Union (EU), regulations require that the phrase “genetically modified” or “produced from genetically modified [name of the organism]” must appear clearly next to the ingredient list. If the food is not packaged, the same phrase must be on the food display or next to it. The EU also unequivocally determines the level of GMO (below 0.9 %) in conventional food or feed that is exempt from labelling.

Despite its controversial guidelines for GMO labeling, bill S. 764 could end the long-fought battle of Just Label It campaign. The bill was a huge step toward the right to know, which will let individuals decide if they want to consume GM foods or not. GMOs can significantly support sustainable food production and reduce the destructive environmental impact of humanity, but only if we let it.

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

May 12, 2017 at 5:13 pm

Science Policy Around the Web – May 9, 2017

leave a comment »

By: Emily Petrus, PhD

By Robert A. Rohde (Own work) [CC-BY-SA-3.0], via Wikimedia Commons

Environment

Please Pass the Crickets!

Most people know that eating beef is bad for the environment. A new study from the University of Edinburgh and Scotland’s rural college quantifies the impact human carnivores could have if we switched half of our current meat intake to insects such as crickets and mealworms. Cattle require huge swaths of pasture and produce enormous amounts of greenhouse gases such as methane. Methane is released during normal digestive processes, and methane and other greenhouse gases such as nitrous oxide are released from manure.

The idea of switching from a plate of steak to a bowl of mealworms may be too much for most Westerners, so what’s the human meat lover to do? Luckily, the study suggested that switching harmful beef for chicken or imitation meat (such as tofu) can yield large environmental benefits, because poultry and soy plants both require less land and produce less greenhouse gasses than cattle. The study also concluded that “meat in a dish”, or lab grown meat, was not more sustainable than chicken or eggs.

Although meat might not be replaced by insects any time soon for humans, we can still begin to incorporate insects into the farming discussion. Currently cattle raised for human consumption are fed diets of hay, soy, grain and other surprising items. These cattle need high levels of protein, which is one reason why mad cow disease became so prevalent – uneaten parts of cows were fed to other cows, which made them sick. Insects could help solve the protein gap for cattle, which was supported by a general survey of farmers, agricultural stakeholders and the public in Belgium.

Our eating practices affect the environment; moving towards a sustainable agricultural system is a commendable goal. Every person can decide for themselves how far they’re willing to go along the food chain to achieve a smaller carbon footprint. (ScienceDaily)

Vision Loss

Letting the Blind See Again

Vision loss is devastating – vision is the most relied upon source of sensory input for humans.  This can occur from an accident or genetic/physiological disorders. Retinitis pigmentosa causes a degeneration of the retina, and affects about 100,000 people in the US. Currently there is no cure, but clinical trials are exploring treatments to slow the process using gene therapy, dietary changes, or other drugs.

A new synthetic, soft tissue retina has been invented by a graduate student at Oxford University.  This artificial retina is biodegradable and uses synthetic but biological tissues to mimic the human retina.  The material composition is less likely to trigger an adverse reaction in the body and are less invasive than current retina transplants made of hard metal materials. Restrepo-Schild developed a bilayer of water droplets which respond to light with electrical impulses. The signals translate to cells at the back of the eye just like healthy retinal cells should. The new retina prototype has yet to be tested in animals to see if it translates well to humans.

Another way to restore vision is gaining traction: xenotransplants (transplants from animals to humans). Just last year a Chinese boy’s vision was restored after a corneal transplant from a pig. Pigs are good candidates for human transplantation because they are anatomically and physiologically similar, and they are ethically more desirable sources than non-human primates. Although pigs are not immunologically similar to humans, the eye transplants are unlikely to be rejected by the recipient because this part of the body is immune-privileged.

Restoring vision is an important and admirable task. Scientists and clinicians have multiple avenues to explore to help people regain their sight. (ScienceDaily)

Have an interesting science policy link?  Share it in the comments!

Written by sciencepolicyforall

May 9, 2017 at 9:43 am