Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘climate change

Science Policy Around the Web August 30th, 2019

leave a comment »

By Andrew Wright, BSc

Image by Steve Buissinne from Pixabay

EPA’s controversial ‘secret science’ plan still lacks key details, advisers say

In early 2018 under its previous administrator Scott Pruitt, the U.S. Environmental Protection Agency (EPA) first proposed rules to restrict the use of scientific findings whose data and methodologies are not public or cannot be replicated. Following the removal of all sitting Science Advisory Board (SAB) members who receive EPA grants in late 2017 (roughly half of its members) there was concern that environmental experts were being sidelined from EPA decision-making, which the proposed rule seemed to support. While making data public and replicable has merits, the SAB has raised concerns that the proposed rule would make it impossible to use the most accurate information as many environmental studies are long-term ones that assess human exposure to toxins and cannot be ethically or efficiently replicated. Now, under administrator Andrew Wheeler, how this proposed rule will be implemented is still unclear. 

A central concern is how to maintain privacy over personally identifiable information (PII) to comply with existing privacy laws and concerns (such as the Health Insurance Portability and Accountability Act or HIPAA). One proffered strategy is to try a tiered approach based of the model currently used by the National Institutes of Health, whereby the more sensitive the PII is, the more restricted its access will be. 

As the SAB has decided to engage in a consultation of the proposed rule, individual members will have their comments written up in a report to be sent to Wheeler but will not have to come to a consensus for the proposed rule to move forward.  

(Sean Reilly, Science (Reprinted from E&E News

 Brazilian Amazon deforestation surges to break August records 

While the recent spate of fires in the Amazon rainforest have been capturing international attention, regular deforestation via cutting and clearing techniques have also been rapidly increasing. In August alone, 430 square miles, or a region the size of Hong Kong, has been cut down. This comes after the July’s loss of 870 square miles, a 275% jump from the previous year.  At the current rate of deforestation Brazil is on track to lose more than 3,800 square miles of rainforest, or an area roughly one and a half times the size of Delaware, this year.

“The August data from Deter is hardly surprising,” said Claudio Angelo of Climate Observatory, referencing the Deter-B satellite that was put into place in 2015 to monitor Brazil’s rainforests. According to him and other representatives from non-governmental organizations, the Bolsonaro government is delivering on its promises to support local industries such as mining, ranching, farming, and logging rather than enforcing environmental protections. 

While this deforestation data is separate from data on forest fires, felled trees are often left to sit and dry before they are lit aflame, leading forest engineers to portend that the fires are going to get worse in the coming months.

Since the Amazon rainforest generates its own weather patterns, studies have demonstrates the possibility that after 40% deforestation has occurred, the biome may irreversibly convert to savannah. This could impact global weather patterns, affected Brazilian weather most severely. However, recent estimates place that tipping point closer to 20-25% due to the synergistic effects of climate change. According to the World Wildlife Fund, approximately 17% of the rainforest has been lost in the past 50 years, putting uncontrollable forest conversion much closer than previously assumed.

(Jonathan Watts, The Guardian

Advertisements

Written by sciencepolicyforall

August 30, 2019 at 11:08 am

Science Policy Around the Web August 16th, 2019

leave a comment »

By Neetu M. Gulati PhD

Image by vegasita from Pixabay 

How Eating Less Meat Could Help Protect the Planet from Climate Change

A recent report by the United Nations climate science body, the Intergovernmental Panel on Climate Science (IPCC), warns that now is a moment of reckoning for how humans use the planet. The report highlights how the planet has been impacted by land-use practices, deforestation, agriculture, and other activities. These threaten our ability to limit the global temperature increase as outlined by the 2015 Paris climate agreement. The report further outlines how humans can help stop the impacts of climate change by drastically changing what food we eat as well as how it is produced.

Explaining this logic, Debra Roberts, the co-chair of the IPCC Working Group II, commented, “some dietary choices require more land and water, and cause more emissions of heat-trapping gases than others.” If people eat more sustainably grown and produced foods, as well as more plant-based diets, this could provide opportunities to adapt and mitigate the potential climate issues. Meats like beef and lamb are particularly taxing on the environment for the amount of meat obtained, partially because such livestock require a large space to graze. Reducing the amount of land to produce meat and also using that land more efficiently through sustainable farming practice will be imperitive to ensure that land remains usable as the planet warms. 

While a lot of the world already eats majority plant-based diets, the countries that eat a lot of meat tend to be wealthier countries. As countries with lesser meat consumption gain wealth, there is a risk that they will eat more meat and put a greater strain on the environment. While not every country will stop eating meat, the recent popularity of meatless products is encouraging, and hopefully the public will begin to focus on the fact that food and agriculture are important in the fight against climate change.

(Abigail Abrams, Time)

“Qutrit” Experiments are a First in Quantum Teleportation

Many believe that quantum information science is a key avenue of research for future technologies. Now, for the first time, researchers have used this technology to teleport a qutrit, a tripartite unit of quantum information. This is an important advance for the field of quantum teleportation, previously limited to the quantum equivalent of binary bits of information known as qubits. The two research teams who independently achieved this feat first had to create qutrits from photons, a challenge in and of itself. Because qutrits can carry more information and have more resistance to noise than qubits, these experiments may mean that qutrits become an important part of future quantum networks.

In quantum science, the states of entangled particles have a connection. Thus, in quantum teleportation, the state of one entangled particle, for example the spin of an electron particle, influences the second particle instantaneously, even if far apart. While this sounds like something out of a science-fiction story, this milestone may have important real-world implications. Quantum teleportation may be important for secure communications in the future. In fact, much of the quantum teleportation research is funded because of its importance for the future of cybersecurity.

The qutrit teleportation experiments were independently performed by two research teams. One team, led by Guang-Can Guo at the University of Science and Technology of China (UTSC), reported their results in a preprint paper in April 2019. The other team, co-led by Anton Zeilinger of the Austrian Academy of Sciences and Jian-Wei Pan at the UTSC, reported their findings in a preprint paper in June 2019 that has been accepted for publication in Physical Review Letters. The two teams agree that each has successfully teleported a qutrit, and both have plans to go beyond qutrits, to at least ququarts (four level systems). Other researchers are less convinced, saying the methods used by the two teams are slow and inefficient, and therefore not suited for practical purposes. In response, one of the authors of the paper by Zeilinger and Pan’s team, Chao-Yang Lu, said, “science is step by step. First, you make the impossible thing possible. Then you work to make it more perfect.”

(Daniel Garisto, Scientific American

 

Written by sciencepolicyforall

August 16, 2019 at 3:15 pm

Science Policy Around the Web August 1st, 2019

leave a comment »

By Andrew Wright BSc

Image by Steve Buissinne from Pixabay 

Major U.S. cities are leaking methane at twice the rate previously believed

While natural gas emits less carbon dioxide (CO2) when burned, if allowed to enter the atmosphere as methane (CH4) it can act as a greenhouse gas that is 20-80 times more potent than CO2. Some of this impact is supposed to be mitigated by the relatively low amount of leaked methane, roughly 370,000 tons in six major urban areas studied according to a 2016 report from the EPA. However, a new study in the journal Geophysical Research Letters analyzed those same metropolitan centers and found that the EPA has underestimated methane release by more than half. By taking simultaneous measurements of ethane, which appears only in natural gas supplied to homes and businesses, researchers were able to delineate the sources of leakage, as natural sources and landfills do not give off ethane. 

From their analysis, the total estimate from the six cites studies was 890,000 tons of CH4, 84% of which was from methane leaks. While the authors of the study are unsure as to why the EPA estimates are so low, they suggest it could be because the EPA only estimate leaks in the distribution system, rather than endpoint leaks in home and businesses. While these results cannot be reliably extrapolated to newer cities which may contain infrastructure more resilient to leakage, they could engender further study to gather a clearer picture of national methane release.

(Sid Perkins, Science)

 

Japan approves first human-animal embryo experiments

On March 1st the Japanese science ministry lifted a ban on growing human cells in animal embryos and transferring them to animal uteri. While human-animal hybrid embryos have been made before, functional offspring have not been allowed to develop.  The first researcher to take advantage of this new regulatory scheme is Hiromitsu Nakauch, the director of the Center for Stem Cell Biology and Regenerative Medicine at the Institute of Medical Science at the University of Tokyo and a faculty member at Stanford University. His long-term goal is to grow human organs in animals such as pigs, from which the functional organs could be extracted and transplanted into human patients. His intent is to start in an early embryonic mouse model, then a rat model, and finally a pig model with embryos that form for up to 70 days. 

This measured approach is in stark contrast to the recent controversy regarding CRISPR edited babies in China, but has still been met with a certain level of ethical skepticism. Bioethicists are particularly concerned that the human cells being injected into animal embryos, induced pluripotent stem (iPS) cells, may deviate from their intended target (in this case the pancreas) and affect the host animal’s cognition. According to Nakauchi, the experimental design, which involves eliminating the gene for the target organ and injecting human iPS cells to compensate, is such that the cells should only be involved in a specific part of the animal. 

While Nakauchi’s group used this method to successfully grow a pancreas in a rat from mouse cells, they have had limited luck putting human iPS cells into sheep embryos. Given the evolutionary distance between mice, rats, pigs, and humans it may be difficult for experimenters to produce more satisfactory results. To address this Nakauchi has suggested that he will be trying genetic editing techniques as well as using various developmental stages of iPS cells.

(David Cyranoski, Nature)

 

 

Written by sciencepolicyforall

August 1, 2019 at 12:23 pm

Homegrown Apocalypse: A Guide to the Holocene Extinction

leave a comment »

By: Andrew Wright BSc

Homegrown Apocalypse: A Guide to the Holocene Extinction

One of the unifying factors of mass extinctions is a rapid change in global average temperature. The end-Ordovician extinction, the second largest, occurred when newly forming mountains made of silicate rock quickly absorbed atmospheric CO2. The global average temperature plunged, leading to the formation of enormous glaciers, drastically lower ocean levels, and much colder waters. Since complex life was still relegated to the oceans, this killed 86% of all species. The most well-known extinction is the end-Cretaceous or K-Pg event caused in part by a massive asteroid impact in Chicxulub, Mexico. The immediate impact, roughly one billion times stronger than the atomic bombings of Japan, was devastating in its own right. However, the subsequent ejection of sulfate-bearing rock into the atmosphere was the real killer, dropping global temperatures by 2-7°C, inhibiting photosynthesis, and acidifying the oceans. Coming right after a period of global warming, this extinction killed about 76% of all species.

            These extinctions pale in comparison to the end-Permian extinction, also known as the Great Dying. When Pangea was the sole continent, an enormous pool of lava called a flood-basalt plain slowly erupted over what is modern-day Siberia. Over 350,000 years, magmatic rock up to a mile thick solidified and covered an area roughly half the size of the United States. This igneous cap forced underground lava to move sideways and spread in paths called sills. As the lava traveled, it vaporized increasing amounts of carbonates and oil and coal deposits, leading to an immense build-up of CO2. Once the sills reached the edge of the cap, these gases were violently expelled, ejecting up to 100,000 gigatons of CO2. The immediate effect was a global average temperature increase of roughly 5°C. Subsequently, oceanic methane hydrate (or methane clathrate) crystals, which become unstable at high temperatures, broke down. Since methane is 20-80 times more potent than CO2as a greenhouse gas, global average temperature increased a further 10°C, bringing the total to 15°C. This left the planet barren, desertified most of Pangea, strongly acidified the oceans, killed 96% of marine life, and 90% of all life on Earth.

            We are currently living through the beginnings of the sixth mass extinction event, known as the Holocene. Species are dying off 10-100 times faster than they should and that rate is accelerating. Insects, including pollinators, are dying off so quickly that 40% of them may disappear within decadesOne in eight birds are threatened with extinction, 40% of amphibians are in steep decline, and marine biodiversity is falling off as well. At current rates, half of all species on Earth could be wiped out by the end of the century. 

What is the commonality between our present circumstances and the past? As with previous mass extinctions, global average temperature has increased. Since 1880, global average temperature has increased by 0.8°C and the rate of warming has doubled since 1975. This June was the hottest month ever recorded on Earth, with global average temperature reaching 2°C above pre-industrial levels. Greenland lost two billion tons of ice in one day. This increase in temperature is because we are currently adding 37.1 gigatons of CO2 per year to the atmosphere, and that number is rising

            From the most recent International Panel on Climate Change (IPCC) report, we know that the best outcome is to keep the increase in global average temperature below 1.5°C. Instead, let us consider what would happen if current trends stay the same and CO2 emissions continue to increase at similar rates until 2100. This is known as the RCP 8.5 model. Under this paradigm, atmospheric CO2 levels will rise from 410 parts per million (ppm) to 936 ppm. The global average temperature will increase by 6°C from pre-industrial levels. That puts the Earth squarely within the temperature range of previous mass extinction periods. 

Given this level of warming the following can be expected to occur: first and foremost, the extreme heat on the planet will massively decrease glaciation, causing a surge in ocean levels. Since water expands as it gets warmer, ocean levels will increase even further to about 12ft higher than current levels. This means most coastal areas will perpetually flood while others will be completely underwater. Unfortunately, non-coastal areas won’t be free from hardship as high air temperature will cause desertification, crop die-off, drought, and widespread wildfires. Secondly, as the ocean absorbs CO2 from the atmosphere, it will become increasingly acidic. So far, the pH of the ocean has only changed by 0.1, but under an RCP 8.5 model, that decrease could be as high as a 0.48 reduction in pH. Since this measurement is on a logarithmic scale, this means that the oceans will be acidic enough to break down the calcium carbonate out of which shellfish and corals are built. Warmer water cannot hold oxygen as effectively as cold, meaning many water-breathing species will suffocate. In combination, these two factors will serve to eliminate a huge source of the human food supply. Finally, since weather patterns are based on ocean and air currents and increasing temperatures can destabilize them, massive hurricanes, dangerously cold weather systems, and flood-inducing rainfall will become the norm. 

One parallel to the end-Permian extinction might result as well. Over millions of years, methane clathrate re-stabilized in the permafrost of Siberia and in the deep ocean floor. But in what has been termed the clathrate gun hypothesis, if methane clathrate destabilizes again at high temperatures, then the resultant methane emissions and planetary warming could form a positive-feedback loop, releasing even more crystallized methane until we end up in another “great dying”. While short-term warming probably won’t cause a runaway temperature increase, a 6°C increase in global average temperature might. New research suggests methane release may not even be necessary as the ocean is reaching a critical point in the carbon cycle where it could rapidly expel an amount of CO2on par with flood-basalt events. Moreover, like the end-Permian extinction, anthropogenic climate change is occurring on a near instantaneous geological time scale and species, including our own, will not have the requisite time to adapt.

Of course, none of these effects exists in a vacuum. They will be alongside increasing deforestation for agriculture, plastic and chemical pollution, and resource extraction. The end result would be a planet with less space, little food, mass migration, and devastating weather. So, what can be done to stop this scenario from coming true? The latest IPCC report essentially places humanity at an inflection point. Either CO2output is cut in half by 2030 and humans become carbon neutral by 2050, or the planet is irrevocably thrust past the point of no return. 

This timeframe may seem short, but it takes into account that even if civilization were to completely stop emitting greenhouse gasses today, it would take hundreds of years for global average temperature to  go back down since it takes time for the ocean to absorb CO2from the atmosphere. Like any problem of scale, there is no one solution to reaching carbon neutrality and it will take a multivariate approach. Some solutions include enacting carbon tax measures, subsidizing and implementing renewable energy (while divesting from new coal and oil production), an increased reliance on nuclear power, large-scale reforestation, livestock reduction, and carbon-sequestration technology. Some of these efforts have come a long way and some have gone in the wrong direction.

This is, of course, a global problem to be solved. At a time when the United States has signaled its intention to withdraw from the Paris Climate Accord as soon as possible and states are rejecting carbon cap-and-trade measures, other nations are moving ahead with unprecedented boosts in renewable energy and bold commitments to reducing greenhouse gas emissions. India, the third-largest polluter after the United States, is on track to surpass its Paris Accord commitments. Should the United States re-engage with and lead the international effort to tackle what is an existential threat, then it is not improbable that the end of this century could be a pleasant one. So, if the idea of living through a global extinction event is disconcerting, one can be assured that the problem is still just barely a solvable one. 

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

July 11, 2019 at 4:24 pm

Science Policy Around the Web – June 14th, 2019

leave a comment »

By: Andrew Wright BSc

Image by David Mark from Pixabay 

The Pentagon emits more greenhouse gases than Portugal, study finds 

A recent study published by Brown University quantified the Pentagon’s total greenhouse gas emissions from 2001 to 2017 using estimates from the Department of Energy and fuel consumption data. The results demonstrated that during the time period studied, the Pentagon’s emissions were “in any one year…greater than many smaller countries‘ greenhouse gas emissions”. In 2017 alone, the Pentagon output 59 metric tons of CO2, ranking it higher than Sweden (42 metric tons), Portugal (55 metric tons) , or North Korea (58 metric tons). The Pentagon’s energy consumption is largely from air emissions (~55%) and diesel use (~14%), while the rest is dedicated to powering and heating military facilities.

Were it to be considered a standalone country, the Pentagon would be the 55th largest contributor of CO2 emissions, according to the study’s author Neta Crawford. In a separate article, she noted ”…the Department of Defense is the U.S. government’s largest fossil fuel consumer, accounting for between 77% and 80% of all federal government energy consumption since 2001″. While measures have been put into place by the Pentagon to reduce its emissions in recent years, given the threat assessment the Pentagon produced that warns fully two-thirds of military installations in the U.S. are or will be at risk due to climate change, further efforts may be needed.

 (Sebastien Malo, Reuters

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

June 14, 2019 at 3:58 pm

How human health depends on biodiversity

leave a comment »

By: Lynda Truong

Image by V Perez from Pixabay 

By many measures, the Earth is facing its sixth mass extinction. The fifth mass extinction, a result of a meteorite approximately 10 km in diameter, wiped out the dinosaurs and an estimated 40-75% of species on Earth. This time around, the natural disaster that is threatening life on Earth is us.

In May, the United Nations released a preliminary report on the drastic risk to biodiversity (not to be confused with the recent report on the drastic consequences of climate change).  The assessment, which was compiled by the Intergovernmental Science-policy Platform on Biodiversity and Ecosystem Services (IPBES), draws on information from 15,000 scientific and government sources with contributions from 145 global experts. It projects that one million species face risk of extinction. Scientists have estimated that the historical base level rate of extinction is one per million species per year, and more recent studies suggest rates as low as 0.1 per million species per year. At the established base level rates, it would take one to ten million years to see the same magnitude of extinction the planet currently faces. This accelerated rate of extinction can be linked to a variety of man-made causes, including changes in land and sea use, direct exploitation of organisms, climate change, pollution, and the introduction of invasive species. 

For some, that may not seem important. If humans are not on the endangered species list, why should it matter? As the IPBES Global Assessment indicates however, healthy ecosystems provide a variety of services, including improving air quality, purifying drinking water, and mitigating floods and erosions. The vast canopies of rainforests worldwide sequester 2.6 billion tons of carbon dioxide a year. Plants and soil microbes found in wetlands can remove toxins from water, including explosive chemicals such as nitroglycerin and trinitrotoluene (TNT). Mangrove forests serve as an important buffer against ocean storm surges for those on land. Nature is a powerful resource, and declines in biodiversity have broad implications for global development and health. 

The importance of biodiversity on global health is immediately apparent in middle- and low-income countries, which rely heavily on natural remedies and seasonal harvests for health and nutrition. The loss of entire species of plants can eliminate valuable sources of traditional medicine for indigenous communities. Genetically diverse crops are more resilient to pest and disease, ensuring a stable food supply and bolstering food security. Beyond this, ecosystem disturbances also have complex implications for infectious disease, which are often endemic to developing nations. 

However, these effects are also seen in first world countries. A well cited example for the impact of biodiversity loss on infectious disease involves Lyme disease, which is endemic to parts of the United States. The white footed mouse is a common carrier of Lyme disease, and in areas with high densities of these mice, ticks are likely to feed on the mice and subsequently transmit the disease to humans. However, the presence of other mammals that the tick can feed on dilutes the disease reservoir, lowering the likelihood of an outbreak (commonly referred to as the “dilution effect”). While biodiversity has complicated effects on the spread of infectious diseases, drastic changes to ecosystems often provide a breeding ground for disease vectors and lead to increases in transmission.

In addition to the direct effects of declines in biodiversity have on global health, an often-neglected aspect of its importance for health is as a resource for biomedical science. The IPBES assessment reports that 70% of cancer drugs are natural or inspired by natural sources such as traditional medicines. This merely scratches the surface of the influence of nature on modern biomedical research. 

Much like the communities that rely on natural products as medicine, many drug compounds produced by pharmaceutical companies are derived from nature. Morphine has been one of the most revolutionary drug compounds in history, effectively treating both acute and chronic pain. The compound was originally isolated from the opium poppy, and its chemical structure has since been modified to reduce negative effects and improve potency. While the current opioid crisis in the United States has highlighted the importance of moderate use, morphine and its analogues are some of the most useful and reliable pain relievers in modern medicine. Similarly, aspirin has been regarded as a wonder drug for its analgesic, anti-inflammatory, and cardioprotective effects. Aspirin is a chemical analogue of salicylic acid, a compound originally isolated from willow tree bark. 

Beyond general pain relief, many naturally derived drugs have also been useful for disease treatment. Quinine, the first effective antimalarial drug, was extracted from the bark of cinchona trees, and quinine and its analogues are still used to treat malaria today. Penicillin, serendipitously discovered in a fungus, has been useful for treating bacterial infections and informing modern antibiotic development. These medicines and many more have been crucial to the advancement of human health, yet could have just as easily been lost to extinction.

On a more fundamental level, scientific research has benefited from many proteins isolated from nature. Thermophilic polymerases, isolated from a bacterium residing in hot springs, are now an essential component of polymerase chain reactions (PCR) – a common laboratory technique that amplifies segments of DNA. This method is critical in molecular biology labs for basic research, and forensic labs for criminal investigations.Fluorescent proteins, which have been isolated from jelly fish and sea anemone, revolutionized the field of molecular biology by allowing scientists to visualize dynamic cellular components in real time. More recently, CRISPR/Cas systems were discovered in bacteria and have been developed as a gene editing tool capable of easily and precisely modifying genetic sequences. These basic tools have vastly improved the scope of biomedical research, and all of them would have been close to impossible to develop without their natural sources.

In addition to medicines and tools, nature has often informed biomedical research. Denning bears are commonly studied for potential solutions to osteoporosis and renal disease. Their ability to enter a reduced metabolic state where they do not eat, drink, or defecate for months at a time provides valuable insight into how these biological processes may be adapted to benefit human disease and physiology. Even more interestingly, there are a few species of frogs that become nearly frozen solid in winter, and thaw fully recovered in spring. In this frozen state, much of the water in their body turns to ice, their heart stops beating, and they stop breathing. When temperatures rise, they thaw from the inside out and continue life as per usual. Crazy cryonics and immortality aside, the freeze/thaw cycles could inform improved preservation for organ transplants.

Nature is a much better experimentalist than any human, having had billions of years to refine its experiments through the process of evolution and natural selection. Depleting these living resources, which provide invaluable benefits to human health and ecosystems, lacks foresight and is dangerously reckless. The techno-optimist approach of ceaseless development in the blind belief that whatever problem humanity encounters can be solved with research and innovation neglects to account for the dependency of research and innovation on nature. Most biomedical scientists, most physicians, and much of the general public have probably devoted a minimal amount of consideration to the importance of biodiversity. But for the one million species currently at risk, and for the hundreds of million more yet to be discovered, it’s worth a thought.

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

June 7, 2019 at 9:51 am

Science Policy Around the Web – April 30th, 2019

leave a comment »

By: Andrew Wright, BSc

Source: Pixabay

North American drilling boom threatens big blow to climate efforts, study finds

At a time when the most recent Intergovernmental Panel on Climate Change (IPCC) report has determined that CO2emissions must be halved by 2030 to prevent irreversible climate change (and the consequences thereof), it would appear that energy investments are following an opposite trend. According to the Global Energy Monitor’s assessment on pipeline infrastructure, 302 new pipelines are under development, 51.5% of which are being developed in North America. This reflects a current pipeline expansion investment of $232.5 billion as part of a total $1.05 trillion in investments that include processing, storage, export, and other oil and gas related expenses. Even though 80% of these pipelines are dedicated to natural gas infrastructure, should each project be completed and be fully utilized in the United States they would approximately lead to an 11% increase in national CO2emissions by 2040 at a time when those emissions should be approaching a 75% reduction. 

            Ignoring the impacts on global climate, human health, and the associated societal cost, the authors of this infrastructure assessment argue that these pipelines may yield a poor return on their investment. To start, the output of the new North American pipelines far exceeds domestic energy demand and thereby will rely on exporting oil and natural gas to foreign markets.  However, these same markets are boosting their own capacity for fuel production and will likely be less reliant on imports from the North American market. Furthermore, renewable sources of energy have become as cheap or cheaper than their oil and gas counterparts and are expected to continue becoming more affordable as technology improves. Both of these factors threaten to upend the future market these pipeline investments will require in much the same way that cheap natural gas production disrupted the US coal market, which was relying on the same foreign export model before its collapse.

(Oliver Milman, The Guardian

Sexual harassment is pervasive in US physics programs

Sexual harassment is a problem across United States academia. For example, a National Academies of Sciences, Engineering, and Medicine (NASEM)  report from 2018 found that within non-STEM majors roughly 22% of female respondents said they experienced sexual harassment, whereas within STEM majors that percentage ranged from 20% in the Sciences to 47% in Medicine.  However, research published in the journal Physical Review Physics Education Research shows that sexual harassment is particularly pervasive among women pursuing an undergraduate in physics. Of women who responded, 338 of 455, or 74.3%, reported experiencing harassment. In addition, 20.4% of respondents said they experienced all three forms of sexual harassment evaluated: sexual gender harassment, sexist gender harassment, and unwanted sexual attention.

            Much like the NASEM report indicated for all academic fields, the high incidence of sexual harassment observed in physics programs is correlated with negative academic outcomes for those experiencing it. This includes a negative sense of belonging and a higher propensity towards the imposter phenomenon, or attributing personal success to external factors. While large funding institutions, such as the National Institutes of Health and the National Science Foundation, have made a stronger push recently to combat sexual harassment, it is clear that such efforts should be expanded and particular attention should be paid to certain academic fields.

(Alexandra Witze, Nature News



Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

April 30, 2019 at 10:46 am