By: Thaddeus Davenport, PhD
DNA Data Storage
In a recent Nature News article, Andy Extance described the growing need for novel data storage methods and materials. It is estimated that between 2013 and 2020 there will be a tenfold increase in digital information, requiring 44 trillion gigabytes of storage. This is a number that is difficult to comprehend, but it’s magnitude and the rapid rate of digital data growth are put in context by a second, more shocking, estimate: if the expansion of digital information continues at the forecasted rates the amount of data requiring storage in 2040 will require “10 to 100 times the expected supply of microchip-grade silicon.” For this reason, researchers have begun considering alternative data storage materials including DNA, which is able to store information at an impressive density; it is estimated that 1 kg of DNA would be sufficient to store the world’s digital archives. DNA is also stable – while there is data loss from hard disks after less than ten years of storage, Nick Goldman, a researcher pioneering DNA data storage at the European Bioinformatics Institute (EBI), notes that in 2013, researchers successfully read the genome of a horse that had been trapped in permafrost for 700,000 years. But there are a number of hurdles that must be overcome before we are able to stream our favorite show out of a test tube. These hurdles include: 1) it is slow to read and (especially) to write DNA sequences, 2) DNA synthesis is error prone, 3) DNA synthesis is currently expensive and 4) it is difficult to specifically access desired information stored within DNA. There have been exciting advances over the last few years from researchers at EBI, Harvard, the University of Washington, and Microsoft that begin to address these problems. This year, researchers at Microsoft and the University of Washington reported successfully storing and retrieving 200 megabytes of data in DNA. This is a far throw from the 44 trillion gigabytes of storage we will require in 2020, but progress in science is non-linear and the need for alternative storage media will motivate the growth of this exciting field. (Andy Extance, Nature News)
Oklahoma Shuts Down Wastewater Injection Wells Following Earthquake
There is a significant amount of wastewater that is released in the process of extracting oil and gas from traditional and hydraulic fracturing (“fracking”) wells. One way to dispose of this wastewater is to inject it deep into the earth’s crust. As oil production has increased within the continental United States within the last few years, wastewater injection has increased in stride. Recent evidence suggests that wastewater injection into rock formations alters pre-existing stresses within faults, in some cases leading to slippage that results in an earthquake. A recent article by Niraj Chokshi and Henry Fountain for the New York Times reported that on September 3rd, Oklahoma experienced a 5.6-magnitude earthquake – tying the state’s previous record for its most severe earthquake set in 2011. In response, Oklahoma government officials ordered the shutdown of three dozen wastewater injection wells in the area most affected by the earthquake. The quake comes amid an impressive increase in earthquake frequency for the state. In 2009, there were only three earthquakes of magnitude 3 or greater, but in 2015, this number increased to over 900. To address this increase, state officials ordered a reduction in wastewater injection last year with the hope of decreasing earthquake activity. To date in 2016 there have been over 400 earthquakes of magnitude 3 or greater in Oklahoma. While it is widely accepted that oil and gas production and the associated wastewater injection have set off a number of earthquakes in Oklahoma and other states, it remains unclear if last Saturday’s earthquake was the result of this activity. In the future, additional monitoring of injection wells will provide valuable data to inform decisions on the placement and operation of wastewater injection wells. (Niraj Chokshi and Henry Fountain, New York Times)
Early Support for Amyloid Plaques as the Causative Agent of Alzheimer’s Disease
As humans are living longer, Alzheimer’s disease is becoming an increasingly significant public health problem. The prevailing hypothesis is that aggregation of proteins such as amyloid-β (Aβ) into larger “plaques” leads to Alzheimer’s disease, but there is still no direct evidence to demonstrate that Aβ plaques cause Alzheimer’s disease. In a Nature News & Views article this week, Eric M. Reiman, summarizes the results of an article published in the same journal, which showed that a human antibody, called aducanumab, was able to reduce Aβ plaques in a dose-dependent manner in a small, 12-month placebo-controlled human trial. Though other Aβ-targeting therapies have successfully reduced Aβ aggregates, the most tantalizing result of this study comes from early exploratory analysis of the trial data, which suggested – based on a study population that is too small to make definitive conclusions – that higher doses of aducanumab and larger reductions in Aβ plaques were associated with slower cognitive decline. Before accepting the hypothesis that Aβ plaques cause Alzheimer’s disease, it will be critical to repeat the experiment in larger clinical trials appropriately powered to measure the impact of antibody treatment and plaque reduction on cognitive decline. The study authors also noticed that high doses of antibody were sometimes associated with the inflammation within the brain, causing them to limit the maximum antibody dose tested. Overall, these are exciting results, which, if confirmed in larger clinical trials, would provide much-needed clarity about the mechanism of Alzheimer’s disease and inform future treatments. (Eric M. Reiman, Nature News & Views)
Have an interesting science policy link? Share it in the comments!