Science Policy For All

Because science policy affects everyone.

Posts Tagged ‘innovation

Science Policy Around the Web – February 26, 2019

leave a comment »

By: Mary Weston, Ph.D.

Source: Wikimedia

A Century-Old Debate Over Science Patents Is Repeating Itself Today

In 1923, after the economic devastation of World War I, the Italian senator Francesco Ruffini wanted to bolster scientific research by giving scientists ownership of their discoveries. His scheme would have awarded scientists a patent of sorts on the laws of nature they found. Although he had reasonable scientific support and the backing of the newly formed League of Nations, ultimately scientists around the world strongly rejected the plan for various reasons. Recent proposed changes to scientific discovery patent law possess a striking similarity to these events and proposals nearly 100 years ago.

Ruffini, desiring to increase scientific research, argued that scientists should be able to receive “scientific property” for a discovery, similar to patents awarded for inventions. He cited the example of “Hertzian waves” (i.e. radio waves) as something that resulted in many valuable products. The proposal was a large deviation from the existing law, where patents could only be assigned for inventions – artificial things made by humans, like machines – but not for discoveries of the natural world. Ruffini “was clear that scientific property would not prevent all uses of a natural law. But only practical commercial applications”.

In 2017, the American Intellectual Property Law Association (AIPLA) and the American Bar Association’s Intellectual Property Section (ABA’s IP) both submitted proposals to change current laws (Amendment 35, Section 101) and allow for patents on scientific discoveries. Motivation for change stems from recent Supreme Court decisions regarding patents for medical techniques (use of the BRCA1/2 gene for detecting breast cancer and a blood diagnostic test to fine-tune autoimmune disease treatments). Currently legislators, specifically Senators Thom Tillis and Chris Coons, are revisiting these guidelines and roundtables were held in both January and February of this year. 

The demise of the previous 1920s proposal was due to details in implementation, very similar to the problems current proposals face today. These include how to:

  • attribute scientific property when there are many contributors to one discovery (i.e. who “discovered” electricity? Benjamin Franklin? George Ohm?). 
  • deal with unexpected liability, potentially requiring some sort of scientific property insurance scheme. 
  • deal with the scope of some scientific discoveries, possibly being so large that it leads to tremendous and costly amounts of ligation. 
  • write the patents with the specificity required without being too vague and/or speculative. 

Edward S. Rogers, a Chicago lawyer who assisted Ruffini with his proposals in the 1920s, ultimately warned against it in 1931, saying that while the plan was appealing, “the whole scheme seems impractical.”

If changes to the patent law are to occur, the same issues that prevented change nearly 100 years ago will need to be solved – a daunting and challenging task.

(Charles DuanSlate

Japanese Spacecraft Successfully Snags Sample of Asteroid Ryugu

The Hayabusa2, a Japanese asteroid-sampling spacecraft, just successfully retrieved surface pieces from Ryugu, a 3000-foot wide asteroid. To obtain the sample, the probe fired a 0.2 ounce tantalum “bullet” into the boulder-covered surface at close range, and then collected disturbed particles using a “sampling horn” located on the underside of the machine. 

The Japanese Space Agency (JAXA) launched the Haybusa2, Japanese for Peregrine Falcon, in December 2014. They told CNN that even reaching the asteroid, 180 million miles from earth, is the “equivalent of hitting a 2.4-inch target from 12,400 miles away”. Upon arrival, the probe circled the small asteroid for 1.5 years collecting data. Then, last September, two probes were successfully released to image and document the asteroid surface. 

The goal of this exploration journey is to better understand the early history and evolution of the solar system. Ryugu is a C-type asteroid, the category that ~75% of known asteroids falls into, and is thought to contain water and other organic materials. One theory suggests that much of earth’s water and organic compounds may have been delivered by asteroids and comets. This will be the first time scientists have visited and collected samples from this type of asteroid and evaluation of its composition may “clarify interactions between the building blocks of Earth and the evolution of its oceans and life,” JAXA described

JAXA is planning two additional sampling expeditions in the next couple of weeks. This second mission will collect additional surface material. The third will use a copper projectile to create a surface crater in order to obtain samples from beneath the asteroid’s surface, which has been weathered by deep-space radiation. The Haybusa2 will depart the asteroid in December 2019 and should arrive back to earth in December 2020.


Have an interesting science policy link? Share it in the comments!


Written by sciencepolicyforall

March 1, 2019 at 12:58 pm

Publications and Patents: Laying the foundation for future innovation and economic growth

leave a comment »

By: Xavier Bofill de Ros, Ph.D.


Source: Pixabay


Many hours on the laboratory bench made me wonder: What is the real impact of our science? How do the thousands of publications appearing in scientific magazines every month and the funds poured into research benefit our society? We all know history; Fleming’s research on mold resulted in the discovery of penicillin and saved millions of lives ever since, and GPS systems rely heavily on basic trigonometry. These examples embody the power of science as a driver of technological progress and motivates – public policies to support scientific research. For example, NIH receives $37 billion  annually to fund intramural and extramural biomedical research[1]. Some of this investment in research generates intellectual property, bringing back to the system private money derived from license agreements. For instance, the NIH Technology Transfer Office had an income of $138 million from royalties in 2015[2]. However, many critics are quick to point out that basic research rarely pays off in practical R&D.

To understand where we are we need to know where we are coming from. A big part of the current legislation that governs the intellectual property derived from publicly-funded research is inspired from the Patent and Trademark Law Amendments Act, also known as the Bayh–Dole Act passed in 1980. This act established that the ownership of inventions made with federally-funded research projects by universities, small business and non-profit institutions is entitled to them in preference to the government. Prior to that act, the government accumulated ownership to large numbers of patents derived from the $75 billion per year of funding dispersed through different agencies, however fewer than 5% of those patents were licensed[3]. In exchange for this new source of revenue, public money receiving institutions  are required to educate the research community about the patenting procedures and to protect the government’s interests on funded inventions among other requirements. Despite the criticisms for forcing consumers to “pay twice” for patented products, the economic impact of the Bayh-Dole Act has been important. Recent reports suggest that academic licenses to industry contributed between $148 to $591 billion per year to US gross domestic product (GDP)[4].

Besides economic performance, other approaches to assess the impact of scientific publications on intellectual property come from the bibliometric analysis of the prior art on issued patents. A recent study from Kellogg School of Management analyzed the content of 4.8 million patents and 32 million research articles to find out how research is connected to inventions[5]. By analyzing the prior art references of patents, and the references of these references, the authors revealed that 80% of research articles linked to a future patent. This connection is often indirect, since direct citations of research articles in patents only account for about 10%, but it quickly accumulates to 42% and 74% when second degree and third degree citations are included. This indicates that the vast majority of the publication corpus ends up in the pool of knowledge where inventions arise. The analysis of the distance between research articles and patents also revealed differences between fields of research. Areas such as “Computer science”, “Nanotechnology” and “Biochemistry and Molecular Biology” depict a more immediate impact on patents compared to others less easily applicable. The authors of the study also went on to address which institutions yield research articles with a more significant impact on patents. To this aim, they compared the publications from universities, government laboratories and publicly traded firms. Consistent with previous studies, firms’ scientific production is the most directly linked to patent production. However, universities and government publications follow at a very close distance, despite generally engaging with more long-term research goals.

Other less tangible contributions from academic research and industry take place through the open access of data, reagents or knowledge[6]. Examples of these are The Cancer Genome Atlas (TCGA) with genomic data from more than 11,000 patients, the Jackson Laboratory (JAX) collection and distribution of mouse strains of human diseases, or the Addgene repository, with a collection of more than 67.000 plasmids. Similarly, collaboration agreements like CRADAs (Cooperative Research and Development Agreements) allow industry to partner with academic labs[7]. Under such agreements, which can last years, researchers from academic labs and companies can engage with joint ventures by providing each other with resources, skills and funds. In these partnerships the ownership of any coming intellectual property is discussed upfront as well as first option rights for licensing. Such collaboration formulas have a positive impact on the market readiness of the technologies developed, when not directly shortening the pathway to market through the same industrial partner. Similarly, there’s also specific agreements allowing for to joint clinical trials, specifically for rare diseases, or to transfer research materials.

Overall, this illustrates that public investment can be used to generate innovation and economic growth through the right policy measures. Contrary to the belief that technological and scientific advances move independently, there’s a well-connected flow of ideas that permeate between patented inventions and scientific articles. There are already good incentives to the research communities to facilitate the collaboration between academia and industry. However, there’s still room for novel policies to further leverage what can be achieved through the public investment on research.



[3]GAO/RCED-98-126 Transferring Federal Technology. Page 3.

[4]The Economic Contribution of  University/Nonprofit  Inventions in the United  States: 1996-2015. Biotechnology Innovation Organization and the Association of University Technology Managers

[5]Ahmadpoor M, Jones BF. “The dual frontier: Patented inventions and prior scientific advance”. Science. 2017 Aug 11;357(6351):583-587.

[6]Bubela T, FitzGerald GA, Gold ER. Recalibrating intellectual property rights to enhance translational research collaborations. Sci Transl Med. 2012 Feb 22;4(122).

[7]Ben-Menachem G, Ferguson SM, Balakrishnan K. Beyond Patents and Royalties: Perception and Reality of Doing Business with the NIH. J Biolaw Bus. 2006 Jan 1;24(1):17-20.


Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

November 28, 2018 at 10:41 am

Science Policy Around the Web – August 16, 2018

leave a comment »

By: Sarah Hawes, Ph.D.


source: pixabay

Scientific Discoveries

Ambitious ‘Human Cell Atlas’ Aims To Catalog Every Type Of Cell In The Body

You’ve no doubt heard of the Human Genome Project – a 15-year long endeavor to sequence the total human genome, fueled by 3 billion dollars and completed in 2003. But have you heard of the Human Cell Atlas Consortium? MIT biology professor and member of the Broad Institute Aviv Regev has led this collective of hundreds of international scientists since its foundation in 2016.

Just as the intent of the Human Genome Project was to sequence all our genetic material, the intent of the Consortium is to identify every single type of cell in the human body so that they can be investigated and understood independently, and in the complex context of cell-cell interactions, and their role in our physiology. This is a huge endeavor. We currently lack even a rough estimate of how many cell types exist. “People guess anything from the thousands to the tens of thousands. I’m not guessing,” Regev says. “I would rather actually get the measurements done and have a precise answer.”

And how does one delineate cell types in a meaningful way, to take these measurements? This is done by determining which genetic material a cell activates, i.e. which genes it mobilizes from quietly bundled DNA to RNA, and from there into a meaningful protein product to carry out the particular cellular functions defining a heart cell, or a lung cell, or a glial cell in the brain. The technology enabling description of a cells’ genetic character is called single-cell RNA sequencing.

In 2014, Aviv Regev, together with Steve McCarrol, and David Weitz at Harvard, improved on this process to substantially speed it up. “All of a sudden, we moved from something that was very laborious and we could do maybe a few dozen or a few hundred, to something where we could do many, many thousands in a 15- to 20-minute experiment,” Regev says. “We said, ‘That’s at the right scale that we could actually do the human body.’ And this is what they have set out to do.

Today, after just two years and 200 million in NIH funding, the Human Cell Atlas Consortium is beginning to bear fruit.

The first major Human Cell Atlas finding was published this month and simultaneously confirmed by a separate lab at Harvard Medical School. Both papers are published in the journal Nature, and report the definition of a new cell type in the windpipe which is responsible for creating a faulty protein linked to cystic fibrosis. Previously it was believed that the faulty protein originated in common cells lining the windpipe. Discovery of the new and rarer cells, dubbed ‘pulmonary ionocytes,’ will improve scientists’ ability to target faulty protein production by targeting the pulmonary ionocytes, thereby speeding up the development of treatments for cystic fibrosis.

A second consortium-related discovery appeared last week in the journal Science, in which British consortium members published the finding that childhood kidney cancer begins in a cell type which is distinct from the cells giving rise to kidney cancer in adults.

“We knew the lessons from the Human Genome Project were [that] rallying together the entire community would really let you get a full answer to a question. And that full answer will empower everyone to do better and faster and higher-resolution biology,” says Regev. Together, cataloguing all human cell types is predicted to take just another five to ten years. Discoveries building off this catalogue will dramatically enhance medical progress globally, and in perpetuity.

(Karen Weintraub, NPR)

Science and Innovation

It’s ‘Shark Tank’ For Global Health Inventions

A fascinating program to bolster innovation in global health, and particularly for impoverished mothers and children, is coming from a union between government agencies in the US, Norway, Korea, and the UK together with the Bill & Melinda Gates Foundation. This program, called Saving Lives At Birth: A Grand Challenge For Development, hosted a special conference in Washington, D.C. last week to empower global health inventors with the gift of pitch.

Nearly 500 applicants were winnowed down to ten participants, each of which presented global health innovations before a panel of judges rating them not on their science, but on their business plan – including target market, competition and revenue model.

Prior to presenting, each participant worked with consultants to enhance their slides and speeches. Rachele Haber-Thomson, one of the consultants, explained that many participants came from academia or non-profits. She said that while they knew how to write a grant, they needed help to capture the attention of investors looking for compact, business-savvy strategies.

Presentations were compact, seven-minute pitches on practical advances – describing a health problem, a solution, and convincing details on its implementation – all before taking questions from judges on feasibility, marketing, etc. The advances presented included an oxygen concentrator powered by running water, a medicine pouch to prevent transmission of HIV from mother to infant, a new subcutaneous contraceptive, and a system to generate disinfectant from salt and water. The winning pitch came from Gradian Health Systems, and was for a network of mobile medical training centers to enhance quality of care in areas with few physicians.

The prize for a winning pitch was not funding (though the audience included potential backers). Each participant had already been awarded $250,000 to $2 million from Saving Lives At Birth to develop their ideas prior to the conference. Instead of direct funding, the winner was given a choice between competing in a similar sales-pitch show-down in Berlin or else receiving free business-consulting.

As Sofia Stafford of USAID and Saving Lives At Birth puts it, “if these global health projects want to scale up, they need to know how to communicate their vision, grab investors’ interest and attract more funds.” Bearing this in mind, the July conference invokes the proverb “give a man a fish and he’ll eat for a day; teach a man to fish and he’ll eat for a lifetime.” Saving Lives At Birth does both.

(Vicky Hallett, NPR)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

August 16, 2018 at 4:31 pm

Science Policy Around the Web – July 31, 2018

leave a comment »

By: Patrice J. Persad, PhD


source: pixabay

Science and Society

The ethics of computer science: this researcher has a controversial proposal

As a computer scientist with good intentions, it is only natural for him/her to be optimistic about the societal implications of his/her discoveries or findings. Unfortunately, this naivety, or lack of foresight, regarding secondary uses and repercussions of computer applications in/on everyday life can be damaging. As illustrations of unpremeditated consequences, automated tasks based on machine learning algorithms may be time efficient but steal jobs from millions of workers. Also, seemingly unlimited data storage capabilities and potent graphical processing unit (GPU) processing permit building prediction models of consumers’ behavior. This unrestricted data access and use can infringe on individuals’ privacy and question the voluntary nature of the consent process.

In order to magnify the importance of all computer applications’—notably, artificial intelligence’s (AI’s)—shortcomings in relation to society, Dr. Brent Hecht of Northwestern University has a plan. Instead of lauding their findings’ positive influences on society, computer science researchers must disclose negative implications of their research in publications and other press-related media.

The Future of Computing Academy (FCA), which Hecht oversees and which is a branch of the Association for Computing Machinery (ACM), promotes this duty of negative impact disclosure during the peer review process. Motivation for such a proposal stems from fostering accountability of researchers to the general public; this emphasizes the computer scientist’s role not as a mindless mass producer but as a mindful protector of the public’s welfare. Acknowledging the cons of works/applications pushes discussing plus implementing solutions. This deepening of accountability also revitalizes the public’s trust in the computer science community. As expressed by Hecht, here is what fellow computer scientists, as authors and peer reviewers, can do right now to contribute to these efforts of recognizing negative societal impacts:

  1. As an author, include a section entitled “Broader Impacts” or “Societal Impacts,” which discloses negative impacts in addition to positive impacts. Readers are not expecting the authors to be seers; in the context of pre-existing literature, discussing secondary uses with possible dastardly effects on citizens should be a start (if not sufficient).
  2. As a peer reviewer, outright ask, if unlisted in the submission, “What are the work’s negative societal impacts?” Stress that disclosing such information will not warrant rejection of the manuscript. (On the other hand, if negative impacts outweigh positive ones, funding agencies can use their discretion in supporting projects.)
  3. When communicating with the press, remember to mention negative societal impacts, and be prepared to address relevant questions/comments.

(Elizabeth Gibney, Nature)


Did a study of Indonesian people who spend most of their days under water violate ethical rules?

At the heart of any study involving human subjects, the potential for an ethical dilemma to arise is strong in the face of unclear and/or inaccessible research policies and regulations. Or, to put it bluntly, there churns the following question that torments the researcher when ethical matters cross over into legal waters: “Will I go to jail if I unknowingly breach research protocol (no matter if that protocol is under debate or revision)?” The ethical dilemma is imminent especially when principal investigators are foreign and from developed countries, but the proposed study’s focus is on indigenous populations in developing nations. Consider the research presented in the April 2018 Cell article “Physiological and Genetic Adaptions to Diving in Sea Nomads” by Dr. Melissa A. Ilardo and colleagues. The investigation’s results demonstrated that genetic variation in PDE10A is associated with a larger spleen size in the Bajau people, Indonesian “Sea Nomads” who have practiced extreme breath-hold diving for over a thousand years. The Ministry of Research, Technology and Higher Education (RISTEK) in Indonesia granted the team a permit to pursue the study. However, the bona fide ethical conflict stems from:

  1. local organizations’ claims that the team did not receive approval from at least one Indonesian research ethics commission/committee (see Council for Internal Organizations of Medical Science, CIOMS, guidelines).
  2. failure to procure approval from the Indonesian National Institute of Health Research and Development to transport human DNA samples out of Indonesia.
  3. lack of research involvement on the part of Indonesian scientists, especially geneticists.
  4. inadequate presentation of overall research results to study populations, including the Bajau, before publication.

In defense of Ilardo and colleagues, supporters point out that the Indonesian government has not reprimanded any team members for their research indiscretions, and Cell finds no issues with the group’s provided documents from said government. As for engaging more with Indonesian scientists regarding local research projects, Ilardo’s unanswered e-mails to several local professionals prior to data and specimen collection are proof of involvement attempted. In hindsight (or perhaps coincidence), RISTEK in early July organized an online portal where foreign researchers can easily gain access to all protocol/documentation for permits.

Foreign researchers are urged to realize that these presented ethical concerns—among them, governmental/national organizations’ approval, or consent, and transfer of biological specimens out of developing countries—are not trivial. Scientists should not be alarmed at just the prospects of jail time. Research cooperation with other nations’ institutions/entities can impact international relations between nations and local denizens’ trust in foreign researchers. Both international relations and trust influence the success of future research endeavors in developing and other nations.

(Dyna Rochmyaningsih, Science)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

July 31, 2018 at 4:56 pm

Science Policy Around the Web – July 20, 2018

leave a comment »

By: Mohor Sengupta, PhD

Skeleton Tools Medicine Technology X-ray Medical

source: Max Pixel


3-D Color X-Rays Could Help Spot Deadly Disease Without Surgery

Traditional X-ray or CT (computerized tomography) scanners pass X-ray beams through the body and detect the transmitted radiation. Dense tissues, that have absorbed the X-ray, appear white and softer tissues, which have transmitted most of the X-ray beam, appear darker. Dr. Anthony Butler from University of Otago in NZ, along with his father Phil Butler, has made a crucial breakthrough in this imaging technique. They have applied the principle of a pixel detecting tool that is used in the Large Hadron Collider at CERN, in modeling their scanner. Their tool basically records the change in the X-ray wavelength once it passes through sub-atomic particles and assigns the new wavelength a pixel (smallest individual unit of a digital image) of a certain color. This colored pixel identifies the particle the X-ray beam has passed through. For example, if the X-ray passes bone, the calcium atoms in the bone will alter its wavelength, which will then be recorded as certain color, say pink, which is different from the color assigned to the altered wavelength of the X-ray beam if it passed another tissue. The tool then translates this data into a 3D color image.

This imaging technique can provide very high-resolution images of tissues without any invasive procedures. Its developers have detected minute details of various tissues like cartilage, bones, adipose tissue etc. in scanned ankles and wrists. They plan to scan the entire human body eventually. Since this imager can take pictures of areas deep inside the body, it will hopefully uncomplicate the diagnoses of many hard-to-detect medical issues, like cancer, heart abnormalities and blood disorders. “It’s about being able to first find the explanation for somebody’s symptoms, like a tumor, and then find the best way to reach it with the least amount of detours and misadventures,” said Dr. Gary E. Friedlaender, an orthopedic surgeon at Yale University.

Aurélie Pezous is a knowledge transfer officer at CERN. She promotes outside uses of research techniques developed by the organization. Of the recent applications of the pixel detecting tool in medicine, she said, “This is the beauty of it: Technology that was first intended for the field of high-energy physics is being used to improve society. It’s very exciting for CERN”.

In the coming months, clinical trials will enroll orthopedic and rheumatology patients to test the novelties of the 3D color X-ray scanner.

(Emily Baumgaertner, New York Times)

Drug pricing

Trump administration to explore allowing drug imports to counter price hikes

In February last year Bernie Sanders, along with many of his democratic colleagues introduced a legislation in the House and Senate to allow drug importation from Canada, to rein in rising drug prices in USA. Drugs are cheaper in many countries because of government regulations on pricing. Since that legislation, this idea has been championed by Sanders and Trump alike.

Steep and regular drug price hikes pursue American consumers constantly. This is particularly true for an off-patent drug produced by a single manufacturer. The case of Martin Shkreli, who became infamous for having hiked the price of Daraprim, an AIDS drug, to 5000 percent of its original price after his company Turing Pharmaceuticals acquired its manufacturing rights in 2015, was cited as an example of blatant abuse of the current system. It has been suggested by Alex Azar, secretary of Health and Human Services, that in such situations an effective solution could be to import drugs from a reliable foreign source and effectively introduce competition in the local manufacturing arena, and curb prices within the United States in the process. As the federal law on drug importation currently stands, it is illegal to import foreign approved medicines except for meeting shortages in supply, something that happened after the hurricanes in Puerto Rico last year.  FDA commissioner Scott Gottlieb has likened the situation of steep price hikes to that of drug shortages, as it creates similar public health consequences for consumers and he believes that a temporary importation of foreign approved drugs could be helpful, at least until competition resumes and prices are brought down.

Yesterday, Gottlieb criticized makers of high priced medicines for stalling manufacture and availability of alternative low-priced versions of the same compounds. The focus of the federal government here is the importation of medically necessary drugs approved in other countries as a reasonable substitute for the FDA approved version in USA. If the import legalization comes about, it will be a major disappointment for pharmaceutical companies at home which have strongly opposed the move, along with the Republicans.

Shortly after the legislation introduced by Sanders in 2017, four former commissioners of the FDA issued an open letter to members of the Congress, citing the dangers of exposing Americans to imported drugs that have not undergone the established standards of scrutiny that the FDA has in place for American-made drugs. Irrespective of the criticisms, the proposed importation of foreign approved drugs seems to be encouraged by the federal government, which is good news for many consumers and a blow to the monopoly of local drug-makers.

(Laurie McGinley, The Washington Post)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

July 20, 2018 at 4:01 pm

Science Policy Around the Web – April 27, 2018

leave a comment »

By: Michael Tennekoon, PhD


source: pixabay

Productivity of Science

Is Science Hitting a Wall?, Part 1

Scientific research is hitting a wall- that’s the view from a recent study published by 4 economists.  The famous metric where the density of computer chips doubles every 2 years, now takes 18 times the number of researchers to accomplish. This pattern also extends to other areas of research as well. For example in medicine, “the numbers of new drugs approved per billion U.S . dollars spent on R&D has halved every 9 years since 1950”. In general, while research teams appear to be getting bigger, the number of patents being produced per researcher has declined. Alarmingly critics argue that some fields may even be regressing- for example the over-treatment of psychiatric and cancer patients may have caused more harm than the benefits.

But why would science be hitting a wall? One major factor could be the reproducibility crisis– the problem where many peer reviewed claims cannot be replicated thus calling into question the validity of the original research findings.  Researchers suggest that intense competition for funding and jobs, has resulted in the need to conduct innovative “high risk” research, in as short of a time as possible. While this type of research can gain plenty of press, they often lack the appropriate scientific rigor that ensure the findings are reliable. However, the perceived slow-down in research productivity could also be a result of the natural advancement of science- the low hanging fruit problem. Said another way, most of the easier problems have already been solved, leaving only problems that require vast scientific resources to solve.

On the other hand, researchers in some fields can rightfully pushback and argue that scientific progression is not stalling but is in fact accelerating. For example, technologies such as CRISPR and optogenetics have been able to produce a multitude of new findings particularly in the areas of neuroscience and genetics research. However, it must be noted, that even with these new technologies, the end product for general society is still relatively disappointing.

Given these concerns how scientific research moves forward raises some tough questions for the field. Given funding limitations, how much do we, as a society, value ‘pure science’- the effort to understand rather than manipulate nature? Scientific curiosity aside, in purely economic terms, is it worth understanding the out of Africa hypothesis of human origins, or sending humans to different planets? Is it worth investing in the latest innovative technology that produces new findings with limited applicability to human health? Scientists and the general society must be open to weighing the costs and benefits of scientific enterprises and deciding the avenues of research worth pursuing.

(John Horgan,  Scientific American)

Vaccine Ethics

The vaccine dilemma: how experts weigh the benefits for many against risks for a few

Cost-benefit analysis. Sure, it’s easy to do when you’re on an amazon shopping spree. But what about when millions of lives are at stake? And what if those millions of lives are of children, unable to give informed consent? Not so easy anymore, but that is the job of the Strategic Advisory Group of Experts (SAGE) for the World Health Organization, who last week decided to scale back the use of a new vaccine to protect against dengue.

2 years ago, SAGE concluded the vaccine was safe to use in children in places with high dengue infection rates, despite theoretical concerns the vaccine may increase the risk of developing a severe form of dengue in some children. Towards the end of last year, the vaccine’s manufacturer, Sanofi Pasteur, released new data validating these theoretical concerns.   How likely was this to happen? It was estimated that in a population where 70% of individuals had dengue at least once, the vaccine would prevent 7 times as many children from needing hospital care than would be needed as a result of the vaccine. If 85% of individuals had had dengue, that figure becomes 18 to 1. Those numbers were deemed not worth the risk.

What goes into making these decisions?

One factor is the prevalence of the disease. For example, the oral polio vaccine had the ability to prevent millions of children from becoming paralyzed, but it could also cause paralysis in a rare number of cases. In the 1950s and 1960s when polio was highly prevalent, it made sense to recommend this vaccine but as polio became nearly non-existent towards the end of the 20th century, using the oral vaccine was no longer prudent.

However, dengue is still rampant in today’s world, so what is different in this case?

Public perception. The modern world is highly litigious and has access to a wide variety of information, both facts and fake. This has resulted in a very skeptical perception of science where negative press for one vaccine could cause collateral damage for many other vaccines, unlike what would have happened a few decades ago. For example, in the 1950s, it was discovered that children were given a polio vaccine that mistakenly contained live viruses. This left 51 children in the US paralyzed, and killed 5. However, polio vaccinations resumed and the company responsible (Cutter Laboratories) went on and polio was virtually eradicated. On the other hand, RotaShield, a vaccine to protect against rotavirus (a virus that causes bowel blockage), had a very different experience. Approved in 1998, it was suspended one year later after the CDC estimated that for every 10,000 children there would be an extra 1 or 2 children who would get intussusception (a type of bowel blockage) over what would normally be seen. While in developing countries, the number of lives saved would have been far more than the extra cases of intussusception, the vaccine was still suspended. A safer rotavirus vaccine only made it to market in 2006. During this time, it is estimated that 3 million children died from rotavirus infections. (Note- risk of  rotavirus infections still persist even when the vaccine is given, but at far lower rates).

Given the tremendously difficult decisions that need to be made with the implementation of vaccines and the impact that public perception can have on these decisions, society has a responsibility to become more informed about the potential benefits and drawbacks of vaccines and must actively tease apart fact from fiction.

(Helen Branswell, STAT)

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

April 27, 2018 at 3:26 pm

Hold the Mayo: Supreme Court Ruling Blocks Patent Protection for Important Medical Diagnostics

leave a comment »

By: Jon Nye, Ph.D.


source: George Hodan, via

In President Obama’s 2015 State of the Union Address he announced the Precision Medicine Initiative. This study will follow 1 million or more volunteers over a long period of time at a cost of $215 million. The goal of this ambitious initiative is to fundamentally change the way we diagnose and treat patients by moving from a “one-size-fits-all” approach to one that tailors disease prevention and treatment based on each individual, factoring in differences such as genetic makeup, lifestyle, and environment. As we move forward into the era of precision medicine and gain a better understanding of the complex underlying mechanisms that contribute to disease, we will require the development of diagnostic tests that allow caregivers to identify the specific causes of each individual patient’s disease, leading them to select an appropriate treatment. Although the future market for diagnostic tests looks bright, recent Supreme Court rulings that will prevent companies from obtaining patent protection threatens to hamper the development of these important tools and may adversely affect patient care.

Mayo Collaborative Services v. Prometheus Laboratories

The purpose of the patent system is to encourage research and innovation by rewarding inventors with a temporary government-granted monopoly. This system has been around since the founding of our country and stems from Article I of the Constitution in which it says, “The Congress shall have power … To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries”. However, in the medical diagnostics field, recent court rulings have dramatically narrowed what is considered to be patent eligible. The most notable being the 2012 Supreme Court case Mayo Collaborative Services v. Prometheus Laboratories. This case centered on a method for determining the proper amount of a specific drug to give to patients by measuring the levels of a drug metabolite in their blood. This test provided a way to individualize dosing so that doctors could maximize the drug’s effectiveness while minimizing the side effects of receiving too much. In a unanimous decision, the Supreme Court ruled that this patent was invalid based on the fact that the test stated a “law of nature”.

“Prometheus’ patents set forth laws of nature – namely, relationships between concentrations of certain metabolites in the blood and the likelihood that a dosage of a thiopurine drug will prove ineffective or cause harm.”

In other words, any patent claim based on a test that measures compounds, metabolites, or any other marker specific to disease was now patent ineligible simply because they measure processes that occur in the human body. This extremely broad definition of what can be considered a natural law led to widespread rejection of most diagnostic tests, now referred to as a Mayo rejection.

Aftermath of the Mayo Decision

A recent study has analyzed the effect that the Mayo decision has had on the medical diagnostic patent landscape in the United States. In this study, the authors looked at 31 patents that contained either diagnostic or prognostic claims and were filed in both the US and the European Union. This revealed a huge discrepancy in the two systems. Indeed, in the EU, which has no legal equivalent to the Mayo rejection, 30 out of 31 applications had either received a patent or were still pending. On the other hand, 29 of 31 applications in the US were abandoned or were still pending after receiving a Mayo rejection. This study highlights the profound and possibly unintended consequences that the Supreme Court ruling in the Mayo case has had on medical diagnostic patent eligibility.

Surprisingly, recent federal court rulings have supported the notion that in the wake of Mayo, current guidelines are too restrictive and prevent even well-deserved new diagnostics from patent approval. Most notably was the recent case Ariosa Diagnostics, Inc. v. Sequenom, Inc. This case involved a patent on a new technique for non-invasive genetic testing of a fetus. Sequenom’s diagnostic relied on two novel findings. First, their test used fetal DNA that was found to be circulating in the mother’s blood which was previously discarded as useless. Second, they developed a method to selectively amplify DNA from the fetus apart from the mother’s by focusing on the paternal DNA contributions. Although this technique was referred to as, “a paradigm shift in non-invasive prenatal diagnosis”, by the Royal Society in the United Kingdom, multiple US courts ruled that it was patent ineligible based on Mayo. In fact, these court rulings were unanimous and all of the judges remarked that based on the Supreme Court’s guidelines their hands were tied, even though some judges believed that this novel test merited a patent. Judge Linn from the Federal Circuit Court wrote:

“This case represents the consequence—perhaps unintended—of that broad language in excluding a meritorious invention from the patent protection it deserves and should have been entitled to retain.”

These lower court rulings had many people hopeful that the Sequenom case would make it to the Supreme Court and allow them to refine the broad language in Mayo. However, the case was declined later that year, signaling that the solution to this problem was unlikely come from the courts. Instead, congress would have to act.

A Path Forward

In the wake of the Sequenom case, it was clear to many that the only way to change the restrictive patent guidelines in the medical diagnostic field was to pass legislation that would supersede the Supreme Court ruling. In order to gain momentum and start a discussion about what a bill like this would entail, a conference was held at Berkeley. It was attended by leading industry experts, scholars, policymakers, and a retired jurist. From these discussions, it was agreed that the current patent guidelines were inhibiting research and development in the diagnostics field by undercutting incentives. Consequently, at least in the bioscience field, the current system is no longer fulfilling the original intent of the patent system which was, “To promote the Progress of Science and useful Arts”. The consensus was that future legislation should expand patent eligibility to include conventional applications of scientific discovery. This general framework for a bill is consistent with current guidelines in the EU and also consistent with the spirit and intent of the patent system outlined in the Constitution. With the continued focus on tailored individualized treatment of patients it will be important to promote research and development in this area. Therefore, changing current guidelines should be a priority before they negatively affect patient health.

Have an interesting science policy link? Share it in the comments!

Written by sciencepolicyforall

March 1, 2018 at 7:45 pm