Scientists who rely on government grants must continually calibrate their research interests to the changing funding regimes.
In 2015, Forbes magazine proclaimed Elizabeth Holmes “the world’s youngest self-made woman billionaire.”11x“The World’s Billionaires: #435 Elizabeth Holmes,” Forbes online, http://www.forbes.com/profile/elizabeth-holmes/. Accessed August 22, 2016. About a decade earlier, when she was nineteen, she dropped out of Stanford to transform the medical industry by introducing a faster, cheaper kind of blood test requiring only a finger stick rather than a needle jab in the vein.22xKen Auletta, “Blood, Simpler,” The New Yorker, December 15, 2014, http://www.newyorker.com/magazine/2014/12/15/blood-simpler. The company she founded to produce and administer this technology, Theranos, raised more than $400 million in venture capital and came to be valued at an estimated $9 billion.33xAbigail Stevenson, “World’s Youngest Female Billionaire—Next Steve Jobs?” CNBC online, September 23, 2015, http://www.cnbc.com/2015/09/23/worlds-youngest-female-billionaire-next-steve-jobs.html. Here was a classic Silicon Valley and biotech success story: By breaking free of old institutions, a young genius could design a new device that would radically disrupt the medical testing industry. Or so the story briefly ran.
Reality was less exciting. Holmes, it seems, had exploited tremendous hype and her father’s extensive business connections to amass financial backing for a technology that never demonstrably worked.44xThe first of these reports was by John Carreyrou: “Hot Startup Theranos Has Struggled with Its Blood-Test Technology,” Wall Street Journal, October 16, 2015. Her fall from grace was swift. Forbes changed her company’s value from $4.5 billion to $0, and in July 2016 the federal Centers for Medicare and Medicaid Services banned her from operating blood-testing facilities for two years.55xJohn Carreyou, Michael Siconalfi, and Christopher Weaver, “Theranos Dealt Sharp Blow as Elizabeth Holmes Banned from Operating Labs,” Wall Street Journal, July 8, 2016, http://www.wsj.com/articles/u-s-regulator-bans-theranos-ceo-elizabeth-holmes-from-operating-labs-for-two-years-1467956064. [Editor’s Note: In July 2018, she was indicted for criminal fraud.]
But even if Theranos had delivered on its promises, what would its product really have amounted to? Holmes promised a technology that would transform health care and even the relationship people have to their own bodies. But once you set aside Theranos’s self-generated mythology, all that its founder seems to have attempted was to make one area of the already successful and profitable field of medical testing a little bit cheaper and easier for patients. If you didn’t like having your blood drawn from a vein, Theranos promised, you would never again have to feel the sting of a hypodermic needle.
In trying to turn humdrum research into a transformative medical disruption, Theranos was not unusual: Early biotech successes such as Genentech’s artificial insulin also followed this model, providing nothing more than a high-tech alternative to existing treatments.66xFor an account of the early biotechnology industry, see Nicolas Rasmussen, Gene Jockeys: Life Science and the Rise of Biotech Enterprise (Baltimore, MD: Johns Hopkins University Press, 2014). But it’s not even clear in Theranos’s case that a cheaper testing technology would have benefited public health. Contemporary health care, as many have noted, already suffers from excessive testing and subsequent overtreatment.
Dreaming of a Bioeconomy
What kind of knowledge should scientists pursue, and how should they pursue it? The answer to that question has changed. Today, research and development is no longer driven by the older scientific urge to arrive at better understandings of the world. Nor is it driven by the free-ranging curiosity of the research scientist that not long ago shaped the investigative agendas in government- and corporation-funded research settings such as the storied Bell Labs.
Instead, research now responds strictly and immediately to the demands of the competitive market. It aims to produce patentable knowledge and products rather than revolutionary conceptual elaborations. And it must do so quickly, because venture capital can dry up, competitors may trump one’s research at any moment, and only the first to the patent can profit. Under such conditions, researchers are unlikely to make groundbreaking discoveries about the workings of the natural world.
For the last forty years, policymakers and business leaders have striven to make biological research and biotechnology an engine of economic growth, motivated by the hope of transforming the US economy into a “bioeconomy.”77xFor more on ethics of the bioeconomy and entrepreneurial science in general, see my article “Knowledge and the Scientist-Entrepreneur,” Pro Ecclesia 24 (2015): 308−25. To support that dream, Congress and the courts strengthened patents and extended patent protection to new entities like genes and genetically modified organisms. Treaties expanded the reach of US intellectual property law around the globe. The Bayh-Dole Act of 1980 clarified the relationship between federal research money and patents. Universities faced with decreasing government funding encouraged faculty to pursue patents and even to start new companies.
Government agencies and venture capitalists have now invested billions of dollars of research money in the dream of a bioeconomy, hoping eventually to profit from the cures and new businesses such an economy might produce. Yet while there have been obvious developments in biomedical research during the last forty years, these decades have seen a relative decline in research productivity in terms of therapies, patents, and publications. Only 15 percent of the drugs approved by the Food and Drug Administration during the 1990s were deemed real innovations, the others being reformulations or copies of existing therapies. Philip Mirowski, an economist and historian of science, has found that the number of these new molecular entities approved each year declined since the mid-1990s, from fifty-three in 1996 to nineteen in 2009.88xPhilip Mirowski, Science-Mart (Cambridge, MA: Harvard University Press, 2011). For an early argument on these trends, see Paul Nightingale and Paul Martin, “The Myth of the Biotech Revolution,” Trends in Biotechnology 22, no. 11 (2004): 564–69. More ominously, the competitiveness and constraints that have resulted from market and government incentives are threatening the future of fundamental research by driving away young scientists. Today, only 59 percent of recipients of PhDs in biomedical fields are employed in fields related to biomedical research.99xNational Institutes of Health, Biomedical Research Workforce Working Group Report (Bethesda, MD: National Institutes of Health, 2012).
Time and freedom are essential to fundamental science, transformative science, and science that, not incidentally, eventually has far-reaching practical consequences.1010xFor a similar account of the problems facing science, but a very different explanation of their causes and possible solutions see Daniel Sarewitz, “Saving Science,” The New Atlantis 49 (Spring/Summer 2016): 4–40. Sarewitz supports intrusive management of science and an increased focus on technical application. While he is correct that bureaucratically organized efforts to manage science like those of the Department of Defense are very effective at bringing about technical applications of science like the atomic bomb, he does not acknowledge that these technical applications are based on theoretical insights generally arising from curiosity-driven research. It takes years, if not a decade or more, for new thoughts to percolate. Even when a scientist makes a breakthrough, it may take a decade or more before his or her work becomes socially useful or economically profitable. Insights come from unexpected quarters, often, as the great sociologist of science Robert Merton made clear, as a result of serendipity. Even in a field as far removed from pure science as the development of biological tools, the new CRISPR/Cas9 gene editing technology was made possible by research on microbes from salt marshes and the ways bacteria fight viruses—hardly the best-funded or most commercially promising areas of biological concern. Scientists need the time and security to investigate questions that may lead to dead ends or, alternatively, yield new understandings of the world. This rarely happens if their research is driven by short-term responses to market demands.
The focus of research did not shift in a cultural or social vacuum. As a society, we have moved from valuing knowledge as understanding to placing a higher priority on knowledge as a producer of technology and practical results.1111xThere have been many commentators on science in the past, beginning with Francis Bacon, who prized science primarily for its practical effects, but historian Paul Forman has vigorously argued that even in many of these cases, primacy was always given to scientific knowledge rather than technology. See Forman, “The Primacy of Science in Modernity, of Technology in Postmodernity, and of Ideology in the History of Technology,” History and Technology 23, nos. 1−2 (2015): 1−152. This new emphasis has deep implications for the practice of science and may even portend the obsolescence of scientists themselves. It is not implausible, some say, that computer algorithms mining Big Data might soon replace human imagination and creativity in formulating and realizing market-driven research agendas.
More Initiatives, Fewer Findings
In a 2015 interview with an editor of the British biological journal Development, the geneticist Didier Stainier explained why he relocated his lab from the United States to the Max Planck Institute in Germany: “In the USA, and in other countries, the emphasis—and hence the funding—is increasingly being placed on translational research [which focuses on turning scientific findings into therapies, moving ‘from bench to bedside’]…. In order to keep running a lab in the USA investigating several different topics, I would have had to move to more translational work. I had a hard time imagining myself not being able to pursue the kind of basic research required for innovative translational work.”1212xSeema Grewal, “An Interview with Didier Stainier,” Development 142, no. 17 (2015): 2861–63.
Unfortunately, government policy has helped create a research culture that focuses on the short-term production of tools rather than the long-term development of understanding—a competitive, entrepreneurial culture that extends even into the academy. The dangers of such a research ecology were anticipated fifty-five years ago by no less than President Dwight D. Eisenhower, who, in his farewell address, anticipated problems that might result from the “technological revolution” he saw aborning:
In this revolution, research has become central, it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government…. The free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers. The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present—and is gravely to be regarded.1313xDwight D. Eisenhower, “Farewell Address” (January 17, 1961), Eisenhower Archives, https://www.eisenhower.archives.gov/research/online_documents/farewell_address/1961_01_17_Press_Release.pdf.
Federal funding is not in itself a threat to fundamental research. Even astute critics such as the philosopher of science Michael Polanyi who are skeptical of government’s role in the management of science see a role for the state in subsidizing research. The potential problems lie in the way in which government (or any other funder) influences the aims and practices of science, perhaps most perniciously through a growing fondness for short-term funding initiatives. President Barack Obama, for instance, has proposed the BRAIN Initiative (2013), the Precision Medicine Initiative (2015), and the Cancer Moonshot Task Force (2016). But he is only the latest in a long line of presidents who have proposed big projects of this type, including stem cell research, the Human Genome Project, and even the original War on Cancer of the 1960s and 1970s. Such initiatives typically encourage researchers to look for immediate technological solutions and fixes, not deeper understandings of underlying processes and mechanisms.
The problems of this approach are compounded by the inconsistency of government funding practices. From time to time, science funding dramatically surges, as in the doubling of the National Institutes of Health budget from 1998 to 2003 or the research investment in the economic stimulus package of 2009. Such expansions typically lead universities to train more students, start new labs, and develop new centers. Then real investment in research suddenly decreases, and the labs and centers are left scrambling to find grants, while students desperately try to track down new jobs.
Since funding is aimed at specific initiatives rather than a broad portfolio of basic research, researchers are left with no choice but to respond to government incentives and enter those areas tied to these initiatives, a process that engenders chronic cynicism toward grant writing as young faculty try to describe why their investigations of the cell cycle or embryonic development are essential to curing cancer. Such a model of research funding offers no security, leaving scientists demoralized and many of the best young researchers seeking other careers.
Even when scientists obtain government grants, they are all but discouraged from engaging in deep theoretical exploration. With a few exceptions, government grants fund only two to five years of research and oblige the investigator to focus on the straightforward analysis of a hypothesis that is almost predictably confirmable. Anything outside the mainstream or overly ambitious faces a difficult struggle in a grant-review panel, especially if it is proposed by a young scientist at the peak of his or her creativity.
Scientists who rely on government grants must continually calibrate their research interests to the changing funding regimes. Usually this requires that they design projects with near-term goals promising immediate practical benefits. Recent congressional debates over National Science Foundation funding resulted in the requirement that all grants show a clear payoff for the national interest. Many fear this will lead only to an increased emphasis on economic competitiveness over real scientific progress.1414xPaul Basken, “US House Backs New Bid to Require ‘National Interest’ Certification for NSF Grants,” Chronicle of Higher Education, February 11, 2016, http://chronicle.com/article/US-House-Backs-New-Bid-to/235275.
The Disempowering of the Scientists
The National Institutes of Health and biotech start-ups are not the only funders in the world of biological science. Many companies and foundations are taking steps to mitigate the problems resulting from cautious, short-term funding. Following the example of German industrial cartels and Bell Labs, Google uses some of the profits from its near-monopoly on Internet advertising to fund long-term research at its Google X center and its biotech start-ups Verily and Calico. Yet even Google expects that these research applications will make money within a few years, offering employees bonuses to kill projects whose pay-offs might be too distant.1515xConor Dougherty, “They Promised Us Jet Packs. They Promised the Bosses Profit,” New York Times, July 23, 2016, http://www.nytimes.com/2016/07/24/technology/they-promised-us-jet-packs-they-promised-the-bosses-profit.html. The Howard Hughes Medical Institute (HHMI), in Chevy Chase, Maryland, has also pursued a different strategy from that of the government, by funding people instead of projects. In doing so, the HHMI has made it possible for scientists to follow their interests rather than cleave to detailed grant plans.
The HHMI has started Janelia Research Campus, a facility in Ashburn, Virginia, which is meant “to recreate the close-knit feeling of legendary labs such as the Laboratory of Molecular Biology in Cambridge, UK, where well-funded investigators free of grant-seeking pressures work in small groups.”1616xJocelyn Kaiser, “Janelia Farm to Recruit First Class,” Science 306, no. 5694 (2004): 210–11. But even Janelia puts its primary focus on new technologies, not basic research, and even when scientists are freed from the restraints of a short-term grant, they are still imprisoned by the assumption that new technology is the goal.
The disempowering of the scientist by technology threatens to go even further. Some techno-enthusiasts suggest that technology will become powerful enough to interpret data and provide new conceptual resources on its own. Since the launch of the Human Genome Project in 1990, biomedical research has been devoted to gathering ever-larger amounts of information. Genomic researchers try to sequence as many individual genomes as they can. This practice has given rise to further fields of research, such as epigenomics and proteomics, which are also dedicated largely to gathering information rather than testing hypotheses. Despite the vast quantities of data these “omic” sciences have generated, so far only genomics has yielded an appreciable number of major insights. What is more troubling is that scientists don’t know what to do with the data or how to give it theoretical form.
Faced with this mass of uninterpreted data, the most enthusiastic promoters of the digital revolution argue that we should do away with the human element altogether.1717xChris Anderson, “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete,” Wired, June 23, 2008, http://www.wired.com/2008/06/pb-theory/. If humans cannot process this much data, why not leave the job to computers? Science can do without hypotheses altogether, since, with enough data, programs should be able to identify patterns and detect correlations. Science can yield completely to technology.
Most actual scientists, even those researching bioinformatics, whose biological investigations are grounded in quantitative theory and method, reject such an extreme vision of autonomous Big Data, insisting that research will always require human creativity and insight. The most fervent rejection has been expressed, surprisingly, by the most reductionist of molecular biologists. Craig Venter, who launched a private-sector alternative to the government-sponsored Human Genome Project, has criticized the focus on emergent properties arising from “vast numbers of interacting chemical processes forming interconnected feedback cycles,” calling it “a new kind of vitalism,” the belief that life must be explained by a nonmaterial force.1818xJ. Craig Venter, Life at the Speed of Light (New York, NY: Penguin, 2013), 17.
Sydney Brenner, one of the founders of molecular biology and a Nobel Prize winner for his work on roundworms as model systems, spent much of the first decade of this century criticizing “omics” sciences in lectures and papers. As Brenner argued in a 2009 interview with the journal Studies in History and Philosophy of Science, the current trend in molecular biology, “systems biology,” “is technology driven… [and] does not have to pose any hypothesis. In other words, it claims to release people from thinking. You do not have to think, you just make an array and get a lot of numbers.”1919xSoraya de Chadarevian, “Interview with Sydney Brenner,” Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 40, no. 1 (2009): 65–71.
Brenner claims that biological systems such as cells and organisms are too complex and too “noisy” to allow scientists to go from the collection of data about the system to an understanding of it.2020xSydney Brenner, “Sequences and Consequences,” Philosophical Transactions of the Royal Society B: Biological Sciences 365, no. 1537 (2010): 207–12. There are so many random variations among different measurements of a living thing that it is difficult to determine a true causal effect. Because of this random variation and the complexity of these systems, it is almost impossible to start with observation and move backward to a solution. The complexity of the data allows too many possible solutions. Consequently, Brenner argues, scientists must move from simple concepts and hypotheses that possibly explain the phenomena before then correcting these conceptual understandings with experimental data.
While I have doubts as to whether the classic paradigm of molecular biology championed by Brenner is sufficient for the task he sets it, he is correct in saying that scientists must progress from concepts to explanations, as the relative barrenness of recent “omics” science shows. His critique of method is particularly germane in light of recent government initiatives on precision medicine and neuroscience that uncritically embrace the Big Data paradigm.
Today’s research paradigm brings to mind the doubts about midcentury physics, particularly quantum mechanics, expressed by thinkers such as Martin Heidegger and Hannah Arendt: They were concerned that experimental data no longer appeared in the form of representational theories. Instead, the results of experiments, as Heidegger wrote in “The Question Concerning Technology,” were merely “orderable as a system of information…. It seems as though causality is shrinking into a reporting.”2121xMartin Heidegger, “The Question Concerning Technology,” in The Question Concerning Technology, and Other Essays (New York, NY: Harper Torchbooks, 1977), 23; see also Hannah Arendt, The Human Condition (Chicago, IL: University of Chicago Press, 1998), 285–90. Sydney Brenner criticizes the field of systems biology because it “wants to avoid detailed questions of causality”; see de Chadarevian, “Interview with Sydney Brenner,” 67.
There is growing evidence that this structure of scientific research will not allow us to achieve even the utilitarian goods it is meant to provide. More alarmingly, this structure undermines science by shifting researchers away from the goods internal to the practice of science. Arguing against an instrumentalist conception of science, Michael Polanyi described the practice of science as allowing researchers to develop conceptual tools “in the hope of making contact with reality,” a hope that springs from wonder at the world and a basic desire to understand the workings of natural phenomena.2222xMichael Polanyi, Personal Knowledge: Towards a Post-Critical Philosophy (Chicago, IL: University of Chicago Press, 1962), 5.
Of course, true knowledge will always be useful, either in an instrumental manner or in the less straightforward classical vision of a liberal education in which deeper engagement and understanding of the natural world shapes a person’s character in salutary ways. But such knowledge must spring from the proper practice of research. If a rich biological research community is to endure, researchers must be allowed to follow their curiosity in a wide variety of topics with a modicum of security on a long-term time horizon. Otherwise, our country will continue to squander billions of dollars and reduce some of the best research institutions in history to slavish and short-sighted subservience to an agenda of immediate marketable results.