The science was elegant, powerful, and troubling, especially to those researchers deeply involved in the field. It could, the scientists acknowledged, revolutionize our understanding of fundamental biochemical processes in cells, propel the development of a new scientific discipline, and hold near limitless promise for medicine, industry, and agriculture.
It could also disrupt living beings in a most intimate way: It would change DNA.
Those involved in the work, gathered to discuss what they might do to manage this science and examine whether they should, in fact, consider halting it to allay their concerns and those growing among the public.
It was within this swirl that an international cohort of biologists together with a smattering of lawyers and physicians—about 140 in all—traveled to the Monterey Peninsula in California more than four decades ago. There, within a complex of Arts and Crafts–style buildings constructed just after the turn of the twentieth century, the scientists met. For nearly four days, they discussed the state of the science. When they emerged, they delivered what is now known as the Asilomar statement, a cautionary letter to scientists involved in recombinant DNA research. The letter, published in 1974 in Science, was disseminated throughout the scientific community and discussed in the popular press of the time.
According to many who have analyzed these events, the researchers’ willingness to self-police and be accountable to the scientific community was rooted in their concerns for the safety of the science. Their actions also echo a “contract” science has had with society since at least the Second World War. At its most fundamental, argue Adam Briggle and Carl Mitcham in their 2012 text, Science and Ethics, that contract holds that society’s progress derives from the “distinctive form of knowledge production” that comes from scientific inquiry, that is, ideas developed freely, and explored and vetted before being shared broadly. In exchange for this free agency, the public expects scientists to be accountable for the quality and safety of their research. That expectation of responsibility does not extend to the applications of research. Traditionally, note Briggle and Mitcham, that has been society’s responsibility.
That division is being increasingly questioned as science becomes more adept at teasing out the mechanisms that drive all of human biology. Calls to accept a greater degree of social responsibility for the use of their research have scientists considering anew how they can, or should, contribute to the public good.
Dear Mr. President
The structure of today’s scientific research establishment, and its handshake agreement with society, can be traced to Vannevar Bush, an engineer who headed the U.S. Office of Scientific Research and Development through World War II. In late 1944, President Franklin D. Roosevelt wrote to Bush and asked him to analyze how the wartime scientific enterprise might be retooled to allow its application “in the days of peace ahead for the improvement of the national health, the creation of new enterprises bringing new jobs, and the betterment of the national standard of living.”
In a report titled Science: The Endless Frontier, issued three months after Roosevelt’s death, Bush called for strong centers of basic research set within the nation’s academic institutions. It emphasized the need for a long-term commitment by the government to support the training of new generations of scientists and the pursuit of novel and existing avenues of research. It also proposed the establishment of an entity that would retain “discretion in the allocation of funds,” ensuring “complete independence and freedom for the nature, scope, and methodology of research.” Above all, it advocated for scientific progress, specifically the type that the report said resulted from “the free play of free intellects.”
“Are scientists simply free actors, able to research whatever they want, or do they have an obligation to be socially responsible, to contribute to the public good?” asks David Jones ’97, the A. Bernard Ackerman Professor of the Culture of Medicine at Harvard and the HMS Department of Global Health and Social Medicine.
“You can make a claim that they have an obligation because of their training. No matter how a scientist is funded at present—public funds, private donors, philanthropies, or corporations—everyone who is a practicing scientist now has had a substantial education in universities, and that training almost certainly involved a substantial investment of government research funds,” says Jones.
“Scientists,” he adds, “have some obligation to the public because of the investment society has made in them.”
For scientists such as George Q. Daley ’91, an HMS professor of biological chemistry and molecular pharmacology, the School’s Robert A. Stranahan Professor of Pediatrics, and its dean designate, that obligation gets addressed on local, national, and international levels.
“I’ve struggled with these issues myself,” says Daley, “and have engaged in deliberative processes with colleagues to consider our science and its implications. In doing so, I hope I’m providing a leadership example for the next generation.
“Scientists should be allowed to pursue their curiosity and creativity to discover new truths about the way the physical and natural worlds work, but how that work is applied, and whether certain work should be supported and others not, those discussions really have to involve the public.”
A Firm Footing
Daley also directs the Stem Cell Transplantation Program at the Dana-Farber/Boston Children’s Cancer and Blood Disorders Center. In 2006, in his role as chair of an international task force on behalf of the International Society for Stem Cell Research, he helped develop a set of principles for self-governance and ethical conduct among his stem-cell research peers. Two years later, the group issued guidelines for the clinical translation of stem cell research, which included a firm statement against unethical efforts: “The ISSCR is deeply concerned about the potential physical, psychological, and financial harm to patients who pursue unproven stem cell-based ‘therapies’ and the general lack of scientific transparency and professional accountability of those engaged in these activities.”
More recently, Daley and colleagues, under the auspices of the International Genomics Initiative, weighed in on embryo engineering using CRISPR technology and called for responsible behavior by scientists working in this galloping field. In April 2015, in Science, the group offered its collective thoughts on “a prudent path forward.” Noting the potential slippery slope from disease curing to less acceptable applications, the group wrote, “it would be wise to begin discussion that bridges the research community, relevant industries, medical centers, regulatory bodies, and the public to explore responsible uses of this technology.”
Daley says that although the waters can be choppy for scientists who enter into the debate on issues such as genome editing, it’s worth the effort.
“These are difficult issues to navigate,” says Daley, “but I think it’s part of the responsibility of the scientist. It takes one outside of the comfort zone of just doing experiments and into the realm of interpretation.
“As scientists we invite peer review, we invite scrutiny of our animal research and independent scrutiny of work that involves human subjects. All this helps preserve the integrity of the research process.
“But issues like genome editing raise larger questions: What are the priorities of the research, and what are its applications from a social perspective? Is it something society should endorse? Such questions should be decided in a greater social context.”
Questions like those are fundamental to shaping how the public views the potential consequences of scientific research and how the scientific community approaches questions of social responsibility.
A 2016 Pew Research Center survey of 4,726 adults in the United States shows a nation wary of using biomedical technologies, especially to enhance human abilities. Asked how comfortable they were with technologies that allowed for gene editing to give infants a lifetime of reduced risk of serious disease, the implantation of brain chips to give people a greater ability to concentrate and process information, and the transfusion of synthetic blood to allow for greater speed, stamina, and strength, more than 60 percent of the respondents indicated they were very or somewhat worried about any such applications. The use of gene editing to prevent disease in children revealed the slimmest split, with 48 percent saying they would want such an option for their child and 50 percent indicating they wouldn’t.
Respondents also thought the technologies might be outpacing scientists’ understanding of the consequences of using them in humans. Seventy-four percent predicted brain chip implants would become available before the technology had been fully tested or understood, and 73 percent thought the same about the use of synthetic blood and gene-editing technologies.
Only a year before, in early 2015, the Ethics and Human Rights working group of the Science and Human Rights Coalition of the American Association for the Advancement of Science released findings from a pilot project that surveyed the scientific community on its responsibilities to society. The majority of the roughly 2,100 responses were from academic scientists in North America.
The working group’s analysis showed that nearly 94 percent of respondents thought it was important to explain their work to the public, and 82 percent considered it important to engage in public-service activities. When considering the social responsibilities specific to their research, 92 percent said they thought they had an obligation to “serve in advisory roles in the public arena in their areas of expertise,” and 88 percent saw a need to contribute to public policy deliberations in their areas of study. Interestingly, although nearly 96 percent thought it was important to “take steps to minimize anticipated risks associated with their work,” only 82 percent felt it was vital to take steps to prevent the inappropriate use of their research by others.
Consider the Possibilities
A bone-deep concern over the inappropriate use of research could be said to have fueled Jonathan Beckwith’s career as a scientist-activist.
In his nearly five decades at HMS, Beckwith, the American Cancer Society Professor of Microbiology and Immunobiology Emeritus, has often been a part of public debates on the safety, need, or ethical nature of research.
Early in his career, Beckwith led the research team that was the first to isolate a single gene, the lacgene. Despite the excitement generated by the work, one member of the team, Lawrence Eron ’71, then a third-year HMS student, told a reporter for The Harvard Crimson that the research team was concerned that the technology it had pioneered in bacteria could be the first step toward genetically engineering humans. The team then called a news conference.
“We did talk about the science,” recalls Beckwith. “but we also talked about what we thought were the social implications of the research. We thought the public should be more involved.”
In the late 1960s, Beckwith introduced the social implications of science and historical perspectives on past research, blemishes and all, into a bacterial genetics course he was teaching. Two decades later, at the request of two former students from that class, Beckwith launched the Social Issues in Biology class at Harvard. That class still draws young scientists and medical students.
In the 1990s, Beckwith became involved in a small but significant movement to consider the ethical and social components of research. The Human Genome Project was taking off, and James Watson, of DNA fame, announced that 1 to 3 percent of the project’s funds were to be used to study the ethical, legal, and social implications (ELSI) of the research.
Beckwith was named to what became known as the ELSI committee for that project. He credits the effort with not only informing the public about the genome project but also in seeding a field of inquiry that continues to affect social policy. In a 2007 interview in BioEssays, Beckwith noted that the ELSI phenomenon was embraced elsewhere in the world, leading to a body of research on issues related to genetic testing programs.
The importance of educating the next generation of scientists continues to figure large on Beckwith’s radar. He co-wrote a 2005 commentary in Nature Biotechnology that included a proposal that graduate-level science education include the study of the social implications of science and the historical instances where scientists have raised concerns about the use of research. Although that proposal has yet to be acted upon, Beckwith was pleased to note a recent change to the Training in the Responsible Conduct of Research guidelines issued by the National Institutes of Health for grants that fund instruction and training.
That change, which took place in 2009, urges grant applicants to develop pedagogies that include instruction on “the scientist as a responsible member of society, contemporary ethical issues in biomedical research, and the environmental and societal impacts of scientific research.”
“Not many people may have noticed that change,” says Beckwith, “but I think it’s a particularly important one.”
Just as having scientists who are comfortable discussing the effects of science in the marketplace has become vital to the scientific community, the preparation of young physicians for the responsibilities of translating science for their patients has taken an ever-larger role in medical education. It’s an education that should continue, according to their profession’s ethical code.
Within the American Medical Association’s nine Principles of Medical Ethics, which anchor the larger AMA Code of Medical Ethics, sits this precept: “A physician shall continue to study, apply, and advance scientific knowledge, maintain a commitment to medical education, make relevant information available to patients, colleagues, and the public. . . .”
From his vantage as a physician-scientist, Daley underscores the principle’s importance.
“Physicians are often the interface between patients and new technologies,” he says. “I think it’s incumbent upon clinicians to understand new technologies and to continue to educate themselves, whether it’s on stem cells or novel applications of gene editing. The clinic is where these issues get played out.”
Jones considers these patient-doctor responsibilities continually. Together with Edward Hundert ’84, the School’s dean for medical education, Jones oversees the Pathways curriculum’s social and population sciences courses. The course explores everything from truth-telling, reproductive ethics, patients’ capacity to make informed decisions, and the ethical dilemmas faced by medical students to the responsibility that physicians have to relieve health inequities and help their patients achieve the best health outcomes. The goal is to address “the role of the physician and the moral framework of modern health care practice.”
“One role physicians have,” says Jones, “is to serve as the mediator, or the translator, between patients and the basic science community. I think this gives physicians a real obligation to be intelligent evaluators of science so that they can educate patients and help them make the most appropriate medical decisions.”
The payoff for preparing young minds for the ethical dilemmas that arise within their profession was recently made clear to Beckwith. He had been invited to attend a seminar in the Department of Genetics. The speaker was Ethan Bier, a professor of cell and developmental biology at the University of California, San Diego. Curious, Beckwith accepted.
Beckwith learned that the speaker had been a graduate student at HMS years ago, so after the presentation, Beckwith approached Bier.
“I felt stupid doing this,” Beckwith recalls, “but I asked him, ‘Do I know you?’ ”
“I was a student in your class,” Bier responded. He had taken Beckwith’s bacterial genetics course.
They sat down to talk. “Bier told me, ‘I wanted to see you to thank you for what you taught me.’ ” Explains Beckwith, “He was talking about the social implications material we covered in the course.”
Beckwith paused. “Bier then told me the course had made him think about those issues and that it helps him think about those things in his research today.”
What is Bier’s work? Gene drives, which are naturally occurring molecular elements that actively influence the heritability of the genetic material of any sexually reproducing species.
Recently, gene drives have been built using CRISPR technology with the aim of one day controlling the spread of viruses such as Zika. It’s work that at least one newspaper described as technology that offers life-transforming power.
Ann Marie Menting is the editor of Harvard Medicine magazine.
Images: iStock, Mattias Paludi (top); John Soares