Who’s Responsible If a Scientist’s Work Is Used for Harm?

A face off in medical ethics.

(© kentoh/Fotolia)


Are scientists morally responsible for the uses of their work? To some extent, yes. Scientists are responsible for both the uses that they intend with their work and for some of the uses they don't intend. This is because scientists bear the same moral responsibilities that we all bear, and we are all responsible for the ends we intend to help bring about and for some (but not all) of those we don't.

To not think about plausible unintended effects is to be negligent -- and to recognize, but do nothing about, such effects is to be reckless.

It should be obvious that the intended outcomes of our work are within our sphere of moral responsibility. If a scientist intends to help alleviate hunger (by, for example, breeding new drought-resistant crop strains), and they succeed in that goal, they are morally responsible for that success, and we would praise them accordingly. If a scientist intends to produce a new weapon of mass destruction (by, for example, developing a lethal strain of a virus), and they are unfortunately successful, they are morally responsible for that as well, and we would blame them accordingly. Intention matters a great deal, and we are most praised or blamed for what we intend to accomplish with our work.

But we are responsible for more than just the intended outcomes of our choices. We are also responsible for unintended but readily foreseeable uses of our work. This is in part because we are all responsible for thinking not just about what we intend, but also what else might follow from our chosen course of action. In cases where severe and egregious harms are plausible, we should act in ways that strive to prevent such outcomes. To not think about plausible unintended effects is to be negligent -- and to recognize, but do nothing about, such effects is to be reckless. To be negligent or reckless is to be morally irresponsible, and thus blameworthy. Each of us should think beyond what we intend to do, reflecting carefully on what our course of action could entail, and adjusting our choices accordingly.

It is this area, of unintended but readily foreseeable (and plausible) impacts, that often creates the most difficulty for scientists. Many scientists can become so focused on their work (which is often demanding) and so focused on achieving their intended goals, that they fail to stop and think about other possible implications.

Debates over "dual-use" research exemplify these concerns, where harmful potential uses of research might mean the work should not be pursued, or the full publication of results should be curtailed. When researchers perform gain-of-function research, pushing viruses to become more transmissible or more deadly, it is clear how dangerous such work could be in the wrong hands. In these cases, it is not enough to simply claim that such uses were not intended and that it is someone else's job to ensure that the materials remain secure. We know securing infectious materials can be error-prone (recall events at the CDC and the FDA).

In some areas of research, scientists are already worrying about the unintended possible downsides of their work.

Further, securing viral strains does nothing to secure the knowledge that could allow for reproducing the viral strain (particularly when the methodologies and/or genetic sequences are published after the fact, as was the case for H5N1 and horsepox). It is, in fact, the researcher's moral responsibility to be concerned not just about the biosafety controls in their own labs, but also which projects should be pursued (Will the gain in knowledge be worth the possible downsides?) and which results should be published (Will a result make it easier for a malicious actor to deploy a new bioweapon?).

We have not yet had (to my knowledge) a use of gain-of-function research to harm people. If that does happen, those who actually released the virus on the public will be most blameworthy–-intentions do matter. But the scientists who developed the knowledge deployed by the malicious actors may also be held blameworthy, especially if the malicious use was easy to foresee, even if it was not pleasant to think about.

In some areas of research, scientists are already worrying about the unintended possible downsides of their work. Scientists investigating gene drives have thought beyond the immediate desired benefits of their work (e.g. reducing invasive species populations) and considered the possible spread of gene drives to untargeted populations. Modeling the impacts of such possibilities has led some researchers to pull back from particular deployment possibilities. It is precisely such thinking through both the intended and unintended possible outcomes that is needed for responsible work.

The world has gotten too small, too vulnerable for scientists to act as though they are not responsible for the uses of their work, intended or not. They must seek to ensure that, as the recent AAAS Statement on Scientific Freedom and Responsibility demands, their work is done "in the interest of humanity." This requires thinking beyond one's intentions, potentially drawing on the expertise of others, sometimes from other disciplines, to help explore implications. The need for such thinking does not guarantee good outcomes, but it will ensure that we are doing the best we can, and that is what being morally responsible is all about.

Heather Douglas
Heather Douglas is an Associate Professor in the Department of Philosophy at Michigan State University. She received her Ph.D. from the History and Philosophy of Science Department at the University of Pittsburgh in 1998, and has held tenure-line positions since then at the University of Puget Sound, the University of Tennessee, and the University of Waterloo. She is the author of Science, Policy, and the Value-Free Ideal (2009) as well as numerous articles on values in science, the moral responsibilities of scientists, and the role of science in democratic societies. Her work has been supported by the National Science Foundation. In 2016, she was named a AAAS fellow.
Get our top stories twice a month
Follow us on

Robin Cavendish in his special wheelchair with his son Jonathan in the 1960s.

Cavendish family

In December 1958, on a vacation with his wife in Kenya, a 28-year-old British tea broker named Robin Cavendish became suddenly ill. Neither he nor his wife Diana knew it at the time, but Robin's illness would change the course of medical history forever.

Robin was rushed to a nearby hospital in Kenya where the medical staff delivered the crushing news: Robin had contracted polio, and the paralysis creeping up his body was almost certainly permanent. The doctors placed Robin on a ventilator through a tracheotomy in his neck, as the paralysis from his polio infection had rendered him unable to breathe on his own – and going off the average life expectancy at the time, they gave him only three months to live. Robin and Diana (who was pregnant at the time with their first child, Jonathan) flew back to England so he could be admitted to a hospital. They mentally prepared to wait out Robin's final days.

Keep Reading Keep Reading
Sarah Watts

Sarah Watts is a health and science writer based in Chicago. Follow her on Twitter at @swattswrites.

Kirstie Ennis, an Afghanistan veteran who survived a helicopter crash but lost a limb, pictured in May 2021 at Two Rivers Park in Colorado.

Photo Credit: Ennis' Instagram

In June 2012, Kirstie Ennis was six months into her second deployment to Afghanistan and recently promoted to sergeant. The helicopter gunner and seven others were three hours into a routine mission of combat resupplies and troop transport when their CH-53D helicopter went down hard.

Miraculously, all eight people onboard survived, but Ennis' injuries were many and severe. She had a torn rotator cuff, torn labrum, crushed cervical discs, facial fractures, deep lacerations and traumatic brain injury. Despite a severely fractured ankle, doctors managed to save her foot, for a while at least.

In November 2015, after three years of constant pain and too many surgeries to count, Ennis relented. She elected to undergo a lower leg amputation but only after she completed the 1,000-mile, 72-day Walking with the Wounded journey across the UK.

On Veteran's Day of that year, on the other side of the country, orthopedic surgeon Cato Laurencin announced a moonshot challenge he was setting out to achieve on behalf of wounded warriors like Ennis: the Hartford Engineering A Limb (HEAL) Project.

Keep Reading Keep Reading
Melba Newsome
Melba Newsome is an independent science and health journalist whose work has appeared in Health Affairs, Scientific American, Prevention, Politico, Everyday Health and North Carolina Health News. She received the June Roth Award for Medical Journalism for a feature on genetic testing in Oprah magazine. She currently serves as core topic leader on health equity for the Association of Healthcare Journalists.