Society Needs Regulations to Prevent Research Abuses

A tension exists between scientists/doctors and government regulators.

(© wladimir1804/Fotolia)


[Editor's Note: Our Big Moral Question this month is, "Do government regulations help or hurt the goal of responsible and timely scientific innovation?"]

Government regulations help more than hurt the goal of responsible and timely scientific innovation. Opponents might argue that without regulations, researchers would be free to do whatever they want. But without ethics and regulations, scientists have performed horrific experiments. In Nazi concentration camps, for instance, doctors forced prisoners to stay in the snow to see how long it took for these inmates to freeze to death. These researchers also removed prisoner's limbs in order to try to develop innovations to reconnect these body parts, but all the experiments failed.

Researchers in not only industry, but also academia have violated research participants' rights.

Due to these atrocities, after the war, the Nuremberg Tribunal established the first ethical guidelines for research, mandating that all study participants provide informed consent. Yet many researchers, including those in leading U.S. academic institutions and government agencies, failed to follow these dictates. The U.S. government, for instance, secretly infected Guatemalan men with syphilis in order to study the disease and experimented on soldiers, exposing them without consent to biological and chemical warfare agents. In the 1960s, researchers at New York's Willowbrook State School purposefully fed intellectually disabled children infected stool extracts with hepatitis to study the disease. In 1966, in the New England Journal of Medicine, Henry Beecher, a Harvard anesthesiologist, described 22 cases of unethical research published in the nation's leading medical journals, but were mostly conducted without informed consent, and at times harmed participants without offering them any benefit.

Despite heightened awareness and enhanced guidelines, abuses continued. Until a 1974 journalistic exposé, the U.S. government continued to fund the now-notorious Tuskegee syphilis study of infected poor African-American men in rural Alabama, refusing to offer these men penicillin when it became available as effective treatment for the disease.

In response, in 1974 Congress passed the National Research Act, establishing research ethics committees or Institutional Review Boards (IRBs), to guide scientists, allowing them to innovate while protecting study participants' rights. Routinely, IRBs now detect and prevent unethical studies from starting.

Still, even with these regulations, researchers have at times conducted unethical investigations. In 1999 at the Los Angeles Veterans Affairs Hospital, for example, a patient twice refused to participate in a study that would prolong his surgery. The researcher nonetheless proceeded to experiment on him anyway, using an electrical probe in the patient's heart to collect data.

Part of the problem and consequent need for regulations is that researchers have conflicts of interest and often do not recognize ethical challenges their research may pose.

Pharmaceutical company scandals, involving Avandia, and Neurontin and other drugs, raise added concerns. In marketing Vioxx, OxyContin, and tobacco, corporations have hidden findings that might undercut sales.

Regulations become increasingly critical as drug companies and the NIH conduct increasing amounts of research in the developing world. In 1996, Pfizer conducted a study of bacterial meningitis in Nigeria in which 11 children died. The families thus sued. Pfizer produced a Nigerian IRB approval letter, but the letter turned out to have been forged. No Nigerian IRB had ever approved the study. Fourteen years later, Wikileaks revealed that Pfizer had hired detectives to find evidence of corruption against the Nigerian Attorney General, to compel him to drop the lawsuit.

Researchers in not only industry, but also academia have violated research participants' rights. Arizona State University scientists wanted to investigate the genes of a Native American group, the Havasupai, who were concerned about their high rates of diabetes. The investigators also wanted to study the group's rates of schizophrenia, but feared that the tribe would oppose the study, given the stigma. Hence, these researchers decided to mislead the tribe, stating that the study was only about diabetes. The university's research ethics committee knew the scientists' plan to study schizophrenia, but approved the study, including the consent form, which did not mention any psychiatric diagnoses. The Havasupai gave blood samples, but later learned that the researchers published articles about the tribe's schizophrenia and alcoholism, and genetic origins in Asia (while the Havasupai believed they originated in the Grand Canyon, where they now lived, and which they thus argued they owned). A 2010 legal settlement required that the university return the blood samples to the tribe, which then destroyed them. Had the researchers instead worked with the tribe more respectfully, they could have advanced science in many ways.

Part of the problem and consequent need for regulations is that researchers have conflicts of interest and often do not recognize ethical challenges their research may pose.

Such violations threaten to lower public trust in science, particularly among vulnerable groups that have historically been systemically mistreated, diminishing public and government support for research and for the National Institutes of Health, National Science Foundation and Centers for Disease Control, all of which conduct large numbers of studies.

Research that has failed to follow ethics has in fact impeded innovation.

In popular culture, myths of immoral science and technology--from Frankenstein to Big Brother and Dr. Strangelove--loom.

Admittedly, regulations involve inherent tradeoffs. Following certain rules can take time and effort. Certain regulations may in fact limit research that might potentially advance knowledge, but be grossly unethical. For instance, if our society's sole goal was to have scientists innovate as much as possible, we might let them stick needles into healthy people's brains to remove cells in return for cash that many vulnerable poor people might find desirable. But these studies would clearly pose major ethical problems.

Research that has failed to follow ethics has in fact impeded innovation. In 1999, the death of a young man, Jesse Gelsinger, in a gene therapy experiment in which the investigator was subsequently found to have major conflicts of interest, delayed innovations in the field of gene therapy research for years.

Without regulations, companies might market products that prove dangerous, leading to massive lawsuits that could also ultimately stifle further innovation within an industry.

The key question is not whether regulations help or hurt science alone, but whether they help or hurt science that is both "responsible and innovative."

We don't want "over-regulation." Rather, the right amount of regulations is needed – neither too much nor too little. Hence, policy makers in this area have developed regulations in fair and transparent ways and have also been working to reduce the burden on researchers – for instance, by allowing single IRBs to review multi-site studies, rather than having multiple IRBs do so, which can create obstacles.

In sum, society requires a proper balance of regulations to ensure ethical research, avoid abuses, and ultimately aid us all by promoting responsible innovation.

[Ed. Note: Check out the opposite viewpoint here, and follow LeapsMag on social media to share your perspective.]

Robert Klitzman
Robert Klitzman, MD, is a professor of psychiatry at the Vagelos College of Physicians and Surgeons and the Joseph Mailman School of Public Health, and the director of the Masters in Bioethics program at Columbia University. He has published over 130 scientific journal articles and eight books, including When Doctors Become Patients; A Year-Long Night: Tales of a Medical Internship; In a House of Dreams and Glass: Becoming a Psychiatrist; Being Positive: The Lives of Men and Women With HIV; The Trembling Mountain: A Personal Account of Kuru, Cannibals and Mad Cow Disease; Mortal Secrets: Truth and Lies in the Age of AIDS (with Ronald Bayer); Am I My Genes? Confronting Fate and Other Genetic Journeys; and The Ethics Police?: The Struggle to Make Human Research Safe. He has received numerous awards for his work, is a Distinguished Fellow of the American Psychiatric Association, a member of the Council on Foreign Relations, and a regular contributor to the New York Times and CNN.
Get our top stories twice a month
Follow us on

Astronaut and Expedition 64 Flight Engineer Soichi Noguchi of the Japan Aerospace Exploration Agency displays Extra Dwarf Pak Choi plants growing aboard the International Space Station. The plants were grown for the Veggie study which is exploring space agriculture as a way to sustain astronauts on future missions to the Moon or Mars.

Johnson Space Center/NASA

Astronauts at the International Space Station today depend on pre-packaged, freeze-dried food, plus some fresh produce thanks to regular resupply missions. This supply chain, however, will not be available on trips further out, such as the moon or Mars. So what are astronauts on long missions going to eat?

Going by the options available now, says Christel Paille, an engineer at the European Space Agency, a lunar expedition is likely to have only dehydrated foods. “So no more fresh product, and a limited amount of already hydrated product in cans.”

For the Mars mission, the situation is a bit more complex, she says. Prepackaged food could still constitute most of their food, “but combined with [on site] production of certain food products…to get them fresh.” A Mars mission isn’t right around the corner, but scientists are currently working on solutions for how to feed those astronauts. A number of boundary-pushing efforts are now underway.

Keep Reading Keep Reading
Payal Dhar
Payal is a writer based in New Delhi who has been covering science, technology, and society since 1998.

A brain expert weighs in on the cognitive biases that hold us back from adjusting to the new reality of Omicron.

Photo by Joshua Sortino on Unsplash

We are sticking our heads into the sand of reality on Omicron, and the results may be catastrophic.

Omicron is over 4 times more infectious than Delta. The Pfizer two-shot vaccine offers only 33% protection from infection. A Pfizer booster vaccine does raises protection to about 75%, but wanes to around 30-40 percent 10 weeks after the booster.

The only silver lining is that Omicron appears to cause a milder illness than Delta. Yet the World Health Organization has warned about the “mildness” narrative.

That’s because the much faster disease transmission and vaccine escape undercut the less severe overall nature of Omicron. That’s why hospitals have a large probability of being overwhelmed, as the Center for Disease Control warned, in this major Omicron wave.

Yet despite this very serious threat, we see the lack of real action. The federal government tightened international travel guidelines and is promoting boosters. Certainly, it’s crucial to get as many people to get their booster – and initial vaccine doses – as soon as possible. But the government is not taking the steps that would be the real game-changers.

Keep Reading Keep Reading
Gleb Tsipursky
Dr. Gleb Tsipursky is an internationally recognized thought leader on a mission to protect leaders from dangerous judgment errors known as cognitive biases by developing the most effective decision-making strategies. A best-selling author, he wrote Resilience: Adapt and Plan for the New Abnormal of the COVID-19 Coronavirus Pandemic and Pro Truth: A Practical Plan for Putting Truth Back Into Politics. His expertise comes from over 20 years of consulting, coaching, and speaking and training as the CEO of Disaster Avoidance Experts, and over 15 years in academia as a behavioral economist and cognitive neuroscientist. He co-founded the Pro-Truth Pledge project.