Society Needs Regulations to Prevent Research Abuses

A tension exists between scientists/doctors and government regulators.

(© wladimir1804/Fotolia)

[Editor's Note: Our Big Moral Question this month is, "Do government regulations help or hurt the goal of responsible and timely scientific innovation?"]

Government regulations help more than hurt the goal of responsible and timely scientific innovation. Opponents might argue that without regulations, researchers would be free to do whatever they want. But without ethics and regulations, scientists have performed horrific experiments. In Nazi concentration camps, for instance, doctors forced prisoners to stay in the snow to see how long it took for these inmates to freeze to death. These researchers also removed prisoner's limbs in order to try to develop innovations to reconnect these body parts, but all the experiments failed.

Researchers in not only industry, but also academia have violated research participants' rights.

Due to these atrocities, after the war, the Nuremberg Tribunal established the first ethical guidelines for research, mandating that all study participants provide informed consent. Yet many researchers, including those in leading U.S. academic institutions and government agencies, failed to follow these dictates. The U.S. government, for instance, secretly infected Guatemalan men with syphilis in order to study the disease and experimented on soldiers, exposing them without consent to biological and chemical warfare agents. In the 1960s, researchers at New York's Willowbrook State School purposefully fed intellectually disabled children infected stool extracts with hepatitis to study the disease. In 1966, in the New England Journal of Medicine, Henry Beecher, a Harvard anesthesiologist, described 22 cases of unethical research published in the nation's leading medical journals, but were mostly conducted without informed consent, and at times harmed participants without offering them any benefit.

Despite heightened awareness and enhanced guidelines, abuses continued. Until a 1974 journalistic exposé, the U.S. government continued to fund the now-notorious Tuskegee syphilis study of infected poor African-American men in rural Alabama, refusing to offer these men penicillin when it became available as effective treatment for the disease.

In response, in 1974 Congress passed the National Research Act, establishing research ethics committees or Institutional Review Boards (IRBs), to guide scientists, allowing them to innovate while protecting study participants' rights. Routinely, IRBs now detect and prevent unethical studies from starting.

Still, even with these regulations, researchers have at times conducted unethical investigations. In 1999 at the Los Angeles Veterans Affairs Hospital, for example, a patient twice refused to participate in a study that would prolong his surgery. The researcher nonetheless proceeded to experiment on him anyway, using an electrical probe in the patient's heart to collect data.

Part of the problem and consequent need for regulations is that researchers have conflicts of interest and often do not recognize ethical challenges their research may pose.

Pharmaceutical company scandals, involving Avandia, and Neurontin and other drugs, raise added concerns. In marketing Vioxx, OxyContin, and tobacco, corporations have hidden findings that might undercut sales.

Regulations become increasingly critical as drug companies and the NIH conduct increasing amounts of research in the developing world. In 1996, Pfizer conducted a study of bacterial meningitis in Nigeria in which 11 children died. The families thus sued. Pfizer produced a Nigerian IRB approval letter, but the letter turned out to have been forged. No Nigerian IRB had ever approved the study. Fourteen years later, Wikileaks revealed that Pfizer had hired detectives to find evidence of corruption against the Nigerian Attorney General, to compel him to drop the lawsuit.

Researchers in not only industry, but also academia have violated research participants' rights. Arizona State University scientists wanted to investigate the genes of a Native American group, the Havasupai, who were concerned about their high rates of diabetes. The investigators also wanted to study the group's rates of schizophrenia, but feared that the tribe would oppose the study, given the stigma. Hence, these researchers decided to mislead the tribe, stating that the study was only about diabetes. The university's research ethics committee knew the scientists' plan to study schizophrenia, but approved the study, including the consent form, which did not mention any psychiatric diagnoses. The Havasupai gave blood samples, but later learned that the researchers published articles about the tribe's schizophrenia and alcoholism, and genetic origins in Asia (while the Havasupai believed they originated in the Grand Canyon, where they now lived, and which they thus argued they owned). A 2010 legal settlement required that the university return the blood samples to the tribe, which then destroyed them. Had the researchers instead worked with the tribe more respectfully, they could have advanced science in many ways.

Part of the problem and consequent need for regulations is that researchers have conflicts of interest and often do not recognize ethical challenges their research may pose.

Such violations threaten to lower public trust in science, particularly among vulnerable groups that have historically been systemically mistreated, diminishing public and government support for research and for the National Institutes of Health, National Science Foundation and Centers for Disease Control, all of which conduct large numbers of studies.

Research that has failed to follow ethics has in fact impeded innovation.

In popular culture, myths of immoral science and technology--from Frankenstein to Big Brother and Dr. Strangelove--loom.

Admittedly, regulations involve inherent tradeoffs. Following certain rules can take time and effort. Certain regulations may in fact limit research that might potentially advance knowledge, but be grossly unethical. For instance, if our society's sole goal was to have scientists innovate as much as possible, we might let them stick needles into healthy people's brains to remove cells in return for cash that many vulnerable poor people might find desirable. But these studies would clearly pose major ethical problems.

Research that has failed to follow ethics has in fact impeded innovation. In 1999, the death of a young man, Jesse Gelsinger, in a gene therapy experiment in which the investigator was subsequently found to have major conflicts of interest, delayed innovations in the field of gene therapy research for years.

Without regulations, companies might market products that prove dangerous, leading to massive lawsuits that could also ultimately stifle further innovation within an industry.

The key question is not whether regulations help or hurt science alone, but whether they help or hurt science that is both "responsible and innovative."

We don't want "over-regulation." Rather, the right amount of regulations is needed – neither too much nor too little. Hence, policy makers in this area have developed regulations in fair and transparent ways and have also been working to reduce the burden on researchers – for instance, by allowing single IRBs to review multi-site studies, rather than having multiple IRBs do so, which can create obstacles.

In sum, society requires a proper balance of regulations to ensure ethical research, avoid abuses, and ultimately aid us all by promoting responsible innovation.

[Ed. Note: Check out the opposite viewpoint here, and follow LeapsMag on social media to share your perspective.]

Robert Klitzman
Robert Klitzman, MD, is a professor of psychiatry at the Vagelos College of Physicians and Surgeons and the Joseph Mailman School of Public Health, and the director of the Masters in Bioethics program at Columbia University. He has published over 130 scientific journal articles and eight books, including When Doctors Become Patients; A Year-Long Night: Tales of a Medical Internship; In a House of Dreams and Glass: Becoming a Psychiatrist; Being Positive: The Lives of Men and Women With HIV; The Trembling Mountain: A Personal Account of Kuru, Cannibals and Mad Cow Disease; Mortal Secrets: Truth and Lies in the Age of AIDS (with Ronald Bayer); Am I My Genes? Confronting Fate and Other Genetic Journeys; and The Ethics Police?: The Struggle to Make Human Research Safe. He has received numerous awards for his work, is a Distinguished Fellow of the American Psychiatric Association, a member of the Council on Foreign Relations, and a regular contributor to the New York Times and CNN.
Get our top stories twice a month
Follow us on

Reporter Michaela Haas takes Aptera's Sol car out for a test drive in San Diego, Calif.

Courtesy Haas

The white two-seater car that rolls down the street in the Sorrento Valley of San Diego looks like a futuristic batmobile, with its long aerodynamic tail and curved underbelly. Called 'Sol' (Spanish for "sun"), it runs solely on solar and could be the future of green cars. Its maker, the California startup Aptera, has announced the production of Sol, the world's first mass-produced solar vehicle, by the end of this year. Aptera co-founder Chris Anthony points to the sky as he says, "On this sunny California day, there is ample fuel. You never need to charge the car."

If you live in a sunny state like California or Florida, you might never need to plug in the streamlined Sol because the solar panels recharge while driving and parked. Its 60-mile range is more than the average commuter needs. For cloudy weather, battery packs can be recharged electronically for a range of up to 1,000 miles. The ultra-aerodynamic shape made of lightweight materials such as carbon, Kevlar, and hemp makes the Sol four times more energy-efficient than a Tesla, according to Aptera. "The material is seven times stronger than steel and even survives hail or an angry ex-girlfriend," Anthony promises.

Keep Reading Keep Reading
Michaela Haas
Michaela Haas, PhD, is an award-winning reporter and author, most recently of Bouncing Forward: The Art and Science of Cultivating Resilience (Atria). Her work has been published in the New York Times, Mother Jones, the Huffington Post, and numerous other media. Find her at and Twitter @MichaelaHaas!

A stock image of a home test for COVID-19.

Photo by Annie Spratt on Unsplash

Last summer, when fast and cheap Covid tests were in high demand and governments were struggling to manufacture and distribute them, a group of independent scientists working together had a bit of a breakthrough.

Working on the Just One Giant Lab platform, an online community that serves as a kind of clearing house for open science researchers to find each other and work together, they managed to create a simple, one-hour Covid test that anyone could take at home with just a cup of hot water. The group tested it across a network of home and professional laboratories before being listed as a semi-finalist team for the XPrize, a competition that rewards innovative solutions-based projects. Then, the group hit a wall: they couldn't commercialize the test.

Keep Reading Keep Reading
Christi Guerrini and Alex Pearlman

Christi Guerrini, JD, MPH studies biomedical citizen science and is an Associate Professor at Baylor College of Medicine. Alex Pearlman, MA, is a science journalist and bioethicist who writes about emerging issues in biotechnology. They have recently launched, a place for discussion about nontraditional research.