On the evening of November 28, 1942, more than 1,000 revelers from the Boston College-Holy Cross football game jammed into the Cocoanut Grove, Boston's oldest nightclub. When a spark from faulty wiring accidently ignited an artificial palm tree, the packed nightspot, which was only designed to accommodate about 500 people, was quickly engulfed in flames. In the ensuing panic, hundreds of people were trapped inside, with most exit doors locked. Bodies piled up by the only open entrance, jamming the exits, and 490 people ultimately died in the worst fire in the country in forty years.
"People couldn't get out," says Dr. Kenneth Marshall, a retired plastic surgeon in Boston and president of the Cocoanut Grove Memorial Committee. "It was a tragedy of mammoth proportions."
Within a half an hour of the start of the blaze, the Red Cross mobilized more than five hundred volunteers in what one newspaper called a "Rehearsal for Possible Blitz." The mayor of Boston imposed martial law. More than 300 victims—many of whom subsequently died--were taken to Boston City Hospital in one hour, averaging one victim every eleven seconds, while Massachusetts General Hospital admitted 114 victims in two hours. In the hospitals, 220 victims clung precariously to life, in agonizing pain from massive burns, their bodies ravaged by infection.
The scene of the fire.
Boston Public Library
Tragic Losses Prompted Revolutionary Leaps
But there is a silver lining: this horrific disaster prompted dramatic changes in safety regulations to prevent another catastrophe of this magnitude and led to the development of medical techniques that eventually saved millions of lives. It transformed burn care treatment and the use of plasma on burn victims, but most importantly, it introduced to the public a new wonder drug that revolutionized medicine, midwifed the birth of the modern pharmaceutical industry, and nearly doubled life expectancy, from 48 years at the turn of the 20th century to 78 years in the post-World War II years.
The devastating grief of the survivors also led to the first published study of post-traumatic stress disorder by pioneering psychiatrist Alexandra Adler, daughter of famed Viennese psychoanalyst Alfred Adler, who was a student of Freud. Dr. Adler studied the anxiety and depression that followed this catastrophe, according to the New York Times, and "later applied her findings to the treatment World War II veterans."
Dr. Ken Marshall is intimately familiar with the lingering psychological trauma of enduring such a disaster. His mother, an Irish immigrant and a nurse in the surgical wards at Boston City Hospital, was on duty that cold Thanksgiving weekend night, and didn't come home for four days. "For years afterward, she'd wake up screaming in the middle of the night," recalls Dr. Marshall, who was four years old at the time. "Seeing all those bodies lined up in neat rows across the City Hospital's parking lot, still in their evening clothes. It was always on her mind and memories of the horrors plagued her for the rest of her life."
The sheer magnitude of casualties prompted overwhelmed physicians to try experimental new procedures that were later successfully used to treat thousands of battlefield casualties. Instead of cutting off blisters and using dyes and tannic acid to treat burned tissues, which can harden the skin, they applied gauze coated with petroleum jelly. Doctors also refined the formula for using plasma--the fluid portion of blood and a medical technology that was just four years old--to replenish bodily liquids that evaporated because of the loss of the protective covering of skin.
"Every war has given us a new medical advance. And penicillin was the great scientific advance of World War II."
"The initial insult with burns is a loss of fluids and patients can die of shock," says Dr. Ken Marshall. "The scientific progress that was made by the two institutions revolutionized fluid management and topical management of burn care forever."
Still, they could not halt the staph infections that kill most burn victims—which prompted the first civilian use of a miracle elixir that was being secretly developed in government-sponsored labs and that ultimately ushered in a new age in therapeutics. Military officials quickly realized this disaster could provide an excellent natural laboratory to test the effectiveness of this drug and see if it could be used to treat the acute traumas of combat in this unfortunate civilian approximation of battlefield conditions. At the time, the very existence of this wondrous medicine—penicillin—was a closely guarded military secret.
From Forgotten Lab Experiment to Wonder Drug
In 1928, Alexander Fleming discovered the curative powers of penicillin, which promised to eradicate infectious pathogens that killed millions every year. But the road to mass producing enough of the highly unstable mold was littered with seemingly unsurmountable obstacles and it remained a forgotten laboratory curiosity for over a decade. But Fleming never gave up and penicillin's eventual rescue from obscurity was a landmark in scientific history.
In 1940, a group at Oxford University, funded in part by the Rockefeller Foundation, isolated enough penicillin to test it on twenty-five mice, which had been infected with lethal doses of streptococci. Its therapeutic effects were miraculous—the untreated mice died within hours, while the treated ones played merrily in their cages, undisturbed. Subsequent tests on a handful of patients, who were brought back from the brink of death, confirmed that penicillin was indeed a wonder drug. But Britain was then being ravaged by the German Luftwaffe during the Blitz, and there were simply no resources to devote to penicillin during the Nazi onslaught.
In June of 1941, two of the Oxford researchers, Howard Florey and Ernst Chain, embarked on a clandestine mission to enlist American aid. Samples of the temperamental mold were stored in their coats. By October, the Roosevelt Administration had recruited four companies—Merck, Squibb, Pfizer and Lederle—to team up in a massive, top-secret development program. Merck, which had more experience with fermentation procedures, swiftly pulled away from the pack and every milligram they produced was zealously hoarded.
After the nightclub fire, the government ordered Merck to dispatch to Boston whatever supplies of penicillin that they could spare and to refine any crude penicillin broth brewing in Merck's fermentation vats. After working in round-the-clock relays over the course of three days, on the evening of December 1st, 1942, a refrigerated truck containing thirty-two liters of injectable penicillin left Merck's Rahway, New Jersey plant. It was accompanied by a convoy of police escorts through four states before arriving in the pre-dawn hours at Massachusetts General Hospital. Dozens of people were rescued from near-certain death in the first public demonstration of the powers of the antibiotic, and the existence of penicillin could no longer be kept secret from inquisitive reporters and an exultant public. The next day, the Boston Globe called it "priceless" and Time magazine dubbed it a "wonder drug."
Within fourteen months, penicillin production escalated exponentially, churning out enough to save the lives of thousands of soldiers, including many from the Normandy invasion. And in October 1945, just weeks after the Japanese surrender ended World War II, Alexander Fleming, Howard Florey and Ernst Chain were awarded the Nobel Prize in medicine. But penicillin didn't just save lives—it helped build some of the most innovative medical and scientific companies in history, including Merck, Pfizer, Glaxo and Sandoz.
"Every war has given us a new medical advance," concludes Marshall. "And penicillin was the great scientific advance of World War II."
When Rita Levi-Montalcini decided to become a scientist, she was determined that nothing would stand in her way. And from the beginning, that determination was put to the test. Before Levi-Montalcini became a Nobel Prize-winning neurobiologist, the first to discover and isolate a crucial chemical called Neural Growth Factor (NGF), she would have to battle both the sexism within her own family as well as the racism and fascism that was slowly engulfing her country
Levi-Montalcini was born to two loving parents in Turin, Italy at the turn of the 20th century. She and her twin sister, Paola, were the youngest of the family's four children, and Levi-Montalcini described her childhood as "filled with love and reciprocal devotion." But while her parents were loving, supportive and "highly cultured," her father refused to let his three daughters engage in any schooling beyond the basics. "He loved us and had a great respect for women," she later explained, "but he believed that a professional career would interfere with the duties of a wife and mother."
At age 20, Levi-Montalcini had finally had enough. "I realized that I could not possibly adjust to a feminine role as conceived by my father," she is quoted as saying, and asked his permission to finish high school and pursue a career in medicine. When her father reluctantly agreed, Levi-Montalcini was ecstatic: In just under a year, she managed to catch up on her mathematics, graduate high school, and enroll in medical school in Turin.
By 1936, Levi-Montalcini had graduated medical school at the top of her class and decided to stay on at the University of Turin as a research assistant for histologist and human anatomy professor Guiseppe Levi. Levi-Montalcini started studying nerve cells and nerve fibers – the tiny, slender tendrils that are threaded throughout our nerves and that determine what information each nerve can transmit. But it wasn't long before another enormous obstacle to her scientific career reared its head.
Science Under a Fascist Regime
Two years into her research assistant position, Levi-Montalcini was fired, along with every other "non-Aryan Italian" who held an academic or professional career, thanks to a series of antisemitic laws passed by Italy's then-leader Benito Mussolini. Forced out of her academic position, Levi-Montalcini went to Belgium for a fellowship at a neurological institute in Brussels – but then was forced back to Turin when the German army invaded.
Levi-Montalcini decided to keep researching. She and Guiseppe Levi built a makeshift lab in Levi-Montalcini's apartment, borrowing chicken eggs from local farmers and using sewing needles to dissect them. By dissecting the chicken embryos from her bedroom laboratory, she was able to see how nerve fibers formed and died. The two continued this research until they were interrupted again – this time, by British air raids. Levi-Montalcini fled to a country cottage to continue her research, and then two years later was forced into hiding when the German army invaded Italy. Levi-Montalcini and her family assumed different identities and lived with non-Jewish friends in Florence to survive the Holocaust. Despite all of this, Levi-Montalcini continued her work, dissecting chicken embryos from her hiding place until the end of the war.
"The discovery of NGF really changed the world in which we live, because now we knew that cells talk to other cells, and that they use soluble factors. It was hugely important."
A Post-War Discovery
Several years after the war, when Levi-Montalcini was once again working at the University of Turin, a German embryologist named Viktor Hamburger invited her to Washington University in St. Louis. Hamburger was impressed by Levi-Montalcini's research with her chicken embryos, and secured an opportunity for her to continue her work in America. The invitation would "change the course of my life," Levi-Montalcini would later recall.
During her fellowship, Montalcini grew tumors in mice and then transferred them to chick embryos in order to see how it would affect the chickens. To her surprise, she noticed that introducing the tumor samples would cause nerve fibers to grow rapidly. From this, Levi-Montalcini discovered and was able to isolate a protein that she determined was able to cause this rapid growth. She later named this Nerve Growth Factor, or NGF.
From there, Levi-Montalcini and her team launched new experiments to test NGF, injecting it and repressing it to see the effect it had in a test subject's body. When the team injected NGF into embryonic mice, they observed nerve growth, as well as the mouse pups developing faster – their eyes opening earlier and their teeth coming in sooner – than the untreated group. When the team purified the NGF extract, however, it had no effect, leading the team to believe that something else in the crude extract of NGF was influencing the growth of the newborn mice. Stanley Cohen, Levi-Montalcini's colleague, identified another growth factor called EGF – epidermal growth factor – that caused the mouse pups' eyes and teeth to grow so quickly.
Levi-Montalcini continued to experiment with NGF for the next several decades at Washington University, illuminating how NGF works in our body. When Levi-Montalcini injected newborn mice with an antiserum for NGF, for example, her team found that it "almost completely deprived the animals of a sympathetic nervous system." Other experiments done by Levi-Montalcini and her colleagues helped show the role that NGF plays in other important biological processes, such as the regulation of our immune system and ovulation.
"The discovery of NGF really changed the world in which we live, because now we knew that cells talk to other cells, and that they use soluble factors. It was hugely important," said Bill Mobley, Chair of the Department of Neurosciences at the University of California, San Diego School of Medicine.
Her Lasting Legacy
After years of setbacks, Levi-Montalcini's groundbreaking work was recognized in 1986, when she was awarded the Nobel Prize in Medicine for her discovery of NGF (Cohen, her colleague who discovered EGF, shared the prize). Researchers continue to study NGF even to this day, and the continued research has been able to increase our understanding of diseases like HIV and Alzheimer's.
Levi-Montalcini never stopped researching either: In January 2012, at the age of 102, Levi-Montalcini published her last research paper in the journal PNAS, making her the oldest member of the National Academy of Science to do so. Before she died in December 2012, she encouraged other scientists who would suffer setbacks in their careers to keep pursuing their passions. "Don't fear the difficult moments," Levi-Montalcini is quoted as saying. "The best comes from them."
Sarah Watts is a health and science writer based in Chicago. Follow her on Twitter at @swattswrites.
In July 1956, a new drug hit the European market for the first time. The drug was called thalidomide – a sedative that was considered so safe it was available without a prescription.
Sedatives were in high demand in post-war Europe – but barbiturates, the most widely-used sedative at the time, caused overdoses and death when consumers took more than the recommended amount. Thalidomide, on the other hand, didn't appear to cause any side effects at all: Chemie Grünenthal, thalidomide's manufacturer, dosed laboratory rodents with over 600 times the normal dosage during clinical testing and had observed no evidence of toxicity.
The drug therefore was considered universally safe, and Grünenthal supplied thousands of doctors with samples to give to their patients. Doctors were encouraged to recommend thalidomide to their pregnant patients specifically because it was so safe, in order to relieve the nausea and insomnia associated with the first trimester of pregnancy.
By 1960, Thalidomide was being sold in countries throughout the world, and the United States was expected to soon follow suit. Dr. Frances Oldham Kelsey, a pharmacologist and physician, was hired by the Food and Drug Administration (FDA) in September of that year to review and approve drugs for the administration. Immediately, Kelsey was tasked with approving thalidomide for commercial use in the United States under the name Kevadon. Kelsey's approval was supposed to be a formality, since the drug was so widely used in other countries.
But Kelsey did something that few people expected – she paused. Rather than approving the drug offhand as she was expected to do, Kelsey asked the manufacturer – William S. Merrell Co., who was manufacturing thalidomide under license from Chemie Grünenthal – to supply her with more safety data, noting that Merrell's application for approval relied mostly on anecdotal testimony. Kelsey – along with her husband who worked as a pharmacologist at the National Institutes of Health (NIH) — was highly suspicious of a drug that had no lethal dose and no side effects. "It was just too positive," Kelsey said later. "This couldn't be the perfect drug with no risk."
At the same time, rumors were starting to swirl across Europe that thalidomide was not as safe as everyone had initially thought: Physicians were starting to notice an "unusual increase" in the birth of severely deformed babies, and they were beginning to suspect thalidomide as the cause. The babies, whose mothers had all taken thalidomide during pregnancy, were born with conditions like deafness, blindness, congenital heart problems, and even phocomelia, a malformation of the arms and legs. Doctors and midwives were also starting to notice a sharp rise in miscarriages and stillbirths among their patients as well.
Kelsey's skepticism was rewarded in November 1961 when thalidomide was yanked abruptly off the market, following a growing outcry that it was responsible for hundreds of stillbirths and deformities.
Kelsey had heard none of these rumors, but she did know from her post-doctoral research that adults could metabolize drugs differently than fetuses – in other words, a drug that was perfectly safe for adults could be detrimental to a patient's unborn child. Noting that thalidomide could cross the placental barrier, she asked for safety data, such as clinical trials, that showed specifically the drug was non-toxic for fetuses. Merrell supplied Kelsey with anecdotal data – in other words, accounts from patients who attested to the fact that they took thalidomide with no adverse effects – but she rejected it, needing stronger data: clinical studies with pregnant women included.
The drug company was annoyed at what they considered Kelsey's needless bureaucracy. After all, Germans were consuming around 1 million doses of thalidomide every day in 1960, with lots of anecdotal evidence that it was safe, even among pregnant women. As the holidays approached – the most lucrative time of year for sedative sales – Merrell executives started hounding Kelsey to approve thalidomide, even phoning her superior and paying her visits at work. But Kelsey was unmovable. Kelsey's skepticism was solidified in December 1960, when she read a letter published in the British Medical Journal from a physician. In the letter, the author warned that his long-term thalidomide patients were starting to report pain in their arms and legs.
"The burden of proof that the drug is safe … lies with the applicant," Kelsey wrote in a letter to Merrell executive Joseph F. Murray in May of 1961. Despite increasing pressure, Kelsey held fast to her insistence that more safety data – particularly for fetuses – was needed.
Kelsey's skepticism was rewarded in November 1961 when Chemie Grünenthal yanked thalidomide off the market overseas, following a growing outcry that it was responsible for hundreds of stillbirths and deformities. In early 1962, Merrell conceded that the drug's safety was unproven in fetuses and formally withdrew its application at the FDA.
Thanks to Kelsey, the United States was spared the effects of thalidomide – although countries like Europe and Canada were not so lucky. Thalidomide remained in people's homes under different names long after it was pulled from the market, and so women unfortunately continued to take thalidomide during their pregnancies, unaware of its effects. All told, thalidomide is thought to have caused around 10,000 birth defects and anywhere from 5,000 to 7,000 miscarriages. Many so-called "thalidomide babies" are now adults living with disabilities.
Niko von Glasow, born in 1960, is a German film director and producer who was born disabled due to the side effects of thalidomide.
Just two years after joining the FDA, Kelsey was presented with the President's Award for Distinguished Federal Civilian Service and was appointed as the head of the Investigational Drug Branch at the FDA. Not only did Kelsey save the U.S. public from the horrific effects of thalidomide, but she forever changed the way drugs were developed and approved for use in the United States: Drugs now need to not only be proven safe and effective, but adverse drug reactions need to be reported to the FDA and informed consent must be obtained by all participants before they volunteer for clinical trials. Today, the United States is safer because of Frances Kelsey's bravery.
Sarah Watts is a health and science writer based in Chicago. Follow her on Twitter at @swattswrites.