How Smallpox Was Wiped Off the Planet By a Vaccine and Global Cooperation

The world's last recorded case of endemic smallpox was in Ali Maow Maalin, of Merka, Somalia, in October 1977. He made a full recovery.

(© WHO / John F. Wickett)

For 3000 years, civilizations all over the world were brutalized by smallpox, an infectious and deadly virus characterized by fever and a rash of painful, oozing sores.

Doctors had to contend with wars, floods, and language barriers to make their campaign a success.

Smallpox was merciless, killing one third of people it infected and leaving many survivors permanently pockmarked and blind. Although smallpox was more common during the 18th and 19th centuries, it was still a leading cause of death even up until the early 1950s, killing an estimated 50 million people annually.

A Primitive Cure

Sometime during the 10th century, Chinese physicians figured out that exposing people to a tiny bit of smallpox would sometimes result in a milder infection and immunity to the disease afterward (if the person survived). Desperate for a cure, people would huff powders made of smallpox scabs or insert smallpox pus into their skin, all in the hopes of getting immunity without having to get too sick. However, this method – called inoculation – didn't always work. People could still catch the full-blown disease, spread it to others, or even catch another infectious disease like syphilis in the process.

A Breakthrough Treatment

For centuries, inoculation – however imperfect – was the only protection the world had against smallpox. But in the late 18th century, an English physician named Edward Jenner created a more effective method. Jenner discovered that inoculating a person with cowpox – a much milder relative of the smallpox virus – would make that person immune to smallpox as well, but this time without the possibility of actually catching or transmitting smallpox. His breakthrough became the world's first vaccine against a contagious disease. Other researchers, like Louis Pasteur, would use these same principles to make vaccines for global killers like anthrax and rabies. Vaccination was considered a miracle, conferring all of the rewards of having gotten sick (immunity) without the risk of death or blindness.

Scaling the Cure

As vaccination became more widespread, the number of global smallpox deaths began to drop, particularly in Europe and the United States. But even as late as 1967, smallpox was still killing anywhere from 10 to 15 million people in poorer parts of the globe. The World Health Assembly (a decision-making body of the World Health Organization) decided that year to launch the first coordinated effort to eradicate smallpox from the planet completely, aiming for 80 percent vaccine coverage in every country in which the disease was endemic – a total of 33 countries.

But officials knew that eradicating smallpox would be easier said than done. Doctors had to contend with wars, floods, and language barriers to make their campaign a success. The vaccination initiative in Bangladesh proved the most challenging, due to its population density and the prevalence of the disease, writes journalist Laurie Garrett in her book, The Coming Plague.

In one instance, French physician Daniel Tarantola on assignment in Bangladesh confronted a murderous gang that was thought to be spreading smallpox throughout the countryside during their crime sprees. Without police protection, Tarantola confronted the gang and "faced down guns" in order to immunize them, protecting the villagers from repeated outbreaks.

Because not enough vaccines existed to vaccinate everyone in a given country, doctors utilized a strategy called "ring vaccination," which meant locating individual outbreaks and vaccinating all known and possible contacts to stop an outbreak at its source. Fewer than 50 percent of the population in Nigeria received a vaccine, for example, but thanks to ring vaccination, it was eradicated in that country nonetheless. Doctors worked tirelessly for the next eleven years to immunize as many people as possible.

The World Health Organization declared smallpox officially eradicated on May 8, 1980.

A Resounding Success

In November 1975, officials discovered a case of variola major — the more virulent strain of the smallpox virus — in a three-year-old Bangladeshi girl named Rahima Banu. Banu was forcibly quarantined in her family's home with armed guards until the risk of transmission had passed, while officials went door-to-door vaccinating everyone within a five-mile radius. Two years later, the last case of variola major in human history was reported in Somalia. When no new community-acquired cases appeared after that, the World Health Organization declared smallpox officially eradicated on May 8, 1980.

Because of smallpox, we now know it's possible to completely eliminate a disease. But is it likely to happen again with other diseases, like COVID-19? Some scientists aren't so sure. As dangerous as smallpox was, it had a few characteristics that made eradication possibly easier than for other diseases. Smallpox, for instance, has no animal reservoir, meaning that it could not circulate in animals and resurge in a human population at a later date. Additionally, a person who had smallpox once was guaranteed immunity from the disease thereafter — which is not the case for COVID-19.

In The Coming Plague, Japanese physician Isao Arita, who led the WHO's Smallpox Eradication Unit, admitted to routinely defying orders from the WHO, mobilizing to parts of the world without official approval and sometimes even vaccinating people against their will. "If we hadn't broken every single WHO rule many times over, we would have never defeated smallpox," Arita said. "Never."

Still, thanks to the life-saving technology of vaccines – and the tireless efforts of doctors and scientists across the globe – a once-lethal disease is now a thing of the past.

Sarah Watts

Sarah Watts is a health and science writer based in Chicago. Follow her on Twitter at @swattswrites.

Get our top stories twice a month
Follow us on

Phages, which are harmless viruses that destroy specific bacteria, are becoming useful tools to protect our food supply.

Every year, one in seven people in America comes down with a foodborne illness, typically caused by a bacterial pathogen, including E.Coli, listeria, salmonella, or campylobacter. That adds up to 48 million people, of which 120,000 are hospitalized and 3000 die, according to the Centers for Disease Control. And the variety of foods that can be contaminated with bacterial pathogens is growing too. In the 20th century, E.Coli and listeria lurked primarily within meat. Now they find their way into lettuce, spinach, and other leafy greens, causing periodic consumer scares and product recalls. Onions are the most recent suspected culprit of a nationwide salmonella outbreak.

Some of these incidents are almost inevitable because of how Mother Nature works, explains Divya Jaroni, associate professor of animal and food sciences at Oklahoma State University. These common foodborne pathogens come from the cattle's intestines when the animals shed them in their manure—and then they get washed into rivers and lakes, especially in heavy rains. When this water is later used to irrigate produce farms, the bugs end up on salad greens. Plus, many small farms do both—herd cattle and grow produce.

"Unfortunately for us, these pathogens are part of the microflora of the cows' intestinal tract," Jaroni says. "Some farmers may have an acre or two of cattle pastures, and an acre of a produce farm nearby, so it's easy for this water to contaminate the crops."

Keep Reading Keep Reading
Lina Zeldovich
Lina Zeldovich has written about science, medicine and technology for Scientific American, Reader’s Digest, Mosaic Science and other publications. She’s an alumna of Columbia University School of Journalism and the author of the upcoming book, The Other Dark Matter: The Science and Business of Turning Waste into Wealth, from Chicago University Press. You can find her on http://linazeldovich.com/ and @linazeldovich.

Biosensors on a touchscreen are showing promise for detecting arsenic and lead in water.

Photo by Johnny McClung on Unsplash

In 2014, the city of Flint, Michigan switched the residents' water supply to the Flint river, citing cheaper costs. However, due to improper filtering, lead contaminated this water, and according to the Associated Press, many of the city's residents soon reported health issues like hair loss and rashes. In 2015, a report found that children there had high levels of lead in their blood. The National Resource Defense Council recently discovered there could still be as many as twelve million lead pipes carrying water to homes across the U.S.

What if Flint residents and others in afflicted areas could simply flick water onto their phone screens and an app would tell them if they were about to drink contaminated water? This is what researchers at the University of Cambridge are working on to prevent catastrophes like what occurred in Flint, and to prepare for an uncertain future of scarcer resources.

Keep Reading Keep Reading
Hanna Webster
Hanna Webster is a freelance science writer based in San Diego, California. She received a Bachelor’s degree in neuroscience and creative writing in 2018 from Western Washington University, and is now a graduate student in the MA Science Writing program at Johns Hopkins University. She writes stories about neuroscience, biology, and public health. Her essays and articles have appeared in Jeopardy Magazine and Leafly. When Hanna is not writing, she enjoys consuming other art forms, such as photography, poetry, creative nonfiction, and live music