One of the oddest political hoaxes of recent times was Pizzagate, in which conspiracy theorists claimed that Hillary Clinton and her 2016 campaign chief ran a child sex ring from the basement of a Washington, DC, pizzeria.
To fight disinformation more effectively, he suggests, humans need to stop believing in one thing above all: our own gullibility.
Millions of believers spread the rumor on social media, abetted by Russian bots; one outraged netizen stormed the restaurant with an assault rifle and shot open what he took to be the dungeon door. (It actually led to a computer closet.) Pundits cited the imbroglio as evidence that Americans had lost the ability to tell fake news from the real thing, putting our democracy in peril.
Such fears, however, are nothing new. "For most of history, the concept of widespread credulity has been fundamental to our understanding of society," observes Hugo Mercier in Not Born Yesterday: The Science of Who We Trust and What We Believe (Princeton University Press, 2020). In the fourth century BCE, he points out, the historian Thucydides blamed Athens' defeat by Sparta on a demagogue who hoodwinked the public into supporting idiotic military strategies; Plato extended that argument to condemn democracy itself. Today, atheists and fundamentalists decry one another's gullibility, as do climate-change accepters and deniers. Leftists bemoan the masses' blind acceptance of the "dominant ideology," while conservatives accuse those who do revolt of being duped by cunning agitators.
What's changed, all sides agree, is the speed at which bamboozlement can propagate. In the digital age, it seems, a sucker is born every nanosecond.
The Case Against Credulity
Yet Mercier, a cognitive scientist at the Jean Nicod Institute in Paris, thinks we've got the problem backward. To fight disinformation more effectively, he suggests, humans need to stop believing in one thing above all: our own gullibility. "We don't credulously accept whatever we're told—even when those views are supported by the majority of the population, or by prestigious, charismatic individuals," he writes. "On the contrary, we are skilled at figuring out who to trust and what to believe, and, if anything, we're too hard rather than too easy to influence."
He bases those contentions on a growing body of research in neuropsychiatry, evolutionary psychology, and other fields. Humans, Mercier argues, are hardwired to balance openness with vigilance when assessing communicated information. To gauge a statement's accuracy, we instinctively test it from many angles, including: Does it jibe with what I already believe? Does the speaker share my interests? Has she demonstrated competence in this area? What's her reputation for trustworthiness? And, with more complex assertions: Does the argument make sense?
This process, Mercier says, enables us to learn much more from one another than do other animals, and to communicate in a far more complex way—key to our unparalleled adaptability. But it doesn't always save us from trusting liars or embracing demonstrably false beliefs. To better understand why, leapsmag spoke with the author.
How did you come to write Not Born Yesterday?
In 2010, I collaborated with the cognitive scientist Dan Sperber and some other colleagues on a paper called "Epistemic Vigilance," which laid out the argument that evolutionarily, it would make no sense for humans to be gullible. If you can be easily manipulated and influenced, you're going to be in major trouble. But as I talked to people, I kept encountering resistance. They'd tell me, "No, no, people are influenced by advertising, by political campaigns, by religious leaders." I started doing more research to see if I was wrong, and eventually I had enough to write a book.
With all the talk about "fake news" these days, the topic has gotten a lot more timely.
Yes. But on the whole, I'm skeptical that fake news matters very much. And all the energy we spend fighting it is energy not spent on other pursuits that may be better ways of improving our informational environment. The real challenge, I think, is not how to shut up people who say stupid things on the internet, but how to make it easier for people who say correct things to convince people.
"History shows that the audience's state of mind and material conditions matter more than the leader's powers of persuasion."
You start the book with an anecdote about your encounter with a con artist several years ago, who scammed you out of 20 euros. Why did you choose that anecdote?
Although I'm arguing that people aren't generally gullible, I'm not saying we're completely impervious to attempts at tricking us. It's just that we're much better than we think at resisting manipulation. And while there's a risk of trusting someone who doesn't deserve to be trusted, there's also a risk of not trusting someone who could have been trusted. You miss out on someone who could help you, or from whom you might have learned something—including figuring out who to trust.
You argue that in humans, vigilance and open-mindedness evolved hand-in-hand, leading to a set of cognitive mechanisms you call "open vigilance."
There's a common view that people start from a state of being gullible and easy to influence, and get better at rejecting information as they become smarter and more sophisticated. But that's not what really happens. It's much harder to get apes than humans to do anything they don't want to do, for example. And research suggests that over evolutionary time, the better our species became at telling what we should and shouldn't listen to, the more open to influence we became. Even small children have ways to evaluate what people tell them.
The most basic is what I call "plausibility checking": if you tell them you're 200 years old, they're going to find that highly suspicious. Kids pay attention to competence; if someone is an expert in the relevant field, they'll trust her more. They're likelier to trust someone who's nice to them. My colleagues and I have found that by age 2 ½, children can distinguish between very strong and very weak arguments. Obviously, these skills keep developing throughout your life.
But you've found that even the most forceful leaders—and their propaganda machines—have a hard time changing people's minds.
Throughout history, there's been this fear of demagogues leading whole countries into terrible decisions. In reality, these leaders are mostly good at feeling the crowd and figuring out what people want to hear. They're not really influencing [the masses]; they're surfing on pre-existing public opinion. We know from a recent study, for instance, that if you match cities in which Hitler gave campaign speeches in the late '20s through early '30s with similar cities in which he didn't give campaign speeches, there was no difference in vote share for the Nazis. Nazi propaganda managed to make Germans who were already anti-Semitic more likely to express their anti-Semitism or act on it. But Germans who were not already anti-Semitic were completely inured to the propaganda.
So why, in totalitarian regimes, do people seem so devoted to the ruler?
It's not a very complex psychology. In these regimes, the slightest show of discontent can be punished by death, or by you and your whole family being sent to a labor camp. That doesn't mean propaganda has no effect, but you can explain people's obedience without it.
What about cult leaders and religious extremists? Their followers seem willing to believe anything.
Prophets and preachers can inspire the kind of fervor that leads people to suicidal acts or doomed crusades. But history shows that the audience's state of mind and material conditions matter more than the leader's powers of persuasion. Only when people are ready for extreme actions can a charismatic figure provide the spark that lights the fire.
Once a religion becomes ubiquitous, the limits of its persuasive powers become clear. Every anthropologist knows that in societies that are nominally dominated by orthodox belief systems—whether Christian or Muslim or anything else—most people share a view of God, or the spirit, that's closer to what you find in societies that lack such religions. In the Middle Ages, for instance, you have records of priests complaining of how unruly the people are—how they spend the whole Mass chatting or gossiping, or go on pilgrimages mostly because of all the prostitutes and wine-drinking. They continue pagan practices. They resist attempts to make them pay tithes. It's very far from our image of how much people really bought the dominant religion.
"The mainstream media is extremely reliable. The scientific consensus is extremely reliable."
And what about all those wild rumors and conspiracy theories on social media? Don't those demonstrate widespread gullibility?
I think not, for two reasons. One is that most of these false beliefs tend to be held in a way that's not very deep. People may say Pizzagate is true, yet that belief doesn't really interact with the rest of their cognition or their behavior. If you really believe that children are being abused, then trying to free them is the moral and rational thing to do. But the only person who did that was the guy who took his assault weapon to the pizzeria. Most people just left one-star reviews of the restaurant.
The other reason is that most of these beliefs actually play some useful role for people. Before any ethnic massacre, for example, rumors circulate about atrocities having been committed by the targeted minority. But those beliefs aren't what's really driving the phenomenon. In the horrendous pogrom of Kishinev, Moldova, 100 years ago, you had these stories of blood libel—a child disappeared, typical stuff. And then what did the Christian inhabitants do? They raped the [Jewish] women, they pillaged the wine stores, they stole everything they could. They clearly wanted to get that stuff, and they made up something to justify it.
Where do skeptics like climate-change deniers and anti-vaxxers fit into the picture?
Most people in most countries accept that vaccination is good and that climate change is real and man-made. These ideas are deeply counter-intuitive, so the fact that scientists were able to get them across is quite fascinating. But the environment in which we live is vastly different from the one in which we evolved. There's a lot more information, which makes it harder to figure out who we can trust. The main effect is that we don't trust enough; we don't accept enough information. We also rely on shortcuts and heuristics—coarse cues of trustworthiness. There are people who abuse these cues. They may have a PhD or an MD, and they use those credentials to help them spread messages that are not true and not good. Mostly, they're affirming what people want to believe, but they may also be changing minds at the margins.
How can we improve people's ability to resist that kind of exploitation?
I wish I could tell you! That's literally my next project. Generally speaking, though, my advice is very vanilla. The mainstream media is extremely reliable. The scientific consensus is extremely reliable. If you trust those sources, you'll go wrong in a very few cases, but on the whole, they'll probably give you good results. Yet a lot of the problems that we attribute to people being stupid and irrational are not entirely their fault. If governments were less corrupt, if the pharmaceutical companies were irreproachable, these problems might not go away—but they would certainly be minimized.
In December 1958, on a vacation with his wife in Kenya, a 28-year-old British tea broker named Robin Cavendish became suddenly ill. Neither he nor his wife Diana knew it at the time, but Robin's illness would change the course of medical history forever.
Robin was rushed to a nearby hospital in Kenya where the medical staff delivered the crushing news: Robin had contracted polio, and the paralysis creeping up his body was almost certainly permanent. The doctors placed Robin on a ventilator through a tracheotomy in his neck, as the paralysis from his polio infection had rendered him unable to breathe on his own – and going off the average life expectancy at the time, they gave him only three months to live. Robin and Diana (who was pregnant at the time with their first child, Jonathan) flew back to England so he could be admitted to a hospital. They mentally prepared to wait out Robin's final days.
But Robin did something unexpected when he returned to the UK – just one of many things that would astonish doctors over the next several years: He survived. Diana gave birth to Jonathan in February 1959 and continued to visit Robin regularly in the hospital with the baby. Despite doctors warning that he would soon succumb to his illness, Robin kept living.
After a year in the hospital, Diana suggested something radical: She wanted Robin to leave the hospital and live at home in South Oxfordshire for as long as he possibly could, with her as his nurse. At the time, this suggestion was unheard of. People like Robin who depended on machinery to keep them breathing had only ever lived inside hospital walls, as the prevailing belief was that the machinery needed to keep them alive was too complicated for laypeople to operate. But Diana and Robin were up for the challenges – and the risks. Because his ventilator ran on electricity, if the house were to unexpectedly lose power, Diana would either need to restore power quickly or hand-pump air into his lungs to keep him alive.
Robin's wheelchair was not only the first of its kind; it became the model for the respiratory wheelchairs that people still use today.
In an interview as an adult, Jonathan Cavendish reflected on his parents' decision to live outside the hospital on a ventilator: "My father's mantra was quality of life," he explained. "He could have stayed in the hospital, but he didn't think that was as good of a life as he could manage. He would rather be two minutes away from death and living a full life."
After a few years of living at home, however, Robin became tired of being confined to his bed. He longed to sit outside, to visit friends, to travel – but had no way of doing so without his ventilator. So together with his friend Teddy Hall, a professor and engineer at Oxford University, the two collaborated in 1962 to create an entirely new invention: a battery-operated wheelchair prototype with a ventilator built in. With this, Robin could now venture outside the house – and soon the Cavendish family became famous for taking vacations. It was something that, by all accounts, had never been done before by someone who was ventilator-dependent. Robin and Hall also designed a van so that the wheelchair could be plugged in and powered during travel. Jonathan Cavendish later recalled a particular family vacation that nearly ended in disaster when the van broke down outside of Barcelona, Spain:
"My poor old uncle [plugged] my father's chair into the wrong socket," Cavendish later recalled, causing the electricity to short. "There was fire and smoke, and both the van and the chair ground to a halt." Johnathan, who was eight or nine at the time, his mother, and his uncle took turns hand-pumping Robin's ventilator by the roadside for the next thirty-six hours, waiting for Professor Hall to arrive in town and repair the van. Rather than being panicked, the Cavendishes managed to turn the vigil into a party. Townspeople came to greet them, bringing food and music, and a local priest even stopped by to give his blessing.
Robin had become a pioneer, showing the world that a person with severe disabilities could still have mobility, access, and a fuller quality of life than anyone had imagined. His mission, along with Hall's, then became gifting this independence to others like himself. Robin and Hall raised money – first from the Ernest Kleinwort Charitable Trust, and then from the British Department of Health – to fund more ventilator chairs, which were then manufactured by Hall's company, Littlemore Scientific Engineering, and given to fellow patients who wanted to live full lives at home. Robin and Hall used themselves as guinea pigs, testing out different models of the chairs and collaborating with scientists to create other devices for those with disabilities. One invention, called the Possum, allowed paraplegics to control things like the telephone and television set with just a nod of the head. Robin's wheelchair was not only the first of its kind; it became the model for the respiratory wheelchairs that people still use today.
Robin went on to enjoy a long and happy life with his family at their house in South Oxfordshire, surrounded by friends who would later attest to his "down-to-earth" personality, his sense of humor, and his "irresistible" charm. When he died peacefully at his home in 1994 at age 64, he was considered the world's oldest-living person who used a ventilator outside the hospital – breaking yet another barrier for what medical science thought was possible.
Sarah Watts is a health and science writer based in Chicago. Follow her on Twitter at @swattswrites.
In June 2012, Kirstie Ennis was six months into her second deployment to Afghanistan and recently promoted to sergeant. The helicopter gunner and seven others were three hours into a routine mission of combat resupplies and troop transport when their CH-53D helicopter went down hard.
Miraculously, all eight people onboard survived, but Ennis' injuries were many and severe. She had a torn rotator cuff, torn labrum, crushed cervical discs, facial fractures, deep lacerations and traumatic brain injury. Despite a severely fractured ankle, doctors managed to save her foot, for a while at least.
In November 2015, after three years of constant pain and too many surgeries to count, Ennis relented. She elected to undergo a lower leg amputation but only after she completed the 1,000-mile, 72-day Walking with the Wounded journey across the UK.
On Veteran's Day of that year, on the other side of the country, orthopedic surgeon Cato Laurencin announced a moonshot challenge he was setting out to achieve on behalf of wounded warriors like Ennis: the Hartford Engineering A Limb (HEAL) Project.
Laurencin, who is a University of Connecticut professor of chemical, materials and biomedical engineering, teamed up with experts in tissue bioengineering and regenerative medicine from Harvard, Columbia, UC Irvine and SASTRA University in India. Laurencin and his colleagues at the Connecticut Convergence Institute for Translation in Regenerative Engineering made a bold commitment to regenerate an entire limb within 15 years – by the year 2030.
Dr. Cato Laurencin pictured in his office at UConn.
Photo Credit: UConn
Regenerative Engineering -- A Whole New Field
Limb regeneration in humans has been a medical and scientific fascination for decades, with little to show for the effort. However, Laurencin believes that if we are to reach the next level of 21st century medical advances, this puzzle must be solved.
An estimated 185,000 people undergo upper or lower limb amputation every year. Despite the significant advances in electromechanical prosthetics, these individuals still lack the ability to perform complex functions such as sensation for tactile input, normal gait and movement feedback. As far as Laurencin is concerned, the only clinical answer that makes sense is to regenerate a whole functional limb.
Laurencin feels other regeneration efforts were hampered by their siloed research methods with chemists, surgeons, engineers all working separately. Success, he argues, requires a paradigm shift to a trans-disciplinary approach that brings together cutting-edge technologies from disparate fields such as biology, material sciences, physical, chemical and engineering sciences.
As the only surgeon ever inducted into the academies of Science, Medicine and Innovation, Laurencin is uniquely suited for the challenge. He is regarded as the founder of Regenerative Engineering, defined as the convergence of advanced materials sciences, stem cell sciences, physics, developmental biology and clinical translation for the regeneration of complex tissues and organ systems.
But none of this is achievable without early clinician participation across scientific fields to develop new technologies and a deeper understanding of how to harness the body's innate regenerative capabilities. "When I perform a surgical procedure or something is torn or needs to be repaired, I count on the body being involved in regenerating tissue," he says. "So, understanding how the body works to regenerate itself and harnessing that ability is an important factor for the regeneration process."
The Birth of the Vision
Laurencin's passion for regeneration began when he was a sports medicine fellow at Cornell University Medical Center in the early 1990s. There he saw a significant number of injuries to the anterior cruciate ligament (ACL), the major ligament that stabilizes the knee. He believed he could develop a better way to address those injuries using biomaterials to regenerate the ligament. He sketched out a preliminary drawing on a napkin one night over dinner. He has spent the next 30 years regenerating tissues, including the patented L-C ligament.
As chair of Orthopaedic Surgery at the University of Virginia during the peak of the wars in Iraq and Afghanistan, Laurencin treated military personnel who survived because of improved helmets, body armor and battlefield medicine but were left with more devastating injuries, including traumatic brain injuries and limb loss.
"I was so honored to care for them and I so admired their steadfast courage that I became determined to do something big for them," says Laurencin.
When he tells people about his plans to regrow a limb, he gets a lot of eye rolls, which he finds amusing but not discouraging. Growing bone cells was relatively new when he was first focused on regenerating bone in 1987 at MIT; in 2007 he was well on his way to regenerating ligaments at UVA when many still doubted that ligaments could even be reconstructed. He and his team have already regenerated torn rotator cuff tendons and ACL ligaments using a nano-textured fabric seeded with stem cells.
Even as a finalist for the $4 million NIH Pioneer Award for high-risk/high-reward research, he faced a skeptical scientific audience in 2014. "They said, 'Well what do you plan to do?' I said 'I plan to regenerate a whole limb in people.' There was a lot of incredulousness. They stared at me and asked a lot of questions. About three days later, I received probably the best score I've ever gotten on an NIH grant."
In the Thick of the Science
Humans are born with regenerative abilities--two-year-olds have regrown fingertips--but lose that ability with age. Salamanders are the only vertebrates that can regenerate lost body parts as adults; axolotl, the rare Mexican salamander, can grow extra limbs.
The axolotl is important as a model organism because it is a four-footed vertebrate with a similar body plan to humans. Mapping the axolotl genome in 2018 enhanced scientists' genetic understanding of their evolution, development, and regeneration. Being easy to breed in captivity allowed the HEAL team to closely study these amphibians and discover a new cell type they believe may shed light on how to mimic the process in humans.
"Whenever limb regeneration takes place in the salamander, there is a huge amount of something called heparan sulfate around that area," explains Laurencin. "We thought, 'What if this heparan sulfate is the key ingredient to allowing regeneration to take place?' We found these groups of cells that were interspersed in tissues during the time of regeneration that seemed to have connections to each other that expressed this heparan sulfate."
Called GRID (Groups that are Regenerative, Interspersed and Dendritic), these cells were also recently discovered in mice. While GRID cells don't regenerate as well in mice as in salamanders, finding them in mammals was significant.
"If they're found in mice. we might be able to find these in humans in some form," Laurencin says. "We think maybe it will help us figure out regeneration or we can create cells that mimic what grid cells do and create an artificial grid cell."
What Comes Next?
Laurencin and his team have individually engineered and made every single tissue in the lower limb, including bone, cartilage, ligament, skin, nerve, blood vessels. Regenerating joints and joint tissue is the next big mile marker, which Laurencin sees as essential to regenerating a limb that functions and performs in the way he envisions.
"Using stem cells and amnion tissue, we can regenerate joints that are damaged, and have severe arthritis," he says. "We're making progress on all fronts, and making discoveries we believe are going to be helping people along the way."
That focus and advancement is vital to Ennis. After laboring over the decision to have her leg amputated below the knee, she contracted MRSA two weeks post-surgery. In less than a month, she went from a below-the-knee-amputee to a through-the-knee amputee to an above-the-knee amputee.
"A below-the-knee amputation is night-and-day from above-the-knee," she said. "You have to relearn everything. You're basically a toddler."
Kirstie Ennis pictured in July 2020.
Photo Credit: Ennis' Instagram
The clock is ticking on the timeline Laurencin set for himself. Nine years might seem like forever if you're doing time but it might appear fleeting when you're trying to create something that's never been done before. But Laurencin isn't worried. He's convinced time is on his side.
"Every week, I receive an email or a call from someone, maybe a mother whose child has lost a finger or I'm in communication with a disabled American veteran who wants to know how the progress is going. That energizes me to continue to work hard to try to create these sorts of solutions because we're talking about people and their lives."
He devotes about 60 hours a week to the project and the roughly 100 students, faculty and staff who make up the HEAL team at the Convergence Institute seem acutely aware of what's at stake and appear equally dedicated.
"We're in the thick of the science in terms of making this happen," says Laurencin. "We've moved from making the impossible possible to making the possible a reality. That's what science is all about."