This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Whenever you hear something repeated, it feels more true. In other words, repetition makes any statement seem more accurate. So anything you hear again will resonate more each time it's said.
Do you see what I did there? Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more true. Cognitive neuroscientists and behavioral economists like myself call this the "illusory truth effect."
Go back and recall your experience reading the first sentence. It probably felt strange and disconcerting, perhaps with a note of resistance, as in "I don't believe things more if they're repeated!"
Reading the second sentence did not inspire such a strong reaction. Your reaction to the third sentence was tame by comparison.
Why? Because of a phenomenon called "cognitive fluency," meaning how easily we process information. Much of our vulnerability to deception in all areas of life—including to fake news and misinformation—revolves around cognitive fluency in one way or another. And unfortunately, such misinformation can swing major elections.
The news sources that you consume can kill you - or save you. That's the fundamental insight of a powerful new study about the impact of watching either Sean Hannity's news show Hannity or Tucker Carlson's Tucker Carlson Tonight. One saved lives and the other resulted in more deaths, due to how each host covered COVID-19.
Carlson took the threat of COVID-19 seriously early on, more so than most media figures on the right or left.
This research illustrates the danger of falling for health-related misinformation due to judgment errors known as cognitive biases. These dangerous mental blindspots stem from the fact that our gut reactions evolved for the ancient savanna environment, not the modern world; yet the vast majority of advice on decision making is to "go with your gut," despite the fact that doing so leads to so many disastrous outcomes. These mental blind spots impact all areas of our life, from health to politics and even shopping, as a survey by a comparison purchasing website reveals. We need to be wary of cognitive biases in order to survive and thrive during this pandemic.
Sean Hannity vs. Tucker Carlson Coverage of COVID-19
Hannity and Tucker Carlson Tonight are the top two U.S. cable news shows, both on Fox News. Hannity and Carlson share very similar ideological profiles and have similar viewership demographics: older adults who lean conservative.
One notable difference, however, relates to how both approached coverage of COVID-19, especially in February and early March 2020. Researchers at the Becker Friedman Institute for Economics at the University of Chicago decided to study the health consequences of this difference.
Carlson took the threat of COVID-19 seriously early on, more so than most media figures on the right or left. Already on January 28, way earlier than most, Carlson spent a significant part of his show highlighting the serious dangers of a global pandemic. He continued his warnings throughout February. On February 25, Carlson told his viewers: "In this country, more than a million would die."
By contrast, Hannity was one of the Fox News hosts who took a more extreme position in downplaying COVID-19, frequently comparing it to the flu. On February 27, he said "And today, thankfully, zero people in the United States of America have died from the coronavirus. Zero. Now, let's put this in perspective. In 2017, 61,000 people in this country died from influenza, the flu. Common flu." Moreover, Hannity explicitly politicized COVID-19, claiming that "[Democrats] are now using the natural fear of a virus as a political weapon. And we have all the evidence to prove it, a shameful politicizing, weaponizing of, yes, the coronavirus."
However, after President Donald Trump declared COVID-19 a national emergency in mid-March, Hannity -- and other Fox News hosts -- changed their tune to align more with Carlson's, acknowledging the serious dangers of the virus.
The Behavior and Health Consequences
The Becker Friedman Institute researchers investigated whether the difference in coverage impacted behaviors. They conducted a nationally representative survey of over 1,000 people who watch Fox News at least once a week, evaluating both viewership and behavior changes in response to the pandemic, such as social distancing and improving hygiene.
Next, the study compared people's behavior changes to viewing patterns. The researchers found that "viewers of Hannity changed their behavior five days later than viewers of other shows, while viewers of Tucker Carlson Tonight changed their behavior three days earlier than viewers of other shows." The statistical difference was more than enough to demonstrate significance; in other words, it was extremely unlikely to occur by chance -- so unlikely as to be negligible.
Did these behavior changes lead to grave consequences? Indeed.
The paper compared the popularity of each show in specific counties to data on COVID-19 infections and deaths. Controlling for a wide variety of potential confounding variables, the study found that areas of the country where Hannity is more popular had more cases and deaths two weeks later, the time that it would take for the virus to start manifesting itself. By March 21st, the researchers found, there were 11 percent more deaths among Hannity's viewership than among Carlson's, again with a high degree of statistical significance.
The study's authors concluded: "Our findings indicate that provision of misinformation in the early stages of a pandemic can have important consequences for health outcomes."
Such outcomes stem from excessive trust that our minds tend to give those we see as having authority, even if they don't possess expertise in the relevant subject era.
Cognitive Biases and COVID-19 Misinformation
It's critically important to recognize that the study's authors did not seek to score any ideological points, given the broadly similar ideological profiles of the two hosts. The researchers simply explored the impact of accurate and inaccurate information about COVID-19 on the viewership. Clearly, the false information had deadly consequences.
Such outcomes stem from excessive trust that our minds tend to give those we see as having authority, even if they don't possess expertise in the relevant subject era -- such as media figures that we follow. This excessive trust - and consequent obedience - is called the "authority bias."
A related mental pattern is called "emotional contagion," in which we are unwittingly infected with the emotions of those we see as leaders. Emotions can motivate action even in the absence of formal authority, and are particularly important for those with informal authority, including thought leaders like Carlson and Hannity.
Thus, Hannity telling his audience that Democrats used anxiety about the virus as a political weapon led his audience to reject fears of COVID-19, even though such a reaction and consequent behavioral changes were the right response. Carlson's emphasis on the deadly nature of this illness motivated his audience to take appropriate precautions.
Authority bias and emotional contagion facilitate the spread of misinformation and its dangers, at least when we don't take the steps necessary to figure out the facts. Such steps can range from following best fact-checking practices to getting your information from news sources that commit publicly to being held accountable for truthfulness. Remember, the more important and impactful such information may be for your life, the more important it is to take the time to evaluate it accurately to help you make the best decisions.
Today's growing distrust of science is not an academic problem. It can be a matter of life and death.
Take, for example, the tragic incident in 2016 when at least 10 U.S. children died and over 400 were sickened after they tried homeopathic teething medicine laced with a poisonous herb called "deadly nightshade." Carried by CVS, Walgreens, and other major American pharmacies, the pills contained this poison based on the alternative medicine principle of homeopathy, the treatment of medical conditions by tiny doses of natural substances that produce symptoms of disease.
Such "alternative medicines" take advantage of the lack of government regulation and people's increasing hostility toward science.
Such "alternative medicines" take advantage of the lack of government regulation and people's increasing hostility toward science. Polling shows that the number of people who believe that science has "made life more difficult" increased by 50 percent from 2009 to 2015. According to a 2017 survey, only 35 percent of respondents have "a lot" of trust in scientists; the number of people who do "not at all" trust scientists increased by over 50 percent from a similar poll conducted in December 2013.
Children dying from deadly nightshade is only one consequence of this crisis of trust. For another example, consider the false claim that vaccines cause autism. This belief has spread widely across the US, and led to a host of problems. For instance, measles was practically eliminated in the US by 2000. However, in recent years outbreaks of measles have been on the rise, driven by parents failing to vaccinate their children in a number of communities.
The Internet Is for… Misinformation
The rise of the Internet, and more recently social media, is key to explaining the declining public confidence in science.
Before the Internet, the information accessible to the general public about any given topic usually came from experts. For instance, researchers on autism were invited to talk on mainstream media, they wrote encyclopedia articles, and they authored books distributed by large publishers.
The Internet has enabled anyone to be a publisher of content, connecting people around the world with any and all sources of information. On the one hand, this freedom is empowering and liberating, with Wikipedia a great example of a highly-curated and accurate source on the vast majority of subjects. On the other, anyone can publish a blog piece making false claims about links between vaccines and autism or the effectiveness of homeopathic medicine. If they are skilled at search engine optimization, or have money to invest in advertising, they can get their message spread widely. Russia has done so extensively to influence elections outside of its borders, whether in the E.U. or the U.S.
Unfortunately, research shows that people lack the skills for differentiating misinformation from true information. This lack of skills has clear real-world effects: U.S. adults believed 75 percent of fake news stories about the 2016 US Presidential election. The more often someone sees a piece of misinformation, the more likely they are to believe it.
To make matters worse, we all suffer from a series of thinking errors such as the confirmation bias, our tendency to look for and interpret information in ways that conform to our intuitions.
Blogs with falsehoods are bad enough, but the rise of social media has made the situation even worse. Most people re-share news stories without reading the actual article, judging the quality of the story by the headline and image alone. No wonder research has indicated that misinformation spreads as much as 10 times faster and further on social media than true information. After all, creators of fake news are free to devise the most appealing headline and image, while credible sources of information have to stick to factual headlines and images.
To make matters worse, we all suffer from a series of thinking errors such as the confirmation bias, our tendency to look for and interpret information in ways that conform to our intuitions and preferences, as opposed to the facts. Our inherent thinking errors combined with the Internet's turbine power has exploded the prevalence of misinformation.
So it's no wonder we see troubling gaps between what scientists and the public believe about issues like climate change, evolution, genetically modified organisms, and vaccination.
What Can We Do?
Fortunately, there are proactive steps we can take to address the crisis of trust in science and academia. The Pro-Truth Pledge, founded by a group of behavioral science experts (including myself) and concerned citizens, calls on public figures, organizations, and private citizens to commit to 12 behaviors listed on the pledge website that research in behavioral science shows correlate with truthfulness.
Signers are held accountable through a crowdsourced reporting and evaluation mechanism while getting reputational rewards because of their commitment. The scientific consensus serves as a key measure of credibility, and the pledge encourages pledge-takers to recognize the opinions of experts - especially scientists - as more likely to be true when the facts are disputed.
The pledge "really does seem to change one's habits," encouraging signers to have attitudes "of honesty and moral sincerity."
Launched in December 2016, the pledge has surprising traction. Over 6200 private citizens took the pledge. So did more than 500 politicians, including members of US state legislatures Eric Nelson (PA), James White (TX), and Ogden Driskell (WY), and national politicians such as members of U.S. Congress Beto O'Rourke (TX), Matt Cartwright (PA), and Marcia Fudge (OH). Over 700 other public figures, such as globally-known public intellectuals Peter Singer, Steven Pinker, Michael Shermer, and Jonathan Haidt, took the pledge, as well as 70 organizations such as Media Bias/Fact Check, Fugitive Watch, Earth Organization for Sustainability, and One America Movement.
The pledge is effective in changing behaviors. A candidate for Congress, Michael Smith, took the Pro-Truth Pledge. He later posted on his Facebook wall a screenshot of a tweet by Donald Trump criticizing minority and disabled children. However, after being called out that the tweet was a fake, he went and searched Trump's feed. He could not find the original tweet, and while Trump may have deleted it, the candidate edited his own Facebook post to say, "Due to a Truth Pledge I have taken, I have to say I have not been able to verify this post." He indicated that he would be more careful with future postings.
U.S. Army veteran and pledge-taker John Kirbow described how the pledge "really does seem to change one's habits," helping push him both to correct his own mistakes with an "attitude of humility and skepticism, and of honesty and moral sincerity," and also to encourage "friends and peers to do so as well."
His experience is confirmed by research on the pledge. Two research studies at Ohio State University demonstrated the effectiveness of the pledge in changing the behavior of pledge-takers to be more truthful with a strong statistical significance.
Taking the pledge yourself, and encouraging people you know and your elected representatives to do the same, is an easy and effective way to fight misinformation and to promote a culture that values the truth.