top of page

Misinformation fails to sway voters in 2020 election, study finds

[Apr. 23, 2023: JD Shavit, The Brighter Side of News]

People have become more adept at detecting misinformation online in the lead-up to the 2020 election. (CREDIT: Kristin-Lord-Katya-Vogt)

A recent study by Stanford, published in the journal Nature Human Behaviour, suggests that people have become more adept at detecting misinformation online in the lead-up to the 2020 election.

The study found that there has been a decline in clicks onto unreliable websites, indicating increased awareness of misleading information. In contrast, prior research showed that during the 2016 U.S. election, 44.3 percent of Americans visited websites that repeatedly spread false or misleading information.

Stanford scholars observed a significant drop in this number during the 2020 election, with only 26.2 percent of Americans visiting such websites. This implies that people have become more discerning in their online habits and are better equipped to identify fake news, a positive trend for the democratic process.


Despite the positive implications of these findings, the scholars remain cautious when interpreting the results of the study. As stated in the paper, the potential consequences of exposure to misinformation, even among a smaller number of individuals, should not be underestimated.

Based on their research, the scholars estimated that approximately 68 million Americans made a total of 1.5 billion visits to untrustworthy websites during the 2020 election.


Related Stories


Jeff Hancock, a senior author of the study and a professor of communication in Stanford's School of Humanities and Sciences, emphasized that while there was a significant decrease in the number of people exposed to misinformation on the web, the issue of misinformation still persists, particularly among older adults and diverse communities. He added that despite the progress made, there is still much work to be done in addressing this serious problem within the information ecosystem.

According to the scholars' research, individuals who frequent websites spreading false information are typically older and have a stronger political leaning to the right, which is in line with the 2016 findings. However, they now visit fewer untrustworthy websites and spend less time on them compared to their 2016 behavior.


Study Details

The study is an extension of the earlier research conducted by Andrew Guess from Princeton University. Guess's research in 2016 identified approximately 490 websites known for spreading disinformation, including pages previously flagged by reputable disinformation researchers, such as Matthew Gentzkow, a prominent Stanford economist.

The researchers, led by Hancock, alongside Stanford PhD students Ryan Moore and Ross Dahike, expanded their list of unreliable domains by adding 1,240 more from NewsGuard.

This organization is responsible for assessing the credibility of news and information websites through manual reviews conducted by experienced journalists and editors. NewsGuard evaluates websites based on several criteria, such as their tendency to publish false content, issue corrections for errors in their reporting, and distinguish between news and opinion.

Next, the Stanford team recruited a representative sample of 1,151 American adults using the polling firm YouGov. The participants were asked to complete an online survey and install a browser plugin, which allowed the researchers to passively monitor their web activity between October 2, 2020, and November 9, 2020. The researchers collected a total of 7.5 million website visits across both desktop and mobile devices used by the participants.


Who reads false news online and how did they find it?

Facebook referred only 5.6 percent of visits to untrustworthy websites in 2020, a significant drop from the 15.1 percent recorded in 2016. This decrease in referrals can be attributed to the social media giant's efforts to tackle the problem of false news on its platform.

Referrers to untrustworthy news websites and other sources. (CREDIT: Nature Human Behaviour)

“The drop in visits referred by Facebook may reflect investment in trust and safety efforts to decrease the prevalence of misinformation on their platform, such as flagging, content moderation, and user education, which they and other platforms weren’t doing as much of in 2016,” said Moore.

According to the researchers, while the average number of visits to misinformation websites has decreased from 32 in 2016 to 23 in 2020, there are still individuals who visit such sites excessively. Dahlke stated that there are some people who visit hundreds of these websites, and further research is necessary to determine how this affects their beliefs and actions.


The study also found that older adults were twice as likely to visit misinformation websites as individuals aged 18-29. Although the percentage of Americans aged 65 and above exposed to misinformation decreased from 37.4% in 2016 to 56.2% in 2020, older adults continue to consume misinformation at higher rates than younger adults.

Visits to untrustworthy websites by media slant decile. (CREDIT: Nature Human Behaviour)

Misinformation peddlers target older adults because they tend to be wealthier and more politically active than other generations. This makes them vulnerable to bad actors seeking to profit or influence election results, according to Hancock.


Misinformation swiftly evolves and mutates

According to scholars, misinformation is a harmful and swiftly evolving phenomenon. In their paper, they suggest that although their findings could be seen as evidence of some improvement in the issue of online misinformation, it could also indicate a change in its nature.

The scholars' research only focused on web browsing behavior, which means that false information could have migrated to other social media platforms or encrypted messaging services like WhatsApp or Signal. Furthermore, consuming fake news is not limited to clicks; individuals may passively encounter unreliable information while scrolling through their news feeds, such as through a meme or headline. All of these factors make studying the topic of misinformation challenging.

Looking ahead to the 2024 election season

The research trio of Hancock, Moore, and Dahike is already contemplating the implications of their findings on the propagation of misinformation during the upcoming 2024 general election.

They are particularly concerned about the susceptibility of elderly people to fabricated news, a problem that Hancock's Social Media Lab has been tackling with the backing of the Stanford Impact Lab program. In collaboration with the nonprofit news organization Poynter, Hancock and Moore created a digital media literacy intervention in 2020 to assist senior citizens in detecting fake news on the internet.


In addition, the trio is apprehensive about the influence of misinformation in areas that are deprived of resources, such as non-English speaking communities. This issue was recently addressed in a paper co-written by Hancock and Moore, along with Angela Y. Lee, a PhD candidate at Stanford.

For more science news stories check out our New Discoveries section at The Brighter Side of News.


Note: Materials provided above by The Brighter Side of News. Content may be edited for style and length.


Like these kind of feel good stories? Get the Brighter Side of News' newsletter.



Most Recent Stories

bottom of page