The Factual’s media literacy scholarship was open worldwide in Spring 2021. The Factual received over 150 submissions for the competition. The essay prompt was: “How did misinformation impact our lives in 2020? What steps can we take to ensure we get credible news?”

Claire Woodcock’s essay was chosen as honorable mention. Here is her essay below: 

In the first scene of 2002’s post-apocalyptic zombie film 28 Days Later, we see a chimpanzee kept in a primate research lab undergoing forced exposure to a looped reel of violent news clips. Looking back, the premise of Danny Boyle’s canon horror about a human-made “Rage virus” that implodes society doesn’t feel all that far off from the virulent spread of misinformation and internalization of fear and uncertainty that broke America’s rage-o-meter in 2020. 

Even before the COVID-19 pandemic reached American shores, many would say the country was already in the midst of an identity crisis caused by polarization (the effects perhaps more muted thanks to increases in antidepressant use, but this isn’t correlation for causation). A 2019 Gallup report states that one in five Americans were feeling anger “a lot” in 2018—an increase from 17% in years prior. Declining trust in the federal government, healthcare systems, law enforcement, corporate structures, mass media and each other was already plaguing the nation. 

Though close—but not that close—biological relatives to the Pan troglodytes, the outbreak in Boyle’s zombie flick was an allegory for the pent-up human rage. In 2020, we watched on our tablets and smartphones as a life-threatening disease took millions of lives in what felt like real-time. Isolated, afraid, and unsure who to trust, an airborne and algorithmic contagion infected our society with misinformation and disinformation, and the threat lives on today. 

It’s no wonder Americans are so angry. 

With mis/disinformation storming our online lives, it’s essential to consider where the source of the virus comes from. Research this year shows that 65% of misinformation about vaccines circulating on social media platforms originates from a dozen individuals. These superspreaders practice microtargeting vulnerable audiences through bots and trolls to spread misinformation and disinformation, hiding grandiose but baseless claims among breadcrumbs of truth. 

Susceptible individuals—which on any given day could be any of us—may then share the content with their inner circles. By design, algorithms are trained to rely on preexisting cognitive biases. Out of a deep-seated survival instinct, individuals prefer to receive information from people with similar beliefs. Humans are also susceptible to the misinformation effect, or what happens when an individual’s memories of events prove less accurate as time passes. 

As the Brookings Institution explains, social media companies rely on a deep learning algorithm that has learned that users keep using the platform when the algorithm prioritizes content with greater prior engagement. A 2015 study measuring “emotional contagion” in social media found that when individuals are exposed to negative content, they will in turn share negative content with followers. Meanwhile, those exposed to positive content are more likely to share positive content with followers. Thus cognitive bias paired with algorithmic manipulation may then lead to the spread of mis/disinformation. 

It’s probable that people are more likely to share negative posts because of what psychologists call “negativity bias,” or our tendency to crave and remember negative content. In a BBC article about human tendencies to gravitate toward bad news in the media, “we’ve evolved to react

quickly to potential threats,” as bad news could “signal that we need to change what we’re doing to avoid danger.” Between the number of negative news reports we saw in 2020 and the spread of negative content on social media, I suspect any onlooker would be fearful. 

But it’s when fear turns to anger that irrationality may kick in and direct correlations between anger and the spread of misinformation that once seen become difficult to unsee. A recent study published in Harvard Kennedy School’s Misinformation Review reveals that “angry individuals are less likely to search for information” about COVID-19 and are likely to favor “information that bolsters their beliefs and are against information that undermines their views.” 

We have, what the World Health Organization (WHO) calls, an “Infodemic.” Suppose our society does not take steps to regulate and educate about the harmful effects of the malevolent spread mis/disinformation. In that case, the consequences to a free press and the democratic process may not be zombie-flick grotesque, but they will be catastrophic. A staunch cynical optimist, I trust technology companies, elected officials and the Q-trolls following my Instagram account will take actionable steps toward a society that values healthy debate guided by credible news. 

Social media entities must offer misinformation researchers access to complete data sets; thus, as a society, we must call for transparent explanations for how social media algorithms work. Commentators for the Misinformation Review write that, “scientists are often left working with partial or biased data and must rush to archive relevant data as soon as it appears on the platforms, before it is suddenly and permanently removed by deplatforming operations.” 

Legislators must look to set a federal precedent to hold social media companies accountable for the role platforms play in the spread of mis/disinformation through algorithms. The U.S. Senate can agree to discuss more bills aimed at holding social media companies liable for inaction. Our elected officials can advocate for critical literacies in the digital age into K-12 classrooms and preserve funding for educational institutions such as public universities and libraries. 

Finally, we must support the nonprofit news organizations providing open access to information, knowledge and culture. New York University scholar Rodney Benson found that subscriber-funded news perpetuates a socio-economic divide between audiences, which functions as gatekeeping information and knowledge. “Even if subscriptions contribute to higher quality news, if that news fails to reach a broader audience, it’s not really a solution to the civic crisis of an uninformed, often misinformed and distrustful citizenry,” Benson writes. 

America is an angry nation, and in 2020, an airborne and algorithmic contagion infected our society. It was a one-two blow to democracy that broke America’s rage-o-meter. Not all horror movies conclude with a cheap jump scare. Let’s now resolve to fix what’s broken.

Published by Claire Woodcock

Claire Woodcock is an M.A. Media & Public Engagement Candidate at the University of Colorado Boulder. She received an honorable mention in the 2021 Factual media literacy scholarship program.