This is a guest post by Marielle DeVos from the Institute for Youth in Policy. The original version can be found here.
When it comes to politics, social media heavily segregates different user groups when it comes to politics by creating an echo chamber effect among its users. Not only does this segregation reinforce bias, but it can also lead those within a particular echo chamber to shift their positions further towards the extreme as they become more isolated from external information. This phenomenon strengthens ideologies and creates a sense of “us vs. them,” pitting one ideological group against the other and exacerbating the political divide within America.
Group polarization theory describes echo chambers as mechanisms which reinforce existing group opinions and, by extension, shift the entire group’s ideology to the extreme (Cinelli et al. 2021). When the group is confronted with information from outside the shared narrative, it seeks information from within to counter the contradicting information and strengthen its ideology. This group mentality exists on both sides of the political aisle. In 2020, 89% of Trump supporters believed Biden’s election would lead to lasting harm to the United States; similarly, 90% of Biden supporters believed Trump’s re-election would lead to lasting harm to the country (Dimock and Wike 2020). These statistics are an example of the polarization and bias that has been amplified by social media. Conservative users are surrounded by conservative content and liberals by liberal content. When confronted with opposing views, they interpret the information to suit their narrative rather than modifying their position. Over time, this causes users’ perceptions to shift further from center (Cinelli et al. 2021).
Social media sites foster confirmation bias because of their basic function. Regardless of the specific algorithm, social media sites like Facebook, Reddit, Twitter, Instagram, and even YouTube, serve the same basic function: to connect groups of like-minded users together based on shared content preferences. This process of grouping users in and of itself is not inherently problematic; however, when it comes to politics, it favors the formation of groups with common beliefs. This system isolates the members of these groups from outside influence because their feed is flooded with content that appeals to their group, thus reinforcing their shared narrative. This phenomenon is most obviously seen with Facebook and Reddit; Facebook groups and subreddits are notorious for fostering conspiracy theories and creating a “group vs. group” mentality that further fuels political divide and bias (Cinelli et al. 2021)
This behavior is amplified because of how social media sites are built. Search engine optimization and different algorithms limit “selection processes” because they constantly suggest content that is similar to what the user has already shown preference for (Cinelli et al. 2021). Platforms suggest groups and content to users based on observed preferences, so if a user shows preferences for conservative news sources, algorithms will suggest more conservative accounts and groups.This moves the “range” of content a user is exposed to farther from the center. If a user shows preference for liberal news sources or content, the range of content they are shown will shift farther to the left as the algorithm observes their preferences. Users who get news primarily from Facebook are especially susceptible because Facebook’s algorithm segregates its users by preference, more so than other platforms like Twitter or Reddit, so users on Facebook are exposed to even less diverse content than on other platforms (Cinelli et al. 2021).
It must be noted that there are studies that found social media to be useful in combating political bias due to the amount and variety of information available. However, this argument is only a half-truth. While social media sites can combat the polarization caused by echo chambers, it often comes down to two factors: the specific platform and the specific conditions. Some platforms, such as Instagram, have algorithms that isolate their users less than others, such as Facebook.
Another factor that plays into this is the conditions within the echo-chamber. A study in the Scientific American found that when groups of Democrats and Republicans were isolated into social media “bubbles” and were left to discuss issues amongst themselves, their opinions moved closer to the center rather than farther (Centola 2020). However, as acknowledged by the author of the study, it doesn’t mimic the conditions that naturally surround users on social media. Outside of a controlled study such as Centola’s, echo-chambers do not equate to virtual discussion groups. Instead, they are affected by, and centered around, members and figures who influence the group. For example, the Q-Anon conspiracy theory and the members of its many Facebook groups are influenced by the anonymous “ Q”, who claims to know deep state secrets.
A less extreme example can be seen with prominent media and political figures such as Ben Shapiro or Representative Alexandria Ocasio-Cortez. Both of them hold significant media “clout” and are prominent figures on the right and left. In a controlled study such as Centola’s neither Ben Shapiro nor Representative Ocasio-Cortez would have an effect because, in a controlled study, the echo chamber equates to a virtual discussion group composed of peers. However, in “ normal” conditions, echo chambers center around figures with political sway and are influenced by their ideology. This moves users further to the ideology of these influencers, which in turn introduces them to and surrounds them with “influencers” with increasingly extreme ideologies.
When it comes to the content that users interact with, a study by Sounman Hong and Sun Hyoung Kim published in Government Information Quarterly found that politicians with more extreme ideologies garnered more followers on Twitter (Hong and Kim 2016). The beliefs of these politicians, whether left or right, reinforce the views of the social media groups that already support them, thus reinforcing the perceived realities within these groups and contributing to the movement towards more extreme positions.
However, because of the segregation of users and movement towards extreme viewpoints, polarization extends beyond partisan politics. Americans on different sides of the political spectrum disagree on more than just the issues — they disagree on basic reality. Even though accurate and unbiased information is just one Google search away, groups such as Democrats and Republicans have fundamentally different perceptions of the reality surrounding the issues they disagree on. According to a Pew Research study, 73% of Democrats and Republicans can’t even agree on basic facts (Pew Research Center 2019). The echo chambers created by social media are largely to blame for this. Subreddits, Twitter Feeds, and Facebook groups provide their members with only the “facts” that align with their beliefs. This cherry-picking of information doesn’t even require a reliance on confirmation bias and pits one group’s version of reality against the other’s, causing both to seek out other people and information to support their positions.
Another contributor to this divide is the perceived censorship of conservative viewpoints by social media companies. Among Republicans, 90% believe that social media sites likely censor political viewpoints (Vogels et al. 2020). The majority of conservative Republicans surveyed had no confidence in social media companies’ ability to determine which posts should be labeled as misleading. And when it comes to flagging posts from elected officials as misleading, 71% of Republicans disapprove, while only 25% of Democrats disapprove (Vogels et al. 2020). This perception of censorship is a side effect of social media echo chambers. Rather than reevaluate the information that was labeled as misleading or the source from which it came, conservatives find it easier to blame censorship because all the information they interact with creates a heavy bias and a skewed reality perception. While these statistics are specific to Republicans, this behavior is not unique to Republicans or conservative Americans, it is simply due to the effect that social media has on users.
A study published by the Proceedings of the National Academy of Sciences of the United States of America found that when Democrats and Republicans followed bots that retweeted messages by elected officials and leaders with opposing views, Republicans expressed more extreme conservative views after following a liberal bot, and Democrats expressed slightly more liberal views after following the conservative bot (Bail et al. 2018). Despite the bots sharing information that contradicted their beliefs, the members of the study strengthened their viewpoints rather than changing them or moving closer to the center. Once again, this phenomenon has to do with the impact of social media echo chambers on political beliefs. Users have different perceptions of reality based on their ideology and are used to being surrounded by other users and content that align with their reality. When confronted with positions or information that contradicts that reality, it causes them to react by retreating further into their group and its perceived reality.
Beyond the effect of group polarization on politics, most Americans (64%) believe social media has a negative effect on the country. The three most commonly cited reasons for this were the spread of misinformation (28%), harassment and extremism (16%), and not knowing what to believe (11%) (Auxier 2020). The spread of misinformation — on both the left and right — through social media content such as misleading infographics, tweets, and Facebook posts, sensationalizes and misconstrues facts. This misinformation often enters echo chambers and quickly becomes accepted as fact within those groups, thus contributing to polarization and bias. Extremism, as mentioned earlier, is part of group polarization theory. As groups absorb misinformation and that misinformation becomes fact, their ideology shifts more and more towards the extreme. An example of this can be seen with the popular conspiracy theories that have come out of Facebook groups and subreddits, such as QAnon and the belief that the pandemic was planned by the government. All of this makes it difficult for users to know what information is truly factual and what has been misconstrued; so even though accurate information is readily available, users are often weary of seeking it out and instead choose to believe whatever their group’s ideology says is true (Cinelli et al. 2021).
Political polarization in America has only continued to increase as Americans have become more active on social media platforms. These sites function by connecting users based on their observed content preferences and, as a result, users are grouped together based on their political preferences. This practice of limiting selection processes has amplified bias and partisanship within the political landscape, and has led to an even deeper divide between American citizens. In order to combat the effects of social media on political bias, the most practical method, despite being controversial, is to fact-check information that is shared. Platforms like Instagram flag posts that are misleading so that users know the information they consume may not be accurate. In a perfect world, platforms like Facebook would flag and remove extremist users and groups; however, as has been seen, it is difficult to control or regulate a company as large as Facebook.
In consummation, it is up to individuals to be intentional about how they interact on social media and to seek out new information and research beyond what they see or hear on social media.