Literature Review: The Impact of Social
Media Algorithms on News Consumption
Introduction
The advent of social media has revolutionized the way news is
consumed. Platforms like Facebook, Twitter, and Instagram employ
sophisticated algorithms to personalize content for users. These
algorithms determine what news stories users see, shaping their
understanding of current events. This literature review examines the
impact of these algorithms on news consumption, focusing on aspects
such as information diversity, echo chambers, misinformation, and user
engagement.
Information Diversity
Reduction in News Diversity
Several studies indicate that social media algorithms often reduce the
diversity of news that users are exposed to. Pariser (2011) introduced
the concept of the "filter bubble," where algorithms prioritize content
that aligns with users' existing beliefs and preferences. This
personalization can lead to a narrower range of viewpoints being
presented, potentially limiting users' exposure to diverse perspectives
(Pariser, 2011).
Counterarguments
However, some research suggests that social media can also introduce
users to a broader array of news sources. Bakshy et al. (2015) found
that while algorithms do tend to show users content that aligns with
their preferences, they also expose users to opposing viewpoints more
frequently than traditional news sources (Bakshy et al., 2015). This
suggests that the impact on information diversity may vary depending
on the platform and user behavior.
Echo Chambers and Polarization
Formation of Echo Chambers
Echo chambers, where users are predominantly exposed to information
that reinforces their existing beliefs, are a significant concern associated
with social media algorithms. Sunstein (2001) argued that this
phenomenon can exacerbate political polarization, as users become more
entrenched in their views (Sunstein, 2001). Algorithms that prioritize
engagement tend to amplify this effect, as controversial or emotionally
charged content often generates more interaction.
Empirical Evidence
Empirical studies have shown mixed results regarding the extent of echo
chambers. Barberá et al. (2015) found that social media users are often
exposed to diverse political opinions, but the degree of exposure varies
significantly among individuals (Barberá et al., 2015). Conversely, a
study by Flaxman, Goel, and Rao (2016) found that social media users
are more likely to encounter partisan news compared to those who rely
on direct browsing of news websites (Flaxman et al., 2016).
Misinformation
Spread of False Information
The spread of misinformation is a critical issue linked to social media
algorithms. Vosoughi, Roy, and Aral (2018) found that false news
spreads more rapidly and widely on social media than true news, largely
due to algorithms that prioritize content likely to generate high
engagement (Vosoughi et al., 2018). This can lead to significant societal
harm, as users may make decisions based on inaccurate information.
Algorithmic Interventions
In response to the proliferation of misinformation, social media
platforms have implemented various algorithmic interventions. For
instance, Facebook has employed machine learning to identify and
reduce the visibility of false news stories (Mosseri, 2017). However, the
effectiveness of these measures remains a topic of ongoing debate, with
some arguing that they are insufficient or inconsistently applied (Gorwa,
Binns, & Katzenbach, 2020).
User Engagement
Influence on Engagement Patterns
Algorithms significantly influence how users engage with news on
social media. Tandoc, Zheng Wei Lim, and Ling (2018) found that
algorithmic curation can lead to increased engagement with news
content, as users are more likely to see stories that interest them (Tandoc
et al., 2018). This heightened engagement can enhance user satisfaction
and platform loyalty.
Implications for News Consumption
While increased engagement can be beneficial, it also raises concerns
about the quality of information being consumed. Thorson and Wells
(2016) noted that engagement-driven algorithms may prioritize
sensationalist or emotionally charged content over in-depth, high-quality
journalism (Thorson & Wells, 2016). This can contribute to a skewed
perception of news events and undermine informed public discourse.
Conclusion
The impact of social media algorithms on news consumption is
multifaceted, influencing the diversity of information, the formation of
echo chambers, the spread of misinformation, and user engagement.
While these algorithms can enhance user experience by personalizing
content, they also pose significant challenges to the quality and diversity
of news consumed. Ongoing research and policy interventions are
necessary to mitigate these challenges and promote a more informed and
balanced news consumption environment.
References
Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to
ideologically diverse news and opinion on Facebook. Science,
348(6239), 1130-1132.
Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R.
(2015). Tweeting from left to right: Is online political
communication more than an echo chamber? Psychological
Science, 26(10), 1531-1542.
Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo
chambers, and online news consumption. Public Opinion
Quarterly, 80(S1), 298-320.
Gorwa, R., Binns, R., & Katzenbach, C. (2020). Algorithmic
content moderation: Technical and political challenges in the
automation of platform governance. Big Data & Society, 7(1),
2053951719897945.
Mosseri, A. (2017). News feed FYI: Addressing hoaxes and fake
news. Facebook Newsroom. Retrieved from
https://about.fb.com/news/2017/01/news-feed-fyi-addressing-
hoaxes-and-fake-news/
Pariser, E. (2011). The filter bubble: What the Internet is hiding
from you. Penguin Press.
Sunstein, C. R. (2001). Republic.com. Princeton University Press.
Tandoc, E. C., Zheng Wei Lim, Z., & Ling, R. (2018). Defining
“fake news”: A typology of scholarly definitions. Digital
Journalism, 6(2), 137-153.
Thorson, E., & Wells, C. (2016). Curated flows: A framework for
mapping media exposure in the digital age. Communication
Theory, 26(3), 309-328.
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and
false news online. Science, 359(6380), 1146-1151.