Sunday, July 7, 2024

Social Media Algorithms: Enhancing Engagement or Limiting Perspectives?

In today's digital age, social media platforms have become fundamental to how we communicate, access information, and perceive the world around us. However, the algorithms that shape these platforms have sparked considerable public interest and controversy. Designed to tailor content to individual users, these sophisticated algorithms maximize engagement and keep us glued to our screens. While this personalized approach can enhance the user experience, it also raises critical concerns about its broader societal impact. These algorithms have the power to shape our perceptions of reality, influence our opinions, and reinforce existing biases.

Given the immense influence of these algorithms, understanding their impact is essential. One perspective is that social media algorithms enhance our user experience by showing us content that is relevant to our interests. From this viewpoint, algorithms are seen as tools that make our online experiences more enjoyable and tailored to our individual needs. Conversely, there is a growing concern that these algorithms are creating filter bubbles, isolating users from diverse viewpoints and reinforcing existing biases. This can lead to a distorted view of reality, where we are less likely to encounter challenging perspectives. A survey on the acceptability of social media platforms using personal data to recommend products and services found society to be quite divided, with 52% of users finding it acceptable and 47% finding it unacceptable [1].

Understanding the impact of social media algorithms is crucial because it touches on the core of how we interact with information and form opinions. By examining scientific findings about these algorithms, we can better understand their influence and develop strategies to mitigate potential negative effects. In this discussion, we will explore the impact of social media algorithms by examining how these algorithms work, their benefits and drawbacks, and what steps can be taken to ensure a more balanced and informed digital experience. By the end, it will become clear why understanding and critically evaluating the role of social media algorithms is essential for both individual users and society.

How Social Media Algorithms Function

I'm sure we've all shared the experience of discussing a topic with friends only to later see ads, articles, and posts directly related to that conversation on our phones. This phenomenon isn't just a coincidence; it is the work of these sophisticated algorithms that are designed to predict our interests and keep us engaged. The process begins with data collection as algorithms collect vast amounts of data from our online behavior, such as the pages we visit, the likes we give, the comments we leave, and the amount of time spent on specific pages [2]. This data is then used to create detailed profiles of our preferences, which enables algorithms to tailor content to our interests, maximizing engagement by showing us more of what we like.​ While this can be incredibly convenient, it also raises significant concerns about privacy, the formation of echo chambers, and the potential for misinformation and manipulation.

Figure 1: How Social Media Algorithms function (A. Biswas et al.)

Personalization vs Privacy

In a qualitative interview, it was found that teenagers perceive these algorithms as accurate reflections of themselves, appreciating how the content aligns with their tastes and preferences [3]. This perception highlights the primary benefit of social media algorithms: personalization. By tailoring content to individual users, algorithms enhance user experience, making social media platforms more engaging and relevant. One study even found that users perceive news items tailored to their interests as having superior journalistic quality compared to news items that are not personalized [4], emphasizing just how much these algorithms can ‘enhance’ a user’s experience. Users often find this personalization convenient and enjoyable, as it helps them discover content that genuinely interests them without much effort.

However, this level of personalization comes at a cost to user privacy since social media algorithms rely on collecting extensive data about users' online behavior. This data collection raises significant privacy concerns, as users' personal information is constantly being monitored and analyzed [5], often without their awareness. According to a Pew Research Center survey, 74% of Facebook users were unaware that the platform maintains a list of their interests and traits [6]. The trade-off between personalized content and privacy is a critical issue in the digital age, as users must weigh the convenience of tailored content against the potential risks to their privacy.

The Echo Chamber Effect

When algorithms prioritize content that aligns with our existing beliefs, they can limit our exposure to differing opinions and critical information. One of the most significant concerns surrounding social media algorithms is their potential to create echo chambers. Echo chambers occur when people are primarily exposed to information and opinions that reinforce their existing beliefs, leading to a narrower worldview and increased polarization [7].

Figure 2: The cycle of bias due to social media algorithms (F. Zimmer et al.)

It has been found that algorithms on social media platforms do indeed tend to limit users' exposure to content opposing their beliefs [8], thereby reinforcing their pre-existing views. A study revealed that the average Facebook user’s feed consists of 50.4% content from like-minded individuals, only 14.7% from sources with opposing views, and the remainder from groups or other connections they follow [9]. Overtime, this can lead to more extreme viewpoints and reduced open-mindedness, leading to increased social and political divisions.

Furthermore, echo chambers have been found to perpetuate misinformation [10]. When users are primarily exposed to content that aligns with their beliefs, they are more likely to encounter and accept false or misleading information that supports their views. This can create a feedback loop where misinformation is continuously reinforced and spread within like-minded communities.

Misinformation and Manipulation

As briefly discussed in the previous section, social media algorithms play a significant role in the spread of misinformation and the potential for manipulation. It is not the algorithms themselves that create misinformation, but their ability to amplify users’ information behavior that allows for the spread of misinformation [11]. Studies have shown that false news is 70% more likely to be retweeted than the truth and reaches its first 1,500 people six times faster [12]. Algorithms prioritize content that garners high engagement, often favoring sensational, emotional, or controversial posts. This tendency can significantly boost the viewership of misleading or false information, as such content is more likely to be shared and commented on rapidly. For instance, a USC study uncovered that 15% of the most habitual news sharers were responsible for spreading about 30% to 40% of the fake news, likely due to the platforms' algorithms prioritizing engagement over accuracy ​[13].

This increase in the spread of misinformation due to social media algorithms can significantly manipulate public opinion. False stories on social media platforms often gain massive traction, being shared and viewed millions of times, which can sway public perception and even impact election outcomes. For example, during the 2016 U.S. presidential election, a study found that a 10% increase in a county’s number of Twitter users was associated with a 0.2 percentage point decrease in Donald Trump’s vote share [14]. This illustrates a direct correlation between social media algorithms' influence and shifts in public perception.

Conclusion

The evidence presented in this post supports the idea that while these algorithms offer significant benefits in terms of personalization and user engagement, they also pose serious risks related to privacy, the formation of echo chambers, and the spread of misinformation. The personalization provided through these algorithms comes at the cost of extensive data collection, raising significant privacy concerns. The prioritization of content that aligns with users' existing beliefs can create echo chambers. The amplification of sensational, emotional, or controversial posts by these algorithms significantly boosts the viewership of misleading or false information.

Given these dynamics, it is crucial for users, tech companies, and policymakers to understand and address the dual-edged nature of social media algorithms. Users should strive to diversify their information sources and remain vigilant about the content they consume. Tech companies need to implement more transparent and fair algorithms that prioritize content diversity and accuracy over mere engagement. Policymakers should create regulations that protect user privacy and promote a balanced informational ecosystem. By working together, we can harness the power of social media algorithms to enhance engagement while minimizing their potential negative impacts.


References

[1]  A. Smith, “2. Algorithms in action: The content people see on social media,” Pew Research Center. Accessed: Jul. 07, 2024. [Online]. Available: https://www.pewresearch.org/internet/2018/11/16/algorithms-in-action-the-content-people-see-on-social-media/

[2]    H. Metzler and D. Garcia, “Social Drivers and Algorithmic Mechanisms on Digital Media,” Perspect. Psychol. Sci., p. 17456916231185057, Jul. 2023, doi: 10.1177/17456916231185057.

[3]    N. McDonald, “Teens see social media algorithms as accurate reflections of themselves, study finds,” The Conversation. Accessed: Jul. 03, 2024. [Online]. Available: http://theconversation.com/teens-see-social-media-algorithms-as-accurate-reflections-of-themselves-study-finds-226302

[4]    S. S. Sundar and S. S. Marathe, “Personalization versus Customization: The Importance of Agency, Privacy, and Power Usage,” Hum. Commun. Res., vol. 36, no. 3, pp. 298–322, 2010, doi: 10.1111/j.1468-2958.2010.01377.x.

[5]    S. Bamdad, D. A. Finaughty, and S. E. Johns, “‘Grey areas’: ethical challenges posed by social media-enabled recruitment and online data collection in cross-border, social science research,” Res. Ethics, vol. 18, no. 1, pp. 24–38, Jan. 2022, doi: 10.1177/17470161211045557.

[6]   P.H. Olmstead Lee Rainie and Kenneth, “Facebook Algorithms and Personal Data,” Pew Research Center. Accessed: Jul. 07, 2024. [Online]. Available: https://www.pewresearch.org/internet/2019/01/16/facebook-algorithms-and-personal-data/

[7]    J. Jiang, X. Ren, and E. Ferrara, “Social Media Polarization and Echo Chambers in the Context of COVID-19: Case Study,” JMIRx Med, vol. 2, no. 3, p. e29570, Aug. 2021, doi: 10.2196/29570.

[8]    M. Cinelli, G. De Francisci Morales, A. Galeazzi, W. Quattrociocchi, and M. Starnini, “The echo chamber effect on social media,” Proc. Natl. Acad. Sci., vol. 118, no. 9, p. e2023301118, Mar. 2021, doi: 10.1073/pnas.2023301118.

[9]    B. Nyhan et al., “Like-minded sources on Facebook are prevalent but not polarizing,” Nature, vol. 620, no. 7972, pp. 137–144, Aug. 2023, doi: 10.1038/s41586-023-06297-w.

[10]  P. Törnberg, “Echo chambers and viral misinformation: Modeling fake news as complex contagion,” PLoS ONE, vol. 13, no. 9, p. e0203958, Sep. 2018, doi: 10.1371/journal.pone.0203958.

[11]  F. Zimmer, K. Scheibe, M. Stock, and W. G. Stock, “Fake News in Social Media: Bad Algorithms or Biased Users?,” J. Inf. Sci. Theory Pract., vol. 7, no. 2, pp. 40–53, Jun. 2019, doi: 10.1633/JISTAP.2019.7.2.4.

[12]  S. Vosoughi, D. Roy, and S. Aral, “The spread of true and false news online,” Science, vol. 359, no. 6380, pp. 1146–1151, Mar. 2018, doi: 10.1126/science.aap9559.

[13]  G. Ceylan, I. A. Anderson, and W. Wood, “Sharing of misinformation is habitual, not just lazy or biased,” Proc. Natl. Acad. Sci., vol. 120, no. 4, p. e2216614120, Jan. 2023, doi: 10.1073/pnas.2216614120.

[14]  “How Twitter affected the 2016 presidential election,” CEPR. Accessed: Jul. 03, 2024. [Online]. Available: https://cepr.org/voxeu/columns/how-twitter-affected-2016-presidential-election

 

No comments:

Post a Comment