Algorithms: Our Personalized Reality
By: Rhea Kumar
Picture yourself sitting on the couch, phone in hand. Your eyes dart up and down, following the swipe of your thumb as you endlessly scroll through videos on Instagram. You then pause and like a video. Moments later, videos strikingly similar to the one you liked before begin to appear. You continue to like the videos. Though this act may appear as fleeting, ordinary, or even insignificant, it plays a critical role in shaping the content you consume.
An algorithm, as defined by Oxford’s Dictionary, is a “Data-tracking system in which an individual's internet search history and browsing habits are used to present them with similar or related material on social media or other platforms.” Basically, it processes data, analyzes patterns, and makes predictions. These are systems that learn, evolve, and can understand us in unexpected ways. With these components, they show users content that they are most likely to engage with. Over time, this selective exposure influences not only what users see but how they think, reinforcing certain viewpoints while limiting others. As a result, algorithms become a powerful and often overlooked influence on the formation of individual opinions.
With this, you ask yourself: Why is this bad? How much of what you believe is what the algorithm wants you to believe? As your feed becomes increasingly personalized, social disruption and misinformation may ensue, with each social media experience being perfectly curated by data points and code. However, people are used to this; no one realizes it’s happening anymore. Engaging content is being pushed onto neighbors, coworkers, students, and even you, contributing to mental health issues stemming from addictive behaviors and narrowing exposure to diverse viewpoints. Our interactions with online algorithms affect how we learn from others, leading to social misinformation and unrest. As a result, algorithms reinforce discrimination and biases along racial, gender, and economic lines. How can pieces of unseen code determine what we know, believe, and what we question? How do they shape reality?
The internet contains a vast ocean of information that algorithms can use to filter what reaches us. For example, Google Search ranks results based on “relevance,” yet they are heavily influenced by popularity, advertising, and profit. Because people tend to trust the top search results as the most accurate, algorithms quietly determine what is seen as the right answer. Why are we allowing search engines to make that decision for us?
Similarly, news sites like Google and Apple News select what headlines you see first. As a result, algorithms don’t portray reality, but shape it. Each feed is personalized, allowing different users to be exposed to different narratives. This can allow them to control what voices are heard and which aren’t, resulting in a struggle of balancing free speech and misinformation, along with engagement.
What you see online isn’t impartial or neutral, but carefully curated. Platforms take what they know about you, put it in a blender, and feed you a concentrated smoothie of ingredients chosen specifically for your digital palate. Past searches, likes, and shares dictate the people you see online, the news you read, and the ideas shared. If you like one conspiracy video on TikTok, the algorithm suggests more. If you search for recipes on Instagram, your feed will be filled with ingredient lists and cooking videos. Over time, this personalization can swallow you, narrowing what you’re exposed to, aloof to alternatives and different perspectives. While this may seem harmless with topics like food or hobbies, it becomes far more concerning when applied to politics and social issues. For example, a 2018 study found that Facebook’s algorithms promoted politically divisive content. Why? Extreme posts are what get the most engagement. People get hooked on emotion. Content that promotes anger or outrage spreads faster, gaining more reactions and activity.
Because the top factor of promoting content relies on engagement, misinformation can slowly slip through the cracks of the media as long as it aligns with our biases and beliefs. For example, Neiman Lab reports that “Facebook’s profit-driven algorithms supported a surge of hateful misinformation targeting the Rohingya people (a predominantly Muslim population of Myanmar), contributed to their genocide by the Myanmar military.” For context, in 2017, Facebook hosted a large quantity of extremist speech due to its lack of moderation. As a result, many commenters voiced hatred on the app, referring to Rohingya as “dogs,”- and other cruel names - that needed to be “exterminated.” Through specific hateful messages and targeted rumors circulating through Facebook’s feed, violence was finally incited. This example highlights the power an algorithm can have on others; how it can reinforce extremist beliefs, or ruin others, whatever is more engaging. This calls for more users to evaluate the information they come across critically.
The term “Media Literacy” is commonly plastered in comments or videos on social media. With the rise of content creation, the information being shared is beginning to feed on the lack of critical thinking of users. As defined by the Oxford English Dictionary, Media Literacy is the “Ability to critically analyze any story or event presented in the media to determine its accuracy or credibility.” What we see online can distort truth from fiction, slowly leaving no room for nuance or ambiguity. Media literacy is often powered by our algorithms, shaping what we believe and understand.
Nowadays, people don’t feel the need to verify the accuracy of the information they see, trusting whatever appears on their feed. Additionally, it amplifies their own beliefs and affirms them by showing similar videos. This is pegged as confirmation bias: the tendency to favor information that confirms what you already believe. Confirmation bias creates a digital cycle that deepens divides, promotes misinformation, and creates a world of apathy in social and political settings.
Despite the persuasive and strong hold algorithms may have on you, you can aim to break out of your digital bubble, reclaiming control. Different tactics include following various perspectives to broaden your understanding, using multiple news platforms to ensure a balanced narrative, and practicing digital literacy - verifying sources and developing a healthy doubtfulness. It is also important to be mindful of each comment or video you come across, as that is what the algorithm will show you in the future. Try to mix up the content you consume and widen your lens.
Algorithms are complex systems to understand, as they balance convenience with potential harm. Living in a rapidly developing world of technology, it is important to understand how these systems can affect us and how we can minimize their damage.