Data from Backlingo shows that globally, the average time a person spends on social media a day is 2 hours 24 minutes. 4.48 billion people currently use social media worldwide, which has more than doubled from 2.07 billion since 2015. In a world where we are spending more and more time on the internet, being influenced by algorithms and targeted ads, what does this mean for our ability to form our own opinions based on rational and logic?. Critical thinking is a skill of analysis by using available facts, evidence, observations, and arguments to form an informed and balanced judgment. Integral to critical thinking is analysing different resourses, to learn about alternative interpretations, viewpoints and perspectives. This enables us to question our own beliefs and assumptions, as well as encourage tolerance for other’s views. Without access or the ability to take different viewpoints into account, we risk polarising our own views and producing weak arguments.
Critical thinking is the ability to think clearly and rationally about what to do or what to believe. It includes the ability to engage in reflective and independent thinking. Critical thinking skills include: analysing and weighing up arguments, evaluating evidence that has been presented, distinguishing between fact and opinion, considering the potential for bias and reaching conclusions based on your own reasoning. Social media algorithms work by learning from the user, it uses both machine learning and attributed data points to ensure that the content in a feed is focused more on relevance based on users’ interest, with the aim of encouraging interaction from the user. In its most simplistic form, if I like a friend’s post on Facebook that the world is flat, the algorithm will record the interaction and register it as a point of interest. It will then automatically organise my Facebook feed to show me more content that is flat earth related. In other words, algorithms work by using data on the user’s previous behaviour to predict what content will encourage another reaction. This means that I am shown material and content that I already agree with or like.
This phenomenon coined in 2010 by Eli Pariser, is knowns the filter bubble. The Cambridge Dictionary definiens the filter bubble as ‘the situation in which someone only hears or sees news and information that supports what they already believe and like, especially a situation created on the internet as a result of algorithms.’ In his book, Pariser explained how Google searches bring up vastly differing results depending on the history of the user. He cites an example in which two people searched for British Petroleum. One user saw news related to investing in the company. The other user received information about a recent oil spill. Once users show interest in a specific topic or category, they are directed to other items in the same category, filtering out content they may disagree with or dislike. This, along with the ever-increasing period that society is spending on social media, particularly in a post-Covid world, means that information individuals are exposed to is increasingly uniform and lacking in variation.
The danger of this is that being constantly shown content that we agree with, we limit knowledge on counter arguments, which aids our ability to question our own assumptions and to scrutinise our own opinions. As Pariser said “The filter bubble tends to dramatically amplify confirmation bias—in a way, it’s designed to. Consuming information that confirms our ideas of the world is easy and pleasurable; consuming information that challenges us to think in new ways or question our assumptions is frustrating and difficult. You become trapped in a loop, as your identity shapes your media, and your media then shapes what you believe and what you care about.” Our critical thinking skills become eroded, as we lack being exposed to differing viewpoints that help us assess our own opinions.
There is also evidence to suggest that recent political polarisation seen in many countries has in part been caused by the filter bubble. A US study (Levendusky 2013) found that exposure to like-minded partisan media under experimental conditions can strengthen the views of already partisan individuals. There is also the concern of fake news spreading on social media. Filter bubbles do not distinguish between reliable and unreliable resources, as with the flat earth theory, those that begin to like conspiracy related material will be shown more and more. This can distort an individual’s sense of reality and make them distrustful of the real world around them.
TikTok, the fastest growing social media platform is said to be tackling the filter bubble. In December 2021, it was reported that the app’s algorithm was making adjustments to ensure it isn’t inadvertently reinforcing viewpoints that could be bad for a person’s wellbeing. That being said, it is important to remember that the app’s primary goal is to keep the user on it for as long as possible, by prioritising content the user is predicted to like based on their previous behaviour.
Increasing knowledge of the filter bubble, what it is and its negative effects helps to retain our critical thinking skills. To burst the bubble, we can turn off customization features, and targeted ads on websites to limit the effect that algorithms have on our content. Making an active effort to follow people who share different opinions, having conversations with those you may disagree with and aiming to be empathetic can help burst a filter bubble. Seeking feedback from others about our opinions who share a differing viewpoint and being open to refining our own perspective is useful. Using multiple news sources, blogs and outlets different in ideological stances and editorials helps us to understand alternative viewpoints. Understanding that content on social media may not be accurate and seeking good quality information will aid us in our ability to question assumptions and be more tolerant towards those who come from a different perspective.