We’ve already established that information can be biased. Now it’s time to look at our own bias.

Studies have shown that we are more likely to accept information when it fits into our existing worldview, a phenomenon known as confirmation or myside bias (for examples see Kappes et al., 2020; McCrudden & Barnes, 2016; Pilditch & Custers, 2018). Wittebols (2019) defines it as a “tendency to be psychologically invested in the familiar and what we believe and less receptive to information that contradicts what we believe” (p. 211). Quite simply, we may reject information that doesn’t support our existing thinking.

This can manifest in a number of ways with Hahn and Harris (2014) suggesting four main behaviours:

  1. Searching only for information that supports our held beliefs
  2. Failing to critically evaluate information that supports our held beliefs - accepting it at face value - while explaining away or being overly critical of information that might contradict them
  3. Becoming set in our thinking, once an opinion has been formed, and deliberately ignoring any new information on the topic
  4. A tendency to be overconfident with the validity of our held beliefs.

Peters (2020) also suggests that we’re more likely to remember information that supports our way of thinking, further cementing our bias. Taken together, the research suggests that bias has a huge impact on the way we think. To learn more about how and why bias can impact our everyday thinking, watch this short video.



Filter bubbles and echo chambers

The theory of filter bubbles emerged in 2011, proposed by an Internet activist, Eli Pariser. He defined it as “your own personal unique world of information that you live in online” (Pariser, 2011, 4:21). At the time that Pariser proposed the filter bubble theory, he focused on the impact of algorithms, connected with social media platforms and search engines, which prioritised content and personalised results based on the individuals past online activity, suggesting “the Internet is showing us what it thinks we want to see, but not necessarily what we should see” (Pariser, 2011, 3:47. Watch his TED talk if you’d like to know more).

Our understanding of filter bubbles has now expanded to recognise that individuals also select and create their own filter bubbles. This happens when you seek out likeminded individuals or sources; follow your friends or people you admire on social media; people that you’re likely to share common beliefs, points-of-view, and interests with. Barack Obama (2017) addressed the concept of filter bubbles in his presidential farewell address:

For too many of us it’s become safer to retreat into our own bubbles, whether in our neighbourhoods, or on college campuses, or places of worship, or especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions… Increasingly we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there. (Obama, 2017, 22:57).

Filter bubbles are not unique to the social media age. Previously, the term echo chamber was used to describe the same phenomenon in the news media where different channels exist, catering to different points of view. Within an echo chamber, people are able to seek out information that supports their existing beliefs, without encountering information that might challenge, contradict or oppose.

Other forms of bias

There are many different ways in which bias can affect the way you think and how you process new information. Try the quiz below to discover some additional forms of bias, or check out Buzzfeed’s 2017 article on cognitive bias.