Hide and Seek on Your Feed

How algorithms influence your information

Do you disagree with a family member, friend, or colleague on hot button issues like politics, divisions in society, or health? If your answer is yes, you’re not alone.

While it’s normal for people to have differing beliefs and opinions, divisiveness and polarization seem to be on the rise. Maybe you and a friend or loved one have completely different impressions of the same event. It might be because of the algorithms operating behind the scenes of most major apps and websites you use to get your news, and the effects they can have on your perceptions.

As much as social media platforms like Facebook, Instagram, Twitter, and TikTok bring us together, they also drive us apart. In the same way YouTube can be considered a radicalization engine, Facebook also divides its users by recommending the posts that get the most emotional reactions ... even if that results in the spread of misinformation. These same designs and set ups can be found on other popular websites and apps.

In this Data Detox, you’ll learn about algorithms and filter bubbles to gain more insights into how you (and those around you) may be swayed by the technology behind popular platforms. You’ll have the opportunity to reflect on your news feed, and gather suggestions to widen your pool of information in order to better understand other perspectives.

Let’s get started!

What are you most likely to click on?

Over your morning coffee, you skim through your social media feed to stay informed, but there’s no way you can see everything. In fact, you’re only seeing a selection of the news or posts which the app thinks you’ll be most interested in, based on what you’ve looked at before. These headlines have been cherry-picked for you by a machine.

Algorithmic curation is when an app’s technology recommends content to you that it calculates you’re most likely to click on or engage with. An algorithmic decision-making system does this automatically and uniquely behind the scenes for each of the millions of people who use the app. Algorithms are created and maintained by people: engineers, developers, and specialists, to name a few ... but the ins and outs of these systems are typically kept on a need-to-know basis.

Did you know? Algorithms have been shown to be biased. Because algorithms are built by people, and rely on the data sets and information uploaded to them by the engineers who build them, depending on what information is or is not fed to the machines, the result can be preferential, exclusionary, or discriminatory. To learn more about how algorithms can be biased, check out this article or watch this video.

These algorithmic systems depend on user profiles, which are also generated by calculations combined over time and across platforms, based on your behaviors, interests, and more. This profile may include information you’ve shared or that’s publicly available online – such as your name, age, and where you live – but it may also include information that’s been inferred about you based on your online habits, like which posts you’ve been liking, commenting on, and sharing.

You can learn more about profiling, and how to take back control, in Renovate Your Social Media Profile.

The accuracy of the recommendations is also based on a number of other factors that make up a profile, created from insights such as:

  • What news headlines, advertisements, and pages you’ve clicked on in the past
  • How long you spend on an image or article, or the speed at which you’re scrolling
  • Demographics you entered into your profile
  • People you’re connected with and their interests
  • People who have a similar profile to you

Algorithms can be used to curate information in whichever way they’re trained—including a more balanced feed, but most of the time it’s quite the opposite. It all depends on how the algorithms are designed, based on the goals of the company.

Most news-based algorithms – like the ones used by Facebook and Twitter, for instance – give you more of what you already seem to like because these companies prioritize attention, with the goal of getting as many clicks and likes as possible.

These algorithms not only select a very precise and narrow set of results for you to view, but they also may lead you down unfavorable paths. The goal becomes less about serving you, and more about showing you posts that you’re most likely to respond to, click on, react to, like, forward, etc. Even if they’re of low-quality, come from questionable sources, or are echoing the same messages you’ve already been receiving.

To learn more about how to spot questionable information, check out 6 Tips to Steer Clear of Misinformation Online.

Burst Your Bubble

When services feed you more stories like the ones you’re already clicking on, this creates what’s known as a filter bubble.

YouTube is the most obvious example of a platform that recommends content based on what you already watch, but similar set ups can be found on Netflix, Spotify, on the Instagram and Twitter Explore pages, in your Facebook feed, and on Amazon. Algorithms may also personalize which search results you see first.

“When you read books on your Kindle, the data about which phrases you highlight, which pages you turn, and whether you read straight through or skip around are all fed back into Amazon’s servers and can be used to indicate what books you might like next.” — Eli Pariser, The Filter Bubble

How does a filter bubble limit or change what you hear about?

Let’s say your go-to app is TikTok. That’s where you connect with friends, follow your favorite celebrities, and learn new things. But depending on whether you’re in or out of a certain bubble, you may not realize certain events are happening, like a protest across town, because the TikTok algorithm hides these results from you. The same scenario can be experienced across the most popular apps and websites.

Being in a filter bubble can cause people to see completely different stories, news headlines, articles, and advertisements. This can result in the spread of misinformation, like poorly researched health advice targeted at those without access to trustworthy sources of medical information. It can also lead to political echo chambers, as demonstrated in the interactive article Blue Feed, Red Feed which shows just how different new stories are for people on different sides of the political spectrum in the US.

It might sound like a good thing to only see content that has been tailored for you and what you like. But consider this example: in the same way that your interest in dog training videos on YouTube will continue to recommend dog-related videos, your neighbor’s interest in conspiracy theories on the same platform will continue them down that same path.

In fact, YouTube is known to encourage people into certain extremist beliefs, particularly through the auto-play feature, which continues to play a line-up of automatically selected videos. And you don’t even need to go to the extremes to see it at work.

Try it yourself

  • Log out of Google and YouTube and open an incognito or private tab in your browser
  • Go to youtube.com
  • Search for the term “vegetarian recipes” and click on the first video in the list (yum!)
  • Check out the recommendations matched to that video

Do you notice any video recommendations which seem to be more provoking or extreme, for example, going beyond simple vegetarian meals and focusing on weight loss and more restrictive diets?

While the example of YouTube “vegetarian recipes” is relatively mild, it’s helpful to see how quickly the algorithm takes you further from your original search. And part of the issue is that by default, YouTube plays its videos continuously. This means that you may find yourself falling down a so-called rabbit hole without realizing it until hours later. But there’s something you can do to prevent it:

Turn off auto-play

If you find yourself easily falling down the YouTube or Netflix rabbit holes, turn off auto-play. You’ll find that this extra effort to manually hit “play” on videos will help bring you back to the present moment.


  • Above the list of recommended videos, you will see the word “Autoplay”. If the switch is blue, it means it’s on, and you can click it to turn it off, turning it gray.
  • If you’re signed in:
  • Click on your profile → Settings → Autoplay → turn off if it’s on


  • Sign in to netflix.com/youraccount
  • Scroll down to the Profile & Parental Controls section
  • For each profile go to: Playback settings → Change
  • Uncheck the boxes next to ‘Autoplay next episode in a series on all devices.’ and ‘Autoplay previews while browsing on all devices.’
  • Save
  • Sign out if you’re done

If you know you’re viewing algorithmically curated content designed specifically for you across your apps and websites, the question is: how can you step outside of your filter bubble?

Mix up your news

A good way to burst your filter bubble is to subscribe to services that aggregate news and information from a variety of sources and with a diverse pool of perspectives. RSS feeds, forums, and mailing lists that exercise a broad range of opinions and themes may help you see outside of your bubble. GlobalVoices and The Syllabus are great options to start with.

You may also want to intentionally visit media websites that have different ways of framing issues, in order to gain insights into how topics you care about are being presented to others. Check out AllSides for a list of media bias rankings and for a selection of headlines as they’re framed on different news websites.

Mind the Gap

If you find that a friend or family member shares news stories that have opposing views to your own and you’d like to start a conversation about it, one way to break the ice is to compare your feeds and discuss the different content that’s recommended to you both.

Face your filter bubble

Sit down with a friend or family member who you think may have different social media and online content habits to you and compare what topics you see first in your news app or on your social media feeds.

  • Do the results surprise you?
  • Are you in different bubbles or the same one?
  • Why do you think you’re getting those specific recommendations and advertisements?
  • Can you connect the recommendations back to specific websites you’ve visited or searches you’ve made? Or maybe you share your device with a family member or partner and their online habits are influencing your recommendations?

In case you decide to engage with your friend or family member in order to dive deeper into a difficult topic, we’ve rounded up a few recommendations to help you navigate the discussion:

And last, but certainly not least, there’s also no shame in taking a pass on a challenging discussion. The choice of whether you engage – online or offline – should feel right to you.

Looking for a next step? Check out Smartphones Call for Smart Habits for tips to enhance your digital wellbeing.

Last updated on: 4/22/2021