How Social Media Algorithms Push Harmful Content

Author avatar
Daniel Carter Author
March 2, 2026 5 min read

Quick Overview

Social media algorithms shape what you see. They are built to keep you scrolling. They often push upsetting content. This is not an accident. It is by design. Here is what you need to know.

How algorithms work:

  • They track every click and scroll.
  • They learn what grabs your attention.
  • They show you more of what you engage with.
  • Upsetting content gets high engagement.
  • So algorithms push more of it to you.
  • This can make you feel worse over time.
  • It is called the engagement loop.
  • It is hard to break on your own.

Signs the algorithm is harming you:

  • You feel worse after scrolling.
  • You feel anxious or angry online.
  • You spend more time than you plan to.
  • You keep seeing upsetting content.
  • You feel unable to stop scrolling.
  • You feel bad about yourself after using apps.
  • You sleep less because of screen time.

What you can do:

  • Set a daily time limit for each app.
  • Unfollow accounts that upset you.
  • Follow accounts that uplift you.
  • Take regular breaks from social media.
  • Turn off push notifications.
  • Use apps at set times, not all day.
  • Talk to friends in person instead.
  • Report harmful content when you see it.
  • Use platform tools to hide upsetting posts.
  • Check in with how you feel after using apps.

The bottom line: you have more control than you think. You can take back your feed. Read on to learn how algorithms work and what you can do about it.

Every time you open a social media app, an algorithm decides what you see. These systems are built to maximize engagement. Research shows that what keeps people engaged is often upsetting or disturbing content. Understanding how algorithms work is the first step to taking back control.

How Social Media Algorithms Work

Social media algorithms track every interaction you make. They record likes, shares, and comments. They measure how long you pause on a post. They even track what you scroll past quickly. This data builds a profile of your interests and emotional triggers. The algorithm uses that profile to serve content. The goal is to keep you on the platform as long as possible.

The Engagement Trap

Research from multiple universities has demonstrated that content triggering strong emotional reactions. Anger, fear, outrage, sadness. Generates significantly more engagement than neutral or positive content. This means algorithms naturally amplify the most emotionally charged material. That includes graphic violence and disturbing imagery. It also includes misinformation that provokes outrage and divisive political content. Content that triggers anxiety or despair gets amplified too.

The Facebook Live incident involving Ronnie McNutt in 2020 demonstrated how algorithmic amplification can spread graphic content far beyond its original audience, reaching millions of people — including children — before platforms intervened.

Doomscrolling: The Algorithmic Rabbit Hole

Doomscrolling — the compulsive consumption of negative news and content — is partly a human tendency and partly an algorithmic creation. When you engage with negative content, the algorithm interprets this as a preference. It then serves more of the same. This creates a feedback loop. Over time, the loop can significantly worsen anxiety and depression.

Facebook social media platform interface
The Facebook platform interface, one of the major social media networks whose algorithms influence content visibility.
Image: Facebook via Wikimedia Commons | Public domain via Wikimedia Commons

How To Take Back Control

1. Audit Your Feed

Spend 10 minutes scrolling through each of your social media feeds and honestly assess how the content makes you feel. If more than half of what you see leaves you feeling anxious, angry, or drained — it is time for a reset.

2. Train Your Algorithm

Algorithms learn from your behavior, which means you can retrain them. Actively engage with positive, informative, or uplifting content. Use the “not interested” or “see less” option on content that disturbs you. Unfollow or mute accounts that consistently post harmful content. Follow accounts focused on your hobbies, education, or personal growth.

3. Use Platform Safety Tools

Most major platforms offer content filtering tools that many users never discover. Instagram and TikTok have sensitive content controls. YouTube allows you to remove specific channels from recommendations. Facebook lets you snooze or unfollow without unfriending. Twitter/X has muted words and content preferences. For a deeper guide on these tools, see our comprehensive digital safety guide.

4. Set Time Boundaries

TikTok viral content representing algorithm-driven platforms
Viral TikTok content, illustrating how algorithm-driven platforms can amplify both helpful and harmful material.
Image: Gabbybratcher via Wikimedia Commons | Licensed under CC BY-SA 4.0 via Wikimedia Commons

  • Use built-in screen time tools to set daily limits per app
  • Remove social media apps from your home screen
  • Designate phone-free times, especially the first and last hour of your day
  • Turn off non-essential notifications

5. Choose Intentional Consumption

Replace passive scrolling with intentional media consumption. Subscribe to newsletters from trusted sources. Listen to podcasts on topics you care about. Bookmark specific websites rather than relying on algorithm-driven feeds. Set specific times for news consumption rather than constant checking.

Protecting Young People

Children and teenagers are especially vulnerable to algorithmic harm. Their brains are still developing emotional regulation skills. Parents should have open conversations about what their children encounter online, use parental controls and content filters, follow and engage with their children on social platforms, model healthy digital habits themselves, and create technology-free family time.

The Bigger Picture

While individual action matters, systemic change is also necessary. Advocacy for algorithmic transparency, stronger content moderation, and platform accountability continues to grow worldwide. By understanding how these systems work, we become better equipped to protect ourselves and push for the changes that will protect everyone.

References & Further Reading

Author avatar
Written by

Daniel Carter

Editor and curator of RonnieMcnutt.com — a mental health awareness site focused on veteran suicide prevention, PTSD, and the legacy of Ronnie McNutt.

View all posts