close.svg

Want help talking to your kids about porn?  Get our free Quick Start Guide: How to Talk to Kids about Pornography.

Social Media

Why TikTok Isn't Safe for Kids: 5 Shocking Features Parents Need to Know Now

Behind the fun dance videos and viral challenges, TikTok is hiding some serious risks. From virtual strip clubs to addictive algorithms and harmful beauty filters that erode self-esteem, the dangers are real, leaving many parents asking, "Is TikTok safe for kids?"

Several investigations and lawsuits reveal that TikTok prioritizes profits over safety, keeping harmful features live even when they know kids are at risk. Let’s break it down.

Smartphone home screen showing TikTok and Instagram app icons

TikTok LIVE: A “Virtual Strip Club” for minors

Utah’s latest lawsuit reveals shocking findings: TikTok LIVE — a feature that allows users to stream in real time — has turned into a virtual strip club. Investigators found minors being paid with digital gifts, such as cute emojis or digital plushies, to perform sexually explicit acts. With one-third of U.S. TikTok users aged 14 or younger, the platform has opened up an alarming space where children are exposed to exploitation on a massive scale.

According to the Utah attorney general, TikTok not only knew about these abuses but failed to act. An internal probe discovered that in just one month, kids received over 1 million gifts for “transactional” behavior. Investigators reported finding underage girls performing sexually suggestive acts within minutes of browsing the Live feed.

What TikTok knew:

  • Minors were performing sexual acts on TikTok LIVE for digital gifts, with TikTok directly profiting from its virtual currency feature.
  • TikTok claimed users must be 18 to go live, but age verification wasn’t enforced.
  • Investigators found boys using filters to pose as girls and receive gifts.
  • Despite knowing these risks, TikTok kept the feature active because it was too profitable to shut down.

Utah News Dispatch reported:

“Attorneys for Utah paint TikTok LIVE as an at times seedy corner of the internet that allows the company ‘to profit from crime and the sexual exploitation of children.’
‘In countless live streams, minors have been encouraged by adults to—among other illicit acts—strip, spread their legs, and flash body parts to the camera, in exchange for virtual Gifts,’ the complaint said. 
'And those gifts, according to court documents, are designed to be ‘tempting’ to children, described as ‘cute, colorful emojis reminiscent of cartoons and Disney characters.
To target this young of an audience shows that even TikTok knows that many TikTok LIVE users are under 18 years old, despite TikTok saying the opposite,’ the complaint reads.” 

Utah Governor Spencer Cox exclaimed, 

“Such disregard for the safety of young users on the platform, much less profiting off their exploitation, cannot and will not be tolerated. We will take all necessary actions to protect them from TikTok’s egregious behavior.”

Related: 

How TikTok addiction develops in just 35 minutes

TikTok knows exactly how fast it can pull users in. According to NPR, internal documents reveal that TikTok staff compared their algorithm to slot machines. The company calculated that 260 videos—about 35 minutes—is all it takes for users to form a habit. And with 95% of smartphone users under 17 using TikTok, kids are getting caught in addictive loops every day. These rapid-fire videos, often as short as 8 seconds, keep them scrolling endlessly.

What TikTok knew:

  • Addictive design: TikTok’s rapid-fire videos are designed to be irresistible.
  • Compulsive TikTok use leads to:
    • anxiety, 
    • poor memory, and 
    • decreased empathy.
  • TikTok knew that kids' use of the app disrupts:
    • sleep, 
    • homework, and 
    • relationships with family and friends.
  • Minors lack the executive function to control their screen time—but TikTok intentionally targeted children anyway.
  • California AG Rob Bonta criticized the platform for targeting kids’ inability to set healthy boundaries.
  • Tristan Harris, co-founder of the Center for Humane Technology, described the addictive nature of social media as “a race to the bottom of the brain stem.”

Internal documents revealed:

“One TikTok executive referred to American teens as ‘the golden audience,’ and also stated, 'It’s better to have young people as an early adopter.'"

Another executive admitted TikTok’s algorithm even pulls attention away from basic needs: 

“I think we need to be cognizant of what it might mean for other opportunities, and when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at someone in the eyes.

TikTok’s time limit feature fails to protect kids

TikTok introduced a 60-minute time limit tool to address concerns about screen time, but according to NPR, it was more of a PR move than a real solution. 

Did you catch that? 

TikTok’s “helpful” time limit tool is more about good PR than helping kids unplug.

What TikTok knew:

  • The time limit tool reduced screen time by just 1.5 minutes—from 108.5 minutes to 107 minutes per day.
  • Internal chats revealed the real goal was to boost public trust through media coverage, not to cut usage.
  • One employee said outright: “Our goal is not to reduce the time spent.”
  • Another employee confirmed the goal was to increase daily active users (DAU) and retention.

[[CTA]]

TikTok beauty filters harm kids' self-esteem and mental health

TikTok’s beauty filters use AI to make users look thinner, younger, or flawless. But these seemingly harmless filters are causing serious harm—especially among young girls. New York’s attorney general warned that these filters contribute to body image issues, eating disorders, and body dysmorphia.

What TikTok Knew:

  • TikTok prioritized attractive users in its main feed. Kentucky investigators found that TikTok deliberately adjusted the algorithm to amplify “beautiful” users and reduce the visibility of those deemed “unattractive.”
  • Internal discussions acknowledged that beauty filters harm self-esteem, yet the company didn’t provide warnings or resources.
  • New York AG Letitia James criticized TikTok for pushing beauty filters to young users to keep them on the app longer.

Related: Body Image Talking Points from Dr. Lexie Kite Part 1

Filter bubbles and dangerous content

TikTok’s algorithm doesn’t just keep users engaged — it also traps them in harmful filter bubbles, feeding them more of the content they interact with. This can lead to extreme negative content, like videos promoting self-harm and eating disorders.

What TikTok Knew:

  • Within 20 minutes of following certain accounts, users were fed a steady stream of self-harm and suicide-related content.
  • TikTok acknowledged content moderation failures, with disturbing leakage rates, including:
    • 35.71% of pedophilia-related content
    • 33.33% of minor sexual solicitation
    • 39.13% of minor physical abuse
    • 30.36% of content leading minors off-platform
    • 50% glorifying minor sexual assault
    • 100% fetishizing minors

No response from TikTok

At the Coalition to End Sexual Exploitation (CESE) Summit in August, Lili Nguyen, a Trust and Safety lead at TikTok, spoke about the platform’s community guidelines and “zero tolerance” policies. However, when we contacted her with questions about how TikTok enforces these policies, Nguyen didn’t respond. TikTok's public messaging paints a positive picture, but the company's actions tell a different story.

What can parents do to keep their kids safer from TikTok’s dangers?

These findings make it clear: TikTok’s design is NOT just about fun videos—it’s about keeping kids hooked, no matter the cost. Here’s how you can help protect your child from the dangers of social media:

  • Delay social media: Hold off on introducing social media until at least age 16.
  • Set clear rules: Establish tech-free zones or times (like during meals or bedtime) to help your family unplug.
  • Monitor usage: Use tools like Bark to keep an eye on your child’s activity and get alerts for risky behavior. Canopy filters nudity when using a browser (instead of an app).
  • Talk about what they see: Encourage open conversations about social media, body image, and online safety. Teach your kid to question what they see and spot filtered, curated content.
  • Find safer alternatives: Use tech-healthy phones like Gabb, Bark, Pinwheel, Troomi, or MMGuardian, which limit risky apps but keep kids connected. Consider smartwatches for younger kids.
  • Build a support network: Team up with other parents to create shared guidelines and reduce the pressure on kids to conform to social media norms.

Related:

Helpful resources

Your child’s digital safety starts with you

TikTok’s inner workings reveal a platform that prioritizes profits over safety—one that targets kids, amplifies harmful content, and uses addictive design to keep them scrolling. 

But parents aren’t powerless

You can’t control TikTok’s design, but you can control how your family engages with it. With powerful tools and strategies, you can keep your kids safer from the grip of harmful platforms like TikTok.

Brain Defense: Digital Safety Curriculum - Family Edition

"Parents are desperate for concepts and language like this to help their children. They would benefit so much from this program - and I think it would spur much needed conversations between parents and children.” --Jenet Erikson, parent

Learn more or buy