Investigation Into Tiktok’s Algorithm Reveals It’s Promoting Sexual & Drug Content To Children As Young As 13

Write Comment

Investigation Into Tiktok’s Algorithm Reveals It’s Promoting Sexual & Drug Content To Children As Young As 13

Tik Tok might not be all fun and games as a recent investigation has found the algorithm is promoting sexual, drug, and violent content to children as young as 13-years-old.

As part of a Wall Street Journal investigation, a ’13-year-old bot user’ searched for ‘only fans’ type content and discovered numerous sexual and pornographic videos pop up on the China-based app.

Tik Tok

While the investigation was not done by real teens, WSJ used numerous bot accounts to find the results of the searches.

RELATED: High School Students Facing Criminals Charges After Tripping Black Student For Tik Tok Video

One of the bots loaded 569 videos about drug use with references to cocaine and meth, and advertising for drugs. In addition, there were also over 100 videos popping up for pornography sites and sex stores where the accounts were labeled as adults-only but still appearing to adolescent accounts.

Other adult content that popped up included; alcohol abuse, disorders, violence. Tik Tok claims they currently don’t have a measure in place to separate adult and child content.

Despite this, Tik Tok claimed that “Protecting minors is vitally important, and TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens.”

TikTok says did note that it is working on a filter tool to restrict content for teens and young people.

Leave a Comment