Social Media Pushing Suicide-Related Content to Teens Despite New UK Safety Laws

Social Media Pushing Suicide-Related Content to Teens Despite New UK Safety Laws
————————————-
A new study by the Molly Rose Foundation, reported by The Guardian, reveals that social media platforms are still pushing a large amount of suicide-related content to teenagers despite new UK safety laws. The foundation created dummy accounts for a 15-year-old girl and found that algorithms on platforms like Instagram Reels and TikTok’s “For You” page quickly began recommending harmful content after the accounts engaged with initial posts on depression, suicide, and self-harm.
The study found that a staggering 97% of recommended videos on Instagram Reels and 96% on TikTok were harmful. A significant portion of this content, 55% on TikTok, referenced suicide and self-harm ideation, with 16% specifically referencing suicide methods.
According to Andy Burrows, chief executive of the Molly Rose Foundation, the platforms have not properly addressed the issue, and the scale of harm has worsened on TikTok. The report also highlights that social media companies profit from advertising alongside these harmful posts and use overly narrow definitions of what constitutes “harm.”
In response to the findings, an Ofcom spokesperson stated that new measures to protect children online have been implemented since the research was conducted and will require services to “tame toxic algorithms.” The technology secretary, Peter Kyle, added that 45 sites are under investigation. Both TikTok and Meta (Instagram’s parent company) have released statements disputing the report’s methodology and asserting their efforts to protect teenagers online.