Uncovering #sadtok – how we journeyed into TikTok’s rabbit holes of despair and how to make a Big Tech giant care

Lisa Dittmer, David Nolan

We will share insights from our journey wading into TikTok’s mental health rabbit holes and speaking to young people in Kenya and the Philippines about the lure and impact of TikTok’s dark side. A not too grim talk about a platform’s lack of care and fighting for rights with research & data science.
Stage 4 / T

Content warning: self-harm and suicide

“Have you ever just wanted for it all to end? Have you ever thought about the release it would give, all the pain that would go away?” It doesn’t take much for a teen’s TikTok ‘For You’ Feed to fill up with video clips romanticizing suicide and normalizing self-harm.

In 2023, Amnesty International researchers in four countries together with partners at the Algorithmic Transparency Institute and AI Forensics set out to understand the algorithmic pipeline behind #sadtok. We’ll share what we learned along the way about pushing the limits of civil society research and its challenges and about the value of combining technical research (adversarial audits) with young people’s lived experiences and perspectives beyond the US-European digital policy bubble. Half a year into our campaign to make TikTok safer, we’ll also discuss the use of research in holding a Big Tech giant to account for its human rights responsibilities. 

Headshot of Lisa Dittmer
Researcher, Children and Young People’s Digital Rights
Headshot of David Nolan
Senior Investigative Researcher