re:publica 25
26.-28. Mai 2025
STATION Berlin
Content warning: self-harm and suicide
“Have you ever just wanted for it all to end? Have you ever thought about the release it would give, all the pain that would go away?” It doesn’t take much for a teen’s TikTok ‘For You’ Feed to fill up with video clips romanticizing suicide and normalizing self-harm.
In 2023, Amnesty International researchers in four countries together with partners at the Algorithmic Transparency Institute and AI Forensics set out to understand the algorithmic pipeline behind #sadtok. We’ll share what we learned along the way about pushing the limits of civil society research and its challenges and about the value of combining technical research (adversarial audits) with young people’s lived experiences and perspectives beyond the US-European digital policy bubble. Half a year into our campaign to make TikTok safer, we’ll also discuss the use of research in holding a Big Tech giant to account for its human rights responsibilities.