Wheel of Misfeeds – Why transparency reporting is broken and the Digital Services Act won’t fix it.

Svea Windwehr, Jillian York, Robert Gorwa

Zusammenfassung
Content moderation at scale can massively interfere with users’ fundamental rights. To counter the opacity of corporate content moderation, regulation like the DSA heavily relies on transparency reports, a fundamentally broken tool. We'll explore how we go here and discuss meaningful alternatives.
Stage 6
Gespräch
Englisch
Conference

Platforms like Instagram, YouTube and X are the interfaces through which millions of users experience the web, access information, and interact with each other. They moderate users’ content at scale and routinely restrict or remove content and accounts. Many of these decisions are taken in a subjective and opaque manner which, as experts have highlighted for decades, can have massively negative implications for users’ freedom of expression. Transparency reporting obligations have been regulators' favorite tool to shed some light onto the darkness that is corporate content moderation. Consequently, the Digital Services Act (DSA), the EU’s new regulatory framework for online platforms, also contains extensive rules for platforms to publish reports on their content moderation operations. However, this highly quantified approach to content moderation fails to give accessible, meaningful insight into platforms’ practices and their interpretation of laws and terms of services. In this session, we will critique the current state of transparency reporting, discuss how we got there, and explore meaningful alternatives that will benefit users.

image of Jillian York
Director for International Freedom of Expression