Wheel of Misfeeds: Why Transparency Reporting Is Broken and the Digital Services Act Won’t Fix It

Svea Windwehr, Jillian York, Robert Gorwa

Content moderation at scale can massively interfere with users’ fundamental rights. To counter the opacity of content moderation, regulation relies on transparency reports, a fundamentally broken tool. Join our game show contestants to explore how we got here and discuss meaningful alternatives.

Platforms like Instagram, YouTube and X are the interfaces through which millions of users experience the web, access information, and interact with each other. They moderate users’ content at scale and routinely restrict or remove content and accounts. Many of these decisions are taken in a subjective and opaque manner which, as experts have highlighted for decades, can have massively negative implications for users’ freedom of expression. Transparency reporting obligations have been regulators' favorite tool to shed some light onto the darkness that is corporate content moderation. Consequently, the Digital Services Act (DSA), the EU’s new regulatory framework for online platforms, also contains extensive rules for platforms to publish reports on their content moderation operations. However, this highly quantified approach to content moderation fails to give accessible, meaningful insight into platforms’ practices and their interpretation of laws and terms of services. In this session, we will critique the current state of transparency reporting, discuss how we got there, and explore meaningful alternatives that will benefit users.