re:publica 25
26.-28. Mai 2025
STATION Berlin
Svea Windwehr, Jillian York, Robert Gorwa
Platforms like Instagram, YouTube and X are the interfaces through which millions of users experience the web, access information, and interact with each other. They moderate users’ content at scale and routinely restrict or remove content and accounts. Many of these decisions are taken in a subjective and opaque manner which, as experts have highlighted for decades, can have massively negative implications for users’ freedom of expression. Transparency reporting obligations have been regulators' favorite tool to shed some light onto the darkness that is corporate content moderation. Consequently, the Digital Services Act (DSA), the EU’s new regulatory framework for online platforms, also contains extensive rules for platforms to publish reports on their content moderation operations. However, this highly quantified approach to content moderation fails to give accessible, meaningful insight into platforms’ practices and their interpretation of laws and terms of services. In this session, we will critique the current state of transparency reporting, discuss how we got there, and explore meaningful alternatives that will benefit users.