re:publica 25
26.-28. Mai 2025
STATION Berlin
Amos Wasserbach, Friederike Quint
As harmful content is on the rise on social media and different platforms dismantle their protections against it, the debate on content moderation is intensifying. Countries take different approaches, balancing freedom of expression with the need to protect individuals from harm. But who should bear the responsibility for managing harmful content—governments, platforms, civil society, or others? How strict should content moderation regimes be? The Content Moderation Lab at TUM Think Tank will present data from 10 democracies on popular preferences in balancing free speech and moderation. In our workshop, together with Das NETTZ, we will apply this data to real-world scenarios, exploring how different populations would handle harmful content. Each group will delve deeper into the preferences of a country and will decide, on the basis of real-life examples, how to best moderate content. By comparing solutions we will discover if culturally sensitive moderation approaches are essential to meet the needs of different democratic communities.
This programme session is supported by Stiftung Mercator. / Dieser Programmpunkt wird durch die Stiftung Mercator unterstützt.

