The Ethics of Content Moderation: Who Should Decide What We Say Online?
Tech Against TerrorismNovember 30, 2023
00:27:3719 MB

The Ethics of Content Moderation: Who Should Decide What We Say Online?

In this week's episode, we discuss the challenges and complexities of content moderation in the online space, asking who gets to decide what we say online and why. We situate this debate in the context of tech platforms facing ever-increasing pressure to moderate content that is considered harmful or otherwise undesirable.

We delve into the mechanics of how content is moderated, focusing on the evolving roles and responsibilities of tech companies and governments in determining acceptable content. Decisions to deplatform individuals or groups, even when their content is not explicitly illegal, raise questions about the legitimacy of tech companies as arbiters of public dialogue.

Join Archie Macfarlane as he speaks with Alastair Reed, Associate Professor at the Cyber Threats Research Centre (CYTREC) at Swansea University and former Director of the International Centre for Counter-Terrorism (ICCT) in the Hague.

You can read a transcript of this episode here.

If you want to find out more about Tech Against Terrorism and our work, visit or follow us on Twitter, where you can find resources on this topic. 

online,facebook,ethics,content moderation,social media,censorship,freedom of speech,freedom of expression,Mark Zuckerberg,