4.8 • 861 Ratings
🗓️ 10 June 2025
⏱️ 47 minutes
🧾️ Download transcript
Attempts to moderate online hate might actually be creating more harmful content. Tamar Mitts is a professor of international and public affairs at Columbia University, where she is a faculty member at the Saltzman Institute of War and Peace Studies, the Institute of Global Politics, and the Data Science Institute. She joins host Krys Boyd to discuss the difficult task of policing online hate, why content moderation isn’t working as intended, and the sites that users go to for the most extreme ideas. Her book is “Safe Havens for Hate: The Challenge of Moderating Online Extremism.”
Learn about your ad choices: dovetail.prx.org/ad-choicesClick on a timestamp to play from that location
0:00.0 | The effort to eliminate hate speech and violent ideologies on social media platforms is sometimes likened to playing the carnival game, Wackamol. |
0:18.0 | If only it were that easy. In Wackamol, as soon as you force a |
0:22.9 | critter back down into its hole, it does emerge somewhere else on the board, but there are a finite |
0:28.1 | number of places where it can pop back up, and the moles themselves never change. For all their |
0:33.7 | dangerous effects on society, you have to give it to the purveyors of hate speech. |
0:37.6 | They are way more sophisticated than this. |
0:40.6 | From KERA in Dallas, this is Think. |
0:43.3 | I'm Chris Boyd. |
0:44.7 | Take out hate speech on Facebook. |
0:46.7 | And sure, that same content can easily be posted to a site like Telegram, which has been |
0:51.4 | less aggressive about content moderation. |
0:53.7 | In that case, the mole goes to another board entirely. |
0:57.2 | And the people who want to spread these ideas might also find ways to lure in eyeballs on a highly |
1:01.8 | moderated site by changing how and what they communicate there. |
1:05.8 | So the mole on Facebook gets transformed into something like a bunny that nobody wants to whack. |
1:10.6 | And anybody who |
1:11.6 | finds that bunny appealing enough can get to the uncensored violent version of the same message with |
1:16.7 | just a couple of clicks. My guest has been working to understand how extremist groups adapt |
1:22.4 | their messaging to allow versions of their ideas to escape platform censorship, use bans on more mainstream sites |
1:29.2 | to stoke grievances, and migrate their most dangerous content to less restrictive sites. |
1:34.9 | Tamar Mitz is a professor of international and public affairs at Columbia University, where she's |
1:40.0 | a faculty member at the Saltzman Institute of War and Peace Studies, the Institute of Global Politics, and the Data Science Institute. Her book is called Safe Havens for Hate, the |
... |
Transcript will be available on the free plan in 3 days. Upgrade to see the full transcript now.
Disclaimer: The podcast and artwork embedded on this page are from KERA, and are the property of its owner and not affiliated with or endorsed by Tapesearch.
Generated transcripts are the property of KERA and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.
Copyright © Tapesearch 2025.