Do Platform Migrations Compromise Content Moderation? Evidence from r/The_Donald and r/Incels
Moderation reduces activity
but at the expense of
radicalizing migrated users.
Best Paper Honorable Mention Award
Manoel Horta Ribeiro, Shagun Jhaver, Savvas Zannettou, Jeremy Blackburn, Gianluca Stringhini, Emiliano De Cristofaro, and Robert West, “Do Platform Migrations Compromise Content Moderation? Evidence from r/The_Donald and r/Incels,” Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 316 (October 2021), 24 pages, DOI: 10.1145/3476057
Important links
Media coverage
- “Sanctioned Online Communities May Become More Radicalized, New Study Finds,” Rutgers News, October 22, 2021
- Will Bedingfield, “Deplatforming works, but it’s not enough to fix Facebook and Twitter,” Wired, January 15, 2021
- Craig Timberg and Drew Harwell, “TheDonald’s owner speaks out on why he finally pulled plug on hate-filled site,” The Washington Post, February 5, 2021
- Billy Perrigo, “Big Tech’s Crackdown on Donald Trump and Parler Won’t Fix the Real Problem With Social Media,” Time, January 12, 2021
- Jeremy Blackburn, Robert W. Gehl and Ugochukwu Etudo, “Does ‘deplatforming’ work to curb hate speech and calls for violence? 3 experts in online communications weigh in,” The Conversation, January 15, 2021
- Jessica Colarossi, “Banning Trump from Social Media Makes Sense. But Beware the Downside,” The Brink, January 8, 2021
- David Ingram, “Does ‘deplatforming’ work? Trump’s most extreme fans will find him, research says,” NBC News, January 10, 2021
- Brandy Zadrozny, “Trump’s blog isn’t lighting up the internet,” NBC News, May 11, 2021
- Kaitlyn Tiffany, “The Secret Internet of TERFs,” The Atlantic, December 8, 2020
Abstract
When toxic online communities on mainstream platforms face moderation measures, such as bans, they may migrate to other platforms with laxer policies or set up their own dedicated websites. Previous work suggests that within mainstream platforms, community-level moderation is effective in mitigating the harm caused by the moderated communities. It is, however, unclear whether these results also hold when considering the broader Web ecosystem. Do toxic communities continue to grow in terms of their user base and activity on the new platforms? Do their members become more toxic and ideologically radicalized? In this paper, we report the results of a large-scale observational study of how problematic online communities progress following community-level moderation measures. We analyze data from r/The_Donald and r/Incels, two communities that were banned from Reddit and subsequently migrated to their own standalone websites. Our results suggest that, in both cases, moderation measures significantly decreased posting activity on the new platform, reducing the number of posts, active users, and newcomers. In spite of that, users in one of the studied communities (r/The_Donald) showed increases in signals associated with toxicity and radicalization, which justifies concerns that the reduction in activity may come at the expense of a more toxic and radical community. Overall, our results paint a nuanced portrait of the consequences of community-level moderation and can inform their design and deployment.
BibTeX citation
@article{ribeiro2021migrations,
author = {Horta Ribeiro, Manoel and Jhaver, Shagun and Zannettou, Savvas and Blackburn, Jeremy and Stringhini, Gianluca and De Cristofaro, Emiliano and West, Robert},
title = {Do Platform Migrations Compromise Content Moderation? Evidence from r/The_Donald and r/Incels},
year = {2021},
issue_date = {October 2021},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {5},
number = {CSCW2},
url = {https://doi.org/10.1145/3476057},
doi = {10.1145/3476057},
journal = {Proc. ACM Hum.-Comput. Interact.},
month = {oct},
articleno = {316},
numpages = {24},
}