Online Harassment and Content Moderation: The Case of Blocklists

Harassment tactics
Who is vulnerable?
Algorithmic solutions.

Featured in Editor's Spotlight

Shagun Jhaver, Sucheta Ghoshal, Eric Gilbert, and Amy Bruckman (2018), “Online Harassment and Content Moderation: The Case of Blocklists,” ACM Trans. Comput.-Hum. Interact. (TOCHI) 25, 2, Article 12 (March 2018), 33 pages. DOI: 10.1145/3185593

Media coverage


Online harassment is a complex and growing problem. On Twitter, one mechanism people use to avoid harassment is the blocklist, a list of accounts that are preemptively blocked from interacting with a subscriber. In this paper, we present a rich description of Twitter blocklists - why they are needed, how they work, and their strengths and weaknesses in practice. Next, we use blocklists to interrogate online harassment - the forms it takes, as well as tactics used by harassers. Specifically, we interviewed both people who use blocklists to protect themselves, and people who are blocked by blocklists. We find that users are not adequately protected from harassment, and at the same time, many people feel they are blocked unnecessarily and unfairly. Moreover, we find that not all users agree on what constitutes harassment. Based on our findings, we propose design interventions for social network sites with the aim of protecting people from harassment, while preserving freedom of speech.

BibTeX citation

	author = {Jhaver, Shagun and Ghoshal, Sucheta and Bruckman,Amy and Gilbert, Eric},
	title = {Online Harassment and Content Moderation: The Case of Blocklists},
	year = {2018},
	issue_date = {April 2018},
	publisher = {Association for Computing Machinery},
	address = {New York, NY, USA},
	volume = {25},
	number = {2},
	issn = {1073-0516},
	url = {},
	doi = {10.1145/3185593},
	journal = {ACM Trans. Comput.-Hum. Interact.},
	month = mar,
	articleno = {12},
	numpages = {33}