Abstract
Content moderation is an essential part of online community health and governance. While much of extant research is centered on what happens to the content, moderation also involves the management of violators. This study focuses on how moderators (mods) make decisions about their actions after the violation takes place but before the sanction by examining how they "profile"the violators. Through observations and interviews with volunteer mods on Twitch, we found that mods engage in a complex process of collaborative evidence collection and profile violators into different categories to decide the type and extent of punishment. Mods consider violators' characteristics as well as behavioral history and violation context before taking moderation action. The main purpose of the profiling was to avoid excessive punishment and aim to integrate violators more into the community. We discuss the contributions of profiling to moderation practice and suggest design mechanisms to facilitate mods' profiling processes.
Original language | English (US) |
---|---|
Article number | 410 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Volume | 5 |
Issue number | CSCW2 |
DOIs | |
State | Published - Oct 18 2021 |
All Science Journal Classification (ASJC) codes
- Social Sciences (miscellaneous)
- Human-Computer Interaction
- Computer Networks and Communications
Keywords
- content moderation
- live streaming
- profiling
- volunteer moderator