Abstract
Content moderation is a critical service performed by a variety of people on social media, protecting users from offensive or harmful content by reviewing and removing either the content or the perpetrator. These moderators fall into one of two categories: employees or volunteers. Prior research has suggested that there are differences in the effectiveness of these two types of moderators, with the more transparent user-based moderation being useful for educating users. However, direct comparisons between commercially-moderated and user-moderated platforms are rare, and apart from the difference in transparency, we still know little about what other disparities in user experience these two moderator types may create. To explore this, we conducted cross-platform surveys of over 900 users of commercially-moderated (Facebook, Instagram, Twitter, and YouTube) and user-moderated (Reddit and Twitch) social media platforms. Our results indicated that although user-moderated platforms did seem to be more transparent than commercially-moderated ones, this did not lead to user-moderated platforms being perceived as less toxic. In addition, commercially-moderated platform users want companies to take more responsibility for content moderation than they currently do, while user-moderated platform users want designated moderators and those who post on the site to take more responsibility. Across platforms, users seem to feel powerless and want to be taken care of when it comes to content moderation as opposed to engaging themselves.
Original language | English (US) |
---|---|
Article number | 626409 |
Journal | Frontiers in Human Dynamics |
Volume | 3 |
DOIs | |
State | Published - 2021 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Human-Computer Interaction
- Management, Monitoring, Policy and Law
- Sociology and Political Science
- Demography
Keywords
- Content moderation
- Human Computer Interaction (HCI)
- Social network sites (SNSs)
- Survey Methodology
- quantitative research