Identifying opportunities to improve content moderation
MetadataShow full item record
This thesis contributes a nuanced understanding of the challenges inherent in the design and implementation of fair and efficient content moderation systems. Using large-scale data analyses, participant observations, survey data and in-depth qualitative interviews, this research describes the perspectives and practices of different stakeholders - users who suffer online harassment, individuals whose posts get removed, people who rely on blocking tools to censor others, and community managers who volunteer time to regulate content. This work provides theoretical and practical guidelines for moderating against online harassment without impinging on free speech, for designing solutions that incorporate the needs of different user groups, and for adopting automated moderation tools that provide explanations of their decisions and that remain sensitive to localized contexts.