Experiences of Moderation, Moderators, and Moderating by Online Users Who Engage with Self-Harm and Suicide Content [Open Access]

Image credit: Thought Catalogue

Abstract

Despite the growing role of content moderation online, particularly in mental health spaces, there is limited research into the effectiveness of platform practices and a lack of user-driven evidence for regulatory guidance. This study aimed to explore user accounts of moderation related to self-harm and suicide (SH/S) content online, including their experiences of being moderated and perspectives on moderation practices. Additionally, where participants were also moderators, their experiences of moderating SH/S content were explored. 14 participants were interviewed at baseline, n = 8 at 3-months and n = 7 at 6-months. They also completed daily diaries of online use between interviews. Thematic analysis was used to explore perspectives. Three key themes were identified: ‘content reporting behaviour’, exploring factors influencing decisions to report content; ‘perceptions of having content blocked’, exploring experiences and speculative accounts of SH/S content moderation; and ‘content moderation and moderators’, examining participant views on moderation approaches and their experiences of moderating. This study revealed challenges in moderating SH/S content online, and highlighted inadequacies with current procedures. Participants struggled to self-moderate online SH/S spaces, showing the need for proactive platform-level strategies. Additionally, whilst the lived experience of moderators was valued, associated risks emphasised the need for supportive measures. Policymakers and industry leaders should prioritise transparent and consistent moderation practices.

Publication
Digital Society

Read Open Access Here

Avatar
Dr Zoë Haime
Senior Research Associate

My research interests include social cognition, mental health and online harms.

Related