Trust and Safety in Social XR
PDF

Keywords

Extended Reality (XR)
content moderation
platform governance
immersive harms
trust and safety
spatial turn

Categories

How to Cite

Redeker, D., Pfannenschmidt, N., Baron Romero, M., Durán, G., & Villa Hernandez, A. S. (2026). Trust and Safety in Social XR: Mapping the Spatial Turn in Content Moderation. Journal of Online Trust and Safety, 3(2). https://doi.org/10.54501/jots.v3i2.290

Abstract

Social Extended Reality (XR) platforms introduce new challenges for content moderation. Unlike traditional social media, XR enables embodied, immersive interaction—intensifying the psychological and social impacts of online harms such as violence, sexual harassment, manipulation, and impersonation. Drawing on an analysis of platform policies and moderation practices, this paper examines how social XR platforms govern these risks. We find that legacy content moderation strategies, such as algorithmic content moderation, are insufficient for the novel characteristics of XR, where harmful material can consist of non-verbal, spatial, and highly engaging behavior. Comparing VRChat’s structured policy framework with Horizon Worlds’ (now Worlds’) more fragmented approach, we highlight gaps in policy clarity, enforcement transparency, and user protection. The paper contributes to emerging debates on platform governance in immersive media, arguing that both state and platform actors should recalibrate their approaches to accountability, real-time moderation, and jurisdictional oversight. We argue that content moderation in XR is not merely a technical challenge—it is a socio-political dilemma requiring participatory, rights-respecting solutions rooted in human rights norms.

https://doi.org/10.54501/jots.v3i2.290
PDF
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Copyright (c) 2026 Journal of Online Trust and Safety