Protecting Young Users on Social Media: Evaluating the Effectiveness of Content Moderation and Legal Safeguards on Video-Sharing Platforms
PDF

Keywords

Social media
content moderation
online harm
algorithmic transparency
child safety
age-restricted content
platform policies

Categories

How to Cite

Eltaher, F., Gajula, . R. K. ., Miralles-Pechuán, L., Crotty, P. ., Martínez-Otero , J. ., Thorpe, C. ., & Mckeever, S. (2026). Protecting Young Users on Social Media: Evaluating the Effectiveness of Content Moderation and Legal Safeguards on Video-Sharing Platforms. Journal of Online Trust and Safety, 3(2). https://doi.org/10.54501/jots.v3i2.251

Abstract

Video-sharing platforms such as TikTok, YouTube, and Instagram implement content moderation policies to reduce the exposure of minors to harmful videos. As video has become the dominant and most immersive form of online content, assessing how effectively these systems protect younger users is increasingly important. This study evaluates the effectiveness of video moderation for different age groups on TikTok, YouTube, and Instagram, based on a focused set of experimental accounts. Accounts were created for simulated users aged 13 and 18, and 3,000 recommended videos were analyzed in two interaction modes: passive scrolling and search-based scrolling. Each video was manually assessed for the severity of the harm using a unified harm classification framework. While low-severity harm was the most prevalent form encountered, the results show that accounts configured as 13-year-olds encountered harmful videos more frequently and rapidly than accounts configured as 18-year-olds. On YouTube, 15% of videos recommended to 13-year-old accounts during passive scrolling were classified as harmful, compared to 8.17% for adult accounts, with exposure occurring within an average of 3:06 minutes. This exposure appeared without user-initiated searches, highlighting weaknesses in algorithmic filtering. Results from our targeted study point to gaps in video moderation systems, suggesting the need for more effective safeguards to better protect minors from harmful online content.

https://doi.org/10.54501/jots.v3i2.251
PDF
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Copyright (c) 2026 Journal of Online Trust and Safety