A new investigation has found that TikTok is hosting millions of likes on sexually suggestive videos featuring AI-generated images of children, raising fresh concerns about the platform’s ability to enforce its own safety rules.
The report, released by Spanish nonprofit Maldita.es, identified more than a dozen accounts posting AI-created girls in lingerie, school uniforms, or other sexualized poses. Some videos were linked in the comments to Telegram chats advertising child pornography.
Hemos investigado la creación de imágenes de menores sexualizadas con IA y el robo de contenido de adolescentes reales en Tiktok y cómo sirve de plataforma de acceso a pornografía infantil real y generada con inteligencia artificial.
— Maldita.es (@maldita) December 11, 2025
Abrimos 🧵https://t.co/ECvIKlvF3p
Researchers flagged 15 accounts and 60 videos to TikTok, but the platform determined that most did not violate its policies, despite rules explicitly banning AI-generated sexualized depictions of minors.
Several accounts remained active even after appeals. TikTok did later remove a handful of videos, though without explanation.
The findings renew pressure on tech companies to police AI imagery as governments adopt tougher online safety laws. TikTok says it maintains zero tolerance for youth exploitation but did not address the specific cases.
Also read:
