AI-generated images of young girls, some as young as five, are spreading on TikTok and Instagram, drawing inappropriate comments from a troubling audience consisting mostly of older men, Forbes uncovers. These images depict children in provocative outfits, sparking serious concerns, and while the images are not illegal, they are highly sexualised, prompting child safety experts to warn about their potential to lead to more severe exploitation.
It is no wonder they are causing a sense of imminent danger, with platforms like TikTok and Instagram, popular with minors, struggling to address this issue. One popular account, "Woman With Chopsticks," had 80,000 followers and viral posts viewed nearly half a million times across both platforms. A recent study by Stanford revealed that the AI tool Stable Diffusion 1.5 was developed using child sexual abuse material (CSAM) involving real children collected from various online sources.
Under federal law, tech companies must report suspected CSAM and exploitation to the National Center for Missing and Exploited Children (NCMEC), which then informs law enforcement. However, they are optional to remove the type of images discussed here. Nonetheless, NCMEC believes that social media companies should remove these images, even if they exist in a legal grey area.
TikTok and Instagram assert that they have strict policies against AI-generated content involving minors to protect young people. TikTok bans content showing anyone under 18, while Meta removes material that sexualises or exploits children, whether real or AI-generated. Both platforms removed accounts and posts identified by Forbes. However, despite strict policies, the ease of creating and sharing AI-generated images will certainly remain a significant challenge for safeguarding children online.
The Forbes story reveals that such content, which has increasingly become easy to find due to powerful algorithm recommendations, worsens online child exploitation, acting as gateways to severe material exchange and facilitating offender networking. A 13 January TikTok slideshow of young girls in pyjamas found by the investigation showed users moving to private messages. The Canadian Centre for Child Protection stressed that companies need to look beyond automated moderation to address how these images are shared and followed.