Wed, 07 Jan 2026

Wed, 07 Jan 2026 IWF finds sexual imagery of children which 'appears to have been' made by Grok

It said analysts discovered the images on a dark-web forum, by users who claimed to have used Grok

* The Internet Watch Foundation (IWF) has found "criminal imagery" of girls aged 11-13 on a dark web forum, created using AI tool Grok owned by xAI.
* The images were described as "sexualized and topless" and were created using Grok to generate photo-realistic child sexual abuse material (CSAM).
* The IWF is concerned that tools like Grok are bringing CSAM into the mainstream and making it easier for people to create realistic images of child abuse.
* A user had used Grok to create a Category C image, but then used another AI tool to create a more serious Category A image.
* X and xAI were previously contacted by Ofcom over concerns that Grok can be used to make "sexualized images of children".
* The IWF has received reports of people using Grok on X to alter real images of women without their consent, but these have not been assessed as CSAM yet.
* X stated in a previous statement that it takes action against illegal content, including CSAM, and anyone using Grok to make illegal content will be punished.
  >>


Terms of Use | Privacy Policy | Manage Cookies+ | Ad Choices | Accessibility & CC | About | Newsletters | Transcripts
Business News Top © 2024-2025