Thu, 15 Jan 2026
Grok will no longer allow users to remove clothing from images of real people in jurisdictions where it is illegal.
* Elon Musk's AI model Grok will no longer be able to edit photos of real people in revealing clothing, including bikinis and underwear, in jurisdictions where it is illegal.
* The change applies to all users, including paid subscribers, and was implemented after widespread concern over sexualized AI deepfakes.
* X has also reiterated that only paid users will be able to edit images using Grok on its platform.
* The company said the new measures are intended to help prevent abuse of the tool and ensure that those who try to use it for illegal activities are held accountable.
* California's top prosecutor has announced an investigation into the spread of sexualized AI deepfakes, including those generated by Grok.
* Leaders around the world have criticized Grok's image editing feature, with some countries banning the tool altogether.
* Britain's media regulator, Ofcom, is investigating whether X failed to comply with UK law over the creation and sharing of explicit images.
* The move comes after Elon Musk defended X's use of Grok, posting AI-generated images of UK Prime Minister Sir Keir Starmer in a bikini.
>>
Terms of Use | Privacy Policy | Manage Cookies+ | Ad Choices | Accessibility & CC | About | Newsletters | Transcripts
Business News Top © 2024-2025