By
Shubhangi Goel
New
Every time Shubhangi publishes a story, you’ll get an alert straight to your inbox!
By clicking “Sign up”, you agree to receive emails from Business Insider. In addition, you accept Insider’s
Terms of Service and
Privacy Policy.
Follow Shubhangi Goel
Grok will no longer be allowed to create AI photos of real people in sexualized or revealing clothing, after widespread global backlash.
"We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis," X's safety account said in a blog post on the platform on Wednesday. "This restriction applies to all users, including paid subscribers."
The social media company added that image creation and the ability to edit images via Grok on the X platform will now only be available to paid users as an additional safety measure.
The change was announced hours after California's top prosecutor said he launched an investigation into sexualized AI deepfakes, including those of children, generated by Grok. Indonesia and Malaysia suspended Grok because of the images, and lawmakers in the UK publicly considered it.
Elon Musk, who owns X and xAI, the maker of Grok, asked users to see if they could get around the AI model's image restrictions a few hours before X's official account posted about the changes on Wednesday.
This is a breaking news story. Please check back for updates.

















