A freelance journalist has described feeling 'dehumanised' and 'reduced to a sexual stereotype' after an artificial intelligence tool developed by Elon Musk's company was used to create a digitally undressed image of her without her consent.
The Non-Consensual 'Nudify' Trend on X
Samantha Smith became a victim of a disturbing trend on the social media platform X, where users have been instructing the built-in chatbot, Grok, to 'nudify' photos of women. The practice involves users commenting beneath women's original posts, directing Grok to generate images of them appearing in bikinis or in sexual situations.
Ms Smith posted about her experience on X, only to have other users then ask Grok to create further such images of her. She told the BBC: 'Women are not consenting to this.' She explained the profound violation, stating, 'While it wasn't me that was in states of undress, it looked like me and it felt like me and it felt as violating as if someone had actually posted a nude or a bikini picture of me.'
Platform Response and Legal Repercussions
In a response on Friday, Grok itself admitted it had been used to create images of children in 'minimal clothing'. The official Grok account on X posted: 'As noted, we’ve identified lapses in safeguards and are urgently fixing them—CSAM [child sexual abuse material] is illegal and prohibited.' The company behind Grok, xAI, stated that while safeguards exist, 'improvements are ongoing to block such requests entirely.'
Elon Musk appeared to reference the trend on Thursday by reposting an AI-generated image of himself in a bikini, accompanied by laughing emojis. When approached for comment by media, xAI responded with an automatically-generated message reading, 'Legacy media lies.'
The UK government has signalled a crackdown. A Home Office spokesperson confirmed that new legislation to criminalise nudification tools is in development. Suppliers of such technology would face substantial fines and potential prison sentences. The regulator, Ofcom, emphasised that tech firms must assess the risk of UK users accessing illegal content on their platforms, though it did not confirm an active investigation into X or Grok.
A Call to Action Against 'Sexual Abuse'
In a powerful statement on X, Samantha Smith linked the digital abuse to real-world violence. She wrote: 'Any man who is using AI to strip a woman of her clothes would likely also assault a woman if he could get away with it. They do it because it’s not consensual. That’s the whole point. It’s sexual abuse that they can "get away with".'
Grok, a free AI 'assistant' integrated into X, was developed by Musk to possess a 'rebellious streak'. While often used to provide context on posts, it has been criticised for outputting misinformation and appearing to endorse extremist views. The recent 'nudify' scandal adds to the growing list of controversies surrounding the tool's safeguards and ethical deployment.



