Artificial Intelligence
Ofcom to investigate after reports of Grok AI making sexualised images of children
Communications regulator Ofcom, who enforces the Online Safety Act, has requested “urgent contact” with Elon Musk’s xAI after reports that its AI chatbot, Grok, is able to be used to digitally undress pictures of women and produce sexualised images of children.
Violating and dehumanising
Journalist Samantha Smith spoke to the BBC about a picture that had been produced of her without her consent. Sharing about it on X, she received comments from others who had experienced the same. She said that the experience made her feel “dehumanised and reduced into a sexual stereotype”.
“While it wasn't me that was in states of undress, it looked like me and it felt like me and it felt as violating as if someone had actually posted a nude or a bikini picture of me” she said.
Those whose images have been digitally de-clothed using xAI’s Grok include Catherine, Princess of Wales
Legal warnings
The government are currently working on legislation to ban ‘nudification’ tools and those who develop such technology would “face a prison sentence and substantial fines”.
While Ofcom are investigating concerns over Grok, the European Commission also said that it was “seriously looking into this matter” along with authorities in France, Malaysia and India. European Commission spokesperson Thomas Regnier said it was aware of posts “showing explicit sexual content” including “some output generated with childlike images”. “This is illegal,” he said, calling the posts “appalling” and “disgusting” with “no place in Europe”.
XAI's acceptable use policy prohibits “depicting likenesses of persons in a pornographic manner” but this has not prevented people using the platform to de-clothe people without their consent. xAI has issued a warning about using Grok to generate illegal content, following a post by Elon Musk saying: “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.” xAI’s post also made it clear that this includes Child Sexual Abuse Material.
Share