In 2021, Jodie* received an email from an anonymous sender. It contained a link directing her to a pornographic site, featuring dozens of images and videos of herself, naked. But the photos weren’t real.
It was her face in them. But it wasn’t her body; she had been ‘deepfaked’, a victim of emerging technologies that allow users to stitch footage of someone’s face onto the body of another person, such as those featured in pornographic content.
The original pictures of her had been normal photos, fully clothed, taken by a close friend. But that friend had then posted them, without her consent, onto an online forum alongside the words: “She makes me really horny, have never done this before, would love to see her faked.” Other users, skilled at using the technology, had created the images in response.
Jodie described how she felt in that moment: “I was completely freaking out, I thought my whole life was over. I broke down and was screaming and crying, I practically blacked out.”
Some of the images had depicted her naked. Others had her dressed as a schoolgirl. Many featured her having sex. She said this year: “The shock and devastation haunts me to this day.”
A growing problem
“In the ad, a woman in a white lace dress makes suggestive faces at the camera, and then kneels. There’s something a bit uncanny about her; a quiver at the side of her temple, a peculiar stillness of her lip. But if you saw the video in the wild, you might not know that it’s a deepfake fabrication. It would just look like a video, like the opening shots of some cheesy, low-budget internet porn.
In the top right corner, as the video loops, there is a still image of the actress Emma Watson, taken when she was a teenager, from a promotional shoot for the Harry Potter movies. It’s her face that has been pasted on to the porn performer’s. Suddenly, a woman who has never performed in pornography is featured in it.” (Moira Donegan, the Guardian)
The situation described is not an uncommon one: the app that has advertised using Emma Watson’s face to produce sexual content charges just $8/week. In May 2024, analysis from Channel 4 News found that sexual content featuring around 4,000 different celebrities was being advertised on the biggest deepfaking websites. Fake pornographic images of Taylor Swift were widely circulated on Facebook, Instagram and X in January 2025.
Pornography is hardly a new concept, with examples of erotic content common in all times and places throughout human history. However, our developing technological capability today has meant not simply an increase in the volume of content being produced or the scale of its distribution, but also in the range of material.
The term ‘deepfaking’ is still less than a decade old, having only been coined in 2017. The word derives from the Reddit profile ‘deepfakes’, which belonged to a male user who shared images he had created, featuring celebrities’ faces being inserted onto the bodies of people in pornographic videos.
But the number of deepfakes has exploded in that time. The livestreaming analyst Genevieve Oh estimated that one prominent deepfake streaming site hosted 1,897 videos in 2018. By 2022, that figure was over 13,000, with over 16 million monthly views. Other research indicates that by 2023, there were over 95,000 deepfake videos online, of which 98% were pornographic. This represented a 550% increase in a four-year-period. Across the top ten dedicated deepfake pornography websites, videos have been viewed more than 300 million times.
The problem only continues to get worse: analysis in 2024 from the campaign group ‘My Image, My Choice’, who exist to fight against image abuse, found that 80% of apps which create deepfakes had been launched within the previous year: one of these had already created 600,000 images within its first three weeks. Seven of the top ten pornographic websites now host deepfaked content. Elsewhere, adverts abound, with messages like “Swap ANY FACE in the video!” or “Replace face with anyone. Enjoy yourself with AI face swap technology.”
Laura Bates, the founder of the Everyday Sexism Project, described the impact which deepfakes can have upon the victim as follows: “Of all the forms of abuse I receive they are the ones that hurt most deeply – the ones that stay with me. It’s hard to describe why, except to say that it feels like you. It feels like someone has taken you and done something to you and there is nothing you can do about it. Watching a video of yourself being violated without your consent is an almost out-of-body experience.”
A misogynistic industry
“The problem of sexually explicit deepfakes is one that is inherently sexist and rapidly proliferating…They have been described as the new frontier of violence against women. The content is created using generative AI and can be made in a matter of seconds with easily downloadable nudification apps or online platforms.” (Baroness Charlotte Owen)
One particular type of technology, nudification apps, has become particularly widely used. This software takes an image of a person who is fully-clothed, and uses AI to picture them naked. Researchers from ‘The Indicator’ estimated that, based on web traffic, from December 2024 - May 2025, 18 such sites made between $2.6 million and $18.4 million. One such site, ClothOff, produces an average of 200,000 pictures a day.
Links to nudification apps can often be found on other websites; ‘The Indicator’ found that Undress CC, which receives around 2.8 million unique visitors every month, obtains 1.1 million of those visitors from referrals by other websites. Search engines also provide a gateway for many users to access such content: searching for terms like “deepnude AI” or “AI clothes remover” return multiple positive results near the top of Google Search.
It is also a deeply misogynistic industry: many deepfake apps only work on women’s bodies, meaning that around 99% of sexually explicit deepfakes accessible online are of women or girls. Baroness Owen, who has been leading the Parliamentary work to criminalise the creation of deepfake sexual images, has cited it as “the new frontier of violence against women.”
The Labour MP, Jess Asato, who chairs the All-Party Parliamentary Group on perpetrators of domestic abuse, said that such apps “digitally strip women and girls”. Her assessment was damning: “I think this is a tool that facilitates digital sexual assault.” The links with real-world assault are not difficult to draw: in July 2024, Gavin Plumb was jailed for plotting to kidnap, rape and murder Holly Willoughby: he had downloaded around 10,000 images of the TV presenter, including deepfaked pornographic content.
Terrifyingly, around 25% of people either support or feel neutral about the legal and moral acceptability of creating, sharing or viewing such deepfakes, according to a recent survey commissioned by the Office of the Police Chief Scientific Adviser. That figure rises among those who regularly watch pornography, and particularly among men under the age of 45.
Ease of access
“It worries me that it’s so normalised. He obviously wasn’t hiding it. He didn’t feel this was something he shouldn’t be doing. It was in the open and people saw it. That’s what was quite shocking.”
Those are the words of a headteacher, who described how one of his male teenage students had pulled out his phone on the bus, selected an image of a girl from a neighbouring school from her social media profile, and used a nudifying app to create a deepfaked image of her.
Younger generations are particularly likely to experiment with new pornographic content: a 2024 survey from Ofcom found that 24% of 18-24s had seen a sexual deepfake, compared with 14% of older adults. A Girlguiding survey this year found that around 25% of 13-18s have seen a sexually explicit deepfake image of a celebrity, a friend, a teacher or even of themselves. Tragically, access to material is so easy that it is commonplace for children and teenagers to not just see it, but create it while still at school.
Last year, in Victoria, Australia, a schoolboy was arrested after 50 students at Bacchus Marsh Grammar School were faked and distributed. One of the girls affected was sick in the car on the way home. In Extramadura in Spain, fifteen boys were put on probation for a year for producing and distributing naked images of around 20 female classmates (the youngest of whom was just 11). Margaret Mulholland, a special needs and inclusion specialist at the Association of School and College Leaders, said: “A year ago I was using examples from the US and Spain to talk about these issues. Now it’s happening on our doorstep and it’s really worrying.”
A recent poll of 4,300 secondary school teachers in England found that around 1 in 10 were aware of students in their school creating “deepfake, sexually explicit videos” within the last year. Around 75% of those incidents involved children aged 14 or younger. Around 1 in 10 included those aged 11.
In 2024, two private schools found themselves at the centre of a police inquiry for creating and sharing deepfaked pornographic images and videos, created from manipulating images from social media accounts of almost a dozen girls at a nearby school. One parent summarised the impact upon her daughter as follows: “To find out that these videos had been created of her and had been circulated was a horrible shock. For her to see, seven weeks later, that no one has been disciplined and that she has had no form of apology is even harder. What has happened is totally unacceptable. As time passes she is sadly coming to the realisation that this is how it is going to be — something that she will just have to put up with. Not something I ever imagined my daughter, in 2024, would have to accept.”
Tragically, many teenagers are now living in fear, because of these new technologies. The Children’s Commissioner for England, Rachel de Souza, compared the anxiety teenage girls feel around deepfaking technology to the anxiety they might feel when walking home at night. She said: “Children have told me they are frightened by the very idea of this technology even being available, let alone used.”
A report from Internet Matters in October 2024 revealed that 55% of teenagers believed it would be worse to have a deepfake nude of them shared than a real nude image, citing a lack of consent, the removal of autonomy, and the fear that friends, family or teachers might believe it was real. Elsewhere, we read stories about young girls being afraid to go to school, teenage boys exhibiting criminal behaviours, and teachers not being trained how to respond.
What does the law say?
When Jodie learnt about the images which had been created and circulated of her, there were no laws against deepfake intimate image abuse. When she initially compiled a 60-page dossier of evidence and handed it to the police, she was told at the first police station that no crime had been committed. Her abuser was eventually convicted, but under the Communications Act.
That was itself a rarity: most intimate image abuse goes unreported, whether through lack of belief of a conviction, or feeling a sense of shame. Data from the Revenge Porn Helpline suggests that only 4% of people who reported their abuse to the helpline also reported it to the police.
In recent years it has felt like the pace of change in the technology has vastly outstripped the progress in the law; the Intimate Image Abuse report in 2022 recommended that the sharing of deepfake abuse should be criminalised - something which was enacted through the Online Safety Act - but initially stopped short of suggesting that the creation of sexually explicit deepfakes without consent should be outlawed.
However, in 2025, Baroness Owen led the charge in the House of Lords - supported by victims - to both criminalise creation of these deepfakes and the soliciting of them. The Bill, which passed in June 2025, mandates forced deletion of any images created, and breaking it will carry the risk of a criminal sentence. The law is based around consent, which means that the intent of the perpetrator (something which is notoriously difficult to prove) will not matter.
Clare McGlynn, a professor of law at Durham and an expert in this area has commented: “There’s so much more to do…If I had my way, I’d introduce a general legal provision that would cover all forms of intimate violation and intimate intrusion so we don’t need to start a new campaign every single time the technology moves on and the abuse changes…We also need better processes to get material removed swiftly and easily, and hold non-compliant platforms to account.”
Technologies will continue to evolve, and the law will have to keep up with the rate of change. However, enormous progress has been made: the creation of sexually explicit deepfakes without consent is now clearly illegal.
*Not her real name