Pornography
One in four unconcerned about non-consensual sexual deepfakes
A survey commissioned by the Office of the Police Chief Scientific Adviser found that 25% of respondents agreed with or were neutral about the acceptability of sexual deepfakes even when the person depicted had not consented.
AI-produced imagery of real people – so-called ‘deepfakes’ – are becoming much more common, and the technology behind them is becoming much more sophisticated. The vast majority of deepfake videos are pornographic and disproportionately target women and girls.
Report findings
Around two-thirds of the 1,700 people surveyed for the report had seen a deepfake. While many were humorous or political in nature, 21% were sexual and 14% were sexual content involving someone they knew.
60% of participants said they were worried about a deepfake being made of them, predominately women and girls. Those who felt the deepfakes were acceptable were mostly younger males with positive views on AI and who actively consume pornography.
Only 14% of those surveyed were aware of current legislation, including new laws that make creating sexually explicit deepfakes an offence which carries a penalty of up to two years in prison.
Worrying attitudes
Access to deepfake technology is easier than ever, and the use of such platforms is increasingly seen as normal. Yet the psychological impact of abuse by deepfake videos and imagery can mirror the effects of sexual assault.
The author of the report, Callyane Desroches, Head of Policy and Strategy for Crest Advisory, said:
“While some deepfake content may seem harmless, the vast majority of video content is sexualised – and women are overwhelmingly the targets. We are deeply concerned about what our research has highlighted – that there is a cohort of young men who actively watch pornography and hold views that align with misogyny who see no harm in viewing, creating and sharing sexual deepfakes of people without their consent.
“People under the age of 45 are more likely to be aware of and exposed to deepfakes. And at the same time, there is a lack of awareness about the legal implications of creating and sharing deepfakes.”
Paul Taylor, Chief Scientific Adviser for Policing, added that the report highlighted “concerns around the growing use of deepfake technology as a form of gender-based violence against women and girls. Their focus on the psychological and emotional impact of this abuse is essential to influencing the approach required across the justice system to tackle this threat.”
Share