Pornography

Deepfake pledge welcome but further action needed

Laptop with the lid half closed

CARE has welcomed a government pledge on deepfake pornography and urged ministers to go further in tackling the issue.

On Tuesday, it was announced that people who create sexually explicit "deepfakes" of adults could be prosecuted under new plans aimed at protecting women and girls.

It is already illegal for such images to be created of children, but new legislation will cover over-18s after the government warned of an “alarming” proliferation of the images.

AI tech crackdown required

Tim Cairns, CARE’s policy lead on online safety, commented:

“The government’s promise to widen the scope of criminal legislation is welcome. In order to expedite the ban, we would encourage the government to work with Baroness Owen, to ensure her Private Member’s Bill gets the support it needs to move through parliament quickly. This Bill is supported by many campaigners in the sector.

“In order to tackle this issue fulsomely – as this is something that is causing increasing anxiety in various areas of society – we would underline the need to expressly outlaw the artificial intelligence tools that allow sexual deepfakes to be created, which are readily available online. Links advertising so-called ‘nudification’ apps and websites have increased exponentially in the last few years. The content they create is extremely realistic. As well as still images, some platforms allow users to create new pornographic videos where subjects appear to do whatever the user asks.

“We would urge Baroness Owen to amend her Bill to cover a ban on deepfake technology and ask the Government to support this ban. Sexual deepfakes pose a particular danger to women and girls. They fuel a culture that sees women and girls dehumanised and treated as mere sex objects. The creation and sharing of deepfake images cause serious mental and physical distress to victims. Swift action to ban these apps would make a difference, and there can be no reasonable argument against banning them. No cogent argument can be put in favour of technology that creates sexualised images without a person’s consent.”

Public support for a ban

Polling commissioned by CARE found that almost 8 in 10 Brits support a ban on AI tools that allows users to digitally undress women and children.

The question:

There are now websites and apps that use artificial intelligence (AI) to simulate sexually explicit content, such as undressing women and children. Do you agree or disagree with this statement? 'Websites and apps that use AI to simulate sexually explicit content such as undressing women and children should be banned by the government.'

Polling headline findings:

  • 69% of respondents strongly agree with the government banning websites and apps that use AI to simulate sexually explicit content such as undressing women and children.
  • This number rises to 75% strongly agreeing, discounting those who preferred not to answer.
  • 10% somewhat agree, 2% somewhat disagree, 3% strongly disagree, and 9% don't know.
  • 57% of 18–24-year-olds strongly agree, vs 73% of 65+.
  • 60% of men strongly agree vs 73% of women.

Full data tables can be accessed here: AI-Explicit-Imagery-Survey-Final.pdf (care.org.uk)

ENDS

About CARE

Christian Action Research and Education (CARE) is a social policy charity, bringing Christian insight to the policies and laws that affect our lives.

Baroness Owen’s Bill can be accessed here: Non-Consensual Sexually Explicit Images and Videos (Offences) Bill [HL] - Parliamentary Bills - UK Parliament

Contact us: press@care.org.uk

Share