AI generating images of child sexual abuseOnline Safety
The Internet Watch Foundation (IWF) has found thousands of images of child abuse on the dark web (hidden sections of the internet), which have been generated by Artificial Intillegence (AI).
In September, one forum hosted around 3,000 images of child abuse, with more over 500 of those images featuring the most serious type of imagery, including depictions of sexual assault and torture. It is estimated that almost 1400 of the images were of children under 10 years old.
Some images are of real children, where humans have used AI to “nudify” fully-clothed images of children which already exist online. Technology is also being used to “de-age” pictures of celebrities, in order to create images of them as children experiencing sexual abuse.
The IWF has warned that some of the images created by AI are so convincing that even trained analysts would struggle to tell the artificial images from the real ones, making it harder for the police to safeguard children.
Susie Hargreaves, chief executive of the IWF, said: “We are seeing criminals deliberately training their AI on real victims' images who have already suffered abuse…Children who have been raped in the past are now being incorporated into new scenarios because someone, somewhere, wants to see it."
A number of leading figures are warning that the proliferation of artificially-generated images of child abuse is likely to enable and normalise abuse of children in the real world.
A spokesperson for the Home Office has commented: “Online child sexual abuse is one of the key challenges of our age, and the rise in AI-generated child sexual abuse material is deeply concerning.
"Last month, the home secretary announced a joint commitment with the US government to work together to innovate and explore development of new solutions to fight the spread of this sickening imagery."
Next week, the UK government will hold a summit around the safety of AI, examining both the opportunities and the risks that artificial intelligence brings.