Officers from Hertfordshire Police conducting a raid to arrest people suspected of contacting children online (Picture: Jacob King/PA Wire)
The UK is becoming the first country in the world to criminalise making child sex abuse images using Artificial Intelligence (AI).
These tools are already being used by predators to generate images of child abuse, such as using AI to remove clothing from real images of children.
Perpetrators have also stitched the faces of other children onto existing child abuse images – and their voices are also used to create the material, which revictimises abuse survivors.
Worse, the AI-generated images are being used to blackmail victims into streaming live images or being subject to further abuse – and the tools can also be used to help obscure the perpetrators’ identity.
It’s hoped these new criminal offences, announced by Home Secretary Yvette Cooper this weekend, will reverse the impact of AI which is helping offenders to more effectively groom and abuse children online.
If someone is found to possess, create or distribute AI tools designed to generate child sex abuse material, they could face up to five years in prison.
The Home Secretary appeared on the BBC this morning to talk about the new laws (Picture: BBC)
Possessing AI ‘paedophile manuals’, which teach people how to use AI to sexually abuse children, will also become illegal, with a maximum prison sentence of three years.
Predators who run websites designed to allow paedophiles to share child sex abuse content or child grooming advice will face 10 years in prison for a new specific offence.
Plus, Border Force will be given new powers to force people to unlock their devices for inspection if they suspect they pose a sexual risk to children – with sentences for any resulting offences of up to three years in prison.
The new measures will form part of the Crime and Policing Bill when it comes to parliament for MPs to vote on.
Yvette Cooper said: ‘We know that sick predators’ activities online often lead to them carrying out the most horrific abuse in person.
‘This government will not hesitate to act to ensure the safety of children online by ensuring our laws keep pace with the latest threats.
‘These four new laws are bold measures designed to keep our children safe online as technologies evolve.
‘It is vital that we tackle child sexual abuse online as well as offline so we can better protect the public from new and emerging crimes as part of our Plan for Change.’
The Internet Watch Foundation (IWF) has warned that more and more sexual abuse AI images of children are being produced.
Their analysis last year found the prevalence of Category A images, the most severe kind, went up by 10% compared to 2023.
AI generated child sex abuse material rose by 380%, with 245 confirmed reports last year compared to 51 in 2023.
Those numbers may seem low, but each report can contain thousands of images.
The IWF warned that some of this AI generated content is so realistic that sometimes they are unable to tell the difference between AI generated content and abuse filmed in real life.
Of the 245 reports, 193 included AI-generated images which were so sophisticated and life-like, they were treated under UK law as though they were actual, photographic images of child sexual abuse.
Police dogs are being trained to find hidden digital devices such as small memory cards and mobile phones to help sniff out child abuse images (Picture: Jacob King/PA Wire)
Interim Chief Executive of the IWF, Derek Ray-Hill, said: ‘We have long been calling for the law to be tightened up, and are pleased the government has adopted our recommendations.
‘These steps will have a concrete impact on online safety.
‘The frightening speed with which AI imagery has become indistinguishable from photographic abuse has shown the need for legislation to keep pace with new technologies.
‘Children who have suffered sexual abuse in the past are now being made victims all over again, with images of their abuse being commodified to train AI models.
‘It is a nightmare scenario, and any child can now be made a victim, with life-like images of them being sexually abused obtainable with only a few prompts, and a few clicks.
‘The availability of this AI content further fuels sexual violence against children. It emboldens and encourages abusers, and it makes real children less safe.
‘There is certainly more to be done to prevent AI technology from being exploited, but we welcome today’s announcement, and believe these measures are a vital starting point.’
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.
Arrow
MORE: US transport secretary reacts to plane crashes with ban on celebrating diversity
Arrow
MORE: Man arrested after Koran burnt on livestream at Manchester terror attack memorial
Arrow
MORE: Graham Potter can earn a share of the spoils when he takes West Ham to former club Chelsea
Breaking News
Never miss the biggest stories with breaking news alerts in your inbox.