
A report from the Children’s Commissioner reveals how young people – especially girls – are now withdrawing from online life, afraid that their images will be taken and manipulated using such technologies.
Dame Rachel de Souza is calling for “a total ban” on apps that allow users to create deepfake sexually explicit images, saying that “there is no positive reason for these to exist”.
While it is illegal to create or share a sexually explicit image of a child, the report warns that the technology enabling deepfake images to be created remains legal and is becoming more easily available via large social media platforms and search engines. GenAI technology, it adds, has “supercharged” the growth of these tools.
The emerging threat of deepfakes has become a key safeguarding issue for schools. In a recent article discussing 10 ways schools can protect children, SecEd’s resident safeguarding writer Elizabeth Rose said there was a clear need for CPD for school staff on the deepfake threat and she urged schools to address the issue with students in the curriculum, too.
Dame Rachel’s report is based on a review of online search engines to establish how accessible the “sexually explicit deepfake economy” is, as well as in-depth focus groups with young people aged 16 and 17.
The findings show that young girls now fear nudification technology “in much the same way as they would fear the threat of sexual assault in public places – for girls, the threat is just there”.
The report finds that women and girls are “almost exclusively” the victims of sexually explicit deepfakes with the AI technology often not working on men and boys due to how it has been trained. The report reveals that 99% of sexually explicit deepfakes accessible online are of women and girls.
It adds: “There is a high and an ever-increasing demand for sexually explicit deepfakes. Content on the most visited bespoke websites for sexually explicit deepfakes received 40 million views in the month of January 2024. Google searches for the term ‘deepfakes’ increased by 76% between 2019 and 2023.
“Despite the demand for pseudo images of high-profile women, it is ordinary women who have been identified as the most targeted group for nudifying apps.”
Research cited by the Children’s Commissioner reveals that 13% of teenagers have now had experience with a deepfake nude, whether that be seeing, receiving or sending one. Another study suggests that 26% of 13 to 18-year-olds have seen a “sexually explicit deepfake image of a celebrity, a friend, a teacher, or themselves”.
What is more, young people aged 16 to 24 were more likely to be exposed to sexually explicit deepfakes than adults.
In the focus groups, young girls told Dame Rachel about taking preventative steps to keep themselves safe from becoming victims of nudification tools by limiting their online participation.
The report states: “The team heard girls describe how they were trying to reduce the chance of featuring in a sexually explicit deepfake by limiting their participation in the online world – a space which could enhance their social lives, play and learning, if it were safe for them.”
Young girls in the focus group voiced their fears about falling victim to deepfakes. One 16-year-old said: “I'm worried that one day this could happen to me. It could happen to anyone around you and with how good AI is getting they wouldn't know whether it's real or not. It's kind of scary.”
Another added: “That risk that people can just get a photo of you from the internet and make it – it's more and more worrying.”
The Online Safety Act 2023 puts legal responsibility on online platforms operating in the UK to prevent UK users from encountering illegal content online, including child sexual abuse material.
The Act also mandates technology companies to prevent children from coming to harm including from pornographic content. Nudification tools, including nudification apps and websites, are in scope of pornography provisions in the Act – which is currently being implemented and which requires services that provide pornographic content to ensure – via age verification – that children are not able to access pornographic content.
Ofcom’s new Children Codes, which will come into force in July 2025, will improve protection of children from services designed for adults and from seeing some harmful content. However, they don’t make the provision of nudifying services illegal.
Dame Rachel, in her recommendations, is now calling for these apps to be outlawed. She also wants Ofcom to act in the short-term to strength the Children Codes to ensure its risk assessment process is “proactive against emerging harms”. The report adds: “Nudification apps are an example of a harm that has emerged as a result of a new technology. The risk assessment that social media companies are required to carry out in order to comply with the Online Safety Act should be outcomes-based, and include proactive inputs to assess the risk of children encountering harmful content, such as pornographic material.”
The government should also ask Ofcom to require technology companies in scope of the Online Safety Act to embed the Report Remove tool into their services, Dame Rachel added.
In schools, Dame Rachel says that education regarding sexually explicit deepfake technology should be included on PSHE curriculum.
Speaking this week, Dame Rachel said: “In our lifetime, we have seen the rise and power of AI to shape the way we learn, connect and experience the world. It has enormous potential to enhance our lives, but in the wrong hands it also brings alarming risks to children’s safety online.
“Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone – a stranger, a classmate, or even a friend – could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps.
“Girls have told me they now actively avoid posting images or engaging online to reduce the risk of being targeted by this technology – we cannot allow sit back and allow these bespoke AI apps to have such a dangerous hold over children’s lives.
“There is no positive reason for these particular apps to exist. They have no place in our society. Tools using deepfake technology to create naked images of children should not be legal and I’m calling on the government to take decisive action to ban them, instead of allowing them to go unchecked with extreme real-world consequences.”
- Children’s Commissioner: ‘One day this could happen to me’ Children, nudification tools and sexually explicit deepfakes, April 2025: www.childrenscommissioner.gov.uk/resource/children-nudification-tools-and-sexually-explicit-deepfakes
Resources & support
- Report Remove is here to help young people under 18 in the UK to confidentially report sexual images and videos of themselves and remove them from the internet: www.childline.org.uk/info-advice/bullying-abuse-safety/online-mobile-safety/report-remove/
- Childline is a free and confidential service for under-19s living in the UK: www.childline.org.uk and 0800 1111.
- Shout provides 24/7 urgent mental health support via text: www.giveusashout.org or text SHOUT to 85258.
- Stop It Now helpline is for anyone worried about child sexual abuse, including their own thoughts or behaviour. Visit www.stopitnow.org.uk or call 0808 1000 900.