Market News - Privacy

One in six UK adults report seeing deepfake porn

An article published by The Christian Institute reports findings from the second International AI Safety report, which highlights the growing problem of AI generated pornographic imagery featuring real people. The report describes this as a major concern and notes that deepfake sexual content is becoming more widespread and easier to produce. 

 

The article’s headline figure is that 15 percent of adults in the United Kingdom say they have seen deepfake pornographic images. It also points to an increase in incidents involving children, including cases where children use tools to create sexualized images of other children. 

 

To illustrate the school impact, the piece cites a poll commissioned by The Guardian of 4,300 secondary school teachers in England. Around one in ten teachers said they were aware of students generating deepfake sexually explicit videos during the last academic year. The article adds that some incidents involved children as young as eleven, and a smaller share involved even younger children. 

 

The article also quotes a professor at Anglia Ruskin University who says headteachers she spoke to had experienced deepfake incidents and viewed them as an emerging problem. It then references reported increases in similar behavior on the Isle of Man, where police said children have used pornographic image generation to bully or take revenge on peers. 

 

On the policy side, the article says the Westminster Government plans to criminalise the creation of explicit deepfake images made without consent. It contrasts this with existing rules under the Online Safety Act 2023 that already make seeking to share non consensual intimate images illegal, and says the Data Use and Access Act 2025 is intended to criminalise the creation of such content. It also states the Government plans to ban dedicated nudification tools through a Crime and Policing Bill that is moving through Parliament. 

 

Finally, the article includes a statement from Liz Kendall emphasizing that the Government intends to act against technology being used to abuse, humiliate, and exploit people via non consensual sexually explicit deepfakes.

View the original full article here: https://www.christian.org.uk/news/one-in-six-uk-adults-report-seeing-deepfake-porn/

Related News