‘Deepfake AI porn’ and non-consensual sexual deepfakes: What can be done about it?
The article examines the growing problem of non consensual deepfake pornography, where AI is used to create fake sexual images or videos of real people without their consent. It explains that this form of abuse is rapidly increasing due to the accessibility of AI tools and the ease of generating realistic content.
Deepfake porn is considered a form of image based sexual abuse, often targeting women and girls. Victims can suffer serious psychological harm, reputational damage, and long term personal consequences, even though the content is fabricated.
A key issue is that laws are not keeping pace with technology. In many jurisdictions, existing legislation was not designed for AI generated content, creating gaps in how these cases are prosecuted and how victims can seek justice.
The article highlights difficulties in enforcement, including the anonymity of perpetrators, the speed at which content spreads online, and challenges in removing material once it is widely shared.
Possible solutions discussed include:
-
stronger and clearer laws specifically targeting deepfake sexual content
-
faster takedown mechanisms for platforms
-
better tools to identify perpetrators
-
increased responsibility for tech companies to prevent misuse
It concludes that addressing deepfake porn requires a combination of legal reform, platform accountability, and public awareness, as current systems are insufficient to protect victims.





