Deepfake porn apps downloaded 705 million times on Apple, Google
A United Press International article reports that dozens of AI “nudify” apps capable of generating sexualized deepfake images have been widely available on both Apple’s App Store and Google Play despite platform policies meant to block such content. According to a Tech Transparency Project investigation, researchers found roughly 55 such apps on Google Play and 47 on Apple’s store that can take a photo of a person and AI-generate a nude or minimally clothed image. These apps have been downloaded hundreds of millions of times worldwide, raising serious concerns about enforcement of content policies and user safety.
The investigation showed the apps collectively exceeded 700 million downloads and generated an estimated $117 million in revenue—money that both app stores share in through fees—despite rules that ostensibly prohibit sexualized or non-consensual imagery. Apple removed some of the identified apps after being confronted with the findings, and Google says it has suspended several pending review, but many still appear in the stores. The existence of these “nudify” apps highlights ongoing challenges in policing AI tools that can easily create non-consensual deepfake content, underscoring gaps between stated platform policies and real-world enforcement.





