People continue to seek out ‘nudify’ websites to create explicit AI images without a person’s consent. A recent analysis of 85 such platforms revealed that, collectively, they could be making up to $36 million annually.
We’ve been writing about nonconsensual deepfakes for well over seven years now, and the problem doesn’t seem to be deescalating.
Piggybacking the widescale proliferation of generative AI capabilities, illicit websites continue to spawn under the creepy umbrella of ‘nudify’ (or undress) apps. Despite lawmakers and tech companies mobilising against dodgy services like this, millions of users are still accessing the sites each month providing their creators with serious profits.
Indicator, an organisation of digital deception experts, has released an analysis of 85 nudify websites – which allow people to generated nude images of subjects through uploaded photos – and uncovered some pretty shocking revelations.
Nudify websites are terrible. As highlighted by 60 Minutes last year and Der Spiegel/The Guardian recently. Looks like this prominent one has popped up under a slightly different domain but is still using @Google login against their ToS. It should be blocked cc @sundarpichai pic.twitter.com/QizxNBrE7f
— Rob Leathern (@robleathern) July 5, 2025
For one, flagship tech conglomerates like Google, Amazon, and Cloudflare provide hosting and content delivery services to 62 of the 85 sites in question. When peering into the stats, traffic showed 18.5 million visitors in the last 6 months, which would collectively make up approximately $36m in annual revenue. That doesn’t account for other segways like Telegram, either.
Like legitimate digital businesses, the lion’s share of readies are brought in through subscription offerings or microtransactions. Many of the websites reportedly use their own digital currency (such as coins or credits) in exchange for generating fake nudes, and offer bundle packages to attain them.