Menu Menu
[gtranslate]

AI ‘nudify’ websites are reportedly making millions

People continue to seek out ‘nudify’ websites to create explicit AI images without a person’s consent. A recent analysis of 85 such platforms revealed that, collectively, they could be making up to $36 million annually.

We’ve been writing about nonconsensual deepfakes for well over seven years now, and the problem doesn’t seem to be deescalating.

Piggybacking the widescale proliferation of generative AI capabilities, illicit websites continue to spawn under the creepy umbrella of ‘nudify’ (or undress) apps. Despite lawmakers and tech companies mobilising against dodgy services like this, millions of users are still accessing the sites each month providing their creators with serious profits.

Indicator, an organisation of digital deception experts, has released an analysis of 85 nudify websites – which allow people to generated nude images of subjects through uploaded photos – and uncovered some pretty shocking revelations.

For one, flagship tech conglomerates like Google, Amazon, and Cloudflare provide hosting and content delivery services to 62 of the 85 sites in question. When peering into the stats, traffic showed 18.5 million visitors in the last 6 months, which would collectively make up approximately $36m in annual revenue. That doesn’t account for other segways like Telegram, either.

Like legitimate digital businesses, the lion’s share of readies are brought in through subscription offerings or microtransactions. Many of the websites reportedly use their own digital currency (such as coins or credits) in exchange for generating fake nudes, and offer bundle packages to attain them.

The majority of the payment systems themselves are provided by mainstream companies, and Google’s sign-on system was found on 54 of the websites. Apple and Discord were also popular choices for quickly creating sleuth accounts.

Nudify website creators continue to evade detection by using an ‘intermediary site’ to ‘pose as a different URL for the registration.’ Users, meanwhile, are equipping themselves with VPNs and proxies for an extra layer of anonymity.

While the host companies will pass the buck by saying their terms and conditions prohibit nonconsensual content, merely requesting that users adhere to policy isn’t a sufficient barrier. Relying on user reports isn’t working, either. You don’t exactly have good samaritans accidentally stumbling across these platforms, and those who use nudify generators are there on purpose with a very specific agenda.

The overall battle against deepfake porn is frankly moving at a glacial pace. Notable instances of offenders being held accountable are few and far between considering the scope of the problem. In terms of making a dent, the UK has been the most proactive nation, moving to make the creation of explicit deepfakes illegal. On the other hand, the US’ Take It Down Act is only concerned with the spread of nonconsensual images on social media – and not their creation directly.

According to previous research, one in 17 teenage students claim to have fallen victim to deepfake images or videos themselves. Given the average class size in the US is around 18 students, that means potentially one person in every class has been victimised in this way.

The elusive and dynamic nature of the internet means that illicit deepfake applications won’t ever subside completely, but are we giving the issue enough attention?

Accessibility