AI ‘Nudify’ Websites Are Raking in Millions of Dollars
Credit to Author: Matt Burgess| Date: Mon, 14 Jul 2025 11:00:00 +0000
For years, so-called “nudify” apps and websites have mushroomed online, allowing people to create nonconsensual and abusive images of women and girls, including child sexual abuse material. Despite some lawmakers and tech companies taking steps to limit the harmful services, every month, millions of people are still accessing the websites, and the sites’ creators may be making millions of dollars each year, new research suggests.
An analysis of 85 nudify and “undress” websites—which allow people to upload photos and use AI to generate “nude” pictures of the subjects with just a few clicks—has found that most of the sites rely on tech services from Google, Amazon, and Cloudflare to operate and stay online. The findings, revealed by Indicator, a publication investigating digital deception, say that the websites had a combined average of 18.5 million visitors for each of the past six months and collectively may be making up to $36 million per year.
Alexios Mantzarlis, a cofounder of Indicator and an online safety researcher, says the murky nudifier ecosystem has become a “lucrative business” that “Silicon Valley’s laissez-faire approach to generative AI” has allowed to persist. “They should have ceased providing any and all services to AI nudifiers when it was clear that their only use case was sexual harassment,” Mantzarlis says of tech companies. It is increasingly becoming illegal to create or share explicit deepfakes.
According to the research, Amazon and Cloudflare provide hosting or content delivery services for 62 of the 85 websites, while Google’s sign-on system has been used on 54 of the websites. The nudify websites also use a host of other services, such as payment systems, provided by mainstream companies.
Amazon Web Services spokesperson Ryan Walsh says AWS has clear terms of service that require customers to follow “applicable” laws. “When we receive reports of potential violations of our terms, we act quickly to review and take steps to disable prohibited content,” Walsh says, adding that people can report issues to its safety teams.
“Some of these sites violate our terms, and our teams are taking action to address these violations, as well as working on longer-term solutions,” Google spokesperson Karl Ryan says, pointing out that Google’s sign-in system requires developers to agree to its policies that prohibit illegal content and content that harasses others.
Cloudflare had not responded to WIRED’s request for comment at the time of writing. WIRED is not naming the nudifier websites in this story, as not to provide them with further exposure.
Nudify and undress websites and bots have flourished since 2019, after originally spawning from the tools and processes used to create the first explicit “deepfakes.” Networks of interconnected companies, as Bellingcat has reported, have appeared online offering the technology and making money from the systems.
Broadly, the services use AI to transform photos into nonconsensual explicit imagery; they often make money by selling “credits” or subscriptions that can be used to generate photos. They have been supercharged by the wave of generative AI image generators that have appeared in the past few years. Their output is hugely damaging. Social media photos have been stolen and used to create abusive images; meanwhile, in a new form of cyberbullying and abuse, teenage boys around the world have created images of their classmates. Such intimate image abuse is harrowing for victims, and images can be difficult to scrub from the web.
Using various open source tools and data, including website analysis tool Built With, Indicator staff and investigative researcher Santiago Lakatos looked into the infrastructure and systems powering 85 nudifier websites. Content delivery networks, hosting services, domain name companies, and webmaster services are all provided by a mixture of some of the biggest tech companies, plus some smaller businesses.
Based on calculations combining subscription costs, estimated customer conversion rates, and web traffic the sites sent to payment providers, the researchers estimate that 18 of the websites made between $2.6 million and $18.4 million in the past six months, which could equate to around $36 million a year. (They note this is likely a conservative estimate, as it doesn’t incorporate all the websites and transactions that take place away from the websites, such as those on Telegram.) Recently, whistleblower and leaked data reported on by German media outlet Der Spiegel indicated one prominent website may have a multimillion-dollar budget. Another website has claimed to have made millions.
Of the 10 most-visited sites, the research says, the most visitors came from the United States—India, Brazil, Mexico, and Germany make up the rest of the top five countries where people accessed the sites. While search engines direct people to nudify websites, the sites have increasingly received visitors from other online sources. Nudifiers have become so popular that Russian hackers have created fake malware-laced versions. Over the past year, 404 Media has reported one site making sponsored videos with adult entertainers, and the websites have also increasingly used paid affiliate and referral programs.
“Our analysis of the nudifiers’ behavior strongly indicates their desire to build and entrench themselves in a niche of the adult industry,” Lakatos says. “They will likely continue to try to intermingle their operations into the adult content space, a trend that needs to be countered by mainstream tech companies and the adult industry as well.”
Many of the problems of tech companies allowing nudify platforms to use their systems are well-known. For years, tech journalists have reported on how the deepfake economy has used mainstream payment services, social media advertisements, search engine exposure, and technology from big companies to operate. Yet little comprehensive action has been taken.
“Since 2019, nudification apps have moved from a handful of low-quality side projects to a cottage industry of professionalized illicit businesses with millions of users,” says Henry Ajder, an expert on AI and deepfakes who first uncovered growth in the nudification ecosystem in 2020. “Only when businesses like these who facilitate nudification apps’ ‘perverse customer journey' take targeted action will we start to see meaningful progress in making these apps harder to access and profit from.”
There are signs the nudify websites are updating their tactics and approaches to try to avoid any potential crackdowns or evade bans. Last year, WIRED reported on how nudify websites used single sign-on systems from Google, Apple, and Discord to allow people to quickly create accounts. Many of the developer accounts were disabled following the reporting. The Indicator says that on 54 of the 85 websites, however, Google’s simple sign-in system is being used, and the website creators have taken steps to evade detection by Google. They would, the report says, use an “intermediary site” to “pose as a different URL for the registration.”
While tech companies and regulators have taken a glacial approach to tackling abusive deepfakes since they first emerged more than a decade ago, there has been some recent movement. San Francisco’s city attorney has sued 16 nonconsensual-image-generation services, Microsoft has identified developers behind celebrity deepfakes, and Meta has filed a lawsuit against a company allegedly behind a nudify app that, Meta says, repeatedly posted ads on its platform. Meanwhile, the controversial Take It Down Act, which US president Donald Trump signed into law in May, has put requirements on tech companies to remove nonconsensual image abuse quickly, and the UK government is making it illegal to create explicit deepfakes.
The moves may chip away at some nudifier and undress services, but more comprehensive crackdowns are needed to slow the burgeoning harmful industry. Mantzarlis says that if tech companies are more proactive and stricter in enforcing their policies, nudifiers’ ability to flourish will diminish. “Yes, this stuff will migrate to less regulated corners of the internet—but let it,” Mantzarlis says. “If websites are harder to discover, access, and use, their audience and revenue will shrink. Unfortunately, this toxic gift of the generative AI era cannot be returned. But it can certainly be drastically reduced in scope.”