How AI ‘Nudify’ Sites Quietly Make $36 Million a Year?

A new investigation exposes how shady AI-powered “nudify” sites are cashing in on stolen images, lax oversight, and the dark side of generative AI—raking in millions while victims struggle to fight back.

Millions of people each month are quietly fueling a disturbing corner of the AI world—one that experts say is raking in tens of millions by turning ordinary photos into nonconsensual fake nudes.

A fresh deep-dive from Indicator, a publication known for tracking digital deception, reveals that 85 so-called nudify and undress websites are not just surviving—they’re thriving. Worse, they’re doing it with the help of everyday tech giants like Google, Amazon, and Cloudflare.

Key Takeaway:

  • Around 18.5 million visitors hit nudify sites every month.
  • These sites may be earning over $36 million a year.
  • Major tech firms’ services keep them running—often in violation of terms.
  • Victims include women and girls whose stolen images become explicit deepfakes.
  • New laws are emerging, but loopholes and slow enforcement keep the industry alive.

Silicon Valley’s Blind Spot

The numbers are shocking. Indicator’s researchers found that these sites pull in an average of 18.5 million visitors monthly and might be making up to $36 million each year—mostly by selling credits or subscriptions to produce fake explicit images in seconds.

So how are they still online? The short answer: big tech. Over 70% of these sites rely on cloud services, hosting, and content delivery from Amazon and Cloudflare. More than half use Google’s convenient single sign-on feature, letting users create accounts with just a click.

Alexios Mantzarlis, a cofounder of Indicator, didn’t hold back: “They should have stopped supporting AI nudifiers the moment it was obvious they’re designed for sexual harassment,” he said. And he’s not alone—online safety experts warn that tech companies’ sluggish response has turned a fringe threat into a booming underground economy.

The Disturbing Rise of Nudify Apps

What exactly are nudify sites? Imagine uploading a selfie—yours, or someone else’s—and in seconds, AI spits out a fake nude. These apps and bots first appeared around 2019, piggybacking on deepfake technology.

Since then, they’ve exploded. Some operate through slick networks of interconnected companies, payment processors, and even shady Telegram channels to hide their tracks. And the money? It adds up fast. A batch of 18 sites alone may have made between $2.6 million and $18.4 million in just six months.

Who’s Using Them—and Who Gets Hurt

The top countries feeding traffic to these sites include the United States, India, Brazil, Mexico, and Germany. While search engines help drive clicks, researchers found nudify sites increasingly lure users with affiliate marketing, sketchy sponsored videos with adult performers, and malware-laced clones by opportunistic hackers.

Meanwhile, the human toll is severe. Social media photos are scraped and misused. Teenage boys have used these apps to humiliate classmates. Once images are out there, removing them is a nightmare. “This is the latest twist on intimate image abuse,” says Henry Ajder, an expert who has tracked the deepfake economy for years.

The Fight to Shut It Down

Major companies insist they have rules in place to stop this. Amazon’s spokesperson Ryan Walsh says AWS quickly investigates reports and takes down banned content. Google points out that its sign-in system bars illegal or harassing material. But according to Indicator, some developers have gotten crafty—using “intermediary” sites to slip past detection.

Cloudflare, which handles site security and delivery for dozens of these sites, declined to comment for the original report. That silence frustrates watchdogs who say the same loopholes have lingered for years.

Despite the grim stats, signs of progress are emerging. Microsoft recently tracked developers behind celebrity deepfakes. Meta sued a firm for allegedly pushing nudify ads on its platforms. San Francisco’s city attorney filed lawsuits against 16 AI image abuse sites. And new laws—like the controversial Take It Down Act signed last spring—are finally putting pressure on tech companies to react faster.

Will Anything Change?

Experts say it’s a race between shady site owners and the companies that keep their servers humming. So far, the sites are adapting quickly—using sneaky sign-ups, new payment options, and back-alley marketing to dodge bans.

Still, Alexios Mantzarlis argues that stronger enforcement would shrink the industry: “If they’re harder to find, they’re harder to use—and that hits their bottom line. This toxic part of the AI boom can’t be undone, but we can starve it.”

Conclusion

Generative AI has unlocked amazing possibilities. But as this hidden world of nudify sites proves, it’s also unleashed a dark market profiting from stolen images and silent victims. It’s up to regulators—and Silicon Valley’s biggest names—to decide whether to pull the plug or keep looking the other way.

Also Read

Leave a Comment