There’s a troubling issue with major mobile app stores. Despite Apple and Google claiming they ban apps that allow users to undress people digitally, a recent study has found that many of these apps are still available. The Tech Transparency Project discovered 55 such apps on Google’s Play Store and 47 in Apple’s App Store. Collectively, these apps have been downloaded a staggering 705 million times, raking in around $117 million in revenue.
This finding comes at a time of increased scrutiny, especially after Elon Musk’s AI tool, Grok, generated global outrage for enabling similar actions. UK regulators have even considered banning Grok due to public outcry.
“Grok is just the beginning,” said Katie Paul, director of Tech Transparency Project. “While it got attention, there are even more graphic apps out there.” The sheer number of downloads indicates a real demand for these technologies, raising serious questions about safety and ethics.
In response to these findings, both companies have begun reviewing their app selections. Google confirmed they have removed several problematic apps while ongoing investigations continue. Apple, however, did not respond to requests for comment but has faced criticism for displaying ads for these banned apps. “Users could search for ‘nudify,’ and not only would they find the app, Apple was also promoting it,” Paul noted.
The apps underwent testing by the researchers, who found they could easily create non-consensual images by modifying existing photographs. Even though 25 of these apps have been removed since the report, the fact that they were available at all is alarming.
Comparing this situation to a prior investigation, Tech Transparency found that both app stores had previously hosted apps connected to entities sanctioned by the U.S. government. While Google and Apple took action in that instance by pulling those apps, the persistence of harmful apps highlights a critical gap in policy enforcement.
Paul emphasized the danger of these apps being marketed not just to adults but also to minors. Some of these apps were aimed at children as young as nine years old. “This isn’t just a grown-up issue,” she pointed out. “Kids are getting targeted too.”
Another red flag involves data privacy. There are concerns that these non-consensual images could end up in the hands of foreign governments, particularly in light of certain data laws in countries like China. “That raises serious privacy and national security issues,” Paul warned, especially if political figures become victims of such apps.
In a world increasingly reliant on technology, the gap between policy and practice is widening. While both Apple and Google assert that their platforms are safe and reliable, the reality suggests otherwise. “Their review processes look good on paper, but in practice, we see a different story,” said Paul. It’s clear that until robust measures are taken, consumers remain at risk.

