
A recent investigation has sparked a serious conversation about how effectively app provider platforms are being policed. According to a detailed report from the Tech Transparency Project (TTP), both Apple and Google have been found hosting and, in some cases, actively directing users toward so-called “nudify” apps—tools that use generative AI to create non-consensual explicit images—through their own app stores, advertising, and search systems.
AI Nudify apps are bypassing Apple and Google filters for stores and search systems
On paper, the rules are clear. Both tech giants have strict rules against pornographic content and apps that facilitate sexual exploitation. Google even has specific language against apps that “nudify” real people. Despite these guardrails, the TTP report (via Bloomberg) reveals that searching for terms like “undress” or “nudify” on the iOS App Store and Android‘s Play often yields dozens of results.
Perhaps more startling is the role of the platforms’ own discovery tools. The investigation found that search algorithms suggested these terms via autocomplete and even displayed paid advertisements for deepfake-capable apps. This suggests that while the companies ban the content, their automated systems are occasionally working in the opposite direction.
A multi-million dollar problem
The scale of this niche market is significant. Data from mobile analytics firms cited by Engadget shows that the apps identified in the report have collectively generated over 483 million downloads and $122 million in revenue.
A major point of contention involves age ratings. The report identified 31 apps that were rated as “E” for Everyone or suitable for minors.
The cleanup effort
Following the report’s release, both companies took action. Apple reportedly removed 15 of the flagged apps. On the other hand, Google suspended several others. Both stated that their investigation and enforcement processes are ongoing. Google reiterated to Android Authority that they do not allow apps containing sexual content and take action whenever violations are reported.
However, the very nature of these apps remains a challenge. As soon as one group is removed, others often reappear under different names or with vaguely described features that hide their AI capabilities. Lawmakers in regions like Minnesota and the UK are moving toward outright bans on this specific technology. Meanwhile, the pressure is on Silicon Valley to ensure their automated search and ad systems aren’t inadvertently profiting from the very content their policies claim to forbid.
The post Apple & Google Filters Are Still Letting AI Nudify Apps Slip Through the Cracks appeared first on Android Headlines.