Apple and Google reportedly hosting deepfake nudity apps, despite breach of policy

A new investigation by the Tech Transparency Project (TTP) has found that Apple and Google are not just failing to remove AI-powered “nudify” apps from their platforms, their own search and advertising systems are actively pushing them up. Type “nudify,” “undress,” or “deepnude” into either app store’s search bar, and the results speak for themselves.

Also read: App Store, Google Play Store accused of promoting nudify apps through search suggestions: All details

The findings are concerning to say the least. Around 40 percent of the top apps returned for those search terms were capable of digitally stripping women’s clothes in photographs. Apple and Google have both also run sponsored advertisements for nudify apps within those same search results. Even the autocomplete suggested new search terms that can lead you down the rabbit hole even more.

The apps being available so easily is one thing, the scale at which they are used is another. The nudify apps from the TTP report have been downloaded a combined 483 million times and generated over $122 million in revenue. Apple and Google collect a cut of in-app purchases and subscription fees from these apps. This does make it difficult to take their policies seriously. When that much profit gets involved, the incentive to act just vanishes altogether.

Also read: Apple threatens to remove Elon Musk’s Grok from App Store, leaked letter reveals

If you think that is bad, it gets worse. 31 of the apps identified were rated suitable for minors. At a time when schools across the world are struggling with sexual deepfake scandals involving students, the idea that a minor could not just stumble across but also be directed to an app capable of generating nonconsensual nude imagery is very dangerous.

Both companies have long maintained policies that should, in theory, not allow these apps. Apple bars content that is “offensive” or “pornographic.” Google bans apps that “claim to undress people or see through clothing.” Yet TTP’s testing found that nearly half of the apps surfacing in search results violated those very standards. After the report was shared with the companies and Bloomberg News, Apple removed 14 apps and Google removed seven. So I guess it was always possible to do it, it simply wasn’t a priority.

This is not a case of technology outpacing regulation, or AI evolving faster than policy can keep up. This is a case of two of the most powerful companies in the world knowing that harmful apps exist on their platforms and choosing revenue over accountability. The tools to act were always there. The will, it seems, was not.

Also read: Elon Musk’s R-rated rule for Grok Imagine: A disaster waiting to happen?

Vyom Ramani

A journalist with a soft spot for tech, games, and things that go beep. While waiting for a delayed metro or rebooting his brain, you’ll find him solving Rubik’s Cubes, bingeing F1, or hunting for the next great snack.

Connect On :