A new investigation by the Tech Transparency Project has raised new concerns about the presence of so-called nudify apps in the Apple and Google app stores. According to the report, both Apple’s App Store and Google Play Store continue to feature apps capable of creating deepfake nude images, sometimes through search suggestions and paid promotions.
According to the report, many of the top search results for terms related to such content included apps that can digitally alter images to depict women in explicit or semi-nude forms. The report also mentioned several instances in which promoted listings appeared at the top of search results.
In one case, a face-swapping app was displayed as a sponsored result for a deepfake-related query, and testing revealed that it can insert a person’s face into explicit video content with no meaningful safeguards. This was observed in the other apps as well. All these apps offered face swap templates that can help users combine clothed and naked images with few restrictions.
Also read: Microsoft unveils MAI Image 2 Efficient AI model, calls it production workhorse: How to access
Furthermore, the report also mentions autocomplete suggestions as a concern. Partial search inputs were discovered to direct users to more explicit queries, which in turn revealed additional apps of this type among the top results.
TTP also stated that some developers may not fully comprehend the capabilities of the AI tools they are utilising. In one case, a developer admitted to using AI image generation technology and promised to implement stronger moderation measures after the issue was raised.
However, the companies have not issued a detailed public response to the discovery, but Apple has taken action by removing several of the apps, the report said. However, the watchdog group believes that more consistent enforcement and stronger safeguards are required to keep such apps from reappearing on mainstream platforms.