In its review, the group found 55 apps on Google Play and 47 on the App Store. TTP contacted both companies requesting the services be removed. Apple removed 28 nudify tools and warned developers that their products could be taken down if they violate store policies.
Two apps were later reinstated after all issues were reportedly fixed.
A Google spokesperson said the company has suspended several apps and opens investigations after receiving such reports.
“Both companies claim they care about user safety, yet they host apps in their stores that can turn an innocent photo of a woman into a degrading sexual image,” TTP wrote in its report.
The analysts identified the apps by searching for terms like “nudify” and “undress,” and tested them using AI-generated images. They examined two types of services:
-
those that used AI to render images of women without clothing;
-
others that placed people’s faces onto explicit photos.
“It’s completely obvious these aren’t just ‘outfit change’ apps. They’re clearly designed to sexualize people without their consent,” said Katie Paul, director at TTP.
TTP added that 14 of the apps were created in China.
“China’s data storage laws mean the government has the right to access any company’s data anywhere in the country. So if someone created deepfakes using your image, they may now be in the hands of authorities,” Paul added.
AI used for harm
AI has made “undressing” women and creating non-consensual deepfake porn easier than ever. In January, the chatbot Grok faced backlash over a similar capability, and the company later disabled the generation of explicit images of real people.
In August 2024, San Francisco City Attorney David Chiu filed a lawsuit against the owners of 16 of the world’s largest websites that enable users to “undress” women and girls in photos using AI without consent.
The filing alleges violations of state and federal laws related to deepfake pornography, revenge porn, and child sexual abuse material.
“Generative AI has enormous potential, but as with all new technologies, there are unintended consequences and criminals looking to exploit it. We must be clear: this is not innovation — it is sexual abuse,” Chiu said.
According to the statement, the sites named in the case offer streamlined photo-upload tools to generate realistic pornographic versions that are often nearly indistinguishable from real images and are used for extortion, intimidation, threats, and humiliation.
ES
EN