Apple and Google are facing scrutiny after a technology watchdog reported that both companies allowed dozens of artificial intelligence (AI) apps capable of creating non-consensual nude images to operate on their app stores, despite policies that prohibit such content.
The findings were published on January 27, 2026, by the Tech Transparency Project (TTP), a nonprofit research group based in Washington, D.C.
According to the report, TTP identified 55 so-called “nudify” apps on Google Play and 47 similar apps on the Apple App Store. The apps use AI technology to digitally remove clothing from images of women or to place real faces onto explicit images, creating what are commonly known as deepfake nudes.
“These apps make it easy to create sexualized images of women without their consent,” TTP said in its report.
TTP estimated that the apps had been downloaded more than 705 million times worldwide and generated approximately $117 million in revenue through subscriptions and in-app purchases.
Under standard app store rules, Apple and Google take up to 30 percent commission on developer earnings, meaning both companies benefited financially while the apps remained available.
Researchers also found that some of the apps were rated suitable for children as young as nine.
Both Apple and Google have clear policies banning sexual nudity, pornographic material and apps that exploit or dehumanise people. However, the TTP report said enforcement failed in practice.
Many of the nudify apps were promoted using neutral descriptions such as “AI photo editor” or “virtual fitting room,” allowing them to bypass app review systems.
In one case cited by researchers, Apple was found displaying sponsored advertisements for a nudify app when users searched restricted terms, generating additional advertising revenue.
“When you search the word ‘nudify,’ nothing should appear under Apple’s own rules,” said Katie Paul, Director of the Tech Transparency Project. “Instead, these apps were available and even promoted.”
Following inquiries from researchers and media outlets, Apple said it had removed 28 apps flagged in the report, though two were later restored after developers modified them. TTP said only 24 apps were fully removed from Apple’s store as of January 27.
Google said it had suspended several apps and launched an internal review.
“When violations of our policies are reported, we investigate and take appropriate action,” a Google spokesperson said in an email response.
Earlier this month, Elon Musk’s AI chatbot Grok faced international backlash after users prompted it to generate nude images, including of minors. Regulators in the European Union launched investigations, and Indonesia blocked access to the service.
TTP warned that nudify apps present risks beyond sexual abuse, including data privacy and national security concerns, especially where developers operate in countries with limited data protection laws.
“Non-consensual intimate images could end up stored or shared without users’ knowledge,” Paul said. “That creates long-term harm for victims.”
Apple and Google continue to promote their app stores as safe and carefully moderated. However, the TTP report argues that the continued presence of nudify apps shows a gap between policy and practice.
“The rules are written,” Paul said. “What’s missing is consistent enforcement.”

