Apple Removes Apps Generating Non-Consensual Nude Images Using AI

Apple
Apple has taken action by removing three applications from its App Store that were found to be capable of generating non-consensual nude images through the use of artificial intelligence (AI). 
The discovery of these apps came after they were promoted via advertisements on Instagram and subsequently reported by 404 Media.
The apps in question offered various services, including face swaps on adult images and AI-powered “undressing” features.
Meta, the parent company of Instagram, promptly removed the ads following the reports, while Apple initially refrained from commenting on the matter.
This incident is not the first of its kind, with similar instances reported back in 2022.
Tech giants such as Apple and Google have faced criticism for not taking swifter action against such apps, opting instead to urge developers to cease advertising such capabilities on adult websites.
The proliferation of these apps has raised concerns, particularly in educational environments, where they have gained popularity among students in schools and colleges worldwide.
The issue reflects broader concerns surrounding privacy and consent in the digital age, prompting calls for greater vigilance and proactive measures from technology companies to address such threats to user privacy and security.

Subscribe to our newsletter for latest news and updates. You can disable anytime.