Apple has removed apps from its App Store following reports that they could create nonconsensual nude images. This crackdown, prompted by a report from 404 Media, highlights the growing concern around the misuse of AI image generation.
How the Apps Were Found
404 Media discovered the apps were advertised on Instagram as “free” ways to “undress” people. While deceptively presented as “art generators” on the App Store, their true capabilities were hidden.
Apple’s Response
Apple initially did not respond to 404 Media’s requests for comment. However, after the story was published, Apple reached out for more information and then removed the offending apps upon receiving direct links.
Ongoing Challenge
As 404 Media’s Emanuel Maiberg points out, this is likely to be an ongoing battle. Apple couldn’t locate the violating apps on its own, suggesting the need for more proactive monitoring.
