AI Nudify Apps Spark Concern in App Stores
A recent report by the Tech Transparency Project has highlighted a troubling issue: numerous "nudify" apps on Apple and Google’s app stores that utilize artificial intelligence to generate deepfake nude images of real individuals. This revelation has ignited widespread concern, with many questioning how such applications were allowed to proliferate on these platforms, despite existing app store regulations.
The investigation uncovered 55 nudify apps on the Google Play Store and 47 on the Apple App Store. Alarmingly, these apps have collectively been downloaded over 700 million times and have generated more than $117 million in revenue. Both tech giants profit from this revenue, which raises ethical questions about their responsibility to protect users.
What’s particularly alarming is the suitability ratings of these apps. Many, like DreamFace, are rated for teens and children, with Google Play rating it suitable for ages 13 and up, while the Apple App Store rates it for ages 9 and up. This raises serious concerns about the exposure of minors to potentially harmful content.
In response to the growing scrutiny, Apple announced it removed 24 apps from its store, although this is less than the total identified by the report. Google also stated it suspended several apps for violating store policies but did not disclose specific numbers. This response has led to calls for more robust regulatory measures to prevent the spread of such harmful applications.
The issue is compounded by the fact that these nudify apps and websites employ generative AI technologies to produce realistic deepfake nude images by digitally removing clothing or manipulating photos. This practice predominantly targets women and children and poses significant mental health risks for those affected, especially women and girls who face the brunt of sexually explicit deepfakes online.
While it is illegal to possess AI-generated sexual content featuring minors, the AI models used for creating these images are not prohibited. In December, the UK government announced intentions to ban nudification apps and make it illegal to create or distribute AI tools that remove clothing from images. This initiative represents a broader strategy to combat misogyny and violence against women and girls.
The situation highlights the urgent need for stricter regulations and accountability from tech companies to ensure that their platforms do not become breeding grounds for harmful content. As discussions around user safety intensify, it is essential that both Apple and Google take decisive action to protect vulnerable populations from the dangers posed by these nudify apps.