The usage of AI-powered applications for digitally undressing women in photos is on the rise.

Photo of author

By news

The usage of AI-powered applications for digitally undressing women in photos is on the rise.

In the month of September alone, AI-powered applications for undressing websites attracted a total of 24 million visitors, as reported by the social network analysis company Graphika.

Apps and websites employing artificial intelligence to digitally undress women in photos are experiencing a surge in popularity, as revealed by researchers. In September alone, undressing websites attracted visits from 24 million individuals, according to findings from the social network analysis company, Graphika.

Graphika noted that many of these undressing, or “nudify,” services heavily rely on popular social networks for their marketing efforts. For example, since the start of the year, the researchers observed a staggering increase of over 2,400% in the number of links promoting undressing apps on social media platforms like X and Reddit. These services utilize AI-powered applications to manipulate images, creating a nude appearance, and notably, many of these applications are designed to work exclusively on images of women.

These applications contribute to a concerning trend of non-consensual pornography proliferating due to advancements in artificial intelligence, specifically in the form of deepfake pornography. This fabricated media presents serious legal and ethical challenges, often involving the unauthorized use of images sourced from social media.

The surge in popularity aligns with the release of several open-source diffusion models, advanced artificial intelligence capable of creating highly realistic images, surpassing those generated just a few years ago, as highlighted by Graphika. Since these models are open source, developers can access them freely.

Santiago Lakatos, an analyst at Graphika, emphasized the increased realism of these images compared to earlier deepfakes, which were often blurry. Some apps, utilizing explicit language, suggest users can create nude images and send them to the digitally undressed person, potentially inciting harassment.

Notably, one of the apps has paid for sponsored content on Google’s YouTube, appearing prominently in searches for the term “nudify.” Google stated it doesn’t allow ads containing sexually explicit content and is removing those violating policies. Reddit, which prohibits non-consensual sharing of faked explicit material, banned several domains following the research. X did not respond to requests for comment.

Aside from the rising traffic, these services, some charging $9.99 per month, claim substantial user bases, with one app advertising over a thousand users daily. Concerns are mounting among privacy experts as advances in AI-powered applications for make deepfake software more accessible.

While there’s currently no federal law against creating deepfake pornography, the US government prohibits generating such images involving minors. In November, a North Carolina child psychiatrist received a 40-year sentence for using undressing apps on patient photos, marking the first prosecution under laws banning deepfake generation of child sexual abuse material.

TikTok and Meta Platforms Inc. have taken steps to block keywords associated with searching for undressing apps, warning users that such content may violate guidelines. TikTok declined to elaborate, and Meta Platforms Inc. declined to comment.

Leave a Comment