“Everyone is subject to being objectified or pornographied by everyone else,” said University of California, Irvine law professor Ari Ezra Waldman to a 2024 House committee hearing on deepfakes. AI image creation technology is now able to take your head and put it on someone else’s body, or create a reasonable facsimile of your image based on a picture, and this has led to the creation of an avalanche of AI deepfake apps specifically designed to sexually harass women and put them in nonconsensual deepfake porn. In September, CNBC published a report on a group of women in Minnesota who were victimized by pornographic deepfakes of them, sending them on a horrifying journey to try to get these fake but very realistic looking images of themselves in various sexual positions off the internet.
“He did not break any laws that we’re aware of,” Molly Kelley said to CNBC, referring to her friend’s estranged ex-husband who created these nonconsensual pornographic images of them using an AI tool. “And that is problematic.” CNBC interviewed experts who said “that many apps that have nudify services advertise on Facebook and are available to download from the Apple App Store and Google Play Store.” This week, the Tech Transparency Project (TTP) published an exposé building on what researchers told CNBC, titled “Nudify Apps Widely Available in Apple and Google App Stores.”
Elon Musk’s Grok chatbot is perhaps the most shameless nudify app, even admitting to producing Child Sexual Abuse Material (CSAM) when it acquiesced to a user’s request to undress an underage girl. Congressional Democrats subsequently called on Apple and Google to pressure Musk, because the nonconsensual porn Grok creates is a feature of Musk’s LLM, not a bug, and it is a violation of both Apple and Google’s terms of services. But TPP found that Grok is far from the only nonconsensual porn creation app available for almost anyone to download from Google and Apple.
<!– RevContent
revcontnent hidden –> <!– RevContent
–> <!–




