Last week, the creators of deepfake app DeepNude took it down after receiving flak from all around the world. The app uses AI to superimpose nude images of women on top of real images, resulting in the creation of thousands of fake, non consensual obscene images. At the time of taking it down, the creators of the app had themselves admitted that there is a high probability of the app being misused. Duh!
Now, The Verge reports that it has found multiple copies of the DeepNude app floating around on the internet, including on YouTube descriptions, Telegram channels, 4Chan, GitHub and other sources.
The app was earlier being sold on a Discord server for $20, as reported by Motherboard. It was later removed, but the seller on Discord wrote, “We are happy to announce that we have the complete and clean version of DeepNude V2 and cracked the software and are making adjustments to improve the program.”
Meanwhile, Verge reports that the person who uploaded the open-source of the app on GitHub noted their resentment on people trying to “censor knowledge.” The publication says it was able to find copies of the app on some sources, including versions which did not impose a watermark on the images, flagging them as fake.
This is not the first time Deepfake technology has resulted in cases of misuse. In the past, multiple people, including celebrities have suffered the consequences of fake images created using this method. Deepfakes are also used to spread fake news on social media platforms and create fake content across video platforms. Sadly, there is no stopping the internet from sharing and creating more copies of apps like DeepNude. There’s always a flipside to technology, and this is it for computer vision.