Github disabled the project because it found the project to be in violation of the platform’s acceptable use policy.
A week after media reported the presence of DeepNude app’s codes on Github, the Microsoft-owned software development hosting platform has said that it has removed all the repositories of the controversial app. DeepNude app was designed to create realistic nude images of women without their consent. The app used AI to remove clothes from images of women. The technology superimposes existing images (and videos) onto source images using machine learning techniques.
“We do not proactively monitor user-generated content, but we do actively investigate abuse reports. In this case, we disabled the project because we found it to be in violation of our acceptable use policy. We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our Terms of Service and Community Guidelines,” Motherboard quoted a GitHub spokesperson as saying.
In the “Sexually Obscene” section of Community Guidelines, GitHub notes, “Don’t post content that is pornographic. This does not mean that all nudity, or all code and content related to sexuality, is prohibited. We recognise that sexuality is a part of life and non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes. We do not allow obscene sexual content or content that may involve the exploitation or sexualisation of minors.”
Soon after the creators of DeepNude app pulled the plug on the app, multiple versions of the app popped up on platforms like GitHub, YouTube and Telegram. There's no word on how other platforms, apart from GitHub, plan to remove links that direct users to download copies of the DeepNude app.
The app was removed following a public backlash. The developers identified that the app was being misused and stopped offering downloads.
“We never thought it would be viral and (that) we would not be able to control the traffic. Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high. We don't want to make money this way,” DeepNude creators said on Twitter.
While the same could be done using other photo editing software, DeepNude apparently made “passably realistic” images. Moreover, the photo editing softwares need expertise to produce such images, but with DeepNude, any person without any technical or artistic skill can produce such images. Unfortunately, DeepNude is just one use case of the DeepFake technology being misused. The technology is seen as a way to spread nonconsensual pornographic content and/or a means for revenge porn.