Facebook to deploy AI to tackle revenge porn on its platforms
The social network will deploy machine learning and artificial intelligence to detect near nude images and videos that have been shared without the owners permission, and delete them. Facebook claims it will be able to find the content even before anyone reports it.
Highlights:
- Facebook to use machine learning and AI to detect intimate photos of people shared without their consent.
- The technology will be used in addition to human intervention who will review the images found by the AI.
- When found offensive, the image will be taken down and the account responsible for spreading it will be disabled.
Facebook announced on Friday that it will be using Artificial Intelligence to curb ‘revenge porn’ on its social network. The obscene practice that involves sharing intimate photos of someone without their consent on social media has plagued Facebook for a long time. The new technology will be in addition to the pilot program Facebook is running with trained representatives who review offending images to keep them under check.
The social network will deploy machine learning and artificial intelligence to detect near nude images and videos that have been shared without the owners permission, and delete them. Facebook claims it will be able to find the content even before anyone reports it. Once the AI finds offending content, a member of Facebook’s community operations team will review the finding and if found offensive, will remove it and even disable the account responsible for posting and spreading it.
Revenge porn is a practice that disproportionately affects women who are targeted by former partners by spreading sexually explicit, intimate photos of them on social media in order to humiliate or extort them. Presently, Facebook works with at least five outsourcing companies in eight countries to review content posted on its social networks. In total, Facebook has about 15,000 people and a mix contractors and employees working on content review, according to Reuters.
Facebook also wrote in a blog that it will open a hub called “Not Without My Consent” on its Safety Center page to help people whose intimate photos have been made public on the social media without their consent.
Related Read:
New Zealand shootings: Facebook removed 1.2 million videos at point of upload, 300K videos bypassed filtration systemsNew Zealand shootings: Facebook removed 1.2 million videos at point of upload, 300K videos bypassed filtration systems
Digit NewsDesk
Digit News Desk writes news stories across a range of topics. Getting you news updates on the latest in the world of tech. View Full Profile