Facebook to add 3,000 moderators to monitor videos and other flagged reports

Facebook to add 3,000 moderators to monitor videos and other flagged reports
HIGHLIGHTS

Facebook CEO, Mark Zuckerberg made this announcement with reference to the recent high-profile incidents of violent content being posted on the social media platform.

Facebook CEO Mark Zuckerberg has announced that the company will add 3,000 people over the next year. These people will be tasked with reviewing videos and other flagged reports on the social media platform. The announcement was made in the light of numerous high-profile incidents of videos depicting violent content being posted on the social media platform. 

Zuckerberg wrote, “if we're going to build a safe community, we need to respond quickly. We are working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down”. 

The new moderators will work “to review the millions of reports we get every week, and improve the process of doing it quickly.” Zuckerberg has also said that the company will work towards easing out the flagging process for users to report problems and for the moderators to contact law enforcement.

Last month, Facebook faced a lot of criticism after a video depicting a deadly shooting in Cleveland was posted on the platform and stayed on site for hours. Facebook had to apologize for the way it handled the situation and pledged “to do better”. However, following this, another incident took place in Thailand in which a video of a child’s murder stayed posted for a full day. 

Zuckerberg has said that the company will add those 3,000 people to the 4,500 who are already working in the Community Operations Team. The statement did not include any explanation regarding how exactly the moderators will work under Facebook. 

Paurush Chaudhary

Paurush Chaudhary

Technology + Cinema = Eternal Bliss View Full Profile

Digit.in
Logo
Digit.in
Logo