“What’s your Facebook trustworthiness score?” could be the next big question that you might see people asking their friends. This is due to a new algorithm that Facebook has developed to to assign its users a reputation score which in turn helps the company predict its users’ trustworthiness on a scale from zero to 1. According to a report in The Washington Post, the rating system has been developed over the past year to measure the credibility of users and identify malicious actors who spread fake news on the platform.
According to Tessa Lyons, a Product Manager at Facebook who is in-charge of fighting misinformation, Facebook developed its reputation assessments as part of its effort against fake news. To curb the dissemination of fake news, Facebook empowered its users with tools to report problematic content. But everything did not go as planned and users started to falsely report items as untrue. “It’s not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Lyons was quoted as saying.
But the trustworthiness score does not guarantee a user’s credibility. Lyons said that there is single unified reputation score that users are assigned and “the score is one measurement among thousands of new behavioural clues that Facebook now takes into account as it seeks to understand risk.” Facebook is monitoring which users have a tendency to flag content as problematic and which publishers are considered trustworthy by users.
Currently, there is no known way users can check their scores. Facebook is also keeping under wraps the other criteria that it uses to determine a user’s score.