Facebook is rating the trustworthiness of its users to fight fake news
The company has long relied on its users to report problematic content, but some began falsely reporting items as untrue
Facebook has begun to assign its users a reputation score, predicting their trustworthiness on a scale from zero to one.
The previously unreported ratings system, which Facebook has developed over the last year, shows that the fight against the gaming of tech systems has evolved to include measuring the credibility of users to help identify malicious actors.
Facebook developed its reputation assessments as part of its effort against fake news, Tessa Lyons, the product manager who is in charge of fighting misinformation, said in an interview.
Watch: Facebook uncovers ‘coordinated’ campaign to disrupt US midterm elections
The company, like others in the tech sector, has long relied on its users to report problematic content – but as Facebook gave people more options, some began falsely reporting items as untrue, a new twist on information warfare that the company had to account for.
It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher”, said Lyons.
A user’s trustworthiness score is not meant to be an absolute indicator of a person’s credibility, Lyons said, nor is there a single unified reputation score that users are assigned.
Rather, the score is one measurement among thousands of new behavioural clues that Facebook now takes into account as it seeks to understand risk.