How trustworthy would you say you are? How trustworthy would your friends say you are? The answers to these may be very different, and it turns out that Facebook may have its own opinion on the question too.
In an interview with The Washington Post, Tessa Lyons, a Facebook product manager, has revealed that users of the service have their trustworthiness predicted on a scale of zero to one.
It’s all part of Facebook’s attempt to try and get a handle on its reputation of helping to spread fake news. By putting a figure on a user’s credibility, it can flag those who are likely trying to push or surpress stories based on an ideological agenda.
Related: How to delete a Facebook account
One aspect of this is people reporting content – and Lyons and her team discovered that some users were routinely flagging content as untrue for partisan reasons. It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Lyons explained.
It’s not clear how many Facebook users have this score enabled, and how big an impact it has on an algorithm that takes in thousands of behavioural cues. But ever since the company started employing independent fact checkers, it has had the ability to spot when people are routinely crying fake news on stories that turn out to be real, and vice versa.
“One of the signals we use is how people interact with articles,” Lyons explained. “For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false-news feedback more than someone who indiscriminately provides false-news feedback on lots of articles, including ones that end up being rated as true.”
This is just one signal of trustworthiness, and Lyons declined the opportunity to reveal more, as it could make it easier to game the system.
What’s interesting about this, however, is Lyons’ words seem to go against what CEO Mark Zuckerberg said recently about the company’s reluctance to judge individual intent.
“It’s hard to impugn intent and to understand the intent,” he replied, when interviewer Kara Swisher put it to him that Holocaust deniers might be intentionally spreading misinformation, rather than people who genuinely believe the articles.
How effective these steps are at stemming the tide of misinformation that has spread across Facebook is something that may not become clear for years to come. But it’s a bold revelation to make at a time when the President of the US – a big beneficiary of social media’s reluctance to get hold of the issue – is campaigning against internet companies’ ability to censor political voices.
How do you feel about Facebook judging an individual’s trustworthiness? Let us know on Twitter @TrustedReviews.