large image

Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

Facebook has started rating users’ “trustworthiness”

How trustworthy would you say you are? How trustworthy would your friends say you are? The answers to these may be very different, and it turns out that Facebook may have its own opinion on the question too.

In an interview with The Washington Post, Tessa Lyons, a Facebook product manager, has revealed that users of the service have their trustworthiness predicted on a scale of zero to one.

It’s all part of Facebook’s attempt to try and get a handle on its reputation of helping to spread fake news. By putting a figure on a user’s credibility, it can flag those who are likely trying to push or surpress stories based on an ideological agenda.

Related: How to delete a Facebook account

One aspect of this is people reporting content – and Lyons and her team discovered that some users were routinely flagging content as untrue for partisan reasons. It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Lyons explained. 

It’s not clear how many Facebook users have this score enabled, and how big an impact it has on an algorithm that takes in thousands of behavioural cues. But ever since the company started employing independent fact checkers, it has had the ability to spot when people are routinely crying fake news on stories that turn out to be real, and vice versa.

“One of the signals we use is how people interact with articles,” Lyons explained. “For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false-news feedback more than someone who indiscriminately provides false-news feedback on lots of articles, including ones that end up being rated as true.”

This is just one signal of trustworthiness, and Lyons declined the opportunity to reveal more, as it could make it easier to game the system.

What’s interesting about this, however, is Lyons’ words seem to go against what CEO Mark Zuckerberg said recently about the company’s reluctance to judge individual intent.

It’s hard to impugn intent and to understand the intent,” he replied, when interviewer Kara Swisher put it to him that Holocaust deniers might be intentionally spreading misinformation, rather than people who genuinely believe the articles.

How effective these steps are at stemming the tide of misinformation that has spread across Facebook is something that may not become clear for years to come. But it’s a bold revelation to make at a time when the President of the US – a big beneficiary of social media’s reluctance to get hold of the issue – is campaigning against internet companies’ ability to censor political voices.

How do you feel about Facebook judging an individual’s trustworthiness? Let us know on Twitter @TrustedReviews.

Why trust our journalism?

Founded in 2004, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have 9 million users a month around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.