News Leak Centre

No Fear No Favour

Facebook begins rating users on how trustworthy they are at flagging fake news!

Facebook has started rating its users’ trustworthiness in order to help the social network know how much to value user reports that a certain news story might be fake. . The system certainly sounds a touch dystopian, but Facebook sees it as a valuable tool for weeding out disinformation.

The trust ratings went into place over the last year, and were developed as part of Facebook’s fight against fake and malicious stories. Facebook relies, in part, on reports from users to help catch these stories. If enough people report a story as false, someone on a fact-checking team will look into it. But checking every story that racks up “fake news” reports would be overwhelming, so Facebook uses other information to figure out what it should and shouldn’t bother looking into.

One of those is this trust rating. Facebook didn’t tell the Post everything that went into the score, but it is partly related to a user’s track record with reporting stories as false. If someone regularly reports stories as false, and a fact-checking team later finds them to be false, their trust score will go up; if a person regularly reports stories as false that later are found to be true, it’ll go down.

In that sense, this may be less of a “trust” score and more of a “fact-check” score, and the name isn’t likely to do it any favors. Algorithms are often flawed and can have larger, deleterious effects that aren’t immediately visible, so Facebook will have to be careful about what other information it factors in and how else this score is used, lest it accidentally discount reports from a specific community of people.

Leave a Reply

Your email address will not be published. Required fields are marked *