Facebook has given scores to its users based on their trustworthiness in an attempt to fight misinformation, the Washington Post reported on Tuesday, citing a company executive.
The social media giant developed the rating system over the past year, the newspaper reported, citing an interview with Facebook product manager Tessa Lyons, who is tasked with the company’s efforts to identify malicious actors.
For quite a long time, Facebook and many other tech companies have sought for the users to report fake news or stories. But with more rights empowered to the users, many of them were spotted gaming the system by flagging fake posts themselves or mixing the genuine story with false one, which has brought a lot of trouble to the company in recent years.
It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Tessa Lyons, Facebook’s product manager for fighting misinformation, told the Post.
Facebook has now developed a score-rating system to judge the reliability. If the user intentionally flags the real news as the faked one on a regular basis, the credibility of the people will get hurt, but if people help report the problematic content, then the score will get improved.
Source(s): Reuters