Facebook Boosts Fact Checkers’ Abilities To Identify Misinformation
Facebook Boosts Fact Checkers’ Abilities To Identify Misinformation
Making it easier for fact-checkers to fillet false content, Facebook is testing a new pilot program that will enlist the help of user groups.
“The program will have community reviewers work as researchers to find information that can contradict the most obvious online hoaxes or corroborate other claims,” Henry Silverman, a Facebook product manager, explains in a new blog post.
Though not official Facebook employees, community reviewers will be hired as contractors through one of the tech titan’s partners. They will not be given final say as to the validity of content.
Rather, their findings will be shared with third-party fact-checkers, each of whom is responsible for doing their own official reviews.
“For example, if there is a post claiming that a celebrity has died and community reviewers don’t find any other sources reporting that news … they can flag that the claim isn’t corroborated,” according to Silverman. “Fact-checkers will then see this information as they review and rate the post.”
As part of its war on misinformation, Facebook has been developing this particular effort for a number of months.
During that time, the company has worked with experts and partners across various fields to understand how to better support its fact-checking partners in their effort to review content more quickly.
Assisting with the cause, Facebook’s machine-learning model will identify potential misinformation using a variety of signals. These include comments on posts that express disbelief, and whether a post is being shared by a Page that has spread misinformation in the past.
If there is an indication that a post may contain misinformation, it will be sent to different groups of community reviewers. These community reviewers will then be asked to identify the main claim in the post. They will then conduct research to find other sources that either support or refute that claim, similar to the way a person using Facebook may search for other news articles to assess it if they believe the main claim in a post.
Fact-checking partners will then see the collective assessment of community reviewers as a signal in selecting which stories to review and rate.
Facebook is piloting the new program in the United States over the coming months. It plans to evaluate how it’s working through its own research, along with help from academics and third-party fact-checking partners.
(13)