The social media giant says it's going to rely heavily on third-party fact-checkers and users to flag fake news when they come across it. To eliminate biased censorship, each team of fact-checkers working for Facebook has signed onto the International Fact Checking Code of Principles, from Poynter—an independent global media network.
Now, when a user tries to share a fake news story, an alert will pop up to let that user know that Poynter's fact-checkers have disputed its findings. If the user still decides to post, a red warning will show up at the bottom of the post, flagging it as fake. Users can also click on the flag to find out more about why it was disputed.
People will also be able to report fake news to Facebook and fellow users, in the same way you can flag images. Facebook will then take that user-generated data and send it to the small team of fact-checkers from FactCheck.org, PolitiFact, and Washington Post to help them better track where fake news is coming from.
"It's important to us that the stories you see on Facebook are authentic and meaningful," Facebook's News Feed VP, Adam Mosseri, wrote in the announcement. "We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully. We'll learn from these tests, and iterate and extend them over time."
Thumbnail photo via Pixbay