Facebook exec says it has ‘a responsibility’ to fight fake news
After a barrage of criticism over fake news stories on Facebook, CEO Mark Zuckerberg said that over 99 percent of content on the site was authentic. Zuckerberg has since backed off that sentiment slightly, admitting that fake news is indeed a major issue for the company. At Harvard’s Campaign Managers Conference this week, the company’s vice president of communications and public policy had more to say on the topic.
“For so long, we had resisted having standards about whether something’s newsworthy because we did not consider ourselves a service that was predominantly for the distribution of news,” explained Facebook’s Elliot Schrage. “And that was wrong.”
Schrage’s comments came during a panel discussion about the role of media during the 2016 US presidential election. “Until this election, our focus was on helping people share,” he said. “This election forced us to question whether we have a role in assessing the validity of content people share. And I have to tell you all — that’s a pretty damn scary role to play.”
Of course, policing content then raises issues of censorship and Facebook doesn’t know how it should proceed just yet. The company has already announced it plans to give users easier ways to report hoaxes, develop better detection before links even hit the News Feed and cutting ad revenue to “misleading, illegal and deceptive” sites. According to a recent BuzzFeed report, Facebook employees have unofficially taken on the task of battling the fake news problem as well.
“We have a responsibility here,” Schrage said. “I think we recognize that. This has been a learning for us.”
Schrage explained that Facebook isn’t interested in hiring human editors who decide what hits the News Feed. The company already changed course from having employees choose trending topics in favor of an algorithm-based approach. Even after the switch, Facebook is still dealing with fake stories popping up there. Another report claimed editors were knowingly suppressing conservative links, an allegation which Facebook later denied.
So, how does Mr. Schrage propose Facebook alleviate the problem? First, he said that the tools that allow users to report fake news are “not well-done” and need an overhaul. He also hinted at potential solutions that seek to change user behavior rather than pulling content that’s shared from certain sites. Schrage called it a “think before you share” program, and it sounds a bit like an awareness campaign that could be ignored by a large portion of the site’s billions of users.
“We’re in the business of giving users the power to share,” Schrage said. “Part of that is helping them share thoughtfully and responsibly, and consume thoughtfully and responsibly.” As Vox notes, merely passing the responsibility to users is a similar approach to that of Twitter on the topic of abuse. And that course of action isn’t doing much to reassure people using the service.
(25)