As the right catches election-fraud fever, Facebook denies its role in spreading lies
Kevin Roose of The New York Times uses CrowdTangle to keep a running tab on which news outlets get the most likes, shares, and comments on their Facebook posts. In 2020 his list has been consistently dominated by right-wing outlets (think Breitbart and Newsmax) that have a habit of spreading half-truths or full-on misinformation. Right now some of the most engaged Facebook pages are busy disseminating the narrative that the election was rigged against Donald Trump.
Facebook VP of Analytics and Chief Marketing Officer Alex Schultz wrote in a Tuesday blog post that this “engagement” data isn’t a good measurement of what people are really seeing on Facebook. At first blush, that seems odd given that CrowdTangle’s home page describes it as a “tool from Facebook to help follow, analyze, and report on what’s happening across social media.”
“Most of the content people see there, even in an election season, is not about politics,” Schultz writes. “In fact, based on our analysis, political content makes up about 6% of what you see on Facebook.”
Schultz also points out that Roose tracks posts with links to articles on the web, not all posts. He says if you look at data for all posts, pages from Joe Biden and Occupy Democrats appear in the top ten (#9 and #6, respectively). Even so, the number one page is Donald Trump and the rest is still dominated by right-wing pages.
Schultz says that “reach”—that is, number of impressions—is a much better stat than engagement for showing what most people are seeing in their news feeds. Facebook provides some limited impressions data for its ads, but none for regular posts. The company considers reach to be proprietary information, which makes it tough for anyone on the outside to gauge what it tells us about Facebook.
Reach may show how many times a piece of content passed before the eyeballs of users, but engagement does tell its own story. Reach includes passive scrolling through the news feed, but engagement captures when users care enough to take action on a post. In a political sense, that’s meaningful.
A glaring problem with both the engagement and reach data is that neither is publicly available for content posted within private Facebook Groups, which have quickly grown in popularity in the last two years.
The New York Times’s Davey Alba reports that according to CrowdTangle data, only about 2.5% of engagement (likes, shares, and comments) for posts claiming “Biden lost Pennsylvania” happened within public view on Facebook, while the rest occurred within Groups, beyond the view of the public and researchers. Facebook’s Schultz doesn’t directly address that issue in his post.
In a last-ditch effort to assert that Facebook wasn’t overrun with misinformation this political season, Schultz tells us that after Biden was declared the winner on Saturday, Americans began using way more “heart” emoji on political posts, while “angry reactions were closer to the baseline.”
The real point of Schultz’s blog post was not to nitpick about social analytics. Facebook is trying to undercut Roose’s reporting in the hopes of downplaying its major role in the dissemination of right-wing misinformation about the presidential election. At the moment Facebook is providing a sprawling distribution channel for hundreds of unverified, baseless reports of voter fraud.
Some of these come directly from the page of Donald Trump, which according to CrowdTangle had the second-highest engagement (behind controversial evangelical leader Jerry Falwell Jr.) of any page for the full month of October.
Others come from Trump enablers. On Monday, two big names on the right—Trump’s personal attorney Rudy Giuliani and former Florida attorney general Pam Bondi—tweeted the false claim that Real Clear Politics had “rescinded” its projection that Biden won the presidency. Even though RCP pointed out that it projected no winner, the tweets gave some perceived credibility to the claim, which was then put into a YouTube video by a channel called “Next News Network,” which was then shared 900,000 times on Facebook.
Facebook was concerned enough about disinformation in its ad system to ban new political ads in the week before and the week after the election. One week after election night, it looks like the real danger may lie in regular posts, many of which are spreading behind closed doors in Groups.
(37)