After Texas Shooting, YouTube Is Once Again A Conspiracy Breeding Ground
(November 10, 2017), a man in Sutherland Springs, Texas, opened fire in a local Baptist church and killed 26 people. Facts are still being discovered, and a full picture of the tragedy, and the gunman’s motive, is far from complete. But if you turn to places like YouTube, you’ll find a conspiracy-laden dystopia of alternative facts at your fingertips.
Currently, if you type “texas shooting” into a YouTube search bar, the first suggestion for autocomplete is “texas shooting antifa”–which produces a flurry of videos falsely tying the gunman, Devin Patrick Kelley, to the antifascist movement. Similar videos show up describing the massacre as a “false flag,” a term used by conspiracy theorists to describe a staged event meant to distract from a larger truth.
Many of these videos have already amassed tens of thousands of views, despite the fact that YouTube recently vowed to crack down on extremist content by diminishing its appearance in search results and recommendations.
In a statement to Fast Company, YouTube wrote:
“We’re continuing to invest in new features and changes to YouTube search that provide authoritative results when people come to YouTube looking for news. So far this year we have introduced new features that promote verified news sources when a major news event happens. These sources are presented on the YouTube homepage, under ‘Breaking News,’ and featured in search results, with the label ‘Top News’. Additionally we’ve been rolling out algorithmic changes to YouTube search during breaking news events. There is still more work to do, but we’re making progress.”
The cycle of events follows a pattern: At a time when trust in the news media is at an all-time low, YouTube has become one the world’s most popular breeding grounds for conspiracy theories. Last week, for instance, dozens of videos on the site discussed an alleged “Antifa Civil War”–supposedly planned for November 4–where far-left activists were going to wage a violent uprising against their ideological opponents. This event, of course, did not happen–but if you searched “antifa” in YouTube’s search bar last week, the first results were “antifa november 4,” “antifa violence,” and “antifa civil war.”
Similarly, after the deadly attack in Las Vegas last month, numerous videos–ones that amassed hundreds of thousands of views–were smattered across YouTube’s search results, peddling false narratives about the shooter, the attack, and even the victims who survived the carnage. Such conspiracies have led to real-world consequences, with victims of the shooting now facing an onslaught of threats and harassment at the hands of conspiracy theorists who insist the whole thing was staged, as the Guardian reported last month.
When asked about the antifa civil war results, a YouTube spokesperson provided this statement:
We have strict policies that prohibit certain types of content, including hate speech, promoting violence, harassment and more. We remove content flagged by our community that violates those policies. We also terminate the accounts of users who repeatedly violate our Guidelines.”
The problem isn’t relegated to YouTube, but an issue facing Google’s search engine as well: One of its first recommended terms after typing “Devin Kelley” is “Devin Kelley antifa.” (Thankfully, the first result that comes up is a Snopes article.) Shortly after (November 10, 2017)’s church massacre, the site was also pointing users to conspiratorial tweets from conservative activists via its “Popular on Twitter” feature.
24 hours after mass shooting in Texas, Google is still auto-completing to inflammatory fake news @sundarpichai pic.twitter.com/5wO705neni
— Miguel Helft (@mhelft) November 6, 2017
Despite Google’s claims that it’s working to fix its search results and bring more trustworthy videos to the front, misinformation still proliferates within hours of any catastrophe. YouTube boasts over 1.5 billion monthly users, and many of them likely use the site to find news. If Google doesn’t act aggressively–or if it implements changes too slowly–the platform runs the risk of being practically taken over by hoaxes and conspiracies.
Whatever happens, it’s clear the tech giant can no longer hide behind the long-standing Silicon Valley rationale that it’s a neutral platform that values all ideas equally.
Fast Company , Read Full Story
(39)