YouTube Tries To Reduce Spread of ‘Borderline Content,’ Misinformation

YouTube Tries To Reduce Spread of ‘Borderline Content,’ Misinformation

by , January 25, 2019

YouTube Tries To Reduce Spread of 'Borderline Content,' Misinformation | DeviceDaily.com

YouTube is taking steps to reduce the spread of content that promotes conspiracy theories and other fringe content. 

In a blog post Friday, the company says it will work to reduce recommendations of those videos, though it will not remove the videos from the platform.

“[YouTube will be] taking a closer look at how we can reduce the spread of content that comes close to — but doesn’t quite cross the line — of violating our Community Guidelines,” the blog post says. “To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”

YouTube says the shift will only impact around 1% of all videos on the platform, though with more than 300 hours of video uploaded to the platform every minute, that may still represent a significant amount of video content.

The company will use a combination of machine learning and human reviewers to make the change, which will happen gradually beginning in the U.S.

In the blog post, YouTube says the change “strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.”

The algorithm tweak follows reports from news outlets, such as BuzzFeed News and The Atlantic that attempted to track how YouTube’s recommendation algorithm keeps people watching. Their reporting showed that users that watched benign clips from popular news or entertainment programs would subsequently be fed recommendations for videos that ultimately led to fringe or conspiracy content.

MediaPost.com: Search Marketing Daily

(47)