Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

admin
Pinned March 29, 2019

<> Embed

@  Email

Report

Uploaded by user
A New Zealand shooting video hit YouTube every second this weekend
<> Embed @  Email Report

A New Zealand shooting video hit YouTube every second this weekend

Christine Fisher, @cfisherwrites

March 18, 2019
 
 

A New Zealand shooting video hit YouTube every second this weekend | DeviceDaily.com

 

In the 24 hours after the mass shooting in New Zealand on Friday, YouTube raced to remove videos that were uploaded as fast as one per second, reports The Washington Post. While the company will not say how many videos it removed, it joined Facebook, Twitter and Reddit in a desperate attempt to remove graphic footage from the shooter’s head-mounted camera.

The speed at which the videos were uploaded forced YouTube to take unprecedented measures. Under standard protocol, YouTube’s software flags troublesome content, which human moderators then review. But because the system was inundated, it let the AI software both flag and remove content it suspected to be problematic. As Neal Mohan, YouTube’s chief product officer, told The Washington Post, the trade-off was that non-problematic content got swept up and deleted, too.

When that wasn’t enough, YouTube also disabled the option to search for “recent uploads.” Both that search feature and the use of human moderators are still currently blocked. As an added challenge, many of the videos were altered in ways that made it hard for YouTube’s AI to recognize them. And while YouTube tries to direct users to authoritative news sources during crises, for hours after the attack, footage could be found simply by searching “New Zealand.”

YouTube has been working to improve its system to flag problematic content for years. In 2017, Google announced it would hire 10,000 YouTube content moderators. At that time, its AI could help take down 70 percent of violent, extremist content within eight hours of upload. But as we saw after the Parkland shooting last year, even the company’s human moderation still needs work. Unfortunately, this is ongoing issue, as mass shootings and extremist content continue to spread around the globe. For the time being, neither Facebook, YouTube, Twitter nor Reddit can offer a true solution.

Engadget RSS Feed

(15)