admin
Pinned November 30, 2020

<> Embed

@  Email

Report

Uploaded by user
Facebook moderators say company is asking them to ‘risk our lives’
<> Embed @  Email Report

Facebook moderators say company is asking them to ‘risk our lives’

Karissa Bell, @karissabe

November 18, 2020

In an open letter published Wednesday, a group of Facebook moderators say the company is putting them and their families at risk by asking them to go back to work in the midst of the pandemic. The content reviewers say that while workers with a doctor’s note can be excused from going to the office, those with high risk family members don’t get the same opportunity. 

“In several offices, multiple COVID cases have occurred on the floor,” the letter, states. “Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work. You refused. We are publishing this letter because we are left with no choice.”

According to the letter-writers, the reason Facebook is pushing moderators to go back to the office is because the company’s AI-based moderation is “years away” from being truly effective. 

Without informing the public, Facebook undertook a massive live experiment in heavily automated content moderation. Management told moderators that we should no longer see certain varieties of toxic content coming up in the review tool from which we work— such as graphic violence or child abuse, for example. 

The AI wasn’t up to the job. Important speech got swept into the maw of the Facebook filter—and risky content, like self-harm, stayed up. 

The lesson is clear. Facebook’s algorithms are years away from achieving the necessary level of sophistication to moderate content automatically.

The letter also brings up several issues that predate the coronavirus pandemic, like the lack of mental healthcare for moderators as well as their status as contractors rather than full-time employees. Among the moderators demand from Facebook and the contracted companies that employ them: hazard pay, more flexibility to work from home and access to better mental healthcare. 

In a statement, a Facebook spokesperson disputed some of the claims made in the letter. “We appreciate the valuable work content reviewers do and we prioritize their health and safety,” the spokesperson said. “While we believe in having an open internal dialogue, these discussions need to be honest. The majority of these 15,000 global content reviewers have been working from home and will continue to do so for the duration of the pandemic. All of them have access to health care and confidential wellbeing resources from their first day of employment, and Facebook has exceeded health guidance on keeping facilities safe for any in-office work.”

The letter underscores how the coronavirus pandemic has complicated Facebook’s massive content moderation operation. The company warned in March at the start of the pandemic that it would rely more heavily on automation as many of its human moderators were unable to work from home. Though the company has repeatedly touted the gains made by its AI systems, the reliance on automated moderation has resulted in a number of issues. Meanwhile, Facebook has said it depends on a combination of human moderators and automated systems.

At the same time, the contract workers who do the bulk of the social network’s content moderation have long said the company doesn’t do enough to protect them. Moderators, many of whom spend their days reviewing misinformation, graphic violence and other egregious content, have criticized the company for low wages and inadequate mental healthcare. In May, Facebook paid $52 million to settle a class action lawsuit on behalf of moderators who said they developed PTSD as the result of their work. 

Updated with more details on Facebook’s moderation practices.

Engadget

(19)


Top