Meta’s Spanish-language moderators have reportedly been working in unsafe conditions
Facebook will hire 3,000 moderators to tackle livestreamed violence
The social network gets serious about the safety of its community.
Following a spate of suicides and murders that were streamed or hosted on Facebook for hours before they were taken down, Mark Zuckerberg has announced that the company will be hiring an additional 3,000 people to its global community operations team over the next year. That will bring the total size of the department to 7,500, and the manpower will be dedicated to reviewing “the millions of reports we get every week, and improv(ing) the process for doing it quickly.”
Zuckerberg wrote that these reviewers will “help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” and that the social network will continue working with law enforcement and local community groups who “are in the best position to help someone if they need it.”
In addition, Facebook will make it simpler for members to report problems and speed up the process for its reviewers to determine which posts violate community standards. The company previously opened up access to its suicide-prevention tools to all its users, and developed an AI system to identify potentially suicidal people.
One of the biggest criticisms against Facebook in the recent incidents is its delay in addressing the problematic content on its video platform. Zuckerberg appears to acknowledge that issue in this post, saying “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
Hiring more people over a year is a prolonged step towards alleviating the problem, but it will hopefully make for speedier response to such situations in future.
(15)