Leaked document indicates Facebook may be underreporting images of child abuse
Leaked document indicates Facebook may be underreporting images of child abuse
Moderators have raised concernsabout how they are trained to determine someone’s age.


A training document used by Facebookâs content moderators raises questions about whether the social network is under-reporting images of potential child sexual abuse, The New York Times reports.The document reportedly tells moderators to âerr on the side of an adultâ when assessing images, a practice that moderators have taken issue with but company executives have defended.
At issue is how Facebook moderators should handle images in which the age of the subject is not immediately obvious. That decision can have significant implications, as suspected child abuse imagery is reported to the National Center for Missing and Exploited Children (NCMEC), which refers images to law enforcement. Images that depict adults, on the other hand, may be removed from Facebook if they violate its rules, but arenât reported to outside authorities.
But, as The NYT points out, there isnât a reliable way to determine age based on a photograph. Moderators are reportedly trained to use a more than 50-year-old method to identify âthe progressive phases of puberty,â but the methodology âwas not designed to determine someoneâs age.â And, since Facebookâs guidelines instruct moderators to assume photos they arenât sure of are adults, moderators suspect many images of children may be slipping through.
This is further complicated by the fact that Facebookâs contract moderators, who work for outside firms and donât get the same benefits as full-time employees, may only have a few seconds to make a determination, and may be penalized for making the wrong call.
Facebook, which reports more child sexual abuse material to NCMEC than any other company, says erring on the side of adults is meant to protect usersâ and privacy and to avoid false reports that may hinder authoritiesâ ability to investigate actual cases of abuse. The companyâs Head of Safety Antigone Davis told the paper that it may also be a legal liability for them to make false reports. Notably, not every company shares Facebookâs philosophy on this issue. Apple, Snap and TikTok all reportedly take âthe opposite approachâ and report images when they are unsure of an age.
(28)