Snap, TikTok and YouTube need to do more to protect children, lawmakers say
Facebook whistleblower hearing: ‘Facebook and big tech are facing a big tobacco moment’
‘Facebook can change, but it’s clearly not going to do so on its own.’
The Facebook whistleblower who has provided a trove of internal documents to Congress and the Securities and Exchange Commission is testifying about research she says proves the social network has repeatedly lied about its platform. The documents were the basis for The Wall Street Journal’s reporting on Facebook’s controversial rules for celebrities, and the disastrous effect of Instagram on some teens’ mental health.
“Facebook and big tech are facing their big tobacco moment,” committee chairman Sen. Richard Blumenthal said at the start of the hearing. “Facebook knows its products can be addictive and toxic to children. They value their profit more than the pain that they cause to children and their families.”
In her opening statement, Frances Haugen, the former Facebook product manager turned whistleblower, said that the company has ignored much of its own research and is “buying its profits with our safety.” She urged Congress to adopt new regulations to limit the company’s power. “The choices being made inside of Facebook are disastrous for our children, our public safety, our privacy and for our democracy,” Haugen said. “And that is why we must demand Facebook make changes.”
She highlighted Facebook’s unwillingness to make data available outside of its own research teams has helped the company mislead the public. “The company intentionally hides vital information from the public, from the US government, and from governments around the world,” Haugen said. “The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages.”
She also said that Congress should not be swayed by Facebook’s insistence on “false choices,” and that simply reforming privacy laws or Section 230 would not go far enough. “We can afford nothing less than full transparency,” Haugen said. “Facebook wants you to believe that the problems we’re talking about are unsolvable… Facebook can change, but it’s clearly not going to do so on its own.”
Haugen’s appearance comes days after Facebook sent its head of safety, Antigone Davis, to testify in front of the same committee. She and other executives have repeatedly tried to downplay the company’s research, with Davis saying that the documents “were not bombshell research.” In Tuesday’s hearing, some senators called out Mark Zuckerberg, saying that they should be hearing from him instead. “Rather than taking personal responsibility, showing leadership, Mark Zuckerberg is going sailing,” Blumenthal said, in an apparent reference to a recent Facebook post from the CEO.
Even though he wasn’t in attendance, Zuckerberg’s decisions came up throughout the hearing. Haugen said that several documents she uncovered showed that the Facebook founder “chose metrics defined by Facebook like meaningful social interactions over changes that would have significantly decreased misinformation and other inciting content.”
She said that last April, Zuckerberg was “directly presented with a list of soft interventions” Facebook could take that would make it “less viral, less twitchy” and that he opted not to make the changes because it had a negative impact on the platform’s “meaningful social interactions” metric. She added that even when Facebook does opt to take interventions and slow down its platform, the company’s AI has problems detecting content.
“When rioting began in the United States in the summer of last year, they turned off downstream MSI [meaningful social interactions] only when they detected content was health content, which is probably COVID, and civic content,” Haugen said. “But Facebook’s own algorithms are bad at finding this content.”
In a statement following the hearing, Facebook’s Director of Policy Communications, Lena Pietsch, tried to downplay Haugen’s knowledge of the company, writing that she “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives — and testified more than six times to not working on the subject matter in question.”
“We don’t agree with her characterization of the many issues she testified about. Despite all this, we agree on one thing: it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it’s time for Congress to act.”
(37)