Instagram will encourage teens to ‘take a break’
What Facebook should change, according to its whistleblower
Frances Haugen said Facebook should move to chronological feeds and open up more of its research.
The whistleblower behind “bombshell” disclosures that have rocked Facebook in recent weeks spent much of Tuesday’s three-hour hearing explaining to Congress how Facebook could fix itself.
While the hearing was far from the first time a Facebook critic has briefed lawmakers, Frances Haugen’s insider knowledge and expertise in algorithm design made her particularly effective. Her background as part of the company’s civic integrity team meant she was intimately familiar with some of the biggest problems on Facebook.
During the hearing, Haugen spoke in detail about Facebook’s algorithms and other internal systems that have hampered its efforts to slow misinformation and other problematic content. She also praised the company’s researchers, calling them “heroes,” and said Facebook should be required to make their work public.
Remove algorithmic ranking and go back to chronological feeds
One of the most notable aspects of Haugen’s testimony was her expertise, which gives her a nuanced understanding of how algorithms work and the often unintended consequences of using them.
“I hope we will discuss as to whether there is such a thing as a safe algorithm,” Sen. Richard Blumenthal said at the start of the hearing. While Haugen never addressed that question directly, she did weigh on the ranking algorithms that power the feeds in Facebook and Instagram. She noted that Facebook’s own research has found that “engagement-based ranking on Instagram can lead children from very innocuous topics like healthy recipes… to anorexia-promoting content over a very short period of time.”
She also said that Facebook’s AI-based moderation tools were much less effective than what the company has publicly portrayed. “We’ve seen from repeated documents within my disclosures that Facebook’s AI systems only catch a very tiny minority of offending content,” Haugen said. “Best case scenario, in the case of something like hate speech, at most they will ever get to 10 to 20%.”
To address this, Haugen said that Facebook could move to a chronological feed where posts are ordered by recency, rather than what is most likely to get engagement. “I’m a strong proponent of chronological ranking, or ordering by time with a little bit of spam demotion, because I think we don’t want computers deciding what we focus on,” Haugen said.
She noted that Facebook would likely resist such a plan because content that gets more engagement is better for their platform because it causes people to post and comment more. “I’ve spent most of my career working on systems like engagement-based ranking,” Haugen said. “When I come to you and say these things, I’m basically damning 10 years of my own work.”
Reform Section 230
In a similar vein, Haugen said that Section 230 — the 1996 law that protects companies from being liable for what their users say and do on their platforms — should be reformed “to make Facebook responsible for the consequences of their intentional ranking decisions.” She said that such a law would likely “get rid of engagement-based ranking” because it would become too big of a liability for the company.
At the same time, she cautioned lawmakers to not let Facebook “trick” them into believing that changing Section 230 alone would be enough to address the scope of its problems. She also noted that using the law to police Facebook’s algorithms could be easier than trying to address specific types of content. “User generated content is something that companies have less control over, they have 100% control over their algorithms,” Haugen said.
The focus on Section 230 is significant because lawmakers from both parties have proposed various changes to the law. During the hearing, Blumenthal indicated that he too supported “narrowing this sweeping immunity when platforms’ algorithms amplify illegal conduct.” Senator Amy Klobuchar has also proposed ending 230 protections for vaccine and health misinformation. Meanwhile, Republicans have tried to eliminate Section 230 for very different reasons.
Slow down virality
Likewise, Haugen suggested that Facebook should slow down its platform with “soft interventions” that would add small bits of friction to the platform. She pointed to Twitter’s “read before sharing” prompts as the kind of measure that can reduce the spread of misinformation.
“Small actions like that friction don’t require picking good ideas and bad ideas,” she said. “They just make the platform less twitchy, less reactive. And Facebook’s internal research says that each one of those small actions dramatically reduces misinformation, hate speech and violence-inciting content on the platform.”
Facebook has taken these steps in the past. Notably, it applied these “break glass” measures in the days after the presidential election, though the company rolled some of them back the following month. The company implemented similar changes again, less than a month later, in the aftermath of the insurrection January 6th.
Huagen said that Facebook has mischaracterized these changes as being harmful to free speech, when in fact the company is concerned because it “wanted that growth back.” During the hearing, she said that Mark Zuckerberg had been personally briefed on just how impactful changes like this could be. But, she said, he prioritized the platform’s growth “over changes that would have significantly decreased misinformation and other inciting content.”
Open Facebook’s research to people outside the company
Access to Facebook’s data has become a hot button issue in recent weeks as researchers outside the company have complained that the company is stifling independent research. Haugen said the social network should work toward making its own internal research available to the public.
She proposed that there should be a set period of time — she suggested as long as 18 months — when Facebook is able to keep its research under wraps. But then the company should make it accessible to those outside the company.
“I believe in collaboration with academics and other researchers that we can develop privacy-conscious ways of exposing radically more data that is available today,” Haugen said. “It is important for our ability to understand how algorithms work, how Facebook shapes the information, we get to see that we have these data sets to be publicly available for scrutiny.”
She went on to say that Facebook’s researchers are among its “biggest heroes” because “they are boldly asking real questions and willing to say awkward truths.” She said it was “unacceptable” that the company has been “throwing them under the bus” in its effort to downplay her disclosures.
A dedicated ‘oversight body’
Besides internal changes, Haugen also said that there should be a dedicated “oversight body” with the power to oversee social media platforms. She said that such a group within an agency like the Federal Trade Commission could provide “a regulatory home where someone like me could do a tour of duty after working at a place like this.”
“Right now, the only people in the world who are trained to analyze these experiments, to understand what’s happening inside of Facebook, are people who grew up inside of Facebook or Pinterest or another social media company,” she said.
Importantly, this “oversight body” would be separate from the Facebook-created Oversight Board, which advises Facebook on specific content decisions. While Facebook has said the creation of the Oversight Board is proof it’s trying to self-regulate, Haugen wrote in prepared remarks that the Oversight Board “is as blind as the public” when it comes to truly knowing what happens inside of the company.
It’s also worth noting that Haugen said she was opposed to efforts to break up Facebook. She said that separating Facebook and Instagram would likely result in more advertisers flocking to Instagram, which could deplete Facebook’s resources for making changes to improve its platform.
What’s next
While it’s unclear which, if any, of Haugen’s recommendations Congress will act on, her disclosures have already caught the attention of regulators. In addition to providing documents to Congress, she has also given documents to the Securities and Exchange Committee. She has alleged that Zuckerberg and other executives have “misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection,” according to SEC filings published by 60 Minutes.
Meanwhile, Facebook has continued to push back on Haugen’s claims. A week after an executive told lawmakers that “this is not bombshell research,” the company tried to discredit Haugen more directly. In a statement, Facebook’s Director of Policy Communications Lena Pietsch, said Haugen “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives. We don’t agree with her characterization of the many issues she testified about.” Pietsch added that “it’s time to begin to create standard rules for the internet.”
In an appearance on CNN following the hearing, Facebook VP Monika Bickert referred to Haugen’s disclosures as “stolen documents” and said the company’s research had been “mischaracterized.” Later that night, Zuckerberg publicly weighed in for the first time since The Wall Street Journal began publishing stories based on Haugen’s disclosures (Zuckerberg did once refer to earlier coverage of the scandals, complaining that a news article has mistakenly described his hydrofoil as an “electric surfboard.”) In his first substantive statement, he said “many of the claims don’t make any sense,” and that “the argument that we deliberately push content that makes people angry for profit is deeply illogical.”
It could still get more difficult for Facebook to counter Haugen, though, particularly if new documents become public. Her letter to the SEC suggests that Facebook knew much more about QAnon and violent extremism on its platform than it let on, as Vice reported earlier. Haugen may also make appearances in front of lawmakers in other countries, too. European lawmakers, many of whom have expressed similar concerns as their US counterparts, have also indicated they want to talk to Haugen and conduct new investigations of their own.
(22)