Looking back, “delay, deny and deflect” has always been part of Facebook’s DNA
As important as the revelations in last week’s New York Times‘s Facebook bombshell are, they’re more meaningful–and less surprising–when taken in context. Facebook’s move-fast-and-break-things management style, and its comfort level with various shades of subterfuge, go way back in the history of the company. Facebook’s whole business model, after all, rests on a con game: Facebook provides a “free” social network, and says as little as possible about the personal data it harvests from users as payment. That data is necessary to power Facebook’s enormously lucrative advertising machine. In classic corporate double-think, the company’s executives see this as a “virtuous cycle.”
When you hold these facts in mind, the events, decisions, events, actions, and inactions described in the Times piece make a lot more sense. In the wake of the realization that Russian trolls had turned Facebook into a vast disinformation platform, Facebook’s executives, we learn, were far more interested in “managing” the reaction to that bad news than in quickly stopping the infestation and informing all stakeholders.
According to the story, Facebook’s COO Sheryl Sandberg and CEO Mark Zuckerberg ignored warnings about the Russian troll invasion and then ordered the suppression of details about it. The two remained unaware of the Russian interference for as much as a year after their own security people detected evidence of it. When security chief Alex Stamos finally got the attention of the executives, Sandberg’s response was to scold him for looking into it without permission (she said it left the company legally exposed). She later flew into a rage when Stamos went into detail about the infestation with members of the board without telling her first. At one point, Sandberg decided to leave up the pages of known Russian hackers that published divisive far-right content, because she feared angering the conservative establishment and tipping off Facebook users that they’d been reading and sharing fake content.
She worked through U.S. Senate Minority Leader Chuck Schumer (D-NY) to tell Senator Mark Warner (D-VA) to slow down his Senate Intelligence Committee’s aggressive investigation of Facebook’s role in the 2016 election meddling. (Warner’s press secretary gave me a quick “no comment” when I asked her to confirm that Schumer had asked Warner to back off.) We also learned that Zuckerberg was absent from (or distracted from) meetings where key decisions were made on how to contain (rather than communicate) the true extent of the Russian infestation and its effects.
On numerous occasions over the last year and a half, Zuckerberg and Sandberg have contritely said that Facebook was “slow to get ahead of” the Russian hijacking of its network. That in itself suggests either dishonesty or negligence. When Facebook first found evidence of coordinated Russian meddling in spring 2016, the company had “no policy on disinformation or any resources dedicated to searching for it,” the New York Times reports. But the Russians had used Facebook in this way before. Facebook knew that the Russian government had run a disinformation campaign on its social network to undermine Ukrainian president Petro Poroshenko in spring 2015.
For most of its history, Facebook has employed skillful public relations strategists to “manage” news of controversial changes to the social network. The company has many times made important user-facing changes without asking its users. The company’s spokespeople and executives have spoken fervently of the supposed user benefits of such changes (and not the privacy implications), when the true purpose of the changes was to make Facebook a more efficient data collection apparatus or a more addictive social platform. At times, new features have benefitted both users and advertisers—but not usually—and the advertising business always seems to come first. Facebook often has responded to user backlash by apologizing and making small concessions, but rarely by totally reversing a decision. Only during the past year, under pressure from regulators, has Facebook begun culling back some of its finer audience targeting capabilities.
So we shouldn’t be too surprised to learn that Sandberg and others tried to keep information about the extent of its Russian troll problem from the public and U.S. Congress, or that they hired an opposition research firm to direct negative attention away from Facebook by planting negative stories about the company’s critics and rivals. Such tactics are part of Facebook’s culture. Transparency doesn’t come naturally to it.
Nor should it be surprising that the Facebook executives have become more aggressive with these tactics since 2016.
More than any other single thing, the Cambridge Analytica scandal made Facebook users more aware that their data is being vacuumed up for all kinds of purposes, some of them harmful. Most people believe Cambridge Analytica used the Facebook user data to help find votes for Donald Trump during the 2016 presidential election. (The Facebook data models were not used, my sources tell me, but that hardly matters now.) To many, the idea that their personal data could be used without their knowledge or permission to support some political cause they may not agree with was offensive. In a wider sense, Facebook allowed its platform to become a key enabler of the tribalism and polarization that’s poisoned the national conversation. Because politically toxic content gets more user engagement, Facebook ends up profiting. As a result, people and lawmakers now understand much better Facebook’s “surveillance capitalism” business model, and Zuckerberg’s and Sandberg’s smooth “we connect the world” patter has begun to sound hollow.
The company’s deepest fear is losing users. Without them, the personal data in Facebook’s “social graph” stagnates and loses its power to target ads. And indeed, Facebook’s user growth has slowed. Young people are deleting the Facebook app from their phones in record numbers. If the information about the disastrous Russian troll infestation was released too quickly or in the wrong way, Facebook executives likely feared that consumers would begin to see Facebook as a harmful vice and begin to leave the network in droves. No doubt Facebook also feared new government regulations, or the growing bipartisan calls to break up the company, which is now collecting the personal data of 2.2 billion people around the world.
Facebook has denied as much of the New York Times story as it could within the confines of believability. It didn’t deny the specifics of the story, but rather took issue with the story’s general suggestions about the intent of the executives’ decisions at the time they were made. Facebook denied things, in other words, that can’t be proven without being inside the heads of the people in question. It also denied a few things that weren’t even claimed by the story. By and large, the New York Times story was fabulously reported and remained factually intact by the end of the week.
As much as those of us in the tech media like to focus on the human side of tech companies–the creativity and innovation, the personalities–we’re reminded from time to time that tech companies are just businesses: amoral and profit-driven at the end of day. The Times‘s story about Facebook was an essential reminder of that uncomfortable truth.
(27)