Mozilla urges WhatsApp to combat misinformation ahead of global elections

Mozilla urges WhatsApp to combat misinformation ahead of global elections

The foundation says Meta’s election-related efforts focus overwhelmingly on Facebook even though WhatsApp has nearly as many users.

Mozilla urges WhatsApp to combat misinformation ahead of global elections | DeviceDaily.com
REUTERS / Reuters

In 2024, four billion people — about half the world’s population — in 64 countries including large democracies like the US and India, will head to the polls. Social media companies like Meta, YouTube and TikTok, have promised to protect the integrity of those elections, at least as far as discourse and factual claims being made on their platforms are concerned. Missing from the conversation, however, is closed messaging app WhatsApp, which now rivals public social media platforms in both scope and reach. That absence has researchers from non-profit Mozilla worried.

“Almost 90% of the safety interventions pledged by Meta ahead of these elections are focused on Facebook and Instagram,” Odanga Madung, a senior researcher at Mozilla focused on elections and platform integrity, told Engadget. “Why has Meta not publicly committed to a public road map of exactly how it’s going to protect elections within [WhatsApp]?”

Over the last ten years, WhatsApp, which Meta (then Facebook) bought for $19 billion in 2014, has become the default way for most of the world outside the US to communicate. In 2020, WhatsApp announced that it had more than two billion users around the world — a scale that dwarfs every other social or messaging app except Facebook itself.

Despite that scale, Meta’s focus has mostly been only on Facebook when it comes to election-related safety measures. Mozilla’s analysis found that while Facebook had made 95 policy announcements related to elections since 2016, the year the social network came under scrutiny for helping spread fake news and foster extreme political sentiments. WhatsApp only made 14. By comparison, Google and YouTube made 35 and 27 announcements each, while X and TikTok had 34 and 21 announcements respectively. “From what we can tell from its public announcements, Meta’s election efforts seem to overwhelmingly prioritize Facebook,” wrote Madung in the report.

Mozilla is now calling on Meta to make major changes to how WhatsApp functions during polling days and in the months before and after a country’s elections. They include adding disinformation labels to viral content (“Highly forwarded: please verify” instead of the current “forwarded many times), restricting broadcast and Communities features that let people blast messages to hundreds of people at the same time and nudging people to “pause and reflect” before they forward anything. More than 16,000 people have signed Mozilla’s pledge asking WhatsApp to slow the spread of political disinformation, a company spokesperson told Engadget.

WhatsApp first started adding friction to its service after dozens of people were killed in India, the company’s largest market, in a series of lynchings sparked by misinformation that went viral on the platform. This included limiting the number of people and groups that users could forward a piece of content to, and distinguishing forwarded messages with “forwarded” labels. Adding a “forwarded” label was a measure to curb misinformation — the idea was that people might treat forwarded content with greater skepticism.

“Someone in Kenya or Nigeria or India using WhatsApp for the first time is not going to think about the meaning of the ‘forwarded’ label in the context of misinformation,” Madung said. “In fact, it might have the opposite effect — that something has been highly forwarded, so it must be credible. For many communities, social proof is an important factor in establishing the credibility of something.”

The idea of asking people to pause and reflect came from a feature that Twitter once implemented where the app prompted people to actually read an article before retweeting it if they hadn’t opened it first. Twitter said that the prompt led to a 40% increase in people opening articles before retweeting them

And asking WhatsApp to temporarily disable its broadcast and Communities features arose from concerns over their potential to blast messages, forwarded or otherwise, to thousands of people at once. “They’re trying to turn this into the next big social media platform,” Madung said. “But without the consideration for the rollout of safety features.”

“WhatsApp is one of the only technology companies to intentionally constrain sharing by introducing forwarding limits and labeling messages that have been forwarded many times,” a WhatsApp spokesperson told Engadget. “We’ve built new tools to empower users to seek accurate information while protecting them from unwanted contact, which we detail on our website.”

Mozilla’s demands came out of research around platforms and elections that the company did in Brazil, India and Liberia. The former are two of WhatsApp’s largest markets, while most of the population of Liberia lives in rural areas with low internet penetration, making traditional online fact-checking nearly impossible. Across all three countries, Mozilla found political parties using WhatsApp’s broadcast feature heavily to “micro-target” voters with propaganda, and, in some cases, hate speech.

WhatsApp’s encrypted nature also makes it impossible for researchers to monitor what is circulating within the platform’s ecosystem — a limitation that isn’t stopping some of them from trying. In 2022, two Rutgers professors, Kiran Garimella and Simon Chandrachud visited the offices of political parties in India and managed to convince officials to add them to 500 WhatsApp groups that they ran. The data that they gathered formed the basis of an award-winning paper they wrote called “What circulates on Partisan WhatsApp in India?” Although the findings were surprising — Garimella and Chandrachud found that misinformation and hate speech did not, in fact, make up a majority of the content of these groups — the authors clarified that their sample size was small, and they may have deliberately been excluded from groups where hate speech and political misinformation flowed freely.

“Encryption is a red herring to prevent accountability on the platform,” Madung said. “In an electoral context, the problems are not necessarily with the content purely. It’s about the fact that a small group of people can end up significantly influencing groups of people with ease. These apps have removed the friction of the transmission of information through society.”

 

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

(10)