How WhatsApp is making it more expensive to spread misinformation

We always have at least some reason to worry about the spread of misinformation, but we worry more about misinformation during a public health crisis. We are generally not well informed on public health issues even in good times, and so the emergence of new disease to which the human race has no natural immunity presents an incredible target for bad actors.

For example, if for whatever reason you are opposed to new 5G cellular networks, you could go on social networks and make a lot of posts suggesting that 5G networks are making the spread of the virus worse. Or you could say that 5G itself is causing COVID-19. Or you could say that the pandemic itself is a hoax, and that talk of a virus is intended to cover up the installation of 5G equipment. And if you said it often enough, and your posts got enough traction, then eventually the fringe press would write up your claims, and the misinformation would rapidly move into the mainstream.

In the United Kingdom last month, in the days after the government ordered citizens to remain in their homes, this is more or less exactly what happened. Some people are setting telephone poles on fire in an utterly misguided effort to fight back against 5G. Jim Waterson and Alex Hern talked to fact-checkers about the situation in the Guardian:

They cite the rapid growth of neighbourhood social media groups, a failure by networks to promote scientific evidence about 5G, and a terrified population looking to make sense of a world turned upside down. [...]
Tom Phillips, the editor of the factchecking organisation Full Fact, said it warned last summer about the growing prevalence of 5G health claims. But in recent weeks debunked claims about 5G had been transformed, potentially aided by the creation of new local Facebook and WhatsApp groups to help support neighbours during the pandemic. Google Trends data suggests British interest in 5G theories exploded in the final days of March, shortly after the lockdown was imposed.

Let’s stipulate that fringe theories like these don’t exist only on social networks — and that, as the piece argues, telecoms should be doing a much better job at explaining to people what 5G is and isn’t. (Here’s a good overview from my colleague Chaim Gartenberg.)

But it’s clear that, as usual, social networks are amplifying some of these theories and helping them gain a foothold in the popular imagination. If you’re Facebook, you can throw a bunch of fact-checkers and content moderators at the issue to remove viral posts and attempt to deny other fringe voices undue algorithmic promotion. But if the subject is Facebook-owned WhatsApp, the solution is murkier.

WhatsApp, after all, uses end-to-end encryption. In practice, this means WhatsApp itself can’t peer into the contents of your message. There are obvious privacy benefits to an app like this, particularly in a world where far-right authoritarianism is on the rise. Will Cathcart, who runs WhatsApp, told me this week that WhatsApp’s commitment to privacy feels even more urgent in a pandemic-stricken world where nearly all of our communication is mediated digitally. (As an aside, the entire story of the recent Zoom backlash is that the product’s design enabled far too many strangers to interrupt your call.)

“Part of what WhatsApp is trying to do is make what you used to do face to face possible,” Cathcart told me. “Part of that is privacy.”

If we were talking face to face, he told me over Zoom, we probably wouldn’t worry too much about someone spying on us. On a digital call, though, spying becomes a much bigger concern.

  • Instagram
  • Twitter
Subscribe to our newsletter