In 2024, half the world’s population in 64 countries, including the US and India, will vote. Social media giants like Meta, YouTube, and TikTok have pledged to safeguard these elections. However, WhatsApp, a popular messaging app, is not part of this conversation, causing concern for Mozilla researchers.
Meta’s safety measures for these elections are primarily focused on Facebook and Instagram, accounting for about 90% of their interventions. This was revealed by Odanga Madung, a senior researcher at Mozilla. He questioned why Meta has not publicly committed to protecting elections within WhatsApp.
WhatsApp, bought by Meta (formerly Facebook) for $19 billion in 2014, has become the primary communication tool for most of the world outside the US. In 2020, it announced it had over two billion users worldwide, surpassing all other social or messaging apps except Facebook.
Despite its size, Meta’s election-related safety measures have mainly targeted Facebook. Mozilla’s analysis showed that Facebook made 95 policy announcements related to elections since 2016, compared to WhatsApp’s 14. Google and YouTube made 35 and 27 announcements each, while X and TikTok made 34 and 21 respectively. Madung noted in his report that Meta’s election efforts seem to prioritize Facebook.
Mozilla is urging Meta to change how WhatsApp operates during elections. They suggest adding disinformation labels to viral content, limiting broadcast and community features, and encouraging users to think before forwarding messages. Over 16,000 people have signed Mozilla’s pledge asking WhatsApp to curb the spread of political disinformation.
WhatsApp began adding friction to its service after misinformation on the platform led to several lynchings in India, its largest market. This included limiting forwarding and adding “forwarded” labels to messages. However, Madung argues that new WhatsApp users in Kenya, Nigeria, or India may not understand the implications of the “forwarded” label.
The idea of prompting users to think before forwarding came from a Twitter feature that encouraged users to read an article before retweeting it. This led to a 40% increase in people opening articles before retweeting them.
Mozilla’s demands stem from research conducted in Brazil, India, and Liberia. WhatsApp’s broadcast feature was heavily used in these countries to target voters with propaganda and sometimes hate speech. WhatsApp’s encryption makes it difficult for researchers to monitor content on the platform, but some are trying. In 2022, two Rutgers professors joined 500 WhatsApp groups run by Indian political parties and wrote an award-winning paper based on the data they collected.
Madung argues that encryption is a distraction from the real issue: a small group of people can easily influence large groups. He believes these apps have removed the friction of information transmission in society.