Individuals Are Sharing Misinformation About WhatsApp’s Privateness Coverage

Michael Reynolds / Getty Images

Hours after WhatsApp announced a new privacy policy to the nearly 2 billion people around the world who use it, the rumors flew fast and furious.

“Do not accept WhatsApp’s new policy,” said one of the messages that went viral on the platform. “Once you do this, your WhatsApp account will be linked to your Facebook account and Zuckerberg can see all of your chats.”

“In a few months, WhatsApp will be releasing a new version that will show ads based on your chats,” said another. “Do not accept the new policy!”

In the days that followed, thousands of similar messages went viral on WhatsApp, Facebook’s instant messaging app. Cheered on by celebrities like Elon Musk, CEO of Tesla, and Edward Snowden, whistleblower, millions of people rushed to download WhatsApp alternatives like Signal and Telegram.

There was only one problem: The 4,000 word policy indicated that the new changes only applied when WhatsApp was being used to chat with businesses, not private conversations with friends and family.

No, the new terms wouldn’t allow Facebook to read your WhatsApp chats, the company told anyone who asked. Top executives have long posted topics on Twitter and interviewed major publications in India, the company’s largest market. WhatsApp spent millions buying front page ads in major newspapers and posted graphics exposing the rumors on its website with a large “Share to WhatsApp” button in hopes of avoiding the stream of misinformation circulating about its Platform flows to give some truth. The company also encouraged Facebook employees to share these infographics according to the posts on the internal Message Board Workplace.

“There has been a lot of misinformation and confusion, so we are working to provide accurate information on how WhatsApp protects people’s personal conversations,” a WhatsApp spokesman told BuzzFeed News. “We use our status feature to communicate directly with people on WhatsApp and to post accurate information in dozen of languages ​​on social media and on our website. Of course, we have made these resources available to people who work in our company as well. So they can Answer questions directly to friends and family if they so wish. “

None of this worked.

“There has been a lot of misinformation to be concerned about and we want to help everyone understand our principles and facts,” WhatsApp wrote in a blog post last week, announcing that the company would delay the new privacy policy for three months. “We’re also going to do a lot more to clean up the misinformation about how privacy and security work on WhatsApp,” he wrote.

Thanks to everyone who got in touch. We’re still working to avoid confusion by communicating directly with @ WhatsApp users. Nobody will lock or delete their account on February 8th and we will reset our business plans until after May – https://t.co/H3DeSS0QfO

7:42 p.m. – January 15, 2021


Twitter

For years, rumors and jokes spread over WhatsApp have sparked a misinformation crisis in some of the world’s most populous countries like Brazil and India, where the app is the main way most people talk to each other. Now this crisis has reached the company itself.

“Trust in platforms is [at a] Low point, ”Claire Wardle, co-founder and director of First Draft, a nonprofit investigating misinformation, told BuzzFeed News. “We have years of people who are increasingly concerned about the power of tech companies, especially how much data they are gathering about us. So when privacy policies are changed, people are rightly concerned about what it means. “

Wardle said people were concerned that WhatsApp would link their behavior on the app to data on their Facebook accounts.

“Facebook and WhatsApp have a huge lack of trust,” said Pratik Sinha, founder of Alt News, a fact-checking platform in India. “Once you have that, any type of misinformation attributed to you is easily consumed.”

What doesn’t help, both Sinha and Wardle added, is the lack of understanding among ordinary people of how technology and privacy work. “Misinformation thrives on confusion,” said Wardle, “so people saw the policy changes, jumped to conclusions, and it was not surprising that many people believed the rumor.”

These patterns of misinformation that have thrived on WhatsApp for years have often caused harm. In 2013, a video allegedly lynching two young men in Muzaffarnagar, a city in northern India, went viral, sparking riot between the Hindu and Muslim communities in which dozens of people died. A police investigation found the video was over two years old and wasn’t even shot in India. In Brazil, fake news flooded the platform and was used to favor far-right candidate Jair Bolsonaro, who won the country’s 2018 presidential election.

The company didn’t seriously address its misinformation problem until 2018, however, when rumors of child kidnappers sweeping the platform sparked a series of violent lynchings across India. In a statement released at the time, India’s IT ministry warned WhatsApp of legal action, saying the company would be “treated as an advocate” if it did not resolve the problem and sent WhatsApp into crisis mode. It flew top executives from the company’s headquarters in Menlo Park, California to New Delhi to meet with government officials and journalists, and ran high-profile misinformation awareness campaigns.

Sam Panthaky / Getty Images

A protest against mob lynching in India in July 2018. This year dozens of people have been lynched across the country thanks to WhatsApp rumors, so both the Indian authorities and WhatsApp began looking for a solution.

In addition, new functions have been integrated into the app to directly counteract incorrect information for the first time, e.g. B. Labeling forwarded messages and limiting the number of people or groups to which content can be forwarded to slow down viral content. Last August, people in a handful of countries started uploading the text of a message to Google to see if a redirect was fake. The feature is not yet available to WhatsApp users in India.

Since then, the company has been working on a tool that will allow users to browse images they received on the app with a single tap in 2019. This would help people check the facts more easily. But almost two years later, there is no evidence of this feature, although a text version is available in over a dozen countries that do not yet include India.

“We are still working on the search function,” a WhatsApp spokesman told BuzzFeed News.

WhatsApp said the company wanted to be more clear on its new privacy policy. “We want to reiterate that this update does not increase our ability to share data with Facebook. Our goal is to provide transparency and new options for working with companies so that they can serve their customers and grow, ”said the spokesman. “WhatsApp always protects personal messages with end-to-end encryption, so that neither WhatsApp nor Facebook can see them. We are working to fix any misinformation and are available to answer any questions. “

This week, the company placed a status message, WhatsApps equivalent to a Facebook story, at the top of the people’s status area. A tap on the status revealed a series of messages from the company debunking the rumors.

BuzzFeed News screenshots

“WhatsApp doesn’t share your contacts with Facebook,” said the first. Two more status updates made it clear that WhatsApp can’t see people’s location and can’t read or listen to encrypted personal conversations. “We are committed to your privacy,” said the last message.

According to internal communications from BuzzFeed News, employees had several questions for Facebook CEO Mark Zuckerberg on Thursday prior to weekly Q&A. Some wanted to know if the increasing shift to signal and telegram is affecting WhatsApp usage and growth metrics. Others wanted the CEO to talk about whether or not Facebook used metadata from WhatsApp to serve ads.

“Do you think we could have done better to explain clearly? [the new privacy policy] to users? “someone asked.

“The public is angry about the changes to WhatsApp PrivPolicy,” commented another person. “There is so much distrust of FB that we should be more careful about it.”

Zuckerberg replied that he didn’t think the company handled the changes well.

“The short answer is ‘no’. I don’t think we handled it as well as we should have,” he said. “And I think the team has already looked at everything that goes with that – and has a number of lessons to make sure we do a better job going forward, not just at WhatsApp TOS. But you know we have other TOS updates for different apps and services. And we have to make sure we do better on these two. In this way, we minimize the amount of misinformation that is generated – and the amount of – and minimize the amount of confusion that is generated. “

Ryan Mac contributed to the coverage.

Comments are closed.