WhatsApp and Facebook’s ultimatum to users reveals a privacy disaster


There’s been a lot of confusion and outrage on social media over an in-app notification that informed WhatsApp users that the messaging app’s privacy terms had changed. Users of the application have no choice but to adopt the new terms, or the software will delete itself on February 8. Because privacy terms are so lengthy, convoluted, and complex in the world of ad tracking, many have assumed that the additional collected information puts their privacy and the security of encrypted personal chats at risk.

In reality, the WhatsApp privacy changes only affect a certain sliver of users’ WhatsApp Business interactions, extending tracking capabilities that are already well in use. The change in terms is less indicative of an end to encrypted privacy on WhatsApp and more a signal of Facebook’s response to the many U.S. antitrust inquiries ahead of them in 2021—and that the social media giant will continue to shift attention to the markets that exist outside of the United States, where it is far more popular.

Because of Apple’s new data disclosure requirements, WhatsApp informed users last week that certain data points, such as the user’s profile status, login activity, contact list, purchases, and financial information, may be shared with businesses and the third parties they use. In 2018, Facebook launched “WhatsApp Business,” which allows us to chat with companies we know in order to ask support questions about orders, inventory, and in some cases make purchases directly through the app. As of summer 2019, 50 million businesses were using WhatsApp Business to communicate with their customers. Because companies pay for each message they send to users, WhatsApp Business has provided Facebook with a clear new revenue stream.

To put your mind at ease: The changes to privacy terms announced last week apply only to people interacting with companies on WhatsApp Business (as of 2019, about 40 million people per month viewed a business on the app, a small fraction of WhatsApp’s 2 billion users). This is the good news. Your private chats with friends and family overseas are still as protected as they ever were.

The bad news is that for those who worry about WhatsApp sharing data with Facebook, little is changing, and that’s a problem in itself. Unless you were one of the lucky few who managed to opt out of WhatsApp sharing your data with Facebook in 2016 when the company gave you the option, you were always being tracked in this way. The biggest difference is that even now, despite the privacy backlash of the last few years, there’s no new option to use WhatsApp without sending data to Facebook. Be tracked, or delete the app.

The fact that the new privacy notice is largely a continuation of the status quo did not, however, stem the outrage against Facebook.

This frustration came largely from WhatsApp users in India, briefly causing “#WhatsApp” to start trending on Twitter at the beginning of last week. But users and reporters globally also slammed Facebook for its lack of transparency. The notification didn’t fully explain what the update really means, and as users discovered that they didn’t have an option to opt out of the data collection, some began to delete their accounts. People around the world have grown weary of Facebook, interpreting a relatively banal terms change as apocalyptic.

HOW WHATSAPP FITS INTO THE DATA ECONOMY

It’s no surprise that people have been confused and outraged over the WhatsApp notification because it’s very difficult to understand. Opaque third-party ad-tech companies and the ever-growing suite of Facebook products make it very difficult to track down the realities of data-sharing, even among experts who understand the jargon.

Shoshana Wodinsky, a reporter at Gizmodo and ad tech expert, described advertising technology to me as a system of Escher-style interconnected pipes folding in on themselves infinitely. Shoshana and I spent way too much time reading the Facebook terms, WhatsApp terms, WhatsApp for Business terms, and diving into the marketing claims and documentation made by third-party data processors, but it’s still relatively unclear exactly how this newly permissioned shareable data will be used. Facebook’s stated intent is to “improve” and “market” the Facebook solutions to brands, which gives them permission to do all sorts of shady things.

WILL THIS DATA SHARING IMPROVE FACEBOOK’S TRACKING OF YOU? WELL, SORT OF.

Will this data sharing improve Facebook’s tracking of you? Well, sort of. WhatsApp Business offers a platform to companies that allows them to chat with you directly as an individual. The change might mean that the brands using WhatsApp Business can use what they learn about you as a customer, including information from your chats with them, status updates, payment history, contact list, and profile photo, and connect it to Facebook’s normal ad targeting.

WhatsApp Business chat conversations may also be in the mix for data mining. Business chats, unlike private messages, are only encrypted in transit, meaning the business and any third parties they work with get to decide what to do with that information once they get it. It’s likely that Facebook will then allow those brands to retarget you so that their own ads for promotions will follow you across Facebook and Instagram. It’s unclear whether Facebook will reuse these WhatsApp Business insights and sell them to other ad buyers, such as political campaigns. But importantly, there’s little stopping them from using your data in this way. Facebook can change their terms whenever they wish, as they did in 2016 and are sure to do again.

If brands do want to use your newly collected information to target you on Facebook, there are a lot of ways this can go wrong. Businesses may now be able to use information such as your transaction history and profile photo to send ads that are “more relevant.” Most modern targeting systems in this space often make use of a field of artificial intelligence called deep learning, where many behavioral features can strongly correlate to race, gender, or socioeconomic class. If WhatsApp Business allows brands to use your profile photo or chats for targeting, it’s possible that the skin color of your profile photo or the nuances of your chat style may allow them to inadvertently infer sensitive information about you, resulting in a different and potentially harmful experience between people of different genders or race.

For instance, if AI mined the chats for some kind of “customer quality score,” it might inadvertently deprioritize support requests from minorities. Or, it might make you more likely to see ads for predatory products such as high-interest loans or gambling sites. I can’t know for sure if this kind of data mining of chats or profile photos is happening, but legally speaking, outside of the EU and U.K., there’s little standing in Facebook’s way.

A PUSH TOWARD AN INCREASINGLY GLOBAL FACEBOOK

Beyond the technical details of how exactly the terms change will affect users, the decision to improve data tracking for WhatsApp Business in particular indicates that Facebook is devoting more and more energy to products that are mostly popular outside the U.S.

That may be because Facebook seems to know it will have a tough time regaining trust from Western audiences. Given the Facebook-fueled assault on the U.S. Capitol building this Wednesday, Facebook’s potential harm to society is pretty hard to ignore. In addition, Facebook is now under investigation for its recent acquisitions of WhatsApp and Instagram, despite a rush to more deeply incorporate the two applications into the code base of the parent company.

FACEBOOK’S POTENTIAL HARM TO SOCIETY IS PRETTY HARD TO IGNORE.

Because of the investigation, last week’s move to change terms is a bold one, according to Roger McNamee, Elevation Partners cofounder and an early Facebook and Google investor. He says that when you’re the subject of an antitrust investigation, “the normal response is to be contrite, promise never to do it again, and spend your time trying to minimize the appearance of harm. The last thing you want to do is add incremental harms. This is not how an antitrust defendant is supposed to act.”

The timing and audacity of the WhatsApp changes may be signals that Facebook believes its time of explosive revenue growth in the U.S. is drawing to a close. Facebook revenues in India, where WhatsApp is dominant, have skyrocketed in recent years, while its user base in the U.S. and Canada has begun to decline. Facebook claims that it needed to update terms only so that it could store messages on behalf of businesses, which is also a signal that the company will prioritize WhatsApp’s B2B relationships. It most certainly indicates that Facebook intends to focus its future efforts on audiences abroad, where it enjoys a draconian capture of all things internet in developing markets. In some of these markets, there aren’t any choices but to use Facebook if you want to get online.

Facebook has an uphill battle ahead of it in 2021. Even while Indian users have been frustrated by the terms change, WhatsApp is so deeply embedded that many have felt little choice but to accept it. But as the public becomes better informed about the many potential harms of AI-enabled targeting, Facebook will be forced to focus on parts of the world where its brand faces less privacy backlash.

This could present a strategy to remain profitable despite looming consequences in the United States. Notably, those in the EU and U.K. aren’t affected by this terms update, because of their deep protections from ad targeting by virtue of the sweeping privacy regulation called the GDPR. While the WhatsApp terms update may not mean much has changed, it’s clear that in most places, Facebook has far too much power to set the terms, and its users have no alternative but to opt in. If this is merely a continuation of the status quo, it’s clear that in order to protect our privacy and civil liberties, the status quo needs to change.


Liz O’Sullivan is a cofounder and VP of Responsible AI at Arthur, the AI monitoring company. She also serves as Technology Director for STOP (The Surveillance Technology Oversight Project), where she works on New York policy to curb the technology that threatens our civil liberties, especially among marginalized communities. She is an active member of the Campaign to Stop Killer Robots, where she advocates on their behalf at the UN. (fastcompany)

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

1,167,000FansLike
34,567FollowersFollow
1,401,000FollowersFollow
0SubscribersSubscribe
- Advertisement -

Latest Articles