Facebook's decision to begin harvesting data from its popular WhatsApp messaging service provoked a social media uproar on Thursday, and prompted leading privacy advocates to prepare a federal complaint accusing the tech titan of violating US law.
On Thursday morning, WhatsApp, which for years has dined out on its reputation for privacy and security, announced that it would begin sharing user phone numbers with its Menlo Park-based parent company in an effort "to improve your Facebook ads and products experiences."
Consumer privacy advocates denounced the move as a betrayal of WhatsApp's one billion users—users who had been assured by the two companies that "nothing would change" about the messaging service's privacy practices after Facebook snapped up the startup for a whopping $19 billion in 2014.
"WhatsApp users should be shocked and upset," Claire Gartland, Consumer Protection Counsel at the Electronic Privacy Information Center, a leading US consumer advocacy group, told Motherboard. "WhatsApp obtained one billion users by promising that it would protect user privacy. Both Facebook and WhatsApp made very public promises that the companies would maintain a separation. Those were the key selling points of the deal."
Other consumer advocacy groups are likely to join the complaint, which is expected to be filed with the FTC as early as Monday.
If the FTC allows this to keep happening, we need to question that agency's role as the cop-on-the-beat.
In its updated terms-of-service, which runs more than 7000 words long, WhatsApp informed its users that they have 30 days to opt-out of sharing their data with Facebook. Privacy advocates say that the vast majority of WhatsApp's users will neither read the terms-of-service, nor take steps to opt-out of the program.
Here's the big picture. Facebook's users may think that they're enjoying a "free" service, according to privacy advocates, but the truth is that if you're on Facebook, you are the product, and your data is little more than a commodity to be sold to advertisers.
This is not Facebook's first attempt to push the boundaries of online privacy. The tech titan has repeatedly tested the limits of user privacy in an effort to build a giant database of information about its members, in order to target them with digital advertising. That, after all, is Facebook's business—selling advertising.
As Facebook founder Mark Zuckerberg famously declared in 2007: "There is no opting out of advertising."
In 2010, Facebook was busted by a Harvard Business School professor for sending personal information to online advertising companies without user consent. The company blamed a "loophole" in its policies and changed its practices, but not before some users quit the service in disgust.
In 2012, the social network conducted a research experiment on so-called "emotional contagion" by tweaking the news feeds of nearly 700,000 users, without informing them. That particular stunt ignited a privacy furor, and prompted Facebook to acknowledge that it had should have "handled things differently."
"Facebook's entire business model is premised on monetizing your user data," said Gartland. "The key thing is that Facebook must give users the choice to opt-in when they're going to make privacy changes like this. Opt-out is not good enough."
Gartland said that the FTC should finally, at long last, step in and crack down on Facebook given the company's long history of privacy abuses.
"People shouldn't be resigned to the status-quo where they just accept these shocking invasions of privacy," Gartland told Motherboard. "If the FTC allows this to keep happening, we need to question that agency's role as the cop-on-the-beat."
What infuriates many privacy advocates is that WhatsApp and Facebook went out of their way to tout the privacy and security features of the messaging service at the time of the 2014 acquisition. At the time, Facebook and WhatsApp executives insisted that the social networking giant would keep the messaging service at arms-length.
Facebook and WhatsApp lied to us… the people who run Facebook have an ethics problem.
Those claims, it now appears, were little more than a "bait-and-switch" routine designed to lure millions of people to use the service under false pretenses, according to Jeff Chester, Executive Director of the Center for Digital Democracy, a DC-based consumer protection group.
"Facebook and WhatsApp lied to us," Chester told Motherboard flatly. "And they think nothing is wrong. That shows you that the people who run Facebook have an ethics problem."
In a phone interview with Motherboard, Facebook spokesperson Matt Steinfeld, who is head of communications for WhatsApp, insisted that the social-networking giant is providing users fair notice of privacy changes and giving them a clear opt-out option if they choose not to participate in the Facebook integration.
There are, in fact, clear ways for WhatsApp users to opt-out of the program, as my colleague Nicholas Deleon described yesterday.
"WhatsApp complies with applicable laws," Steinfeld added. "As always, we consider our obligations when designing updates like this."