This story is over 5 years old.


If Facebook Actually Wants to Be Transparent, It Should Talk to Journalists

A Facebook executive has been complaining about negative press coverage, yet the company declines to engage with journalists.
Image: Shutterstock

About a year ago Facebook launched a new security and privacy feature for Facebook Messenger: end-to-end encrypted chats.

This was a big deal. Before it was implemented, all chats, gifs and emoji sent through Facebook Messenger could be seen by Facebook, and, in turn, by law enforcement agents armed with a warrant. But some criticized the social media giant because the feature—called Secret Chats—was optional, and not enabled by default for all users, unlike WhatsApp (a Facebook company).


At the time, I reached out to Facebook in an attempt to clear up what I thought was the key question here: Why can WhatsApp users have end-to-end encryption by default and Facebook Messenger users can’t? A Facebook PR person was quick to respond, and he set up a phone interview with Tony Leach, Facebook Messenger's product manager. As a result, we published a story that was nuanced and explained why encrypting messenger is more difficult than encrypting WhatsApp, because we were able to speak to a person who actually worked directly on the feature we were reporting on. Most importantly, that person spoke to us on the record, allowing us to publish direct quotes from the interview.

Fast-forward to this week, when a new Facebook program to combat the spread of revenge porn was announced in the Australian press and was quickly picked up by media all over the world. The pilot program was bound to be controversial. In essence, to stop abusers from uploading explicit pictures without their subject’s consent, Facebook is asking Australians to voluntarily submit their own nudes, so the social network can generate a digital fingerprint of the picture and then stop anyone else from uploading and spreading it.

Read more: The Bots That Are Changing Politics

When we heard about this program, we had a lot of questions: Is Facebook storing these pictures? If so, for how long? How does Facebook make sure people don’t abuse the system to trick it into censoring non-abusive pictures? My colleague Louise Matsakis reached out to Facebook with these questions, and a Facebook representative gave her answers “on background,” meaning we were allowed to paraphrase, but not directly quote the answers.


One of the spokesperson’s responses, it turned out, was wrong. Facebook initially told us the system was designed to blur pictures before showing them to a human team of reviewers. It told us this was the case two times, then said it was wrong when she followed up. Mistakes happen, but in this instance Facebook was wrong about a basic and crucial part of a controversial program.

Meanwhile, Facebook Chief Security Officer Alex Stamos, a very well-respected information security expert, wrote on Twitter that the media and public at large had misunderstood the intentions and the methods behind the anti-revenge porn program.

“Our greater infosec/privacy community, including the media, has trouble talking about imperfect solutions to serious problems,” Stamos tweeted. “I think this is a good demo of why we need to continue to talk about these problems publicly to build understanding and trust.”

Stamos had similar complaints earlier this year when multiple media outlets covered news of Russian ads and propaganda on Facebook.

“I am seeing a ton of coverage of our recent issues driven by stereotypes of our employees and attacks against fantasy, strawman tech [companies],” Stamos tweeted. “My suggestion for journalists is to try to talk to people who have actually had to solve these problems and live with the consequences.”

It is crucially important that the public understands how Facebook works


Facebook head of global safety Antigone Davis gave a statement to Australia’s ABC for its initial story, but Facebook didn’t make any of its experts available to journalists for follow-up questions about how the program would actually work. None of the follow-up articles published about the program in the days after the news broke had on-the-record interviews with Facebook employees. Before we published our story about Facebook’s human review process of nude photos, I asked Stamos directly for an interview about the new program. He said he would ask Facebook for permission to talk to me (I previously profiled Stamos when he was head of Yahoo!’s security), but I was eventually told Facebook was preparing a blog post to address the new program.

The Daily Beast executive editor Noah Shachtman, along with other journalists, responded to Stamos’s earlier tweets: If Facebook wants to quell confusion about its product, it should actually talk to journalists.

Facebook is better than many huge tech companies in that it does a pretty good job of actually responding to journalist inquiries (many companies simply ignore requests). But too many of these conversations are off-the-record, on background, or are run through spokespeople rather than the people who actually work on the projects themselves. Direct responses to questions are rare—instead we’re given broad statements on background. In many cases, answers to simple questions—are the nude images blurred or not, for instance?—are filtered to the point where the information Facebook gives journalists is not true.There are exceptions of course: A Facebook spokesperson recently spoke on the record to my colleague Jordan Pearson. And this summer, Facebook invited me and half a dozen other journalists to talk to Stamos during the security conference Black Hat in Las Vegas, ahead of a series of announcements.


But Facebook’s preferred method of interacting with the press is through blog posts and carefully orchestrated livestreams, like the one Sheryl Sandberg recently had with Axios. Blogs and white papers posted by Facebook on its own website are not dialogs with the press; they are simply one-way press releases.

One out of four humans on earth have a Facebook account; Russian hackers appear to have abused the platform to push hateful, divisive propaganda that might have swayed a major election; and sometimes it seems like the social media giant knows more about you than you know about yourself.

It is crucially important that the public understands how the company works. If Stamos and other Facebook executives would like to foster transparency and ensure that the intentions of its projects aren’t misunderstood, they should start by letting the people who work on those projects speak to journalists, on the record.

When I raised all these points to a Facebook spokesperson, he said that “we are committed to communicating openly and clearly, and we always welcome feedback.”

Got a tip? You can contact this reporter securely on Signal at +1 917 257 1382, OTR chat at, or email

Get six of our favorite Motherboard stories every day by signing up for our newsletter.