What Facebook Is Getting Wrong in the Fight Against Fake News
Brooke Binkowski spends her days in search of the misinformation and propaganda that have infected digital media. We caught up with her to talk Facebook and fact-checking.
Portraits by Ariana Drehsler
Brooke Binkowski realizes she can sound crazy. “But the world sounds crazy,” she told me. And in her line of work, avoiding what’s crazy isn’t the point.
That work has put her near the center of some of the weirder flash points igniting American culture in recent years: Binkowski spends her days wading through the fever swamps in search of the misinformation and propaganda that have infected digital media. Formerly the managing editor of the OG fact-checking site Snopes, she now holds the same title at TruthOrFiction.com, a smaller outlet that similarly debunks myths.
As Facebook faced enormous backlash for the spread of fake news leading up to the 2016 US presidential election, it enlisted Snopes and other fact-checking outfits as part of a supposed counteroffensive. But Binkowski soured on the deal early on, and she thinks the resulting tension contributed to her being fired this past summer from Snopes. (Snopes founder and CEO David Mikkelson disputes this account.) Still, Snopes didn’t renew its agreement with Facebook at the end of last year. Now Binkowski is one of the few current or former fact-checking partners of the tech giant to speak out against its efforts, which she sees as sorely lacking. Facebook has consistently defended this initiative, adding, in a comment to VICE, that it has introduced new features to improve data-sharing and broaden fact-checkers’ impact since Binkowski left Snopes last year.
From her home in San Diego, Binkowski sees the stakes of the info war in nearby Tijuana, where the asylum seekers known as “The Caravan” remain in limbo after a journey across Central America that received a lot of publicity riddled with misinformation. The real-world implications don’t end there: Parents of Sandy Hook victims are pursuing a defamation suit against Infowars huckster Alex Jones—a case on which Binkowski is consulting as an expert witness—for claiming the school shooting was a hoax. “I love talking shit to people who lie on the internet,” she said. “I’m pretty much born for this.”
I caught up with Binkowski by phone to talk Facebook, fact-checking, and how fake news has changed since she joined Snopes in late 2015.
VICE: Am I wrong to think those were simpler times?
Brooke Binkowski: We just hadn’t figured out how bad it was yet. But looking back, all the signs were there that rot was setting in. I think that the disinformation we’re seeing now—all the fake news shit—that’s been sort of bubbling under the surface for at least a decade. What was happening was it was getting tested out on less-mainstream sites, like Tumblr. I know for a fact [fake news] was on LiveJournal. It’s been around for a while and just went septic in the mainstream in 2015 and 2016.
Was there just a larger quantity of fake news in 2015–16 or did you notice qualitative changes?
It started to get very racialized around April 2016. One of the first stories I debunked was that UPS was flying Muslims in on its planes in the dead of night into the United States for—I don’t know—some kind of Muslim takeover. That kind of stuff went on steroids, and then a lot of border stuff was coming up. It became very obvious that there were people who were trying to latch onto and widen the preexisting racial and socioeconomic divides of the United States.
How do you think fact-checking has an impact? You have all these people who believe these things or are more likely to believe these things about the caravan or whatnot. Then there are people who read Snopes. Do those two groups of people overlap much?
I really do think fact-checking has an impact. It’s not immediately clear what the impact is, which is why studies have come out saying people dig their heels in when presented with a fact-check. Nobody wants to be told they’re fucking wrong. I don’t like it. But I—and I know I’m not abnormal—will think about it, and then quietly go back and be like, Sorry, yeah, I was totally wrong. I think that we’re seeing a lot of that mechanism.
People want to make sense of the world. That’s human nature. And I think we need to remember that in the absence of actual news, people will still tell stories to each other and to themselves to make sense of the world. If you feed them garbage, then garbage is going to come out.
Facebook or Twitter would also say, well, at our core we allow people to share stories that help them make sense of the world.
Of course they’re going to say that. That’s how they make their money. Facebook and Twitter are great; I love mass communication. But their frickin’ rich white-boy Silicon Valley bullshit of, Oh, we’re just going to let people talk, privileges rich white-boy Silicon Valley racist assholes.
But you were one of the fact-checkers who tried to work with those people—or at least, Snopes was.
I was insistent that we are not taking money from these people. We will not be their bitches. I couldn’t think of a more elegant way to put it. And [Mikkelson] took their money anyway.
What was your end of the agreement in terms of fact-checking?
They gave us this algorithm-generated list of stories with their headlines and URLs, and then we had fields where we could put in whether it was true, false, a mixture. And then we’d put the URL to our own debunking. That was it. It was very low-key. We thought that the added visibility would be good. We also thought, OK, Facebook is going to use this to improve their algorithms and figure out where they’re going to go next. And nothing came of it.
Did they ever communicate with you on what they wanted you to do?
They tried to at first, and I was like, No. We are working with you, not for you. I started really beating the drum for us to pull out of the partnership in mid 2017. First of all, it was obvious that it wasn’t working. Second of all, [the genocide in] Myanmar was disgusting. And [Facebook has] been almost entirely culpable for the disinformation crisis that helped justify this massacre—this ethnic cleansing. It’s clear they don’t give a shit.
If I could play devil’s advocate: If Facebook throws money at a news organization that allows them to employ a fact-checker that they wouldn’t employ otherwise—it seems like there’s some upside to that, no?
The truth is we didn’t need [the money]. My issue is not with the people who took it. My issue is with the mechanics of it all. If Facebook was really acting in good faith, they’d put it into a foundation and not use it to make the marionettes dance, which they do.
What do you mean by that?
We talked several times about what kind of news Facebook was sending us to debunk. It was very easy, [very] debunkable, and it was all sort of left-wing junk news—like anti-vaxxer shit. So they obviously have some kind of algorithm [to determine what to send to] us. But what’s the algorithm? I kept asking. Never got an answer.
Did you have any sense of how big the problem was on Facebook or to what extent they were taking it seriously?
They didn’t share shit with us. I felt that we were crisis PR: They could point to us and say, Look, we’re doing something about it. We hired Snopes. They also [included] The Weekly Standard and [considered including] The Daily Caller in their fact-checking teams, because they didn’t want to be perceived as left-wing fact-checker friendly. I was like, You guys don’t know how this fucking works, do you? You should not be doing this. You need to hire people internally.
They’re reacting to conservative criticism the exact same way a legacy media company might react.
Their reaction has been very telling. That’s another reason I’ve gone on this offensive. I’m broke as shit—always. I don’t have a lot of personal power. But what I really have right now is a megaphone. I have a voice. And they’re very sensitive to public opinion. So I’m just going to keep kicking them in the teeth publicly as long as I can, because they’re fucking up.
So you think the power lies with them?
One hundred percent. For them, they’ve been in denial about being a media company, not just for legal reasons, but also because they can tell themselves media may be prone to being swayed one way or the other. Tech is morally neutral—it’s all in the way people use it. That’s obviously not true. It never was.
You’ve tweeted before that you grew up with this idealistic belief that the internet could help bring people together. Do you still think you’re an idealist?
If I wasn’t an idealist, I wouldn’t be so mad all the time. I don’t believe there has to be one way of changing things. But I also think that we have been prevented by distraction and by outside forces from changing in the ways that we need to.
I always try to connect the decline of journalism to politicians’ having more power over media right now: Trump can say all this shit because journalism is weaker. And that power dynamic probably plays out with people in San Diego, too.
Probably. I don’t really talk to a lot of people. I can’t really talk to anybody about my job. I mean, I can, but people just kind of end up looking at me like, Sure.
Sometimes when I log on, or even if I turn on cable news, it’s just, Wow, we are five steps down the rabbit hole already.
It’s so dystopian, isn’t it? I’ve been smoking a lot of pot. My coping skills are yoga, astrology, and weed.
It’s the trifecta.
We are in a dystopian novel from 1986. We just need aliens. Where are the aliens?
This conversation has been edited for length and clarity.
Sign up for our newsletter to get the best of VICE delivered to your inbox daily.