Facebook is still spreading conspiracies 48 hours after Texas shooting

Digital platforms have pledged to do a better job of preventing fake news, but they’re still singing the same apologetic songs.

More than 48 hours after Devin Kelley shot and killed 26 people at a Texas church, fake news was still swirling around on social media — much like what happened in the wake of the mass shooting in Las Vegas last month.

Without evidence, these false articles and speciously labeled videos describe Kelley as a radical leftist sympathizer or a militant atheist. Although digital platforms pledged to do a better job of preventing misleading information from spreading after events like Charlottesville and the Las Vegas massacre, they’re still offering the same apologies.


On YouTube, videos among the top search results for “who is Devin Kelley” wrongly characterize the gunman as a disaffected leftist or an anti-Christian terrorist. Searching for “devin kelley antifa” on Twitter yields a long list of tweets about Kelley’s wholly invented sympathies for antifa, a loosely organized radical leftist political movement, alongside articles debunking Kelley conspiracy theories.

Egged on by Infowars’ Alex Jones and Mike Cernovich, fake news articles discussing Kelley reverberated throughout conservative Facebook groups, according to posts viewed through the analytics tool CrowdTangle. And people searching on Google for information about Kelley were nudged toward more claims of antifa affiliation in autocomplete and top search results, although articles from more reputable news organization have since buried many of those links.

What we do know about Kelley is that he was a 26-year-old San Antonio resident who was discharged from the Air Force in 2014 after being convicted in a court martial for assaulting his wife and breaking his infant stepson’s skull. Although a domestic abuse conviction should have kept Kelley from purchasing a gun, an apparent Air Force procedural error after his court martial allowed him to purchase an assault rifle, which he used to kill at least 26 people at the , including children as young as 5 years old.

The motive, according to initial reports, is likely related to a long-running family dispute. Prior to the attack, Kelley sent threatening text messages to his mother-in-law, who attended the church but wasn’t present on Sunday. After engaging in a high-speed chase with two citizens, Kelley shot himself in his car several miles from the church.


Google admitted in a statement to VICE News that its autocomplete “system did not work as intended,” and Facebook said in its own statement that “we recognize that accurate information is critical during a crisis.” Twitter directed VICE News to a blog post that claims the company is “working to ensure we are surfacing the highest-quality and most relevant content and context first.” A YouTube spokesperson said that “there is still more work to do, but we’re making progress.”

Their full statements after Texas, however, follow roughly the same playbook they used after Charlottesville and Las Vegas: Announce that they’re improving the algorithms that surface such content, highlight experiments to promote “real” news content, or mention plans to hire a few thousand more people to monitor content on the platform.

Fundamentally, the problem that these companies face is that what most effectively keeps their costs down — using algorithms instead of people to curate content — isn’t up to the task of preventing misinformation, especially in a crisis.

READ: Facebook, Twitter, and Google failed to stop fake news after Vegas shooting