My favorite piece of fetid, rotting Facebook repost garbage is this one, abstracted by Snopes as such: “Drinking four glasses of water at the beginning of each day will cure various diseases.” Those diseases range from “fatness” to “womb cancer,” and the proof is found in a deference to an unnamed “Japanese medical society” and the vaguest assurance of “scientific testing.”
The “activates internal organs” kernel is still propagating nearly a year after the Snopes takedown and the original email source traces all the way back to 2004. It’s a classic bad rumor, morphing and reigniting cyclically over time in defiance of factual takedowns. I’d call it a zombie, but that implies a way we might actually kill it forever. There isn’t.
We can reasonably assume that Facebook is the chief social media vector for these bad rumors, just because of how well it fosters clustering of IRL close/trusted connections, and that its interface is practically built for moving rumors/image macros across sharing cascades.
It should be no surprise that Facebook is concerned about—or at least interested in—its role in the propagation of bad information. To that end, researchers at the company have just released a paper, to be presented at the International AAAI Conference on Weblogs and Social Media in June, describing the relative movements of true and false rumors across the network. The main takeaway is more hopeful than you’d expect: true rumors disseminate further and wider, but not by much.
The Facebook researchers used Snopes as its information baseline, allowing them to, first, identify a rumor as a rumor in the first place (a rumor being anything covered by Snopes) and, second, to quickly categorize those rumors as true or false (based on the typically immaculate research/judgment of Snopes). With the rumors identified, Facebook was then able to trace them across its networks. The map below shows the spread of a (false) rumor consisting of a photographed Cabela’s receipt purporting to show an additional tax levied as part of the Affordable Care Act.
It’s interesting to note that if a rumor makes it past the initial burst, it’s likely to make it way past. If a stupid rumor initially finds a stupid network, the road is wide open. If not, it’s done. Below is a close-up of one branch. The red dots are where commenters below the posted bad rumor link to the Snopes takedown. This tends to cauterize the information wound, but not always.
The paper explains as such: “Individuals propagating rumors may attempt to disrupt their role in propagation or disassociate themselves from the rumor if they learn that it is false or otherwise experience social conﬂict as the result of propagating it.” So, Facebook users tend to act to reduce their exposure to self-clowning. Makes sense.
The result of that self-policing is that “the true rumors that were most viral and elicited the largest cascades.” The numbers show that true rumors resulted in an average of 163 shares per upload, while false rumors get 108 shares per upload. Consider also that, of rumors examined by Snopes, 62 percent were false, so the slant toward truth might take on even a bit more weight.
That said, the findings here should be taken with a couple shakes of salt. Consider first that the results are based only on rumors that made it onto Snopes in the very first place, and thus had a ready-made rebuttal. The authors also note that their study’s conclusions are only observational, and don’t take into account other possibilities for increased and decreased shareability. Perhaps the false rumors were more likely to be associated with crappy images (anecdotally, they are) or were more poorly written (again, yep) or something else less obvious.
Controlling for that and everything else that might make a difference seems nigh impossible in any study. Twisting things further is the observation that Snopes comment links showing true rumors also led to deletions, albeit not at the same rate. The suggestion is that users delete shares once they realize that they're stale.
The takeaway is that you might as well fight back against bad information on Facebook. Those red dots above count as something positive for a more accurate internet, and all it takes is a copy and pasted URL. The catch is always that you might have already sealed off that crowd of people posting bad rumors from your own personal network. I know I have. It might make Facebook easier to look at and cut down on some time-consuming comment battles, but as for the sum total of true information, it certainly doesn't help.