On February 21, Tass—a Russian news agency—reported that five Ukrainian soldiers had crossed the border into Russia riding two armored personnel carriers (APC). According to the story, Russian forces destroyed both vehicles and killed the five Ukrainians. Later, Russia released a helmet-cam from one of the supposed Ukrainians as well as photos and videos of one of the destroyed APCs. It’s one of many Russian reports of alleged Ukrainian aggression — like the shelling of a school in Donetsk and Luhansk — that Russia has used to justify its military action in the region.
Soon after the footage hit the internet, sleuths had picked it apart. One Twitter user used metadata of the video file and satellite imagery to geolocate the images and figured out it had all been filmed in the exact same location where Russia previously claimed a shell had destroyed a border post.
The location of the skirmish was miles from where Russia said it was and deep inside occupied territory in Eastern Ukraine, not Russia. The destroyed APC was a BTR-70M, a type that Ukraine doesn’t own, painted over to make it look Ukrainian.
There are dozens of stories like this. But as Russia floods Telegram, TikTok, and its own state-controlled media with stories of Ukrainian aggression, people on the internet are using open-source intelligence tools that have proliferated in recent years to debunk Russia’s claims. Internet sleuths are debunking the Kremlin’s disinformation and justification for war in real time.
Amid all this, Eliot Higgins and Bellingcat are collecting the data, fact checking it, archiving footage, and amplifying the messages online. Higgins and Bellingcat are old hands at this. They’ve been tracking conflict online and sifting through the morass of multiple sources and bad information for eight years now.
They’ve gotten good at it. “It used to be days or weeks until we had fact checks,” Higgins told Motherboard over the phone. “Now we’re getting it within an hour. That helps with the rapid news cycle. The question of whether these will be authentic or not is being answered very quickly. We didn’t have that back in 2014.”
Higgins told Motherboard that eight years of building a group of people dedicated to sorting through images and videos of war on the internet had sped up how quickly people can learn the truth behind what they see online. “There’s already a network and community,” he said. “We’ve existed for a long time and are familiar with open-source investigation.
According to Higgins, a lot of the disinformation out of Russia has been easy to debunk. On Feb. 18, the heads of the Donetsk and Luhansk People's Republics announced emergency evacuations of their breakaway republics, citing sudden Ukrainian aggression. Metadata in the videos revealed they’d been filmed two days earlier, suggesting the emergency evacuation had been planned in advance.
“They basically manufactured a refugee crisis so they could put them in camps across the Russian border, so they could then be filmed by Russian state media to show evidence of this refugee crisis that they were claiming is part of this genocide that is going on,” Higgins said.
Moscow then used this, and other reasons, as part of its justification for the escalated invasion of Ukraine. “It’s like a parody of a NATO intervention,” Higgins said. “It’s trolling NATO, manufacturing a humanitarian crisis.”
He said it’s been odd to watch Russia’s narratives be so thoroughly destroyed so quickly. “Somehow, even though they’ve been doing this for years and they’ve seen how open-source investigators pick apart this kind of evidence, they’ve learned no lessons from it. In fact, they’ve even become worse,” he said. “I’m triple checking, just because it’s so ridiculous. It’s like they’re creating forgeries and then giving away the blueprint of the forgery along with the forgery itself.”
The most outlandish piece of propaganda Higgins saw involved Polish saboteurs. On Feb. 18, a Telegram channel associated with the People's Militia of Donetsk People’s Republic published a video it claimed showed Polish-speaking saboteurs targeting chlorine tanks. “It wasn’t just that it had the entire Adobe metadata in it,” he said. “It included the names of the two files that had been edited together, the source of the audio in one file that was used to overlay things into the file exactly at the moment the explosions occurred.”
Higgins said the video contained “every single tiny little detail about how that file had been faked, basically. It was there in the metadata for anyone to look at. It blows your mind that they can do that. It might be that they don’t realize Telegram doesn’t strip metadata like a lot of the other social media platforms. It’s the laziest, dumbest d'information I’ve seen in forever.”
Another wild war story involved Russia claiming Ukrainian military forces had hit a kindergarten with artillery. Quickly, people gathered video of the school, geo-located, analyzed photos of the impact crater, and determined that the artillery shell that struck the school had come from a location occupied by Russian backed separatists.
War is complicated, horrifying, and messy. For hundreds of years, people had to rely on second hand information filtered through reporters, eye witnesses, and state propaganda. Now, eye witnesses can speak directly to the public via social media. Satellites and cameras gather information every moment of the day, much of it available to the public.
The power to determine truth from fiction on the battlefield, more than ever, is in the hands of ordinary people. “This is exactly the same sort of material that was coming out of Syria and Libya in 2011 and 2012, but no one was really taking notice of it because there’s kind of a lack of trust in that material because it wasn’t being verified,” Higgins said. “But now you have communities of people on the internet geolocating videos and verifying videos that are coming out of conflict zones.”
Western intelligence spent the last few weeks telling the world that an invasion was imminent. After 9/11, WMDs, and Afghanistan, the public doesn’t have much faith in America’s intelligence community. “The thing is, they didn’t present any evidence to support what they were saying,” Higgins said of U.S. intelligence in the runup to the escalation in Ukraine. “For us, as investigators, it’s not worth too much beyond watching out for certain things that may be occurring and seeing if they’re consistent with the claims that have been made by the White House and intelligence officials.”
The rapid dissemination of information and real-time debunking of propaganda raises ethical concerns. Higgins and others are dealing with life and death topics and open source-intelligence (OSINT) investigators often uncover evidence of real atrocities as well as troop movements and other sensitive information. Because of this, the OSINT community must constantly police itself. Sometimes publishing the truth can get people killed.
The ethics of this budding journalism is currently being studied by the Stanley Center, a non-profit that works to prevent nuclear war and other atrocities. It’s recent report on the subject, The Gray Spectrum, outlines the ethical considerations OSINT investigations should consider before they send a tweet.
“These are powerful tools,” Ben Loehrke, Program Officer for Nuclear Weapons at the Stanley Center, told Motherboard. “The OSINT community can break news, hold governments to account, and pierce attempts at secrecy. They have been remarkably effective at preempting and displacing Russian disinformation about the situation in Ukraine…but if you ask around, most OSINT analysts will have a few stories of times they felt uneasy about whether publishing something was the right call. That kind of influence carries ethical responsibilities.
Higgins said he’s deeply aware of the ethical pitfalls of this work. He said that Bellingcat is currently archiving every public video and photo of the war they can find. “We’re thinking about making some of that public,” he said. “But we want to make sure that the stuff we make public doesn’t allow the individuals who share them online to be identified. We know that Russia has really put the boot down on people who share information.”
“When you geolocate videos you have to make sure you’re not also geolocating the person who filmed it,” he said. It’s a tricky and complicated process, sorting out truth from lies on the battlefield while doing your best to protect sources. It’s life and death stuff playing out online, and it’s happening faster and faster every day.