The UK is in the middle of a general election for the next prime minister. It's been a divisive and controversial election cycle, with incumbent Conservative Boris Johnson facing off against the Labour Party's Jeremy Corbyn.
The last thing any election cycle needs in the age of rampant misinformation is deepfake videos of candidates saying things they've never said, but that's exactly what one artist has put out in the world, though he claims he did it an effort to inform the public on the dangers of deepfakes.
Digital artist Bill Posters is back, following his deepfakes of Facebook CEO Mark Zuckerberg and celebrities, with a set of face-swapped videos that make Corbyn and Johnson look and sound like they're endorsing each other. They're made in partnership with a group called Future Advocacy, which claims to be working on ways to crowdsource solutions to malicious deepfakes.
The deepfakes of Johnson and Corbyn look pretty realistic, and for an ignorant American who has spent almost no time listening to or watching either of these politicians, they could be them, for all I know. Except that they admit, at the end of each of the videos, to being fakes.
"Twitter has just banned political advertising because they can’t control it, the government should force Facebook and Google to do the same and ban the practice of micro-targeted advertising completely," Posters said in a press release. But some experts say that banning political ads can disadvantage upstart candidates and actually proliferate more biased media, by forcing people to turn to cable news sources.
I asked Posters why he continues to make deepfakes if he's so concerned about deepfakes sparking a misinformation crisis. He and the Future Advocacy say they are the first and only ones currently putting deepfakes of these politicians into the world during an election cycle.
In certain states in the U.S., including California and Texas, recent laws have made it illegeal to create and spread damaging deepfakes of candidates during an election season. Attempts at legislating deepfakes, experts say, risk stifling free speech online.
More importantly, studies show most deepfakes aren't even made for political reasons, especially in the U.S. and UK. The victims of malicious deepfakes are still, by and large, women in nonconsensual porn, a very real and ongoing problem that doesn't get nearly as much attention as the still largely hypothetical use of political deepfakes.
But Posters believes that these are worth doing. After all, Motherboard's coverage of the deepfake he made of Zuckerberg announcing world domination forced Facebook to take a stand on what types of misinformation and satire would stay or be banned on the platform, and the one Posters made of Kim Kardashian raised interesting copyright claim questions.
"My interest in the technologies used in the creation of deepfakes is to subvert them for social and artistic purposes," Posters said. "There is still so little public understanding today about how personal data is used by powerful technologies... I'll probably stop exploring the issues with these types of technologies when elected officials do their jobs and create adequate laws that safeguard people's privacy and the integrity of democratic processes."
This article originally appeared on VICE US.