​Screenshot from the fake video
Screenshot from the fa

Hacked News Channel and Deepfake of Zelenskyy Surrendering Is Causing Chaos Online

Broadcast news outlet Ukraine 24 was allegedly hacked on Wednesday, and a video of a faked President Volodymyr Zelenskyy is spreading on social media.

A Ukrainian television news outlet claims that its live broadcast and website were hacked on Wednesday, with a chyron falsely saying Ukraine surrendered, Ukraine 24 said in a Facebook post.

Ukraine 24 posted a warning on Facebook that its broadcast and website were hacked. Adding to the chaos is a deepfake video of President Zelenskyy appearing to tell Ukrainians to surrender began to go viral online at the same time. Zelenskyy himself has since posted a video to say Ukraine will not surrender to Russia.


“The running line of the ‘Ukraine 24’ TV channel and the ‘Today’ website were hacked by enemy hackers and broadcast Zelenskyy's message about alleged ‘capitulation’❗️❗️❗️ THIS IS FAKE! FAKE !” the Ukraine 24 post said.

In a fake video that started spreading on social media on Wednesday, following the Ukraine 24 hack, Zelenskyy appeared to stand at a podium and address the Ukrainian “defenders,” telling them to lay down their arms and return to their families. According to an archived version of the Ukraine 24 website from Wednesday, as well as screenshots shared by journalists on Twitter, a transcription from the deepfaked video is visible: 

Dear Ukrainians! Dear defenders! Being president was not so easy. I have to make difficult decisions. At first I decided to return Donbas. It's time to look in the eye. It didn't work out. It only got worse. Much worse. There is no more tomorrow. At least in me. And now I decide to say goodbye to you. I advise you to lay down your arms and return to your families. You should not die in this war. I advise you to live, and I'm going to do the same.

The Ukraine 24 website is still down, and no one has claimed responsibility for the alleged hack or the fake video. 

On March 2, the official Facebook account for the land forces of Ukraine posted a warning about deepfaked videos: “Imagine seeing Vladimir Zelenskyy on TV making a surrender statement. You see it, you hear it - so it’s true. But this is not the truth! This is deepfake technology,” it wrote. “This will not be a real video, but created through machine learning algorithms. Videos made through such technologies are almost impossible to distinguish from real ones.” 


The Defense Intelligence of Ukraine has also been warning about deepfaked disinformation since early March, claiming that Russia might try to use AI-generated fake video in its favor: 

The one spreading around on social media today is actually pretty easy to distinguish from the real thing: His head looks like it’s been pasted onto a still photo of his body, standing at a podium. His face moves in a fairly convincing way—it blinks, after all, something deepfake detection experts used to claim wasn’t possible for AI to replicate–but it doesn’t look natural enough to fool anyone looking at it for more than a passing glance. It combines the video with an AI version of his voice, which sounds stilted and mechanical.

After the alleged hack, Ukraine 24 posted a (real) video of Zelenskyy speaking to a phone camera to deny the claims made in the hack, and assert that Ukraine was not, in fact, surrendering. 

"Regarding the last childish provocation, that I’m offering to lay down arms. I can offer to lay down arms only to the militaries of the Russian Federation and to return home. And we are at home already. We defend our land, our children, our families. And we absolutely are not going to lay down our arms. Till our victory." Zelenskyy said in his message, according to a translation by VICE.

U.S. Sen. Marco Rubio also posted the debunking video from Zelenskyy on Twitter. Rubio alleges that the Ukrainian president is speaking about the deepfake video—Rubio has advocated for more internet protections against deepfakes in the past.  


On Wednesday afternoon, Facebook’s head of security and policy, Nathaniel Gleicher, tweeted that the company’s platforms were identifying and removing the video wherever it cropped up:

Since the advent of deepfake technology for casual use in late 2017, people have worried that malicious deepfakes would be used for political aims in the U.S.—to start a war, deceive the public or spread fake news faster than the already rampant disinformation on social media through more rudimentary means like text and fake photos. Earlier this month, people online tried to claim that Russia had deepfaked president Vladimir Putin into videos of meetings, which wasn’t true. 

In almost five years, a deepfaked president has yet to trick any countries into launching nukes, but amid near-constant disinformation spreading about Russia’s war against Ukraine, a news outlet hack and shitty AI-generated video just adds to the pile.