A fake video purporting to show Ukrainian President Volodymyr Zelenskyy telling his soldiers to surrender made the rounds on social media earlier this week, before being scrubbed from several social media sites.
In a video post on Telegram, the real Zelenskyy refuted claims that he called on Ukrainians to surrender. He said he’s still in Ukraine, defending it from Russian invaders.
Nonetheless, the fake video still reached a substantial audience. The Ukrainian TV network Ukraine 24 said hackers placed the video on its website and put Zelenskyy’s fake message in its scrolling news ticker at the bottom of the screen. The message was also spread across Russian-backed social media sites like VK, a site similar to Facebook predominantly used by Russian speakers.
Fortunately, the fake Zelenskyy video was poorly made and easy to spot. But that won’t always be the case.
For years, security professionals around the world have warned about the nefarious use of deepfakes, machine-manipulated video and audio often used to deceive people into believing public figures have said or done things they actually haven’t.
Here’s an NBC News video from 2018 explaining how the technology is used:
Unfortunately, deepfakes aren’t going away. In fact, the technologies used to create them are being refined and becoming more widely accessible, which has inevitably led to more widespread use.
For example, the very artificial intelligence technology that allowed someone to make the fake video of Zelenskyy is also being utilized to improve dubs on foreign-language films.
And the same augmented reality technology — known as “AR” — used in some deepfakes to superimpose someone’s talking head on another person’s body has already become widely popular thanks to “face swap” features on apps like Snapchat.
So-called face-swap apps are so easy to make, there are tutorials on how to do so widely available online.









