AI-generated synthetic media is being used in a political ad campaign—not to disrupt the election, but to save it.
MIT Technology Review
by Karen Hao, September 29, 2020
The news: Two political ads will broadcast on social media today, featuring deepfake versions of Russian president Vladimir Putin and North Korean leader Kim Jong-un. Both deepfake leaders will be giving the same message: that America doesn’t need any election interference from them; it will ruin its democracy by itself.
What are they for? Yes, the ads sound creepy, but they’re meant for a good cause. They’re part of a campaign from the nonpartisan advocacy group RepresentUs to protect voting rights during the upcoming US presidential election, amid president Trump’s attacks on mail-in voting and suggestions that he may refuse a peaceful transition. The goal is to shock Americans into understanding the fragility of democracy as well as provoke them to take various actions, including checking their voter registration and volunteering for the polls. It flips the script on the typical narrative of political deepfakes, which experts often worry could be abused to confuse voters and disrupt elections.
How they were made: RepresentUs worked with the creative agency Mischief at No Fixed Address, which came up with the idea of using dictators to deliver the message. They filmed two actors with the right face shape and authentic accents to recite the script. They then worked with a deepfake artist who used an open-source algorithm to swap in Putin’s and Kim’s faces. A post-production crew cleaned up the leftover artifacts of the algorithm to make the video look more realistic. All in all the process took only 10 days. Attempting the equivalent with CGI likely would have taken months, the team says. It also could have been prohibitively expensive.
Are we ready? The ads were supposed to broadcast on Fox, CNN, and MSNBC in their Washington, DC, markets, but the stations pulled them last-minute from airing. A spokesperson for the campaign said they were still waiting on an explanation. The ads include a disclaimer at the end, stating: “The footage is not real, but the threat is.” But given the sensitive nature of using deepfakes in a political context, it’s possible the networks felt the American public just wasn’t ready.