Computer scientists are warning against the development that threatens to turn fake news into something indistinguishable from the real thing: Deepfakes. But, exactly what are deepfakes?
What are deepfakes?
Deepfakes — a compound of “deep machine learning” and, well, “fake” — are videos that use AI to swap faces, add images and adjust audio, creating false evidence that people said or did things they really didn’t.
It’s the consequence of technology that has progressed quickly and become accessible to everyone. Face swapping, once only achievable in a professional FX suite, now can be done by anyone with a smartphone. And although deepfakes are a bit more complicated, they can still be created with simple computer programs. It’s an offshoot of facial-recognition technology, which is exploding in use; a deepfake program looks for the common field between faces and stitches them together.
The technology has already been used to create fake celebrity porn, adding mainstream actresses’ faces to triple-X scenes. Others have created fake videos of speeches by President Trump, Russian President Vladimir Putin and Hillary Clinton, using real news footage combined with video from “Saturday Night Live” sketches.
In April, BuzzFeed published a video showing former President Obama apparently calling President Trump names, including “dips–t.” Halfway through the video, director Jordan Peele breaks in to announce that it’s a gag, one he created to warn people about the impending danger of deepfake news.
Peele said the video took 56 hours to make with the assistance of a professional video editor using readily available software.
“It may sound basic, but how we move forward in the age of information is going to be the difference between whether we survive or whether we become some kind of f—ed-up dystopia,” said Peele.
What can be done about deepfakes?
Disconcertingly, the experts aren’t sure yet. CNN reports that tech experts and academics are consulting with social-media companies about how to respond to the threat.
First Amendment experts say that banning the technology isn’t the answer. “From a civil liberties perspective, I am concerned that the response to this innovation will be censorial and end up punishing and discouraging protected speech,” David Greene, the civil liberties director at the Electronic Frontier Foundation, told Mashable. “It would be a bad idea, and likely unconstitutional, for example, to criminalize the technology.”
But politicians warn that there’s isn’t much time to come up with a strategy to combat deepfakes. “The idea that someone could put another person’s face on an individual’s body, that would be like a homerun for anyone who wants to interfere in a political process,” Sen. Mark Warner (D-VA) told CBS in March. “This is now going to be the new reality, surely by 2020, but potentially even as early as this year.”