- Deepfake artist Hao Li, who created a Putin deepfake for at MIT Technology Review's EmTech conference, told CNBC on Friday that "perfectly real" manipulated videos are just six to 12 months away.
- Li had previously said that he expected "virtually undetectable" deepfakes to be "a few years" away.
- When asked for clarification on his timeline, Li told CNBC that recent developments, including the emergence of the wildly popular Chinese app Zao, had led him to "recalibrate" his timeline.
- Visit Business Insider's homepage for more stories.
A deepfake pioneer said in an interview with CNBC on Friday that "perfectly real" digitally manipulated videos are just six to 12 months away from being accessible to everyday people.
"It's still very easy you can tell from the naked eye most of the deepfakes," Hao Li, an associate professor of computer science at the University of Southern California, said on CNBC's Power Lunch. "But there also are examples that are really, really convincing."
He continued: "Soon, it's going to get to the point where there is no way that we can actually detect [deepfakes] anymore, so we have to look at other types of solutions."
Li created a deepfake of Russian president Vladimir Putin, which was showcased at an MIT tech conference this week. Li said that the video was intended to show the current state of deepfake technology, which is developing more rapidly than he expected. He told the MIT Technology Review at that time that "perfect and virtually undetectable" deepfakes were "a few years" away.
When CNBC asked for clarification on his timeline in an email after his interview this week, Li said that recent developments, including the emergence of the wildly popular Chinese app Zao, had led him to "recalibrate" his timeline.
"In some ways, we already know how to do it," he said in an email to CNBC. "[It's] only a matter of training with more data and implementing it."
The advancements in artificial intelligence are enabling deepfakes to become more believable, and it's now more difficult to decipher real videos from doctored ones. This has raised alarm bells about spreading misinformation, especially as we head into the 2020 presidential election.