How Deepfake Lips is a Game Changer for Dubbing
When watching foreign films there is a certain feeling of detachment between the audience and the actor that has been dubbed over. This mismatch can be distracting as an actor’s mouth movements are completely out of sync with what they are saying. This is what a company called Flawless hopes to solve by creating an AI-powered solution that can match the actor’s mouth in the language that it is being dubbed over.
When we think of deepfakes we often think of manipulating the entire face, but Flawless’ technology focuses on just a single element: the mouth. Dubbing is becoming increasingly more important for Hollywood as media companies target consumers worldwide with their own streaming services.
Netflix, for instance, is dubbing in 31 languages and aims to target English-language audiences with more foreign originals. So, creating a tool that can manipulate the actor’s mouth to match with the language used is a game-changer.
Flawless’ co-founder Nick Lynes states “When someone’s watching this dubbed footage, they’re not jolted out of the performance by a jarring word or a mistimed mouth movement. It’s all about retaining the performance and retaining the original style.”
The technology still has some perfecting to do as it isn’t 100% flawless. The technology works best when it has a clear view of the actor’s mouth and remains mostly still. But the technology is still relatively new and will become smarter for more complex scenes.