Bad lip-syncing in dubbing and subtitles can put off audiences and hurt box office takings of foreign films. AI may be about to change all that. Start-up Flawless AI, co-founded by film director Scott Mann, has a tool that it says can accurately recreate lip sync in dubbing without altering the performance of the actors. The tool studies how actors move their mouths and swaps the movements out according to the dubbed words in different languages, making it seem like Tom Hanks can speak Japanese or Jack Nicholson is fluent in French. Mann was inspired to come up with the tool when he saw how dubbing affected the narrative cohesion of his 2015 film “Heist”, starring Robert De Niro. “I hate dubbing as it stands,” he told Reuters. “You have to change so many things to try and catch sync. You’re changing words filmmakers and performers have thought about so deeply. They’re thrown out to find a different word that fits, but it never really does.” Mann decided to do something about this conspicuous mismatch. After some research, he discovered a white paper by Christian Theobalt of the Max Planck Institute for Informatics.