Larian Studios shares with us a video on motion capture try-outs for Dragon Commander in which they show how the motion of real person's faces while talking are mapped to in-game characters.
The video is part of an article from Swen Vincke about the process they went through to get where they are now as far as spoken dialogs go.
Nowadays, bringing dialogs to life is not a question of voice recordings alone anymore, but also of having perfect lip synchronization and fitting non-verbal communication on the virtual speakers. The latter is certainly the case when you’re dealing with high quality character artwork, which is more or less the norm in these modern times.
While cool for players, this requirement is pretty uncool for poor and not-so-poor independent developers, and I already talked extensively about the problems this has been causing for us during the development of Dragon Commander. Because there’s a shitload of choice and consequence going on in between game turns, the dialog asset requirements are pretty steep in that game, and when we did the initial research on how much it was going to cost us to animate all the dialogs, we came up with numbers that were bananas. (between half a million and one million U$ for one language!)
For a long time actually we thought that we’d have to resort to plan B, which was just recording and animating the opening lines, without having anything in terms of animation or voice for the rest of the conversation.
It’s been almost a year since I first wrote about this particular problem here, and I’m really glad to say now that we finally cracked it (just in time ). It’s amazing what a courageous lead animator who refuses to admit defeat can do if you give him a few good programmers and a bunch of cool cameras.