Alpha Protocol - From Script to Digital Vision
A fascinating blog update at the IGN blog for Alpha Protocol gives some insight into the process of moving from a written script to the finished, voice-acted, game experience. A flowchart of the interaction with a character named Grigori is provided (although it's too small to read any spoiler detail) and seems to show five different outcomes across a large number of conversation nodes and/or actions. Whilst you can't draw hard conclusions from a flowchart you can't even read, it would seem to nicely refute the famous Sony tester comment that the game isn't RPG enough.
Here's an introductory snip:
Mike Thorton walks into a dry cleaner's shop; he hears muffled yelling to which he calls out, "Hello?"
"In the back." He follows the voice to the source to see a man tied in a chair, tape over his mouth, along with another man - an informant for Thorton in the Taipei hub, Steven Heck - walking toward him with a huge bottle of bleach. Clearly the man tied to the chair is about to be tortured for incredibly important information; yet Thorton needs information of his own from Heck -but he's (obviously) preoccupied. Alas,Thorton needs to say something and what he says could affect how Heck looks at him from then on. Does Thorton hesitate, offering to come back later? Does he instead offer to help, possibly gaining some reputation points with Heck? Or does Thorton go the professional route and get right to business, thus possibly angering his would be informant?
The three choices - or Stances as they are referred to in AP - will come up in every major cinematic in the game. Matched with a timer, the player will have to make their choice quickly to keep the conversation going toward what they think would be best. Yet how do we, as the developer, incorporate what can be a spiderweb of choices and reactive callbacks into the game with relative ease?
Information aboutAlpha Protocol
Platform: PC, Xbox 360, PS3