I’m still playing Star Wars: The Force Unleashed II and you? As usual, LucasArts deliver a really great sounding game, worthy of belonging to the saga. Here is an interview I had with two of the minds behind the sound of this fantastic game: Lead Sound Designer Brian Tibbetts and Technical Sound Designer Damian Kastbauer (Implementation).
DS: How did you get involved in “Star Wars: The Force Unleashed II”? How long did it take you to develop the whole sound work of the game?
BRT: I was hired in 2007 as a Senior Sound Designer for The Force Unleashed and worked under David Collins who was the Lead. As I mostly worked in my auxiliary cubicle during that project in the middle of the development team and producers, I worked closely with every discipline and got a lot of face-time with the team and executives. When TFU2 began manifesting, I was the natural choice for Lead especially as David had been promoted to a supervisory role. TFU2 was a greatly accelerated project and took the same amount of time you would expect from any triple A game.
DS: Can you explain how you collaborated with producers, designers, and other members of the team regarding sound?
BRT: Collaboration, asset and dependency tracking is always challenging on any project. This is especially true for Audio as we are at the end of a long assembly line of dependencies. Like TFU1, I chose to have my office in the main area of game development and always had an open door policy regarding communication with other disciplines. There were many meetings regarding asset changes and in general communication at Lucasarts is good between disciplines. I’ve always stressed that we should work together as much as possible and there were many moments of myself and sound designers working directly with designers, artists, and producers at their desks or ours. So much of the magic that happens on a game is between people working together, and the spirit of problem solving is pervasive here. Face to face discussions and collaborations are always better than phone calls and emails!
DS: What were some of the sound challenges in developing the second installment of “The Force Unleashed”?
BRT: Our greatest challenges regarding this project involved keeping up with the massive amount of iterations due to the accelerated 9 month schedule. There are many different ways to integrate our audio assets including scripting or placing sound emitters directly inside environment art and our work was unfortunately blown out many times. Toward the end of the project, a couple brilliant engineers helped me build an e-mail messaging system that would notify us seconds after any audio reference had been changed or removed. This helped a lot especially as the responsible parties didn’t realize or intend to blow us out and were more than happy to help resolve the situation. By the time this tool was built though, we had already had to re-author/integrate excessively though which is always frustrating. For the next projects, we are investigating the creation of a more solid layer of abstraction for audio which will make such scenarios highly unlikely or impossible.
DS: How loyal were you guys regarding the legacy of sounds and classic sonic icons established by the Star Wars franchise? What did you do to stay creative while developing new things without losing the original concepts of Star Wars’ sounds or being too influenced?
BRT: We’ve discussed this at length within our department over the years with much enthusiasm. In general, we try to make as many new or updated assets as possible, but with certain classic ‘assets’ like a Tie Fighter or Lightsaber, using anything but the classic sounds simply doesn’t work. We’ve experimented though! That being said, Ben Burtt and others at Skysound have created higher resolution modernized versions of the classic sounds for the prequels, Clone Wars and simply for the Skysound archives. We work closely with Skysound and are always open to finding a new version of the classic iconic sounds to add variety and/or increase resolution. But the leads generally encourage the audio team to get as creative as possible and not rely on the classic sound. In regards to making new sounds that maintain congruity, we have a deep understanding of Ben’s tricks and effects and have done a good job of creating new assets with that same feel. But we also like to get crazy and freak out on interactivity, in addition to using Sound Toys, GRM Tools, etc at the content level. That’s always fun!
DS: Did you use sounds from previous Star Wars games? How much new material was needed?
BRT: Yes, we used many of the sounds from The Force Unleashed I. In general though, all of the character sounds, scripting, boss sounds, and environmental ambiences were brand new for this title. In general it was only physics sounds, movers (like doors and elevators), and some of the force powers that were legacy, although we updated many force powers to be higher resolution and surround-sound. Overall I’d say 70% of the content on this title was brand new and designed my myself, Aaron Brown, Tom Bible, Erik Foreman, David Collins, Julian Kwasneski, and Damian Kastbauer.
DS: Star Wars games always come with amazing stories. This game is not an exception, especially on the cinematics. Could you talk us about your approach on the sound of these cinematics?
BRT: The Jedi Master of Cinematics here is David Collins. He loves working on them, did a fantastic job on TFU1, and brings his passion for all things Star Wars to each and every one. He also was the main voice director for them all and worked closely with us during the Foley tracking at Skysound. So I guess my approach was to ask David nicely if he would rock them out once again, and to do all that I could to empower him to delve in as deeply as possible. It seems to have paid off as they turned out wonderfully!
DS: How was the work on the weapons and powers? How was your work with the different sizes and performances of the blasters, lightsabers, big spaceships, special powers, etc?
BRT: Some of the weapons and force-powers were legacy assets from TFU1 or the films, but even so Wwise gave us increased flexibility with regards to how we could further extend the existing content. For the most part, Tom Bible owned the weapons and force powers and Aaron Brown owned the big spaceships and other special events which were usually scripted. I helped out a bit as well as did Erik Foreman but for the most part I simply served in a supervisory role for these sounds and tried my best to drive the vision of the title as conveyed to me by Project Lead Julio Torres. By the end of the title, the larger events and special scripted areas of gameplay had undergone much iteration. We continued to tune it all up until the very end in an attempt to make it all sound as cool as possible.
DS: And what about the big robots and the creatures? Any special approach on the sound of this elements?
BRT: Tom Bible, character-authoring master had this to say: In general I was trying to hit the extremely high bar that Ben Burtt had set with all his character vocals and imbue them not only with a really unique sound and personality, but also get a great “performance” out of them as well. For the Flame Thrower and Carbonite droids, I went back to the approach I take to creating bass lines in electronic music and applied it to their vocal sounds. I used a Nord Modular Vocoder and Bass Synth and ran various vocal performances through it to create the deep, resonant, metallic sound that’s in the game.
DS: And what about dialogue? How did you deal with this huge amount of voices? Also, could you talk about the processing applied to the different characters?
BRT: Dealing with the huge amount of voice files in the midst of an accelerated development cycle was challenging. Not only did we have all of the main characters, scripted events, “In Your Ear Dialogue”, but we created a brand new A.I. Voice system which allowed the NPC’s to have conversations with each other and uniquely react to situations based on a variety of gameplay data. The system turned out great but essentially quintupled the amount of voice files that we needed to record, manage and implement. This A.I. Voice system alone accounted for 15,000 files in English. It was a gigantic amount of work and collaborative effort among numerous disciplines including the A.I. Engineers. Not only did David Collins voice direct most of it (along with LucasArts Audio Director Darragh O’Farrell) but he helped compose the scripts for each character type as well. Editing, Mastering and Processing all of this was a monumental effort involving both our internal team and external contractors. As for specific Sound Design-y processing, we typically treat intelligible spoken words (in English or FIGS) as “Dialogue” while we treat all other sounds coming from a “mouth” as “Sound Effects”. We have standard tricks for such things as Stromtrooper processing, but unique sounds from droids are always unique utilizing whatever weird sound design techniques the sound designer chooses.
DS: What’s new in terms of implementation system/methods? What were the new challenges in the second title of the game?
Damian Kastbauer: For the first Force Unleashed there was a perfect storm of technology including multiple physics simulations, behavioral animation, and new game and audio engine’s tooled for cross platform development. Getting all of the systems to talk to each other was the greatest challenge, moving forward into TFU2 it became more about building upon and keeping pace with the dramatic amount of detail that would be added to make gameplay more fluid and the player experience payoff with sound. So, the challenge implementation wise was to move the quality and satisfaction of the interactivity forward while smoothing out the authoring pipeline to allow for rapid iteration towards greater quality. Some of the biggest successes came in the form of: visual scripting tools which helped wire sound for custom scenarios, an interactive state-based mix system, soundbank management which literally allowed us to cram 20-30% more sound into each level, in addition to the use of the Wwise audio middleware tools and functionality. The team of audio engineers led by Neil Wakefield really enabled the design side to dream big. Their support and unfailing commitment to great sound was one of the single greatest assets during development, which led to a very detailed representation of sound in the Star Wars universe.
DS: Could you explain us what kind of tools did you use for implementation and your experience working with them? What about the physics system? Interactive mixing?
Damian Kastbauer: LucasArts moved from an internally developed audio engine for TFU1 to Audiokinetics Wwise for audio on TFU2. The value of having the Wwise toolset from the beginning vs. the simultaneous development of a proprietary toolset on TFU1 was immediately apparent in the speed at which sound was up and running and in game. When you’re working with great content made by the sound designers, the best you can hope for is to have the sounds play back appropriately in order to sell the moment and allow the intention of the sound designer to show through. With that in mind, working within the audio tools becomes an artistic pursuit to unlock the potential of every wav file in the game; whether that means setting appropriate pitch randomization properties or developing a switching sytem that will allow you to convey the differences in a given physics impact or explosion. The Wwise toolset was consistently up for the task at hand, and frequently presented more than one way to solve the various playback problems that spring up. Aaron Brown worked extensively with the mixing functionality of Wwise, and established the network of busses and ducking rules that governed the eventual playback of sound in the game. While everyone took their turn contributing to the mix, having one person responsible for managing the mixer at the end of the project really helped everything coalesce when it was time to ship.
DS: What are your favorite tools to work with when designing sounds in the studio/field?
BRT: Though we all have different preferences and toys, I personally adore the Sound Toys suite of plug-ins as well as GRM tools. Of course, we all use Serato’s Pitch-n-Time, Altiverb, and many of the Wave’s plug-ins often, but my favorite plug-in is probably GRM’s RESON. I created most of the hologram and forcefield sounds on TFU2 using this badboy. As for DAW’s, we all use Pro Tools HD on Mac’s.
DS: Was there any unique or unusual field source utilized for this title?
BRT: Actually, yes! I’ve lived on an organic dairy farm for a few years now up in Marin and there are all sorts of amazing sounds happening all the time which means I will run outside with my Zoom H2 at all hours of the day/night. Right next to our house is the special pen for the sick cows. There was one that was particularly vocal one night during TFU2 that we ended up using her screams as one of the layers of the Gorog boss. And for this same creature, we recorded some really huge and heavy chains up at Skysound which were the same chains used in the classic film “A Christmas Carol”. Our Skysound contractor Erik Foreman and I randomly saw them in a recording booth and *had* to track them. Super heavy and my back hurt for a few days!
DS: Finally, I’d like to know more about your work as a sound team at Lucasarts? For instance, how you typically work and collaborate together?
BRT: We are all good friends here at Lucasarts and share a deep passion for audio. We are constantly striving to learn as much as possible about the sounds, tips and tricks regarding the Star Wars world but are also huge geeks about sound, gaming, film, and perception in general. One of our department heroes is Walter Murch whom we had an opportunity to meet recently. It was fascinating and we discussed it at length afterwards. We eat lunch together, have a weekly meeting to discuss current projects and also hang out offsite from time to time. It’s a great experience to be surrounded by like minds of such great talent. We collaborate together often on trailers, cinematics, Foley sessions and also try to get out into the field together as much as possible to gather new source. And of course, we try to get out to Skysound whenever possible to hang with our friends up there and drool. It’s a wonderful team and has been a deeply inspirational experience to be a part of it. :D