Many thanks to Brad Dyck for contributing this interview. You can follow Brad on Twitter @Brad_Dyck
Gordon Durity is the Executive Audio Director of the EA Audioworks team, which supports the audio development of the upcoming Need for Speed release available on November 3rd, 2015 for PS4 and Xbox One (PC due in the spring of 2016). I’d like to extend my thanks to Gordon for sitting down to chat with me.
Brad Dyck: Could you describe some of the responsibilities you deal with day to day?
Gordon Durity: I look at all the titles that I’m in charge of – all of the sports games, Need for Speed, Plants vs Zombies and mobile products just to keep track of where everything’s going as far as audio content and quality. I do R&D as well, looking at where our technology is headed, what’s out there competitively, what we’re building in-house, what we need to build for emerging platforms, and what we need to re-factor to make things work better. Because we’re a central team, I spend time with the senior leaders of the titles we service whether it’s FIFA, Madden or Need for Speed, just to make sure that we’re completely aligned with our dev partners.
I do strategic planning as well, what that means is basically looking at the coming title plans and seeing what resources we have. I look at who we bring on as far as contracted help in all areas of the game and the talent we have that are growing. I do a lot with the local schools or any potential candidates that come across our table that look promising. We do interviewing, we check out demo reels, we go to various shows, conventions, GDCs and all that kind of stuff to keep a tap on where the talent is out there.
The other thing I do is keep in touch with all of the other senior audio people at other companies. It’s a small, small community – we’re not fighting at each other’s throats, we’re trying to move the whole industry forward so I regularly consult with Sony, Microsoft and folks in other companies just to see where things are at and to make sure that professionally the whole idea of audio for games is moving along and growing with the rest of the industry.
BD: Things like loudness standards…
GD: Yes, loudness standards, mix standards, how things fold down on the consoles, basically the consumer experience at the end of the day. We’re trying to have professional standards like they have in television or film. Whether it’s a Sony product, Lion’s Gate or Warner Bros., when you put in that Blu-ray disc or you watch that TV show generally you expect a certain amount of audio standards as far as fidelity, loudness, and how things play back in your speaker set-up. With games it’s still a bit of a crapshoot so if we can all align on some sort of professional standards it just makes it a better experience for the consumer.
BD: Do you feel like you’ve achieved that?
GD: There’s great buy-in. There are actual professional loudness standards that have been adopted this generation that Microsoft’s bought into, it’s in their actual development recommendations for audio. Sony has been sort of leading the charge. Garry Taylor from Sony Europe has really been pushing forward. We’ve done stuff at GDC and other conferences where EA, Sony, Microsoft, Activision have all bought into certain standards. It’s in the actual platform documentation. It looks like actual technical requirements, it’s not an enforced one but it’s a best practice, for sure.
BD: Did you have a lot of those kind of discussions with first party companies when the next gen consoles were about to come out?
GD: Yep, we did, and with our tech group here as well. EA has an in-house technical audio team that’s pretty significant so they deal a lot with first party to talk about the audio specs that are coming out the pipe and what specifically we would like to see. The first party console makers will do a poll of all of the major players, all the companies in the marketplace and they’ll take those suggestions into play with theirs as well so we’ve had direct feedback with Sony and Microsoft as far as what we would like to see happening in the box before it ships. And then ongoing with whatever SDK updates that need to happen. So absolutely, we do consult with them and they consult with the community.
BD: Could you tell me about some of your earliest memories doing sound design for games?
GD: This is going to go way back but I started off doing sound in games, not for a game company back in the Commodore 64 days. My brother and I splurged on a Commodore 64 and we managed to win a shoebox full of software from a local company, Wizard Computers in Vancouver. In there was a disc on Ultrabasic and this weird little cartridgy thing where you could go in, actually hack into games and put your own soundtrack into it. We ended up breaking into some EA games actually (laughs) as well as peek and poke. I’d go in there using the little SID chip on the Commodore 64 and was able to actually hack my own soundtracks into video games which was kind of cool at the time.
But after that I really didn’t do stuff in video games for quite a while. I was mostly in music and film doing post-production, composition and sound design. I graduated from SFU in Interactive MIDI systems and some of the people I met there became the tech department at EA Canada in the late 90’s. So through connections I was able to support a couple of their needs on contract. For instance, I knew FM and MIDI really well so I was able to go in and work on a SNES title. That was amusing, it was almost like using a MOD tracker type of thing which is like MIDI and there was a tiny, tiny 11K, 8 bit sample. Then the rest of it was all a real time FM synthesis sort of thing.
That was fun because part of my degree was in that but it really wasn’t until the PS1 days when I got back into it. I was doing a lot of film post and thought, “The beeps and squeaks are kind of fun but I won’t really want to jump into this until there was something closer to Red Book audio.” Then the PS1 came along and it was cool to be able to do some stuff with audio manipulation so I jumped in there and started getting back into games at Radical Entertainment. I’ve been sort of in it since then so 20…25 years? It’s getting on in time but yeah, it started off there and I worked with every console after that, as well as PC.
BD: Do you still have time to do hands on sound design or composing?
GD: At EA I will contribute something on most titles I work with. That could be mixing, tuning, sound design or other post-production types of things. There is always something new to learn creatively.
BD: Is there a reason that the company is pushing for more titles to be on Frostbite?
GD: Yeah, so I think the reason – and this has been spoken about by our senior leaders in the company – is scale. EA is a large company with a lot of developers all over the world and being able to be on the same technical platform is a huge boost in being able to share resources and scale. If you need some help or expertise to come in at any given moment you can take somebody who has done Need for Speed, for instance, and then you can leverage them to do work on the engines in Battlefield, Hardline or whatever it is and do it very, very simply. It’s the same tech platform and the same understanding. They can come in, get up to speed very easily and contribute to a title in a much more efficient way.
Also, if we are all on the same platform we can sort of push in the same direction as far as getting things built instead of how it was in the past. When you have multiple platforms within the company you’re always at odds with keeping multiple technical bases up and running so there’s a technical deficit there. If you’re running 2 or 3 different systems then you have to support those systems and you have to improve them instead of working towards one system where all of the effort goes into making that as good as it can be so it’s both things – shareable tech as well as shareable experts.
BD: Is there any danger in trying to do too many things with one engine or is that not too much of an issue?
GD: Well, it depends. There are certain specializations that happen when you have a custom engine but EA doesn’t make one type of game, it makes a vast variety so because we’re making everything from a sports title to a shooter we’re trying to create the tool to be able to create all those titles and the tool is getting more generalized as well. So, similar to Unreal, where even though the engine was made for Gears of War, people are doing more than just a straight up shooter with the Unreal engine these days so it just opens up the flexibility a bit more.
BD: Is there any possibility of bringing back some of the old systems you’ve used, like RUMR for SSX?
GD: Yeah, all things are possible. We have the tech base for some of the systems we’ve used on older games and that can all get written in. Some of that technology can come across easier than others depending on what gets integrated into the new system. Some things have already come across right away. The speech system for sports integrated very, very easily. We shipped it in PGA and that was a good first experiment, which is being developed even more. The engine system for Need for Speed came from the actual NFS engine and that got integrated into Frostbite quite nicely. Now it’s called Octane and it has done amazing things on the latest Need for Speed. So yeah, a lot of older tech can get bundled up and brought in, it’s just how it does it, what’s the overhead to bring it over and what’s the use case for that tech.
BD: Are there any games, EA or otherwise that you think stand out when it comes to audio?
GD: Battlefield is awesome sounding. Battlefield is one of the best sounding shooters, if not the best sounding shooter. When I listen to it I hear all the care and work that goes into actually making those characters sing. One of my pet peeves with all video games is, whether it’s a shooter, a sports game, a driving game or whatever – you always want to perfect what I call the 80%. So when you’re playing a game, what do you hear 80% of the time? If it’s a shooter, it’s the character. You’re this guy with all this gear running around talking, doing stuff. The physicality, the feet, the Foley, that’s there all the time. And I would say a lot of the time that’s some of the weakest parts of the sound playback. It gets repetitive, the feet sound like Pinocchio, you can pick up these weird loops so Battlefield really nails the natural feel and sound of that character.
When you put a backpack on there should be a change in sound because they’ve got more weight on their body. The footsteps should be heavier. When you get more exhausted it shouldn’t sound as though you’ve jumped fresh out of a plane. If you’ve been running and doing stuff there should be more panting, there should be a different pace. The gait of the character should change when you’re standing up and sitting down, there’s all sorts of little nuances that we take for granted in film, for instance, because that’s such a continuously modulated, natural sound that’s happening. But the fact that everything in the game has to be purposefully calculated for – whether it’s procedural or just sample tagging – someone has to think it through and make that feel as realistic as possible so I think in Battlefield they’ve done that really well with a lot of the Foley on the character.
FIFA is amazing from a sports game perspective. It’s hard to distinguish it from an actual FIFA game that you hear in a home theatre, for instance. Getting authentic crowds from the region it’s in, getting the right announcers and the right cadences – there’s a lot of work that goes into mining that data and so that’s another that I think completely nails it…there’s a few out there.
BD: Did you have a chance to play Star Wars: Battlefront?
GD: I played Battlefront way back in the early days. I haven’t played the latest beta but I’ve seen the evolution of Battlefront. I know the work that the team at DICE have done with Skywalker and the amount of detail they’ve gone through to keep it authentic is incredible. There’s a really tight, close relationship with LucasArts and Lucasfilm to make sure that you do not stray out of that universe. The Star Wars universe is like the FIFA football universe on the sports game side where fans are so anal, so focused on the detail that if you get close enough it doesn’t count. The announcer on a sports game has to say the right thing at the right time in the right way otherwise…you can nail it 99.9% of the time but that one time it doesn’t work? Ah! It’s the uncanny valley of sound. They’d rather hear something that’s so wrong and comical it doesn’t make sense but when you’re trying to simulate something it’s got to be dead on or else it takes people out of it. They’re not forgiving at all.
In the Star Wars universe – now that’s an awesome sound gig but it’s also a terrifying sound gig because the sounds are part of a whole generation. It’s part of the DNA of modern sci-fi so the authenticity of how those creatures and weapons sound, how you modulate the light saber and how the TIE and X-Wing fighters all sound have to be super authentic. Plus you have the added extra overhead of making it a real time playable experience. In a movie when something’s flying you can do all the stuff, you’ve got all the whooshes that are just baked in but if you’re sitting in an X-Wing fighter, you’ve got to start it, you’ve got to hover, what does that sound like? Two minutes in when you’re hovering and moving around, how are you going to change the sound? In the movies it’s just steady state sounds and they’re doing all the action but in a game you’ve got to constantly figure out “If this thing was real, how would it really work and still stay authentic to what it’s perceived to be in the movies?”
BD: You have to invent new sounds based on old ones.
GD: Yeah, and you also have to think about the behaviour. In the movies you don’t spend any time going in. He fires up the X-Wing fighter, sits back, he throttles it up. By that time you’re bored but in a game you’re sitting there firing up this machine and you’re turning it on, moving around, talking to people and when you’re spending 15 minutes in that thing it’s got to feel like a real responsive vehicle which you don’t have to do in the movies because it’s just action and cut scenes.
BD: Do you think EA has any plans on developing VR?
GD: Yes, the official word on VR is there’s research going on. VR, AR, and what’s the new one? MR. Mixed reality. That’s like Magic Leap, that’s where the object actually interacts with direct objects in the world.
BD: HoloLens type stuff.
GD: Yeah, and beyond. So that stuff, absolutely is at play. It will happen. The official stance that EA says is they’re taking a measure to look at what that is. Research is going on, it’s just that there’s nothing immediately put out there and developed in any way, shape or form as far as I know. Beyond that, from a sonic point of view – we’re absolutely supporting Dolby Atmos in our engine. We do lots of multichannel to binaural encoding so that’s in stuff that we’ve shipped already but also we’re looking at what does real time targeting. How do we target the user’s unique head and HRTF responses to their ear shape and face? Nobody takes that into account but it matters a huge amount when you’re looking at getting the right sort of binaural response for a particular listener. Also, what about head tracking? That’s a major thing. You can have stuff going on in the world but if you don’t tie sound directionality into head tracking then you’re not really going to get a sense of being in that world.
BD: How’s the progress on HRTF? Is it something that devs are able to use today?
GD: We’ve been doing it for a while. We’ve developed our own HRTF response algorithm that was actually optimized a couple of years ago for mobile. We wanted to make sure it was very fast, very efficient and it works. It’s an early version still. Now we’re looking at incorporating height into the mix so as we roll out with VR, Atmos etc. we want to get a sense of verticality going on. I’ve used and researched every single algorithm, API and method out there for HRTF for about 15 years now, it’s one of my pet things to do. I know what’s good, what’s not good, what’s real, what’s kind of fluff so it’s exciting to see things move forward but there’s the age old problem – I always ask people, so how are you targeting the HRTF impulse response to that unique listener’s head and ear shapes? Without that, you can get a generalized effect but you can’t get a customized effect.
From a games perspective, if you’re doing something where the user has to spend a lot of time coming up with tests you’re going to lose people. Because how many people want to sit there and even check the gamma on their screen? So with audio it’s a little different. How many people want to sit there and go through a listening test to get measurements so you can target the right impulse response for them? It becomes an accessibility thing. People want to put in the game and have the magic happen as quickly as possible so I’m looking at various ways that we can do that. I’m not going to say what they are but they’re in R&D so we can utilize it to get that unique user head and ear shape in a painless and entertaining way.
BD: What do we have to look forward to with the new Need for Speed game?
GD: The new Need for Speed game has some of the most authentic engine modelling I’ve ever heard, anywhere, any time. The process and the technology that’s being used is pretty advanced. So much of the game is being able to tweak and tune out your car in crazy ways. The fact that you can take engine parts from one car, attach it to engine parts of another car and get a new hybrid sounding engine is amazing. So in the game you can actually do that. You can take a Ferrari engine, put on some Acura blowers, an exhaust from a Toyota Camry and that combination creates a unique sound that is literally a blend of all of those things. So that’s a pretty amazing tool right there.
Apart from that, I think the experience is just more visceral. Here’s the thing, it’s literally Need for Speed, it feels as though you’re driving really fast. The graphics are beautiful and the sound that goes along with it I think it all ties in to make a pretty amazing experience for a driving game.
BD: Going back to what you said about it not being good enough to get 99% there and how people who are focused on the details will make their voices heard – I think that also applies a lot to Need for Speed.
GD: Absolutely, and in the tuner and car world the engines are number one – you hear the engines all the time, so absolutely. The accuracy in the engines, how that technology creates engines and the way the engines are recorded are very, very authentic and as I said, the fact that you can tune these engines and hear the actual changes is amazing. Apart from that, there’s the rest of the car performance, the world, how the shifting works and all that stuff so a lot of detail has gone into making each car. And then with the updates, that’s the magical thing, you’ve got this whole toolbox of things you can do to these cars to customize them. You can get cars that literally would never be able to be put together in the real world, unless you are extremely wealthy I suppose. But you can actually experience what that would be like to create these totally customized hybrid cars, race them and get a sense of the feel and sound of them.
Koca says
Thakns, nice to read some words from top people of game industry.