Let’s get started with the Erik Aadahl Special. Here is a Designing Sound’s exclusive interview with Erik talking about some aspects of his career, personal thoughts, his main tools, and more.
Designing Sound: To start please give us an introduction about your career and how you get involved with sound design
Erik Aadahl: As a kid, I played piano and started composing music using MIDI. This was in the days of the original Sound Blaster audio card for PCs. I did some music scores for friends and myself through film school at USC, and wound up working at the sound department my junior year. I remember the vibe there at the time was so awesome–great camaraderie with the eclectic group of funky sound weirdos, heaps of old vintage gear and ancient 16mm magnetic tape dubbers, floor-to-ceiling reel-to-reel analog beasts that lined the machine room. We had something called the “Machine Room Olympics”, where students could show off their skills at bouncing plastic film cores into trim bins. Those were the last days of analog, as ProTools quickly took over the department. The USC School of Cinematic Arts is now completely remodeled and unrecognizable.
Some of the titans of sound design, like Ben Burtt and Gary Rydstrom, also came out of the USC film program. That in itself was a huge inspiration to me.
I graduated in ’98, and a lucky connection from the film school got me a job as an assistant sound editor working on NYPD Blue and a variety of made-for-TV movies, loading dialogue on the night shift. I learned how to operate a Fairlight editing system on the job, and a year later was sound effects editor on the first seasons of Law & Order: SVU and Family Guy. On the Dune miniseries I met Jay Wilkinson, who introduced me to the feature film world and the sound department at Fox Studios. A few years later I went freelance. Since then, I’ve worked at almost every studio here in Los Angeles on one project or another.
DS: Did you have a mentor early? What you consider as your main piece of learning?
EA: Absolutely. Mentors are so important. My first mentor was Tim Nielsen in film school. He was my T.A. in sound class. After that, Andrew Ellerd showed me the ropes in the television world: how to lay out sound effects in the sessions, how to deliver to the mixing stage, how to print cue sheets for the re-recording mixers — the basics. Jay Wilkinson showed me how to edit in the film world, and Craig Berkey taught me artistic simplicity, which I think was a critical lesson. Sound effects recordist John Fasal, who recorded the jets on Top Gun, taught me field recording on almost every movie I’ve worked on. And Mike Hopkins and Ethan van der Ryn taught me to dump everything I’d learned right out the window. You have to know the rules before you can break them!
Music is also a big influence. The bands Radiohead and Boards of Canada are probably my biggest sonic inspirations. Some of their abstract and evocative sounds have very directly inspired my film work.
But there’s nothing as vital to learning as just going through it. Finishing a movie can feel like going to war and back, being mentally and physically exhausting at times. There’s no substitute for experience, the weeks and years sweating in the trenches. Making mistakes is often to best way to learn, and I’ve made my share of those. There isn’t a movie that goes by where I don’t learn something new, and part of the fun is challenging oneself to find that “new” thing to explore.
DS: How has been your relationship with the directors of the movies you’ve worked on? Did you have any special story about a director? Generally… Do you think that directors should give more importance to sound on a film?
EA: Every director is different, with their own tastes and style of working. The two most involved directors I’ve ever worked with when it comes to sound are Michael Bay and Terrence Malick. Ironically, their films represent polar opposites in the cinema world. But what they share is an understanding of how critical sound is to film. I think Michael Bay puts it best when he says that sound is 50% of the experience.
As Frank Herbert put it, “the beginning is a very delicate time.” When you first start on a movie, you have to hit the ground running and gain the confidence of the filmmaker.
I remember the first time I met Michael Bay on Transformers, he grilled me on what my credits were. I’m pretty young for what I do, and I could tell he was testing me. The way he works with sound, he doesn’t give a ton of direction. He likes you to make it as kick-ass as possible, and then present it to him for his reaction.
Typically, the very first week of working on a film I will design the sound effects of a sequence, do a mix, and give it to the picture editor to integrate into his AVID cut, muting his old temp effects.
The first Transformers scene I hit was the “Witwicky House” scene, where the autobots visit Sam’s suburban home and cause havok. You can imagine my relief when Michael listened to my first design pass and loved it. If you nail it in the beginning and gain that confidence, the rest is downhill. After that, we pretty much had free creative reign, with the occasional “makes those explosions more varied” or “is that all you got?”. We’d present scenes about once a week, until we started mixing, at which point Michael would let us do our thing and show up for reel playbacks on the Cary Grant theater at Sony Studios.
Terrence Malick was a whole other experience. I first met him while working with Skip Lievsay and Craig Berkey on The New World. The first hour, he talked about his sound philosophy. Our work should be “simple but bold, like a Japanese Painting; three brushstrokes”. Like the “ocean”, the sounds should constantly change and evolve, as if the audience member were “slowly accelerating on an inclined plane” like a “train on tracks”. It was up to us to interpret that philosophy into real terms. We found that instead of doing hard cuts on scene changes with our sound, we would use 40 or 50 foot fades from one background atmosphere to the next.
When we hit the final mix stage and he heard our finished sounds, he noted that the pacing of his film changed dramatically. So he took our track, went back to the cutting room, and re-tuned his picture cut to our sounds. His approach is a very holistic approach to film making.
I’ve worked with Bryan Singer several times, and he has a different approach. Bryan doesn’t know what he wants until he hears it, and even then he’s always questioning until the very end. He’s a very careful listener; I’ve witnessed him snap at people on the mixing stage creaking in a chair or opening a door during a playback (I was the culprit at least once) because it pulls him out of his concentration. He wants to be firmly lost in the experience of his film.
On I, Robot, Alex Proyas had another approach entirely. He wanted tons of options to wade through and pick and choose. For the NS-5 robot, we went through maybe 100 options before he decided on a bubbly sound I made for upper-torso movements, and a synthetic whine Craig Berkey made for the leg motors.
Directors who are really involved make our jobs so much more fun. When they think about sound as integral to the film making process, ideally on the script level, it elevates the experience for us as artists, and audience members too.
DS: When and how did you typically find your inspiration? Any influence?
EA: Ideas come in all sorts of ways. I’ve had epiphanies after a drink. I’ve been to Burning Man several times and that has always yielded a new way of looking at the world. I’ve woken up in the middle of the night, grabbed a notepad and filled it with ideas to use the next morning. A lot of great and unexpected ideas come from field recording, serendipitous accidents that I never would have thought of otherwise. I’ve spent weeks making little progress on something, taken a break, and then come back and completely re-done all my work in one hour, and that’s made it to the final mix.
The best is to have lots of time up front, before even starting your sound editing. If the schedule can afford it, I love to take a few weeks designing without even looking at picture; recording and manipulating and processing, building a library of fresh design. One weird sound might give you inspiration for another. Then, when you finally put those abstract sound ideas up against the image, a whole new alchemy takes place that can lead to the next level.
I think the most important thing is to never decide that I’m done. If you stumble on something better later, don’t be afraid to scrap the old stuff and go with the new. The track should be a living organism, alive and evolving to the very end. That’s the fun of it.
DS: What is the best advise you could give to a sound design guy (both beginners and experienced)?
EA: I saw an interview years ago with the great Ray Charles. He told a story about a young musician asking him “how many tracks do you use for a song?” His answer was amazing: “I don’t care if you use 70 tracks, or 1 track ….. how does it SOUND?”
The most common mistake for sound designers that are starting out is over-complicating the track, using too many layers. This is often done out of insecurity. I’ve found that every time you add another layer to a sound, you often strip the first sound of it’s power, essence and simplicity. The most potent sound is the single perfect sound played against silence. I think of it like painting: you start mixing too many colors of paint, and it all turns brown. But if you keep it simple, it stays potent and raw and bold.
Also, I find working early and quickly to be really important. Don’t waste too much time, even if you’ve got lots of it. Get a sketch of a sequence done quickly, with the big building blocks in shape, and then go in with a detail brush and finesse it. The worst is wasting hours on one little detail and missing the big picture of the scene.
Another technique I use is “pre-visualization”. Or maybe “pre-auralization” is a better term. I like to make the sound and rhythm with my mouth, and then match that. It sounds silly, but it works. At the very least, it will lead to more ideas. And worst case scenario, you can simply record your voice and use that!
DS: We’ve all had bad sound design moments. You know, when you’re creating a sound/sequence or mixing and it just isn’t working. How do you find a solution for that? Do you have a special method for that moments?
EA: There’s no question we all have those moments where a sound or a scene just isn’t working, and there can be a million reasons. Maybe there’s too much going on, maybe there’s not enough going on, maybe the sound is wrong, or maybe the picture simply doesn’t look good and it’s dragging the sound down with it. The first important step is figuring out why it’s not working. That makes finding a solution easier.
On the design end, if something isn’t working it’s usually because it’s the wrong sound. The solution is simple: put in the right sound! The trick is finding the “right” sound. It’s part thought process, part gut-level intuition. The way I usually try to get the “right” sound is to find the right thing and record it. This is my most basic strategy; to think of a thing to record that is the perfect source. I don’t always find it right away, but inevitably, even if it’s the wrong sound, it will give me more ideas and put me down the right path.
I recently had to design some wicked witch broom flying sounds for an upcoming movie. I started with jet recordings, which seems like an obvious analog; both brooms and jets slice through the air, dopplering past us in similar ways. But they sounded too “jet-like”, and any additional processing I’d apply to them just made them even more treated-feeling and less organic and real. So I thought, what is physical that I can record that has speed and doppler and sounds real? The question bugged me for days until I heard on the radio that we finally got some snow in Wrightwood, which is just outside of Los Angeles. And it hit me: skiing. So we took a day off and hit the slopes with our recording rigs, and that became the sound of the flying brooms. This sort of thing happens on every movie, where you hit a snag and serendipity presents a solution.
Some people feel that recording sounds fresh for a movie is cumbersome and time-consuming and should only be done in moderation for certain sounds. They’ll fall back on library for the bulk of it. I feel the exact opposite: recording everything fresh is far more efficient–you get the right sound, with the right performance, tailored specifically for that movie, and wind up cutting your editing time in half because you’re not sifting through tons of the wrong stuff and trying to force it to work.
If the “right” sound I need is a completely synthetic one, the only limit is imagination and the ability to reproduce it.
The bottom-line is to never be afraid of destroying your work and starting over. If you have the time, that is.
So that’s the design end. Then there’s the mix.
A very common problem is poorly timed rhythms. Maybe the percussion in the music is competing or “flamming” with my robot footsteps. A simple sync-slip of a drum hit might solve that. Many problems can be solved by simply lining sounds up, clearing room between the sounds and improving the dynamics.
In my experience the most common problem is that there’s Too Much Sound, or as we frequently call it on mixing stages: “TMS”. I’ll be the first to start removing sounds to clean out the clutter. When there’s a lot of sounds going on at once, we tend to lose definition and singularity, and that’s a really common problem on the final mix stage when all the elements finally come together. So I constantly ask myself “how can I simplify this?” and find the meat of the moment. If an explosion doesn’t have any punch, maybe there’s something cluttering up the attack. Perhaps the whoosh leading into it needs to cut out earlier to give a beat of silence, clearing hole for the upcoming hit. Maybe we can afford to drop all of the backgrounds during the car chase because all they’re doing is adding a “wash” that’s destroying our clarity and precision. Maybe we don’t need to hear the motorcycle engine revving once the gun starts shooting. Subtracting sounds to make a track sound better is just as much an art as making the sounds in the first place.
DS: What are your favorite tools in the studio?
EA: My favorite tool is my recording rig. About 90% of the time I use a Sound Devices 722 with a Neumann 191 MS stereo shotgun microphone. That mic has an extended frequency range making it great for 192 kHz recording, which gives me more flexibility to pitch, twist and manipulate the recordings later.
When I design or edit, I monitor on a Meyer Sound HD-1 speaker setup, which gives me accuracy and confidence in what I’m hearing. I can’t stress enough the importance of trusting what you hear as accurate, because too many near-field 5.1 setups have unnatural frequency response curves.
I do my editing in ProTools. My favorite plugins are from Waves, GRM Tools, Sound Toys, Native Instruments and U&I Software. My sounds are cataloged and given metadata using the Soundminer search engine. I have an Ensoniq KT-76 keyboard for playing samples or taking breaks to play a little music.
But my favorite tools are sound props, which I collect on every movie. My room is littered with instruments like Tibetan singing bowls, aluminum tuning rods, a gong, wood blocks, magnets, kazoos, a bagpipe, bull-roarers, wind chimes, a theremin and anything I’ve bought on eBay over the years to record. Part of the fun of this job is searching for any unique or weird tool that can be performed like an instrument. Last week we found a great Tesla-style electrical conductor on the internet, which we’ll use for zapping sparks and energy on an upcoming super hero movie.
DS: You’ve worked on many films and different genres on them. What is your favorite genre to work with sound? Are you interested on any specific challenge for your next works?
EA: I’ve done a number of science fiction films, so I’d be a masochist not to enjoy working in that genre. I’m a huge fan of Arthur C. Clark, Isaac Asimov, Orson Scott Card and Frank Herbert, and was so inspired by them as a kid that until college I expected to wind up working in a science field. So I guess through science fiction I can sort of vicariously live that out. Sci-fi films give such freedom to unleash from reality and get really wild and abstract.
But I think I’d go nuts if that was all I ever did. I feel really lucky to be able to switch genres, bouncing between realistic and unrealistic films. I recently finished up work on a Terrence Malick drama that was incredibly refreshing and emotionally gratifying.
One thing I’ve been thinking about a lot is technology to three-dimensionally sculpt sound effects, the way a sculptor shapes clay. On Transformers: Revenge of the Fallen I developed a hardware technique that took a big step in that direction, and I’d like to continue evolving it.
I’d also love to get the chance to get out of Hollywood a little more and work internationally. We live and work in a bit of a bubble here, and I’m curious about our part of the art overseas. A few months ago I spent some time traveling through Cambodia and Vietnam, recording every day. I wouldn’t mind at all doing more of that.
DS: Everything changes within the days, and there is a constant evolution on each aspect of technology, art, etc. Talking about film what do you think would be the next step? What do you think is lacking there?
EA: Obviously 3-D is a big leap and it’s clear audiences are embracing it. I heard Randy Thom recently mention that sound has been 3-D for years and he couldn’t be more right. But on the other hand we’ve had the same standard for way too long, and the advent of 3-D technology on the picture end is a great opportunity for us in the sound world to ride on those coat tails and push for the next standard.
For me, that means several things. First, higher quality playback in theaters; we’re only recently breaking the 48 kHz glass ceiling with Digital Cinema, but few if any mixing stages are generating 96 kHz stems and printmasters.
Second, channel count. 5.1 is the standard. Two surrounds is barely adequate in this day and age. The imaging is nowhere close to as precise as it could be. Give me a ceiling channel too. And I can’t describe how irritating it is to go to theaters and hear my surround material coming out of the sides of some theaters. I recently checked out a demo of a sound mixing and exhibition product called “Ionosound”, utilizing over one hundred speakers and claiming to spatially pinpoint sounds precisely in three-dimensions within a theater. The imaging isn’t perfect, but it’s a big step in the right direction and I’d love to see us adopt it, or a version of it.
DS: What are your favorites films for sound?
EA: Well, there’s the classics: The Conversation, Apocalypse Now, Star Wars, Tron. David Lynch consistently blows me away with all of his movies. Recent design goodies are Jurassic Park and Lord of the Rings. I’m a big fan of Master and Commander, which I think is one of the tastiest sound editing and mix jobs in recent years. I was also really blown away by Hurt Locker this year, which used sound exactly the way it’s supposed to; to put you right there in the moment. Kudos to District 9 also.
DS: And what about video games? Did you play video games? Would you like to work in game audio some day?
EA: I’ve been playing video games since the Commodore 64. I dedicated years of intense practice to the Mario Kart games by Nintendo. These days, I play mostly racing games on the PS3, like Dirt 2 and Motorstorm.
My friend John Fasal has been recording vehicles for games for years, but I never worked on one until recently, designing robots and guns for Front Mission Evolved. Games have a whole other layer of complexity when it comes to the mix engine, which was designed by Double Helix Games audio director J.P. Walton. It was fantastic fun and I would work on a game again in a heartbeat.
DS: Finally… What is coming next on your career?
EA: I recently finished working on The Tree of Life for Terrence Malick. These days I’ve been having a blast on Shrek 4 with Ethan Van der Ryn, and I look forward to a couple more Dreamworks Animation films on the horizon. Next year we’ll get started on Transformers 3. We don’t know what it’s about, but we’re already collecting sounds.
Will Dearborn says
This was absolutely fantastic. Love this guy. He’s worked on so many of my favorite audio movies and that definitely isn’t a coincidence. Keep up the good work!
I really hope the hint about his upcoming work with broomsticks means Harry Potter 7!
Will Dearborn says
I’d also love to know what he thinks of 7.1 surround (which I have at home and love for the near 360 sound-field and also maybe of the quality of surround sound in modern games? One of the highest grossing games of last year (and ever), Modern Warfare 2 had a 4.1 track and it drove me batty when the there was no center channel for dialogue. It threw the whole thing off. Okay, lastly, what kind of audio setup (for home theatre playback) does he have at home?
Erik Aadahl says
Hi Will, thanks. Sorry, it’s not Harry Potter :) I try to listen to movies in theaters–my favorites are the Arclight black-box cinemas here in Hollywood–but my home system is a humble Onkyo HT-S9100THX 7.1 which does pretty good. It’s no match for my Meyer HD-1s in the studio though. Having a center channel is definitely important, and a room with dead acoustics doesn’t hurt either. Cheers!
Jim Stout says
Thanks for checking in Erik! Awesome stuff!!
RichardDevine says
Hello Erik, awesome interview. Huge fan of your work! :)
Will Dearborn says
Thanks very much for answering my questions Erik. I have a similar Onkyo setup, exciting. :D
pepe lopata says
Hi Erik, how do you go from sound editor to designing a major hollywood film in just 4 years? 4 years is a real short time period. was it your MPSE awards that opened doors for you, or was it the old adage of right place at the right time. describing your brief history really shows that you met the right people along the way. i find your career ascension rather inspiring. Good work.
Ryan says
Dear Erik,
Great article. I’ve been following your work ever since Transformers. I’ve also been a huge fan of Ethan van der Ryn since Lord of the Rings.
I work as a resident recordist/mixer at a studio in L.A. and I do everything at 96K and I personally hate mixing at 48K. When I read that you record at 192 that blew me away.
Few questions if I may!:
How and when do you downsample to send it to the mixing stage?
Do you premix at 192, 96 or 48?
Do you ever record ambiences with the holophone or any other surround mic or do you always use your Neumann?
Jadarrian Edmond says
I admire your work and the passion you have for sound designing. Im a video production major at the Art Institute of Atlanta and currently working on a Sound Design presentation about you and your work. I’ve followed your work for a while, I know your will continue to surprise me.
Tom Blazukas says
Great inspiration to the beginners, like myself. This article made my day. Hope our paths will cross one day. Keep up the good work!
T