The Sound Design of The Matrix has been one of the most famous in recent years. An awesome sound work by our August featured sound designer Dane A. Davis. Therefore, I decided to divide into several parts the information about the sound of The Matrix (1999), a famous science fiction-action film written and directed by Larry and Andy Wachowski.
In this part, I found two amazing articles at Filmsound.org with useful information about the Matrix Sound. The first one is an interview with Dane A. Davis who talks about the sound design of Matrix:
When did you get involved with “The Matrix”?
About three years ago, when almost all the same crew, including cinematographer Bill Pope, picture editor Zach Staenberg and composer Don Davis, made the first Wachowsky brothers film “Bound,” produced by DeLaurentis Entertainment. Even back then, the Wachowsky brothers actually had a finished script of “The Matrix” which we read, and while discussing the project we developed an overall concept for its sound design. They shot the film in nine months in Sidney, Australia. I went there in July on their last day of shooting, and worked on the movie until up to a week before its release. I created the sound effects on an Avid Pro Tools system here at Danetracks, Inc. with the help of effects editor Julia Evershade and Eric Lindeman who created many of the gun and helicopter effects. It’s actually a DigiDesign ProTools system (with many plug-ins), although the two companies merged a few years ago. I also used MetaSynth and SoundHack extensively on this movie. Then John Reitz, Greg Rudloff and Dave Campbell mixed the effects, dialog and music tracks on a digital Neve console in the brand new Stage 6 on the Warner Bros. lot. The entire final mix was a magless, tapeless and drive-based digital mix from original recordings to final printmasters.
Most importantly, there are a lot of fight scenes in “The Matrix” and I made a point of creating composite body-hit and whoosh sounds that had never been heard before, using meat hits and animal vocal sounds as sources. They also evolve from one combat to the next to become increasingly more animalistic and powerful. I don’t want to reveal the source for them in any greater detail, but I think it resulted in a sound effects experience that audiences really believed.
How did you help the Wachowsky brothers realize their futuristic vision?
As filmmakers, Andy and Larry are primarily concerned with storytelling through the characters, and although we knew a film like this offered a great potential for creative sound work, we never wanted the soundtrack to call attention to itself beyond the point that is being made in the story. We wanted the totalitarian computer the rebels battle against to present only enough detail in the reality they encounter to keep them believing in it. Both the visual effects and the sound had to support these shifting levels of credibility, so that when you went to the artificial world of the future there was a deadly, almost mundane detail to the sound effects which contrasts with the more contrived and deliberate ‘real’ world occupied by the rebels in their underground spy-ship headquarters, the Nebuchadnezzar.
One of the unifying concepts of the movie is that everything is motivated by electricity which results in a lot of sparking and zapping in the future scenes. For example, the Neb is driven by electromagnetic propellers so I rented a six-foot Jacob’s Ladder and ran 60,000 volts through it to create the basis for the sound of its engines.
Time shifting plays a major role in the story. How did you use sound effects to emphasize this?
We played a lot of games with the speed of the effects. There is a scene where the police have a tremendous shootout with the rebels in the lobby of a government building and we employed many different time rates for the sound effects to key the audience into how fast Neo’s brain is working as the bullets are flying all around. Off-screen, gunshots would pick up in speed as the visuals went from slow motion to normal, and the bullet ‘fly-bys’ would accelerate as they zoomed across the surround speakers. The idea was to play off the mental aspect of the scene rather than just the physical violence, so at different times different elements would be emphasized. Sometimes, you would only hear the marble columns being smashed by automatic fire and individual chunks of stone flying through the air, while in other instances you could merely hear the guns themselves. It was a tricky scene and we felt that it worked out really well.
Which scene was the most challenging for you?
There is a key moment toward the end where Trinity kisses Neo while a battle is raging around them and the problem was maintaining the romantic intensity without losing the dramatic tension of the background conflict. We tried a lot of different ways to keep the sound of the laser beams and metal rending and banging from stepping on the feeling of the kiss, and in the end, we came up with the idea of transitioning the full-on attack into a surreal, deep metallic booming like cannons in the distance while occasionally bringing some mid-range frequencies back in when Trinity pauses in the kiss. The scene was built very carefully in terms of where all these resonating metal hits are positioned throughout the action and it let the intimacy of that crucial kiss build while the battle continued.
Which sequence do you think would have been most different if someone else had been the sound designer?
There is a scene where Neo is being encased by what we called a ‘mercury mirror’ as the computer tries to take him over, andthe sounds of his own screams being digitized from his perspective was extremely time-consuming. I don’t think anyone else would have done it just the same way. There are also some evil creatures with mechanical tentacles called ‘Squiddies’ and we created at least 15 raw digital effects tracks for each of them, many involving techniques such as in-line pitch shifting and sequenced samples of screams, screeching bearings and ratchets among other things, to give the feeling of individual terror as they are burning their way into the rebels’ ship.
The second article contain information provided by the sound makers of Matrix, including Dane A. Davis, FX mixer Gregg Rudloff, Dialogue mixer John Reitz and Music mixer Dave Campbell. Some highlights:
‘Basically we wanted to create all of the sounds for the movie from scratch in order to give it a very unique quality, but we were also dealing with a lot of genres that we really wanted to transcend; martial arts scenes, gun battles, and so on,’ says sound designer-supervising sound editor, Dane Davis, who started full-time work on The Matrix project in July of 1998. This was about a week and a half before the completion of principal photography, which, although this is a Warner Brothers film, largely took place on the Twentieth Century Fox lot in Sydney, Australia. Thereafter Davis used his own Pro Tools-based Danetracks facility in Hollywood, while the mix took place on a Neve and Fairlight-equipped Warners sound stage in Burbank.
‘Pro Tools was used for recording, editing, processing and manipulating all of the sound in the movie–the music, the dialogue, everything–and, aside from some mag stems for one of the temp mixes, tape was never used for any of the post work. That kept everything flexible and efficient, and I also think it added a lot to the clarity.’
‘Another program that I used was MetaSynth, and that really defined the sound quality of a lot of things, giving them an extremely clean and distinct timbre while doing digital processing. I used it on anything that had to feel digital, not wanting to get grainy in an ugly way–except for five or six sounds in the movie that did have to be grainy in an ugly way. In some cases I had to create an audio file and import it to MetaSynth, export it back to Pro Tools and then let it continue with its linear progression.’
Configured for dialogue, music and effects, the all-digital, all-automated DFC console is set up in four tiers, with each fader capable of up to an 8-track pre-dub. ‘We did a 6-track mix, so all of my predubs were in 6-track form,’ explains Rudloff. ‘The six channels consisted of left-centre-right, a left surround, a right surround and the sub information. I wasn’t using the faders of each tier; just one layer had the 6-track predub, but I was using multilayers for other things. Depending on how you set it up and what you’re using the signals and routeing paths for, the board can provide up to 500 paths.
An approach that Gregg Rudloff refers to as ‘see a bear, hear a bear’. ‘Sometimes that makes a really big difference,’ Davis continues. ‘I don’t ever use synthesisers–whether we’re talking software or hardware, and even though I have tons of them–unless the thing on the screen is a synthesiser, and I apply that same principal to creatures such as the robots in this movie. I didn’t want them to make a sound that seemed like it was being made for the benefit of humans, and, while that’s guiding principle in all of my work, in this movie it was a law. If a sound makes the audience think about somebody creating that sound then it’s the wrong sound.’