Guest Contribution by Pierce O’Toole
Writer/Director Pierce O’Toole shares his thoughts on music and sound design, and how they play into his creative process.
As a writer and director, my biggest concern on any project is the story. Every project has a story that you are trying to tell. When I approach sound, the lens I view it through – or the speaker I hear it through, I guess – is one of story. While this is true of every element of the filmmaking process, sound is unlike any of the others because it’s the only element that follows me through the entire process.
When I begin writing, music is very important. At first, it’s just something atmospheric or energetic, like The Album Leaf or Daft Punk. As I get further along in the writing process, I get a better sense of the story and the tone. At this point, the music has to match. If it doesn’t, it can make it harder to write. I build playlists that I listen to on repeat. I’ve had several roommates that hate me for this, especially when the playlist is less than ten songs. I don’t ever tire of the music, no matter how many times I listen to it, because that music helps put me in the world of the story. I’m not listening to the music; I’m absorbing it.
Finding and removing noise (image display from iZotope’s RX 2 Advanced)
As a sound designer, there are many different thoughts that come to mind when considering a topic such as noise. Everything from using tone generated noise, like white noise in the designing of sound effects, to a technical discussion on different types of dither algorithms, but when I kept thinking about noise, one slightly different viewpoint of the word “noise” kept coming back to mind; like attempting to attenuate something that just won’t go away, this question kept creeping back into the forefront of my mind:
How does a sound designer get their “signal” heard through the ever-increasing amount of “noise” that surrounds us (and our intended audience)?
Jad Abumrad at PopTech 2010 – Camden, Maine (Kris Krüg/PopTech via Flickr, used under Creative Commons License)
I recently had the chance to chat with Jad Abumrad, creator and co-host of WNYC’s Radiolab. Each episode of Radiolab explores ideas in science, technology, and the universe at large through a seamless blend of expert interviews, sound design, and music. Together with co-host Robert Krulwich, the show has covered topics such as sleep, colors, cities, and loops, just to name a few. Recently, Radiolab has taken to the stage, touring around the United States and adding a visual element to the show’s already imagery-rich storytelling. Jad and I talked about noise, sound’s ability to create powerful mental images, and how all of that translates into a live show.
Designing Sound: I’ll start off by asking you about noise. When I say the word “noise”, what does that make you think? What does it mean to you?
Jad Abumrad: Honestly, the first thing I think is a particular style of experimental music which is loud and abusive and cacophonous and hurtful, but which I very sparingly employ in scoring the show. I’m thinking Merzbow and the whole “musical pain posse” that sort of tumbled out of him. I always like the idea that those stabs and bursts of noise could kind of catch someone off guard, almost like an idea that sort of hits you in the face before you’re ready for it. There’s something about the storytelling we do where I want those ideas to have that kind of impact. So I think about that kind of music.
Guest Contribution by Steven Klein
There are many reasons for conflicting viewpoints and misinformation on studio design / acoustics. This article will examine components contributing to the confusion along with some advice on how to avoid common pitfalls.
(Merriam-Webster Dictionary: the inherent unpredictability in the behavior of a complex natural system.)
We must first realize that science is challenged by chaos. I present the thesis that talent supersedes everything. Since this is immensely abstract and unknowable where it fits in the analysis, chaos is exposed. Talented people will work in the worst conditions and have great results. The untalented can work in the greatest environments and never produce. There is an ambivalent conclusion for what works.
The reality that great music/production can come from adverse conditions leads one away from the true objective science of acoustics and physics.
Photo by flickr user Carbon Arc, and used under Creative Commons license.
As dynamics month comes to a close, I thought it would be fun to talk about the evolution of film sound mediums and how they impact dynamics. Since the widespread introduction of sync sound to films in the early 20th century, the technologies involved have changed quite dramatically. From experiments with wax cylinders and phonographs to magnetic tape and Dolby Digital, each evolution in sound technology improved fidelity and dynamics, giving sound designers greater power in creating artificial worlds and engaging the audience.