The Philips Pavilion, based on hyperbolic paraboloids originally used in Metastaseis musical piece by Iannis Xenakis
“Sound is a spatial event, a material phenomenon and an auditive experience rolled into one. It can be described using the vectors of distance, direction and location. Within architecture, every built space can modify, position, reflect or reverberate the sounds that occur there. Sound embraces and transcends the spaces in which it occurs, opening up a consummate context for the listener: the acoustic source and its surroundings unite into a unique auditory experience.”
The spatial metaphor
Over the years, the relationship and analogy between music/sound art/sound design and architecture has been explored in several aspects. In the same way architecture works over the solid materials, visual spaces, geometry, abstract realities or social contexts, it does over the aural realities, the sonic dimension. When it comes to space, sound can be valued in an architectural process, just as architecture is also sonic.
Although when it comes to music, there has been a discussion on the validity of the analogy between the musical space and that of architecture, and there’s also some way of relating both concepts in the role of sound design, since it doesn’t rely in a fixed language as some music is, and it’s always open to the contexts in which it evolves or in which it is developed, such as a film. Space in terms of sound design is immensely important, both in terms of the visual/outer spaces projected in a particular audiovisual medium, but also in the inner, abstract or invisible faculties of a piece like a film or a videogame, thus introducing the possibility of creating architecture with aural elements in the same way the visual aspect creates its own spaces and objects.
A few months ago I came across a Twitter post made by Stephan Schütze (a recent Designing Sound contributor) that continues to resonate with me (no pun intended) and I wanted to share it with anyone in the sound design community that has yet to hear these sounds.
As a side note, Stephan’s tweet was unrelated to his Designing Sound contribution (which can be found here) that he wrote for our monthly theme dedicated to Vehicles.
The original Twitter post was for an article entitled:
NASA Probes Record Sounds In Space – And It’s Terrifying.
I was immediately enthralled as soon as I heard the sounds. Opposed to my previous beliefs, outer space actually does produce sound, and the sounds are quite remarkable.
If you made it to the Designing Sound mixer we held during the AES conference in New York last year, you may have met Neil Benezra. Neil is a Brooklyn based sound designer and mixer, and he’s just shown up on the cover of the latest issue of CineMontage (the Motion Picture Editor’s Guild Journal). We’re always happy to see members of our community being recognized. Why not go give it a read? ;)
Image by: Marcio Eduardo Rodrigues
Listening is the most important skill a sound designer has, and yet, it’s probably the one that’s the most ephemeral and difficult to nail down. What is listening? Are we born with this skill, or is it something that we can learn? Listening is the process that takes the information that we hear and makes meaning from that sound. To listen requires a conscious effort, and it’s this effort that you can learn how to train. Some blind people have learned to listen so well that they can echo-locate: we have a remarkable to hear all kinds of things in our environment that most of us just miss out on.
You’re reading this article, so you’re someone who is probably already listening to sound more than your friends. Maybe you’ve gone to a movie and stepped out with your friend afterwards and said “wow, what great sound in that film” and your friend gives you a blank stare and says they didn’t notice. But we can always improve our ability to listen, to focus our attention on sound. I’d like to take you through some exercises I do with my students when I teach sound design so you can build your listening skills.
Blind Man’s Buff, by Eugene Pierre Francois Giraud
Silence! Be quiet! Because listening is active, because the birds have already left but their sound still reverberates. Silent all ears that listen, stunned by the noise that is gone but still relishes. The soundtrack? Our life! That one of changes, transition, mutation and mysteries, that one able to peer into the recesses of the deepest realities, responsible for questioning the apparent manifestations of the abstract and the concrete to go into unexpected territories of consciousness. These are the realities of sound phenomena, the challenges of searching for a continuous vibration, a pure sonic experience.
Let the mind travel around 2.500 years ago: we’re here in the Pythagorean School, waiting for the teacher to lead us into the most unlikely truths of the cosmic harmony. Our eyes are eager, the heart rumbles and a curtain, the veil of listening, can be seen on the horizon. Suddenly, a voice is heard, the teaching begins. The eyes, yet expectant, cry for the face of the talking master, who is not (and will not) on the retina. The curtain is still there and is the only visual reference for the sounds being heard. The voices possibly emerge from the cloisters of the mind or perhaps from the same shadows in the curtain, where the teacher continues his mission.
Silence! Be quiet! Because the sound is active, the akousma has emerged and the sonic code is already running through the mazes of the passions and the cusps of thinking. Slowly and without seeing, the oral reality becomes symphony, opening the doors to an intimate universe, the acousmatic. The teaching behind the curtain now makes sense and invisibility brings a message to the cochlea that is impatient because of its blindness. Over time it gets calmed, the world of sound is clear and the government of tongue and thought becomes possible, and with them also the desires and the scars of those memories that despite of being absent, still hit the listener’s soul.
And so, behind the curtain, sitting in silence, the initiation begins.
Exercising listening in a public outdoor space.
Sound designers by nature have an inherent curiosity towards sound. We explore the way sounds work every time we approach a project. With each new opportunity to design a sound, we ask ourselves questions such as: What object/event produced the sound(s)? Where is the sound source located in relation to the listener, and just as importantly, how does (or how will) the sound impact an audience’s emotional state when heard?
It goes without saying that the sheer act of producing our own sonic work, and by critically listening to and dissecting the works of others (as Berrak Nil Boya explored and extrapolated on in her recent post) will inherently make us stronger and better critical listeners. Though along with these practices, it is invaluable to also step away from evaluating completed, produced works and critically listen to some alternate sound sources, and in some potentially new ways; just like exercising a muscle, the more angles you can target your critical listening “muscle”, the stronger and more well-rounded it becomes.
The question then must be, other than by evaluating an already existing game or film’s audio as it was intended, how, and what, can we listen to in order to hone our listening abilities?
This post looks to add to this conversation by offering a few exercises I’ve picked up and augmented over the years and still use to this day. Once again, just like any exercise routine, training your critical listening is an on-going responsibility for any sound designer (though vitally important early in your career, continued practice is essential to maintain a high level of critical listening fitness).
Back around the time I was first starting out, I remember opening up a demo of Cubase VST (on my trusty PowerMac 6400) and taking a look through the various menus. Everything seemed pretty standard, but something in particular caught my eye, a menu item labeled “Ears Only”. Curious, I clicked on it, only to have my monitor go completely blank. After a few seconds of panic thinking I had broken everything, I realized that Steinberg had programmed a mode that completely disabled the monitor and forced you to just listen. At first, this option seemed like a strange addition. Why, when I’m creating sound, would I not be listening to what I’m doing? Listening while working with audio seemed like a no-brainer. However, after gaining a little more experience, this “just listen” mode began to make a lot more sense.
There’s a short and interesting post on the Sound Reflections blog by La Cosa Preziosa, and it ties nicely into our theme this month:
One of the benefits of our tight-knit recording community is the availability of dialogue and exchange on the subject and techniques of recording. What do you use and how you use it? What tips have you got? Any questions? There is certainly no shortage of websites dedicated to the subject and forums to air our views in- the first being Twitter of course! Recording chat is plentiful among us recordists. But what about the other end of the recording process- the listening?
Head here to continue reading.
Indiewire has published a guest post by Dolby Institute’s director Glenn Kiser in which he talks to filmmakers about the importance of sound design from the beginning of production.
Making a movie is a never-ending series of compromises, and nothing is as good as the original concept you had in mind. But if you’re really lucky, there’s a moment of alchemy that can happen in the editing room when you put the right piece of music or the right sound effect into the cut. Suddenly something magical happens, and the thing comes to life. You forget about the perfect location you couldn’t secure and the cold your lead actor had on the day you shot the emotional scene. It stops being a maddening litany of disappointments and becomes a movie.
Guest Contribution by David Nichols
An engine is, in essence, an air pump. Air comes in, gets mixed with fuel, goes bang, and leaves again. When talking about ways to make more power, the most obvious is to make a bigger bang. However, gasoline works best at a very specific ratio of fuel to air, which is roughly 14:1. So, if you want to make a bigger bang, you need 14 times more air than your increase in fuel to get it.
When trying to get more air, one solution is to use a bigger engine. More, larger cylinders means the pump can inhale a bigger breath, which means more fuel and more power. However, this so-called “natural aspiration” or NA for short, has a limitation in air pressure. Just like a straw, the inhale of an engine works by creating low pressure, which atmospheric pressure then fills in. So, another way to get more air into an engine is to pressurize it, or use “forced induction.”
There are a few different methods of forced induction, but today I want to talk about one in particular: turbocharging. A turbocharger is a turbine that is connected to the exhaust gas leaving the engine on one side, which then drives an impeller on the other side to create air pressure. The more and faster exhaust gas comes out of the exhaust, the more and faster the intake side compresses air. When the amount of pressure generated by the propeller is greater than atmospheric pressure, the system is making “boost” and the amount of boost can be measured in PSI.