Guest Contribution by Graham Gatheral
Let’s be honest: code can be daunting. All those words and numbers and operators and punctuation errors… For a start, there’s no GUI. How are you supposed to make anything without a GUI?!
Well, as we’ll see later we can make a GUI-based synth in SuperCollider with just a few dozen lines of code! But let’s put GUIs to one side for now, because SuperCollider’s real power is in its ability to produce flexible and complex dynamic systems directly from code, and without too much trepidation… My aim here is to introduce an audio synthesis programming language to an audience that is, for the most part, more comfortable working with a GUI. So I’ll start off with some simple code examples and then move onto how SuperCollider can use game-code parametric data to drive synthesis ‘patches’ in real-time.
If you don’t have SuperCollider already, download an installer here:
Regarding platforms, I’m on Windows 7 but the code will certainly run fine on a Mac.
Warning: The code examples below were written for demonstration purposes and have not been fully tested. Please be careful not to expose your ears to loud sounds (particularly when using the metal impacts tuner) as stable behaviour cannot be guaranteed. This is especially critical if using headphones!
A Quick Introduction
SuperCollider consists of three components:
- an object oriented programming language
- a language interpreter
- a real-time sound synthesis server
When code is executed, it is interpreted and sent to the server, whereupon the sound is generated.
SuperCollider has had an Integrated Development Environment (IDE) since version 3.6, which is great because now you have everything you need in one place:
- Code editor (where you write your code!)
- Help browser
- Post window (shows the outcome of your code, including any errors)
- Document browser [not shown below]
Guest contribution by Chris Didlick
We are Box of Toys Audio, a music and sound design company with studios in London and Stockholm. We provide audio services for commercials, branding, trailers and all manner of projects. With the ongoing innovation and expansion of digital media we are sometimes offered new and uncharted avenues for creativity, which is why when we were asked to work on the new Madefire Motion Book platform we embraced the challenge readily.
Madefire is an iOS app that has been optimised for the iPad and iPhone, emulating the traditional graphic novel format with the addition of motion, interactivity and audio. What’s more, Madefire is also releasing, in phases, free development tools that can be used by independent artists to create and publish their own stories on the platform. With Moving Brands CEO Ben Wolstenholme and comic book legends Dave Gibbons and Liam Sharp involved in the creation of the app, we jumped at the chance to create the audio for the first three in-house story releases, namely “Treatment”, “Captain Stone is Missing…” and “Mono”. Not only were we creating the audio for the narratives, we were also constructing an SFX library for use within the development tool. It was therefore important that the audio enhanced each story while being effective for future titles.
Garry Taylor, the Audio Director of Sony Computer Entertainment Europe (SCEE) has started a new blog called Blessed Are The Noisemakers where he is posting his various conference talks and presentations. ‘All In The Mix – The Importance of Real-Time Mixing In Video Games‘ is the first fruit of the new blog, alongside an interesting post on loudness standards. An excerpt of the former is below;
Today I’m going to be talking about audio mixing; what’s the purpose of mixing over and above getting the levels right, and why a good mix is so important. I’m also going to talk about the stages a mix engineer will go through when working in linear media, and what lessons we can learn, and what techniques we can use from the linear world when working with interactive material.
So, here’s what I’m going to talk about today. Firstly, a short introduction. Secondly, I’ll ask “what is mixing?” What’s the purpose of it, and what’s to be gained by good mixing practices.
In Part One we took a look at some of the fundamentals involved with orchestrating the sounds of destruction. We continue with another physics system design presented at last years Austin Game Developers Conference and then take a brief look towards where these techniques may be headed.
UNLEASH THE KRAKEN
In Star Wars: The Force Unleashed we were working with two physics middleware packages: Havok Physics, and Pixelux’s Digital Molecular Matter (DMM). In addition to the simulation data that each provided, we also needed to manage the relationship between both. While Havok has become a popular choice for runtime physics simulations, the use of DMM spoke to the core of materials and provided each object physical properties enabling – in addition to collision’s – physically modeled dynamic fractures and bending. In some ways tackling the sound for both systems was a monumental undertaking, but there was enough overlap to make the process more pleasure than pain.
Before Jumping into the fray, I just wanted to take a moment to echo a couple of things that were touched on in the companion this article; specifically, that collaboration and iteration are the cornerstones of a quality production when it comes to systems design. Collaboration, because the stakeholders involved usually include people across all disciplines; from programmers to sound designers, modelers to texture artists, build engineer’s to game designers. Iteration, because the initial vision is always a approximation at best and until things get moving, it’s difficult to know what the eventual shape things will take.
While simultaneously reigning in and letting loose the flow of creativity ebbing and flowing across the development team, there is nothing more important than the support of your colleges. Leveraging the specialties of different people helps to bring new idea’s to situations in need of a solution. Your greatest asset as a team member is to recognize and respect the uniqueness of your co-workers and stay open to the constantly shifting requirements of the game. Good listening and better communication will improve the productivity of meetings, and reinforce the fundamental desire of everyone – to craft the best player experience possible.