As has come before; many of these posts will be philosophical in nature. Some will be in contradiction to previous postings. These are not intended as truths or assertions, they’re merely thoughts…ideas. Think of this as stream of consciousness over a wide span…Please bare with us as we traverse the abstract canals of audio musings.
This year I took part in Audio Game Jam 2. A game jam with the goal to raise awareness of accessibility issues experienced by visually impaired people when playing video games.
If you haven’t heard of audio games, these are games which are played mostly or solely through audio. There’s lots of audio games across many genres like narrative adventures, flight simulators, RPG’s, RTS games or even GTA style games.
My goal for this jam was to inform myself better about the topic and try to make something that is accessible. In many of the projects I’ve been involved so far, accessibility was often neglected &and I want to change that.
To prepare for the jam I watched a fair bit of audio game lets players on YouTube. What really surprised me was the speed at which in many audio games screen readers or text to speech tools are reading explanations and instructions to the player.
This is an extreme example:
https://www.youtube.com/watch?v=AJ0Xu7TeHmY
My immediate reaction was: Wouldn’t it be possible to create sth. that doesn’t rely on screen readers?
With all the new technologies for spatial audio, HRTF processing and physics based reflection systems for game engines, that take into account the geometry of a 3D space and materials, maybe there’s a lot of new untapped potential in regards to designing audio only games that take place in a 3D space?
Then I watched this really informative talk on accessibility through audio by Adriane Kuzminski to find out that it’s just not that simple.
https://youtu.be/n6kANg1K3nE?list=PLVEo4bPIUOsmhxWT181OPVq9Z1P8Qjf19
One important thing among others that I overlooked is, that in a 3D space the player not only has to be aware of location but also the right direction in order to progress.
For my game I decided to keep it really simple and use a specific sound cue to guide the player. A bird that you have to follow through a forest. This is explained through another character in the game through voice over. So I got away without any screenreader.
But of course this mechanic is limiting in regards to which type of game such a guide system might work well with, because you’ve got to find a way to somehow integrate this element into the story (if there is one).
Also, the more complex the game or the game world gets, the more information the player has to be made aware of and voice over is also often a budgeting issue.
So I guess you maybe can’t take screen readers completely out of the equation?
I want to explore audio games further and over the past days I’ve been theorycrafting if it might be possible to make a mechanic like the Witcher sense ability in the Witcher games, which lets you analyze your surroundings for clues & interactable items, but does not rely on visuals & convey the same information through sound only.
I’m still not sure.
What I’m sure about is that accessibility is important and that it’s also our responsibility as game sound designers to make other team members aware of it if it’s neglected.
For every sound or system I create I should not only ask myself how it could help make the game better, but also how it could potentially help make it more accessible.
Further Resources on Audio Games and Accessibility through Audio: