Producing Interactive Audio:
Thoughts, Tools, and Techniques

By Mark Steven Miller
Published in Game Developer Magazine,
October 1997

Game Developer Magazine
Producing Interactive Audio

The Rules of Interactive Sound Design

Creating an Adaptive Audio Engine

Case Study: The GEX Project

Only the Beginning

Tools of the Trade

Getting More Info about the IASIG

Times have changed in the game development industry. All of the stupid money has fled to Internet commerce applications, and companies in the game world are left having to make their way as viable businesses. There is no market tolerance left for "junk," or as it used to be referred to by a producer at Sega some years ago, "library titles." The "B" and "C" titles that filled out your publishing profile aren't viable any more - in fact, they can be downright deadly. Every game made from now on has to have a legitimate shot at becoming a hit; otherwise, it's just not worth making. This philosophy has inevitably trickled down to audio departments, where the focus now is squarely on quality.

A Rant on Interactivity

What exactly does it mean to produce interactive audio? For some time, interactive audio suffered from an identity crisis. The term has come to mean less and less. In my own jaded way, I always imagined the adjective "interactive" modifying a noun such as "media" or "audio." When a person says that they are "doing interactive" in reference to audio, it usually means the individual has stumbled into a job working on a CD-ROM or web site and is desperately trying to figure out how make 8-bit, 22Khz Red Book audio not sound as if it's being played back across two tin cans connected by string. To this person, "interactive audio" simply means sound for nontraditional media: CD-ROMs, console games, kiosks, and web sites. To me, the term means something entirely different.

If you're describing audio as "interactive," you're implying more than just linear playback. Interactive audio should be constructed in such a way that that the user can affect its performance in real time during playback. I'm talking about reactive, responsive audio, coming from audio drivers that are "aware" of what's happening and can respond by changing the music appropriately. Spooling a two-minute pop song in an unchanging, endless loop during a real-time strategy game is not interactive audio. Perhaps the term "audio for interactive media" would be appropriate instead.

Imagine instead audio that is an interwoven part of a 3D, free-roaming world. As you explore the world, the sound smoothly accompanies you, rising as you encounter danger, falling away as you explore in peace. It sounds triumphant when you succeed, and distant and mournful when you fail. And all of this happens with the precision and emotional impact of a great film score. In a user-driven world such as a game, you have no linear timeline by which to synchronize the changes in the music, as you do in a movie. The audio entirely depends upon the unpredictable input of the user. How can you make this work?

The answer lies in the nature of an interactive 3D world and is made possible by new tools and technologies. The 3D game world is open-ended, a database of terrain, objects, animations, behaviors, and their various relationships. Therefore, the music must also become a database of musical ideas, sounds, instruments, and relationships, imbued with awareness of the other objects in the world and programmed with responsive behaviors of its own.
The Rules of Interactive Sound Design Next Page