It's free to join Gamasutra!|Have a question? Want to know who runs this site? Here you go.|Targeting the game development market with your product or service? Get info on advertising here.||For altering your contact information or changing email subscription preferences.
Registered members can log in here.Back to the home page.    

Search articles, jobs, buyers guide, and more.

By Daniel Bernstein
October 1997


Audio Object Vocabulary

Character Development

Ambient Sound

Adaptive Music


This article originally appeared in the October 1997 issue of Game Developer magazine

Game Developer Magazine

Letters to the Editor:
Write a letter
View all letters


Creating an Interactive Audio Environment

Adaptive Music

The nonlinear medium of computer gaming can lead a player down an enormous number of pathways to an enormous number of resolutions. From the standpoint of music composition, this means that a single piece may resolve in one of an enormous number of ways. Event-driven music engines (or adaptive audio engines) allow music to change along with game state changes. Event-driven music isn't composed for linear playback; instead, it's written in such a way as to allow a certain music sequence (ranging in size from one note to several minutes of music) to transition into one or more other music sequences at any point in time. An event-driven music engine must contain two essential components:

  • Control logic - a collection of commands and scripts that control the flow of music depending on the game state.
  • Segments - audio segments that can be arranged horizontally or vertically according to the control logic.

In Kesmai's Multiplayer Battletech, control logic determined the selection of segments within a game state and the selection of sets of segments at game state changes. Thus, the control logic was able to construct melodies and bass lines out of one to two measure segments following a musical pattern. At game state changes, a transition segment was played, and a whole different set of segments was selected. However, this transition segment was played only after the current set of segments finished playing so as not to interrupt the flow of the music. I selected game states and also tracked game state changes based on the player's relative health vs. the health of the opponent. Overall, I composed 220 one to two measure segments that could all be arranged algorithmically by the control logic. What resulted was a soundtrack that was closely coupled with the game-playing experience.

Tips and Techniques

  1. Music comes first. Remember that no matter how closely your music follows the game play and how interactive it is, if it doesn't gel as a musical composition, you're better off writing a linear score. Always explore all possibilities of transitions from one game state to the next, and see if the music reacts the way you meant it to react. Make sure that you write transition sequences and that the engine is intelligent enough not to change game states midmeasure or midphrase.
  2. Decouple segments horizontally and vertically. Compose your music so that different segments may be combined end-to-end (horizontally), as well as on top of each other (vertically). This way, you can combine different melody lines with bass lines, use different ornamentation, and so on.
  3. Don't give away too much information. Sometimes a musical cue might say too much, when it was meant just to highlight the game state change. For example, in a certain game, an upward chord progression always signifies to a player that a starship is on his tail. When working on game state changes, make sure your event-driven music isn't used as an early warning system for the game.
  4. Define a series of translation tables to track game state changes. For example, in Multiplayer Battletech, a game state change from "winning" to "advantage" implies a losing trend. The music reacts to this change by selecting a different set of segments than it would if the change occurred from "advantage" to "winning." By composing in a nonlinear fashion, and by having the music react to the player's actions directly and indirectly, we introduce a new level of interactivity. Emotionally, the soundtrack carries the person seamlessly along with the action in much the same way as the static, linear media of film. In this fashion, music becomes the gateway to the player's emotional response to the game.

Total Immersion through Sound

As game designers and audio producers, we should be constantly aware of the impact that a well thought-out audio environment can have on the product. It can make a graphically simple and uneventful scene become awe-inspiring. Effective use of an audio object vocabulary can enhance the impact a character may have on the game player. Ambient sounds, in all of their variety, can transform a game scene from a virtual one to a believable one. Surreal textures and atmospheric gestures can generate emotional responses in a player as varied as the soundscapes themselves. As games become more and more complex and graphically spectacular, we must not overlook the role of audio in enhancing and completing that feeling of total immersion.

Daniel Bernstein manages Monolith Productions' Audio/ Video department in Kirkland, Washington. He has been the audio producer, sound designer, and composer on a number of successful commercial titles, most recently for Blood. When not listening to feedback loops, he enjoys spending time with his wife basking in the beautiful Seattle sunshine. He can be reached at [email protected]. Please, no e-mails after 10PM.


[Back to] Intro

join | contact us | advertise | write | my profile
news | features | companies | jobs | resumes | education | product guide | projects | store

Copyright © 2002 CMP Media LLC. All rights reserved.
privacy policy | terms of service