Bumpers, Bells, and Beats: The Dynamic World of Interactive Audio with David Thiel

The world of game audio presents unique challenges and opportunities, and few individuals have navigated this space with as much depth and insight as David Thiel. In an online guest lecture, Thiel shared his extensive experience in interactive audio, covering its evolution, principles, and creative approaches. With a career spanning over four decades, his work has influenced interactive entertainment, from early arcade machines to modern gaming environments.

David Thiel

Interactive Audio vs Linear Audio

Thiel began by distinguishing between interactive audio—used in games—and linear audio, which is typical of film and television. Unlike linear audio, where sounds are meticulously timed to match fixed visuals, game audio must be dynamic. It adapts in real-time based on player interactions, requiring a more flexible and responsive approach to composition and sound design.

One key challenge he highlighted is unpredictability. In a film, every sound effect and piece of music is placed with precise timing. In contrast, game audio must account for numerous possibilities—players might trigger events in different sequences or at varying speeds. This means that game sound designers must think beyond static cues, ensuring that every sound conveys meaning while enhancing immersion. Thiel illustrated this with an analogy: Imagine posting a movie where you know which events will occur but have no idea when or in what order they will happen. He then expanded on this by explaining how game audio is akin to composing for an orchestra where each instrument plays independently based on player input, making real-time adaptation essential.

Making Game Audio Meaningful

One of Thiel’s core principles is that game audio should always provide useful information to the player. Sounds should not just be aesthetically pleasing but should also enhance gameplay. For example, when a player fires a weapon in a game, the sound can communicate crucial details such as the type of gun, whether it is running out of ammunition, or if the shot has hit a target. He elaborated on this concept by breaking down the sound design process for firearms: A shotgun blast should feel weighty and reverberate differently in an open space versus a confined corridor, while an energy weapon should have an otherworldly charge-up sound. Additionally, missed shots and ricochets can provide players with subtle cues about their accuracy, reinforcing the importance of audio feedback.

Another fundamental aspect is variation. If a game reuses the same audio sample repeatedly, players may quickly lose their sense of engagement. Thiel demonstrated how game audio can introduce subtle variations based on contextual factors, such as the shooter’s position, the remaining ammunition in a weapon, or environmental influences. He provided an example from Borderlands 2, where he spent over 1,000 hours playing and noted how the game’s procedural gun system extended to audio, ensuring that weapons sounded unique based on their make and function. Each gun has a different reload sound sequence, creating deeper immersion and ensuring that players can distinguish between weapons purely through audio cues.

Additionally, Thiel discussed the importance of environmental sounds in enhancing game immersion. He explained how in Winter Games (1985), all sound effects were synthesised in real-time, yet they managed to convey the distinct feel of ice skating. By manipulating pitch and timbre, the sound team created convincing audio cues that responded dynamically to skater movements.

The Role of Music in Interactive Audio

Music in games also requires a different approach compared to linear media. Thiel recounted his early experiences in the 1980s, where hardware constraints required music to be generated in real-time using algorithms. Though modern hardware allows for pre-recorded music with high production values, he highlighted the benefits of runtime-generated music, such as the ability to synchronise musical cues dynamically with gameplay.

A particularly engaging example was his work on pinball machines. In Monday Night Football pinball, musical motifs and drum beats were triggered by player actions, enhancing gameplay feedback. When the player scored a goal, a celebratory fanfare played, rising in pitch with each successive goal, reinforcing the excitement. Similarly, synchronised drum fills were used when the ball passed through a spinner, making player actions musically rewarding. Another notable example was from Torpedo Alley, where a cowbell sound layer was introduced when players entered a time-limited game mode, ensuring they knew they had a short window to act. The cowbell was musically integrated, but also acted as a warning that the mode would soon expire, influencing player behaviour.

Thiel also explored how interactive music could adapt to player performance. He referenced a pinball machine where successfully hitting targets would cause the background music to shift in pitch, making victories more rewarding. Each successive successful shot raised the key of the soundtrack, creating a musical escalation that heightened player excitement.

Challenges in Speech and Sound Effects

Thiel also touched on the complexities of speech and sound effects in games. While modern storage capacities allow for extensive voice recordings, game dialogue must be carefully managed to maintain clarity and engagement. He shared insights into ‘speech wrangling’—the process of organising, editing, and integrating thousands of voice clips in a way that is useful to game developers.

Sound effects, meanwhile, are not simply lifted from libraries but are often layered and modified to enhance realism. Thiel illustrated this with an explosion sound effect: Rather than using a single sample, he combined elements such as a low-end impact, a sharp transient, and a synthesised decay to create a more impactful and informative effect. He also explained how the iconic sound of the Ark of the Covenant in the Indiana Jones pinball machine was created using a manipulated orchestral harp sound called ‘Psycho Drone’—an example of how concept sometimes takes precedence over traditional realism in sound design.

Additionally, Thiel described how synchronised sound cues could be used to communicate time-sensitive objectives. In a pinball machine, for example, the sound of a looping crowd chant helped signal an urgent task. Players needed to hit a specific target before the chant faded, using sound as a direct gameplay indicator.

Mixing and Mastering for Different Environments

A crucial part of game audio design is ensuring that sounds are balanced correctly in different playback environments. Thiel noted the differences between public spaces (such as arcades or casinos) and private listening setups (home gaming, mobile devices). In noisy public settings, audio cues must be clear and loud, often using aggressive mixing techniques such as ducking (reducing background volume when key sounds play). In contrast, home environments allow for more subtle layering, offering richer detail and depth.

The Passion Behind Game Audio

Thiel concluded his talk with reflections on the industry’s approach to audio. Despite the immense progress in gaming technology, he observed that audio still receives a smaller share of development resources compared to graphics. However, for those passionate about sound, game audio remains a deeply rewarding field that requires creativity, problem-solving, and technical expertise.

This lecture provided a fascinating exploration of interactive audio, offering both historical perspectives and practical insights. Whether you are an aspiring game audio designer or simply interested in the intricacies of interactive sound, Thiel’s knowledge and experience shed light on the challenges and artistry of game audio creation.