Welcome back to our five-part discussion of some of the best techniques that video game composers can use to enhance tension and promote suspenseful gameplay. These articles are based on the presentation I gave at this year’s Game Developers Conference in San Francisco, entitled Homefront to God of War: Using Music to Build Suspense. If you haven’t read our previous discussion of Ominous Ambiences in part one of this series, please go check that article out.
Are you back? Good! Let’s continue!
We’ve already talked about how to create an edgy, ominous atmosphere. By carefully nurturing the player’s suspense and anxiety, we can prime the player with an assortment of quietly unnerving sounds, until the player is perfectly ready for…
The Jarring Jolt technique
This is the second technique we’ll be discussing in our five-part article series on the role of music in building suspense. Like the Ominous Ambience (which we discussed in part one), the Jarring Jolt also owes a debt to the expert work of sound designers. In fact, the Ominous Ambience and the Jarring Jolt are fairly interdependent. One doesn’t work that well without the other.
Interactive music is always a hot topic in the game audio community, and newcomers to game music composition can easily become confused by the structure and process of creating non-linear music for games. To address this issue, I produced four videos that introduce aspiring video game composers to some of the most popular tactics and procedures commonly used by game audio experts in the structuring of musical interactivity for games. Over the next four articles, I’ll be sharing these videos with you, and I’ll also be including some supplemental information and accompanying musical examples for easy reference. Hopefully these videos can answer some of the top questions about interactive music composition. Music interactivity can be awesome, but it can also seem very abstract and mysterious when we’re first learning about it. Let’s work together to make the process feel a bit more concrete and understandable!
Last week, it was my honor and pleasure to give a presentation at the Game Developers Conference in San Francisco. My talk was entitled “From Total War to Assassin’s Creed: Music for Mobile Games.” The talk focused on the best and most effective methods for composition and implementation of music in portable gaming. The talk was structured for the benefit of video game composers and game audio pros, and as a part of the presentation, I played short excerpts of music that I composed for several of my top mobile and handheld video game projects. Now that GDC is over, I thought I’d provide streaming links to some of the complete music tracks that I featured during my presentation, in case attendees were curious about the complete pieces of music. So, without further ado, here are tracks from my GDC 2016 talk!
Assassin’s Creed Liberation
The Assassin’s Creed Liberation game was released by Ubisoft for the PlayStation Vita, and delivered an immersive experience from the popular Assassin’s Creed franchise. The game was designed specifically for a portable system, and as such, all aspects of the design were adjusted to cater specifically to a portable gaming experience, including the music.
I was tremendously honored to speak at the Audio Engineering Society’s convention last month, and I thought I’d share a video excerpt from my speech, which was entitled “Effective Interactive Music Systems: The Nuts and Bolts of Dynamic Musical Content.” Many thanks to Steve Martz and Bob Lee at the Audio Engineering Society for organizing an outstanding event!
More about the AES:
The Audio Engineering Society is the only professional society devoted exclusively to audio technology. Founded in the United States in 1948, the AES has grown to become an international organization that unites audio engineers, creative artists, scientists and students worldwide by promoting advances in audio and disseminating new knowledge and research. Currently, over 14,000 members are affiliated with more than 75 AES professional sections and more than 95 AES student sections around the world. Conventions, which include scientific presentations, student activities, workshops, and exhibitions, are held annually both in the US and Europe. Additional conferences and regional summits are held periodically throughout Latin America, Asia, Europe, and North America.
Effective Interactive Music Systems: The Nuts and Bolts of Dynamic Musical Content
Interactive methodologies have profoundly impacted the way that music is recorded, mixed and integrated in video games. From horizontal resequencing and vertical layering techniques for the interactive implementation of music recordings, to MIDI and generative systems for the manipulation of music data, the structure of game music poses serious challenges both for the composer and for the game audio engineer. This talk will examine the procedures for designing interactive music models and implementing them effectively into video games. The talk will include comparisons between additive and interchange systems in vertical layering, the lessons that can be learned from conventional stem mixing, the use of markers for switching between segments, and how to disassemble a traditionally composed piece of music for use within an interactive system.
Here’s another installment of a four-part series of videos I produced as a supplement to my book, A Composer’s Guide to Game Music. This video focuses on the Horizontal Resequencing model employed in the Speed Racer video game, providing some visual illustration for this interactive music composition technique. The video demonstrates concepts that are explored in depth in my book, beginning on page 188.