Interactive Music for the Video Game Composer

Game Composer Winifred Phillips works in her studio on the music of the popular Spore Hero video game As a speaker in the audio track of the Game Developers Conference this year, I enjoyed taking in a number of GDC audio sessions — including a couple of presentations that focused on the future of interactive music in games.  I’ve explored this topic before at length in my book (A Composer’s Guide to Game Music), and it was great to see that the game audio community continues to push the boundaries and innovate in this area! Interactive music is a worthwhile subject for discussion, and will undoubtedly be increasingly important in the future as dynamic music systems become more prevalent in game projects.  With that in mind, in this blog I’d like to share my personal takeaway from two sessions that described very different approaches to musical interactivity. After that, we’ll discuss one of my experiences with interactive music for the video game Spore Hero from Electronic Arts (pictured above).

Musical Intelligence

baldur_baldurssonBaldur Baldursson (pictured left) is the audio director for Icelandic game development studio CCP Games, responsible for the EVE Online MMORPG.  Together with Professor Kjartan Olafsson of the Iceland Academy of Arts, Baldursson presented a talk at GDC 2016 on a new system to provide “Intelligent Music For Games.”

Baldursson began the presentation by explaining why an intelligent music system for games can be a necessity.  “We basically want an intelligent music system because we can’t (or maybe shouldn’t really) precompose all of the elements,” Baldursson explains. He describes the conundrum of creating a musical score for a game whose story is still fluid and changeable, and then asserts,  “I think we should find ways of making this better.”

Baldursson shared during the course of the talk that the problem of adapting a linear art form (music) to a nonlinear format (interactivity) forces game audio professionals to look for technological solutions to artistic problems. A dynamic music system can best address the issue when it retains the ability to “create the material according to the atmosphere of the game,” Baldursson says.  “It should evolve according to the actual progression of the game in real time.  It should be possible to control various parameters of music simultaneously.”

Photo of Professor Kjartan Olafsson of the Iceland Academy of Arts (from the blog written by Winifred Phillips, award winning game composer)With these goals in mind, Baldursson teamed up with Professor Olafsson (pictured right), who had designed an interactive music system as part of his thesis at Sibelius Academy in Finland in 1988.  Olafsson’s music system, dubbed CALMUS (or CALculated MUSic), is the product of over twenty years of research and experimentation.  “The idea,” says Olafsson, “was to make a composing program that could use musical elements and material (as we are doing with pencil and paper), and use this new tool to make experimental music.”  To this end, Olafsson studied other interactive music systems, such as the one developed by the team at Electronic Arts for the Spore video game (described in greater detail in this Rolling Stone article).  After these studies and experiments, Olafsson refined the CALMUS system until “in the end we had this program that could compose music for orchestra, for chamber music, electronic music, and now today we are focusing on realtime composing.”

"A game composer must first use the CALMUS application to define the best set of parameters..." (from the blog article by game composer Winifred Phillips)In order to fully exploit the capabilities of the CALMUS system for realtime interactive composition, a game composer must first use the CALMUS application (pictured left) to define the best set of parameters to be used by the system to determine musical events such as themes, harmonies, scales, melodies, textures, tempos, etc.  “The algorithms we are using, they take care of putting together tones for harmonies and for melodies,” Olaffson says, “and the artificial intelligence system makes it possible for the system to go on, to compose music by itself, but as we are doing it, the composer using the system is defining the framework.”

"Defining the best parameters for musical events in the CALMUS software" (photo from the blog article by award-winning game composer Winifred Phillips)As a part of ongoing testing of the CALMUS software, Olafsson designed a performance-art experiment called “Calmus Waves,” in which the movements of dancers were motion-tracked (using the accelerometers, gyroscopes and compasses built into iPhones strapped to the dancers’ bodies).  This data was used during the dance performance to define the best parameters for musical events (such as melodies, harmonies, etc) in the CALMUS software.  While the dancers moved, the software recorded their movements, and translated them into musical algorithms.

These algorithms were then instantaneously translated into sheet music appearing on computer tablets for the musicians who were executing the music live during the dance performance.  The result was a reversal of the normal course of creative inception — instead of the dancers moving in accordance with the shapes and activity of the music, the musicians were playing in accordance with the shapes and activity of the dance.  Here’s a video that shows this performance in action:

Uses of this generative music technology for video games are still in experimental stages.  During the GDC talk, Baldur Baldursson described how his audio team integrated the CALMUS system into prototypes of the EVE Online game, using the audio middleware Wwise from Audiokinetic.  “CALMUS feeds the MIDI events into Wwise,” Baldursson explains, “which hosts the instruments. Currently the system runs outside WWise but ideally we’re going to have it as a plugin so that we can use it with other games we’re making.”

Here’s a video showing the prototype of the CALMUS system operating within EVE Online:

The “Intelligent Music for Games” talk was a fascinating exploration of a MIDI-based generative music system.  The entire talk is available for viewing by subscribers to the GDC Vault.

Musical Precognition

In linear media (such as films), the narrative is written ahead of time. With the story fully conceived, the musical score can emotionally prepare the listener for events that have not yet occurred. But how do we achieve the same awesome results in nonlinear media, when the story is fluid, and there is no way to predict future events?  In the talk, “Precognitive Interactive Music: Yes You Can!,” Microsoft’s senior technical audio director Robert Ridihalgh is joined by former Microsoft audio director Paul Lipson. Together, they explore ways to structure music so that it reacts so quickly to in-game changes that it seems to anticipate what’s to come.

Photo of Microsoft's senior technical audio director Robert Ridihalgh (from the "Interactive Music for the Video Game Composer" article by Winifred Phillips)“We’ve come up with a modular approach to build the musical content on the fly,” says Ridihalgh (pictured left).  This modular approach to interactive music has been dubbed Project Hindemith (named for the famous German composer Paul Hindemith).  The system converts linear music into interactive components in a number of ways.  First, a selection of linear loops are composed.  The instrumentation in these loops is subsequently broken down into several submixes – each representing a different level of energy and excitement.  The music system is then able to switch between these submixes as the player progresses and activates triggering points corresponding to the different submix recordings. This collection of loops and triggers results in music that reacts to gameplay in emotionally satisfying ways — and while these ideas are hardly new, Project Hindemith introduces novelty in the way in which the system utilizes musical transitions.

The ultimate goal of the Hindemith system is to build drama in a satisfying way towards a predetermined “event” – which can be any in-game occurrence that the development team would like to carry extra emotional weight.  To make this work, the Hindemith system focuses special attention on the moments occurring shortly before the player would encounter and activate the “event.”

In preparing music for the Hindemith system, the video game composer is asked to pore through the musical composition looking for short, declarative segments that may include musical flourishes or escalations.  The video game music composer is not asked to create these short musical segments from scratch, but to isolate them from music that was composed in a traditionally linear way.

Picture showing former Microsoft audio director Paul Lipson (from the blog article by Winifred Phillips, award winning game composer)“Content is extracted from pre-existing cues,” says Lipson (pictured right).  “Write a beautiful piece of music.  Write a linear idea.  And then pull from that the content that you need that can then be stitched together.”

Copied and isolated from the larger composition, these short chunks of music are now referred to as “Hindebits” in the system.  To prepare these short “Hindebits” for use in the interactive music matrix, the composer processes the music chunks into many variations, each one representing a small change in tempo and/or pitch.  The Hindemith system then calculates how many of these Hindebits will be required to bridge the gap between the player’s current position and the position of the event trigger (towards which these Hindebits are emotionally building).  The system is able to string the Hindebits together by calculating their length and extrapolating the remaining time before the player triggers the event.

“It allows us to react to changes in the amount of time that we’re anticipating it’s going to take a player to get to a particular event,” says Ridihalgh.  “And on top of that, because it’s dynamically changing and it’s built up of these bits, we’re able to fight repetition because its different each time.”

The system also allows for the creation and utilization of Hindebits that would be activated if the player were to retreat from the event, rather than triggering it.  In this way, the musical drama can swell and recede in accordance with the player’s decisions, and without requiring the composer to create tons of short musical snippets from scratch.  “Let the composers create their vision and then be able to take that vision and put it into a system like this,” says Ridihalgh.  “This is really a holistic system we’ve come up with.  There’s a guidance spec for composers, and a guidance spec for developers… it does work.  It can really add to gameplay and how the music supports the entire story of the game.”

Logo image for the Sunset Overdrive video game (from the article on interactive music design for video games, by game composer WInifred Phillips)While Project Hindemith is also in early stages and hasn’t been implemented fully in a released game, Lipson and Ridihalgh did mention in their presentation that a simplified version of Project Hindemith was used to enable musical interactivity for the Sunset Overdrive video game (pictured left).  We may be able to hear some of this interactivity in the way that the music builds during boss battles – so here is a video containing all of Sunset Overdrive’s boss fights.  It’s a pretty long video, but I was especially able to notice the interactive musical transitions during the first boss fight in this video, which is fought against a creature called “Fizzie.”

The “Precognitive Interactive Music: Yes You Can!” talk offered some interesting new ideas pertaining to musical interactivity for game audio implementation.  The entire talk is available for viewing by subscribers to the GDC Vault.

An Example: Spore Hero

Image of the Spore Hero soundtrack album (from the "Interactive Music for the Video Game Composer" article by Winifred Phillips, award-winning game composer)We’ve been talking about some very early prototypes of interactive music systems that haven’t yet debuted in released game titles, so let’s take a quick look at an example of how musical interactivity works in practical applications. I’ve worked on quite a few projects featuring interactive music systems, and one of the more interesting of these was the Spore Hero video game from Electronic Arts.  The Spore Hero soundtrack featured multiple systems of musical interactivity, as well as special musical minigames that tied the music directly to the gameplay mechanics.  One of my favorite interactive music moments was right at the very beginning, during the game’s opening menu system. It’s a simple interactive music system, and the music happens to be the main theme of the game. While simple, the interactive system is quite satisfying in its execution, and it shows that sometimes musical interactivity can be very straightforward and streamlined, while still providing pleasing reactions to the choices of players.

In the following video, you’ll see the Spore Hero main menu system in action.  There are three basic musical components to this system — the Main Menu music, the Battle Menu music, and the Sporepedia Menu music.  All three of these tracks present very different executions of the same melodic content for the Spore Hero Main Theme, with different instrumentation and atmosphere.  For instance, the Main Menu is by far the most dramatic in its orchestral treatment, while the Sporepedia Menu is quietly ambient and the Battle Menu is jaunty and primitive.  Transitions from one of these tracks to another are seamless because they are playing simultaneously in a Vertical Layering system (which we’ve discussed in previous blogs). The shifts in emotion and intensity between the menus is directly attributable to the emotive swells and dips emanating from the interactive music system.  Here’s a video that shows this system in action:

Poster image of Disney The Jungle Book (2016) movie - from the blog article by Winifred Phillips, game composerOn a related note, since Disney’s The Jungle Book movie is currently riding high as a number one box office blockbuster, I thought I’d share this video in which the same Spore Hero Main Theme that I composed for the game was subsequently featured prominently in an official trailer in which director Jon Favreau talks about his popular Jungle Book movie.

I think this use of the interactive music (that I originally composed for Spore Hero), now heard in the movie trailer for Disney’s The Jungle Book also serves to demonstrate that this type of interactive music can be written with the kind of emotional contours associated with linear composition for movie soundtracks:


While the systems described in the presentations from this year’s Game Developers Conference are prototypes, they offer an interesting glimpse into what ambitious game audio teams will be implementing for musical interactivity in the future.  I hope you’ve enjoyed this blog that explored a few interactive music systems, and please let me know what you think in the comments section below!


Photo of game composer Winifred Phillips in her music production studio.Winifred Phillips is an award-winning video game music composer.  Her credits include five of the most famous and popular franchises in video gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press. Follow her on Twitter @winphillips.

Comedic Sound for the Game Music Composer

In this week’s blog, I’d like to explore the role that comedy can play in a video game, and how we as game composers can use some of the techniques from comedic sound design to our best advantage.  Along the way, we’ll be looking at an interesting essay article by pop culture critic Christopher Gates, a presentation by game sound designer Luca Fusi at the December 2015 Vancouver Sound Design Meetup, and an interview with film sound designer Chris Scarabosio.

Game Composer Winifred Phillips works in her studio on the music of The Maw video game.I’ll also be sharing some of my experiences applying comedic sound design techniques during music composition for the video game The Maw – an award-winning and very funny game that was developed by Twisted Pixel Games.  To the left, you can see that I’m working hard to give The Maw its proper dose of comedic wackiness… but more on that later.

First, let’s get a broad perspective on the role of comedy in gaming.

Continue reading

From Total War to Assassin’s Creed: Music from my GDC Talk

From Total War to Assassin's Creed: Music from my GDC Talk (article by Winifred Phillips, video game composer)Last week, it was my honor and pleasure to give a presentation at the Game Developers Conference in San Francisco. My talk was entitled “From Total War to Assassin’s Creed: Music for Mobile Games.” The talk focused on the best and most effective methods for composition and implementation of music in portable gaming.  The talk was structured for the benefit of video game composers and game audio pros, and as a part of the presentation, I played short excerpts of music that I composed for several of my top mobile and handheld video game projects. Now that GDC is over, I thought I’d provide streaming links to some of the complete music tracks that I featured during my presentation, in case attendees were curious about the complete pieces of music. So, without further ado, here are tracks from my GDC 2016 talk!

Assassin’s Creed Liberation

The Assassin’s Creed Liberation game was released by Ubisoft for the PlayStation Vita, and delivered an immersive experience from the popular Assassin’s Creed franchise. The game was designed specifically for a portable system, and as such, all aspects of the design were adjusted to cater specifically to a portable gaming experience, including the music.

Game composer Winifred Phillips speaking about the music of Assassin's Creed Liberation at GDC 2016

Continue reading

Video Game Music Concert Tours

Colbert presents The Legend of Zelda Concert (article by Winifred Phillips, video game composer)This week I thought we’d check in with some of the top orchestral video game music concert tours currently underway.  We’ll take a look at some reviews of 2015 performances from the respective tours, and we’ll also take a look at video from some of the most recent concert performances.

The Legend of Zelda: Symphony of the Goddesses

Legend of Zelda Symphony of the Goddesses marquee (article by game composer Winifred Phillips)Originating as a simple four-minute overture performed at a Nintendo press event in 2011, Symphony of the Goddesses kicked off as a full-fledged concert tour in January 2012 and currently has 33 dates scheduled for 2016 that will take the popular tour all around the world.  The concert’s program lineup focuses exclusively on famous music from the Legend of Zelda games.  In a review of the September 25th 2015 performance at the Providence Performing Arts Center in Rhode Island, Broadway World critic Andria Tieman wrote, “Overall, this was a night of fantastic music, excellent people-watching and a fun, visual performance. This is something that Zelda fans should certainly seek out.” Here’s a video clip from the Oct. 30th 2015 broadcast of the Late Show with Stephen Colbert, in which the Symphony of the Goddesses tour performed their Legend of Zelda Medley:

Continue reading

Strategies in Audio & Music for Portable Games

Portable Game Audio and Music (article by award winning video game composer Winifred Phillips)
I’ll be talking about effective music composition for mobile and portable gaming platforms during my talk, “From Total War to Assassin’s Creed: Music for Mobile Games,” which will take place on March 16th at the upcoming Game Developers Conference at the Moscone Center in San Francisco.  With that in mind, I thought I’d use this blog entry to share some resources that explore current strategies and trends in regards to sound and music for mobile – resources that could be useful to the video game composer and sound designer.
Audio and Music for Portable Games (blog written by Winifred Phillips, video game composer)While my talk at GDC will focus specifically on music composition and implementation for handheld devices, the resources that will follow in this blog offer assistance with the more general technical issues that face audio pros creating sound assets for a mobile gaming environment.  I’ve included links to the original articles, as well as a summation of some of the best points that I thought were particularly interesting:

Continue reading

VR Audio: Past, Present & Future

VR Audio (article by award winning video game music composer Winifred Phillips)In this blog, I thought we might take a quick look at the development of the three dimensional audio technologies that promise to be a vital part of music and sound for a virtual reality video game experience. Starting from its earliest incarnations, we’ll follow 3D audio through the fits and starts that it endured through its tumultuous history.  We’ll trace its development to the current state of affairs, and we’ll even try to imagine what may be coming in the future!  But first, let’s start at the beginning:

3D Audio of the Past

Alan Blumlein (article by award winning video game music composer Winifred Phillips)In the 1930s, English engineer and inventor Alan Blumlein invented a process of audio recording that involved a pair of microphones that were coincident (i.e. placed closely together to capture a sound source).  Blumlein’s intent was to accurately reflect the directional position of the sounds being recorded, thus attaining a result that conveyed spatial relationships in a more faithful way.  In reality, Blumlein had invented what we now call stereo, but the inventor himself referred to his technique as “binaural sound.”  As we know, stereo has been an extremely successful format, but the fully realized concept of “binaural sound” would not come to fruition until much later.

Continue reading

Can Game Music and Sound Combat VR Sickness?

dizzyVirtual Reality Sickness: the nightmare of VR developers everywhere.  We all know the symptoms.  Nausea.  Headache.  Sweating. Pallor.  Disorientation. All together, these symptoms are a perfect recipe for disaster. No one wants their game to make players feel like they’ve been spinning on a demon-possessed merry-go-round.  So, how do we keep this affliction from destroying the brand new, awesome VR industry before it even gets a chance to get off the ground?

In response to this possible VR apocalypse, the top manufacturers have taken big steps to improve their popular devices.  Oculus improved the display on its famous Rift device, Valve introduced a motion-tracking system that helps us orient ourselves and not get nauseous when wearing the Vive, and PlayStation VR incorporated a wider field of view designed to make players feel more comfortable. Even with these efforts, players are still reporting motion sickness symptoms, and the creators of the VR systems have responded by pointing the finger of blame at game developers.  So, if the developers of VR games have to solve the problem, then how can the music and sound folks help? Can game music and sound combat VR sickness?

Continue reading