Video game music systems at GDC 2017: tools and tips for composers

Photo of video game composer Winifred Phillips, working in her music production studio on the music of the SimAnimals video game.

By video game composer Winifred Phillips | Contact | Follow

Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:

In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn.  So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems.  If you haven’t read parts one and two of this series, please go do so now and then come back:

  1. Video game music systems at GDC 2017: what are composers using?
  2. Video game music systems at GDC 2017: pros and cons for composers

Ready?  Great!  Here we go!

Tools and tips

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).To implement his interactive music system for the ocean exploration game ABZU, sound designer Steve Green relied on the popular Wwise middleware application in order to best facilitate transitions from one piece of music to another.  “This idea of changing music on the fly rather than bringing instruments in and out was a good example of how to do ABZU,” Green says.  In order to implement these transitions, Green relied primarily on simple horizontal resequencing, with some limited vertical layering to ease transitions.  At times, synchronization points were placed (usually at the beginning of each measure) in order to enable the music system to switch from one track to another.  From the article by game composer Winifred Phillips - an illustration of the game ABZU.Sometimes, this approach was too abrupt.  “Transitionary pieces are basically to help two tracks that are just not going to flow well together,” Green explains.

For instance, when the player is traversing a particular portion of a level (i.e. stage) the audio engine is able to keep track of the player’s progress.”We used a parameter that would gauge how far along you are in this stage of the level,” Green says.  “Once you reach the 90% range, (the audio engine) would call a transition.”  This musical transition would play simultaneously with the previous track, allowing the previous track to fade out and facilitating either a more gentle culmination into silence, or a more graceful segue to another piece of music.


 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).During educator Leonard J. Paul’s GDC presentation, he discussed his work on the audio team of the Vessel platforming game, but he also shared insights into several other projects from his repertoire, along with lots of general advice that’s worth considering.  “When working in games, of course, you’ve got to think about code,” Paul says. “You’ve got to think about RAM – how much space this is all gonna to fit in – streams, if you’re streaming.  You’ve got to think about what your RAM window size is and how fast you can get information in and out – how many streams you can do at once, what kind of compression you’re going to use,” From the article by game composer Winifred Phillips - an illustration of the game Vessel.Paul continues, listing areas of technical concern for the game audio expert to think about when working on a project like Vessel.  “Because if you don’t,” he warns, “then you’re going to run into some issues.”

Paul also mentions some of the more famous software tools available to audio pros, including Wwise, FMOD Studio, Fabric, and Pure Data, along with a number of general words of advice.  “Technical tips – use spreadsheets, be organized, use tools to refine your process, embrace prototyping,” Paul urges. “Do audio sketches and get this stuff done early.”

 

My perspective on sketches and prototypes

Video game music composer Winifred Phillips, pictured in her music production studio working on the music of the SimAnimals video game.In his presentation, Leonard J. Paul brought up a good point about the importance of sketches and prototypes in the workflow of a video game composer.  Most game music composers have gone through this type of iterative process.  For instance, for the SimAnimals game from Electronic Arts, I composed lots of sketches and prototypes before settling on the ultimate style of the game.  The audio team had many varying ideas about what the music should sound like, so the initial brainstorming process involved quite a bit of trial and error.  It was a game about heartwarming animal relationships, so should the music feel old-fashioned and lyrical?  On the other hand, the game focused on strategy mechanics, so should the music feel more puzzle-like and intellectual?  Also, the game included darker underpinnings associated with suffering and dissolution, so should the music stick with a light and airy feel, or should it weave dark and sinister textures into the mix?  I tried tons of permutations on all these ideas until I hit upon the right balance of elements, and in the end my music for SimAnimals took me in unexpectedly challenging directions.  The style couldn’t have been achieved without the process of trial and error that characterized my early efforts at the beginning of music composition. As an example, I’m including here two videos showing the difference between the brightly cheerful main theme music of the game, and one of several grim and dissonant variations on the same melodic theme:


 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).For Final Fantasy XV, audio programmer Sho Iwamoto created an audio engine he dubbed MAGI (Music API for Gaming Interaction).  While the audio engine accomplished tasks similar to the functionality of other applications such as Wwise and FMOD, the MAGI system also included some creative and original specializations.  “There are a lot of time signature variations or changes in JRPG music, so I designed MAGI to support them,” Iwamoto says.  Among its other capabilities, MAGI has the ability to adjust to changing tempo and time signature values across the length of a composition, allowing flexibility in the placing of synchronization points within the track.  This capability was born out of both a spirit of innovation and pure necessity.  Because Iwamoto was hired by Square Enix just two years earlier, much of the music of Final Fantasy XV had already been composed and recorded before he joined the team.  “Many of these (compositions) were not planned to be interactive,” Iwamoto admits.  Because of this, the musical compositions included variable tempos, dramatic pauses, and frequent shifts in time signature.

From the article by game composer Winifred Phillips - an illustration of the game Final Fantasy XV.In order to make these tracks interactive, Iwamoto used the MAGI system to insert custom synchronization points into each composition so that the transitions would be pleasing in a musical way.  “All you have to do is just move or delete (sync points) when you think the transitions are not musical, or add the sync points when you think the transition is musical,” Iwamoto comments.  “Sometimes these sync points can be hard to find, and you may have to wait sixteen bars or more to (find a workable) transition.”  This problem became a frequent issue during combat music, so to solve this problem, Iwamoto devised a strategy that he called the pre-end section. This was so named because it is always used as a preparation for the ending of the combat track.  “The pre-end section is designed to have more constant and dense sync points,” Iwamoto explains.  “That makes for really quick transitions.”  Existing as a separate music file, the pre-end section would always be in the same style as the main body of the composition, so the music could switch from the main body to the pre-end section smoothly.  Designed to be more consistent in terms of tempo, key and time signature, the pre-end section could accommodate lots of sync points, allowing the music to transition to a finishing flourish with almost instantaneous precision in accordance with the end of combat.  Despite its seeming complexity, Iwamoto meant for this system within the MAGI engine to be intuitive by design.  “We used very simple and basic approaches,” Iwamoto says.  “I wanted to make (MAGI) very easy and versatile.”


 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).For audio director Becky Allen, simple and basic approaches are important for very different reasons.  “We are a mobile studio and we have to be very mindful of our size because of the devices we’re played on,” Allen says, describing her strategy for the MIDI music system of Plants vs. Zombies: Heroes.  “We’re always thinking about size, and ways to be smart.”  Part of that strategy involved a plan for limiting the amount of memory required for the MIDI files and their associated library of instrument sounds.  At first, those limitations were meant to be sensible but not extreme.  “80 megs,” Allen says. “40 for music, 40 for sound effects and VO.”  But then, the audio development process hit a snag when another mobile game came along and made a big splash in the marketplace.  “Along came Clash Royale, and that was a quick game, it loaded quickly,” Allen says.  “We had some pressure to come down to 30 megabytes.”

From the article by game composer Winifred Phillips - an illustration of the game Plants vs. Zombies: Heroes.In the end, the team held to 60 megabytes, but even this required sacrifices.  “We switched some things from stereo to mono but not everything. We streamed all the WAV music and we streamed larger WAV assets.” After making these and other modifications, the newly streamlined music system was ready to show how interactive it could be.  For instance, by associating the player’s relative health with a RTPC (real-time parameter control) in Wwise, the music system could adjust MIDI data on the fly in accordance with the player’s success or failure.  “If you’re up by five points all the MIDI music went up a whole step, and if you’re up by ten points all the MIDI music went up two whole steps. And it worked, it was awesome,” Allen enthuses.  “Utilizing your tools and an open mind and a flexible mind and a flexible team,” Allen says, “you can make these changes all the way along through the process.”


 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).If flexibility is the ultimate ambition of an interactive music system, then a procedural music system could be considered the definitive realization of that goal.  While procedural music can be viewed as the height of interactivity, Paul Weir had built such systems for games before, and his viewpoint on the technology is much more down-to-earth. “Essentially they’re glorified random file players with a little bit of logic stuck on top,” Weir admits.  Together with the development team for the space exploration game No Man’s Sky, Weir set about to see what procedural music could bring to the project.  First, Weir hired a drum-driven electronica band called 65daysofstatic, then asked them to start creating music for their science fiction game.  “We were kind of really adamant that as a band, I did not want to interfere creatively with their process,” Weir says.  “So we always said, write an album, just write us an album, and we’re not going to start telling you how to do that, because you know how to do that – because you’re the band.  So we let them go off and write a very traditional album, but in the knowledge that we were going to come back to it later and just rip it all apart.”

From the article by game composer Winifred Phillips - an illustration of the game No Man's Sky.Using Pulse (his self-made procedural music generation tool) Weir set about disassembling the elements of the submitted music so that it could be incorporated into the procedural system.  He quickly learned that it would require more than just a retooling of the original content.  “So it wasn’t a question of saying just give us the stems and we’ll do it,” Weir says. “It was like, no no no, go right back and do us more performances, take out bits, give us more drum loops, perform new guitar riffs, create new stuff – almost like kind of remixing the original tracks.”  With these musical fragments, the procedural system could then perform calculations and assemble the library of elements into new combinations in accordance with the player’s locale and occupation. For instance, the music might be triggered depending on such in-game variables as proximity to buildings and time spent walking, utilizing what amounted to a large library of musical ‘soundscape sets’ that randomly alternated depending on how long any previous soundscape set had been playing.  “In the game you don’t get the album,” Weir insists.  “The game soundtrack is bits of what appear on the album. It’s lots of bits that aren’t on the album, but it feels relatively cohesive.”


 

Conclusion

So, there we have it!  We’ve compared the viewpoints of five game audio pros discussing the interactive music systems of their projects during their presentations at GDC 2017.  Interactive music is a fascinating subject, ranging from the simplest of solutions to the most complex and intricate of designs.  With ingenuity and patience, these audio developers have introduced their own creativity into the implementation of music in games.  Their strategies and solutions can help us to broaden our minds when we’re thinking about ways to make our own music interactive!  I hope you’ve enjoyed this three-article series, and please feel free to leave your comments below!

 

Photo of video game composer Winifred Phillips in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.

Video game music systems at GDC 2017: pros and cons for composers

Video game composer Winifred Phillips, pictured in her music production studio working on the music of LittleBigPlanet 2 Cross Controller

By Winifred Phillips | Contact | Follow

Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:

If you haven’t read part one of this article series, please go do that now and come back.

Okay, so let’s now contemplate some simple but important questions: why were those systems used?  What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?

Continue reading

VR Game Composer: Music Beyond the Virtual

Photo of video game music composer Winifred Phillips, from the article entitled "VR Game Composer: Music Beyond the Virtual."Welcome to the third installment in our series on the fascinating possibilities created by virtual reality motion tracking, and how the immersive nature of VR may serve to inspire us as video game composers and afford us new and innovative tools for music creation.  As modern composers, we work with a lot of technological tools, as I can attest from the studio equipment that I rely on daily (pictured left). Many of these tools communicate with each other by virtue of the Musical Instrument Digital Interface protocol, commonly known as MIDI – a technical standard that allows music devices and software to interact.

Image depicting VR apps from the article by Winifred Phillips, Game Music Composer.In order for a VR music application to control and manipulate external devices, the software must be able to communicate by way of the MIDI protocol – and that’s an exciting development in the field of music creation in VR!

This series of articles focuses on what VR means for music composers and performers. In previous installments, we’ve had some fun exploring new ways to play air guitar and air drums, and we’ve looked at top VR applications that provide standalone virtual instruments and music creation tools.  Now we’ll be talking about the most potentially useful application of VR for video game music composers – the ability to control our existing music production tools from within a VR environment.

We’ll explore three applications that employ MIDI to connect music creation in VR to our existing music production tools. But first, let’s take a look at another, much older gesture-controlled instrument that in ways is quite reminiscent of these motion-tracking music applications for VR:

Continue reading

VR Game Composer: Music Inside the Machine

Illustration for an article in the VR Game Composer series, written by video game composer Winifred PhillipsWelcome to part two of our ongoing exploration of some interesting possibilities created by the motion tracking capabilities of VR, and how this might alter our creative process as video game composers.

In part one we discussed how motion tracking lets us be awesome air guitarists and drummers inside the virtual space.  In this article, we’ll be taking a look at how the same technology will allow us to make interesting music using more serious tools that are incorporated directly inside the VR environment – musical instruments that exist entirely within the VR ‘machine.’

Our discussion to follow will concentrate on three software applications: Soundscape, Carillon, and Lyra.  Later, in the third article of this ongoing series, we’ll take a look at applications that allow our VR user interfaces to harness the power of MIDI to control some of the top music devices and software that we use in our external production studios. But first, let’s look at the ways that VR apps can function as fully-featured musical instruments, all on their own!

Soundscape

Let’s start with something simple – a step sequencer with a sound bank and signal processing tools, built for the mobile virtual reality experience of the Samsung Gear VR.

Video game composer Winifred Phillips demonstrating the Samsung Gear VR headset during the AES convention in NYC.I got a chance to demo the Samsung Gear VR during the Audio Engineering Society Convention in NYC last year, and while it doesn’t offer the best or most mind-blowing experience in VR (such as what we can experience from products like the famous Oculus Rift), it does achieve a satisfying level of immersion. Plus, it’s great fun!  The Soundscape VR app was built for Samsung Gear VR by developer Sander Sneek of the Netherlands.  It’s a simple app designed to enable users to create dance loops using three instruments from a built-in electro sound library, a pentatonic step sequencer that enables the user to create rhythm and tone patterns within the loops, and a collection of audio signal processing effects that let the user warp and mold the sounds as the loops progress, adding variety to the performance.

Continue reading

VR Game Composer: Music in the Air

Illustration for the article by game music composer Winifred Phillips, entitled "VR Game Composer: Music in the Air"Since I’ve been working recently on music for a Virtual Reality project (more info in the coming months), I’ve been thinking a lot about VR technology and its effect on the creative process.  Certainly, VR is going to be a great environment in which to be creative and perform tasks and skills with enhanced focus, according to this article from the VR site SingularityHub.  I’ve written in this blog before about the role that music and sound will play in the Virtual Reality gaming experience.  It’s clear that music will have an impact on the way in which we experience VR, not only during gaming experiences, but also when using the tools of VR to create and be productive.  With that in mind, let’s consider if the opposite statement may also be true – will VR impact the way in which we experience music, not only as listeners, but also as video game composers?

Game composer Winifred Phillips tries out the VR experience of Google Cardboard (pictured here in her music production studio).Simple VR technologies like the popular Google Cardboard headset can be a lot of fun – as I personally experienced recently (photo to the left).  However, they offer only the rudimentary visual aspects, which omits some of the most compelling aspects of the VR experience.  When motion tracking (beyond simple head movement) is added to the mix, the potential of VR explodes.  Over the next three articles, we’ll be exploring some interesting possibilities created by the motion tracking capabilities of VR, and how this might alter our creative process.  In the first article, we’ll have some fun exploring new ways to play air guitars and air drums in the VR environment. In the second article, we’ll take a look at ways to control virtual instruments and sound modules that are folded into the VR software.  And finally, in the third article we’ll explore the ways in which VR motion tracking is allowing us to immersively control our existing real-world instruments using MIDI. But first, let’s take a look at the early days of VR musical technology!

Continue reading

Video Game Music Production Tips from GDC 2016

Game Composer Winifred Phillips during her game music presentation at the Game Developers Conference 2016I was pleased to give a talk about composing music for games at the 2016 Game Developers Conference (pictured left).  GDC took place this past March in San Francisco – it was an honor to be a part of the audio track again this year, which offered a wealth of awesome educational sessions for game audio practitioners.  So much fun to see the other talks and learn about what’s new and exciting in the field of game audio!  In this blog, I want to share some info that I thought was really interesting from two talks that pertained to the audio production side of game development: composer Laura Karpman’s talk about “Composing Virtually, Sounding Real” and audio director Garry Taylor’s talk on “Audio Mastering for Interactive Entertainment.”  Both sessions had some very good info for video game composers who may be looking to improve the quality of their recordings.  Along the way, I’ll also be sharing a few of my own personal viewpoints on these music production topics, and I’ll include some examples from one of my own projects, the Ultimate Trailers album for West One Music, to illustrate ideas that we’ll be discussing.  So let’s get started!

Continue reading

The Great MIDI Comeback?

I recently read a great article by Bernard Rodrigue of Audiokinetic in Develop Magazine, heralding the return of MIDI to the field of video game music.  It was a very well-written article, filled with hopeful optimism about the capability of MIDI to add new musical capabilities to interactive video game scores, particularly in light of the memory and CPU resources of modern games consoles.

It also reminded me strongly of another article I read, from 2010.

Four years ago, Microsoft Sound Supervisor West Latta wrote for Shockwave-Sound.com that “we may see a sort of return to a hybrid approach to composing, using samples and some form of MIDI-like control data… the next Xbox or Playstation could, in fact, yield enough RAM and CPU power to load a robust (and highly compressed) orchestral sample library.”

So, it seems that the game audio sector has been anticipating a return to MIDI for awhile now (I wrote at length about the history and possible future of MIDI in my book, A Composer’s Guide to Game Music).  The question is – has the current generation of video game consoles evolved to the point that a quality orchestral sample library could be loaded and used by MIDI within a modern video game?  So far, I haven’t come across an answer to this question, and it’s a very intriguing mystery.

Certainly, the availability of an orchestral sample library in a MIDI-based interactive video game score would depend on factors that are not all hinged to the technical specs of the hardware.  Would the development teams be willing to devote that amount of memory to a quality orchestral sample library?  As games continue to participate in a visual arms race, development teams devote available hardware horsepower to pixels and polygons… so, would the music team be able to get a big enough slice of that pie to make a high-quality orchestral MIDI score possible?

I’m keeping my eyes open for developments in this area. Certainly, the return of MIDI could be a game changer for composers of interactive music, but only if the musical standards remain high, both in terms of the music compositions and the quality of the instruments used within them. Let me know in the comments if you’ve heard any news about the great MIDI comeback!