Video game music systems at GDC 2017: tools and tips for composers

Photo of video game composer Winifred Phillips, working in her music production studio on the music of the SimAnimals video game.

By video game composer Winifred Phillips | Contact | Follow

Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:

In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn.  So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems.  If you haven’t read parts one and two of this series, please go do so now and then come back:

  1. Video game music systems at GDC 2017: what are composers using?
  2. Video game music systems at GDC 2017: pros and cons for composers

Ready?  Great!  Here we go!

Tools and tips

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).To implement his interactive music system for the ocean exploration game ABZU, sound designer Steve Green relied on the popular Wwise middleware application in order to best facilitate transitions from one piece of music to another.  “This idea of changing music on the fly rather than bringing instruments in and out was a good example of how to do ABZU,” Green says.  In order to implement these transitions, Green relied primarily on simple horizontal resequencing, with some limited vertical layering to ease transitions.  At times, synchronization points were placed (usually at the beginning of each measure) in order to enable the music system to switch from one track to another.  From the article by game composer Winifred Phillips - an illustration of the game ABZU.Sometimes, this approach was too abrupt.  “Transitionary pieces are basically to help two tracks that are just not going to flow well together,” Green explains.

For instance, when the player is traversing a particular portion of a level (i.e. stage) the audio engine is able to keep track of the player’s progress.”We used a parameter that would gauge how far along you are in this stage of the level,” Green says.  “Once you reach the 90% range, (the audio engine) would call a transition.”  This musical transition would play simultaneously with the previous track, allowing the previous track to fade out and facilitating either a more gentle culmination into silence, or a more graceful segue to another piece of music.


 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).During educator Leonard J. Paul’s GDC presentation, he discussed his work on the audio team of the Vessel platforming game, but he also shared insights into several other projects from his repertoire, along with lots of general advice that’s worth considering.  “When working in games, of course, you’ve got to think about code,” Paul says. “You’ve got to think about RAM – how much space this is all gonna to fit in – streams, if you’re streaming.  You’ve got to think about what your RAM window size is and how fast you can get information in and out – how many streams you can do at once, what kind of compression you’re going to use,” From the article by game composer Winifred Phillips - an illustration of the game Vessel.Paul continues, listing areas of technical concern for the game audio expert to think about when working on a project like Vessel.  “Because if you don’t,” he warns, “then you’re going to run into some issues.”

Paul also mentions some of the more famous software tools available to audio pros, including Wwise, FMOD Studio, Fabric, and Pure Data, along with a number of general words of advice.  “Technical tips – use spreadsheets, be organized, use tools to refine your process, embrace prototyping,” Paul urges. “Do audio sketches and get this stuff done early.”

 

My perspective on sketches and prototypes

Video game music composer Winifred Phillips, pictured in her music production studio working on the music of the SimAnimals video game.In his presentation, Leonard J. Paul brought up a good point about the importance of sketches and prototypes in the workflow of a video game composer.  Most game music composers have gone through this type of iterative process.  For instance, for the SimAnimals game from Electronic Arts, I composed lots of sketches and prototypes before settling on the ultimate style of the game.  The audio team had many varying ideas about what the music should sound like, so the initial brainstorming process involved quite a bit of trial and error.  It was a game about heartwarming animal relationships, so should the music feel old-fashioned and lyrical?  On the other hand, the game focused on strategy mechanics, so should the music feel more puzzle-like and intellectual?  Also, the game included darker underpinnings associated with suffering and dissolution, so should the music stick with a light and airy feel, or should it weave dark and sinister textures into the mix?  I tried tons of permutations on all these ideas until I hit upon the right balance of elements, and in the end my music for SimAnimals took me in unexpectedly challenging directions.  The style couldn’t have been achieved without the process of trial and error that characterized my early efforts at the beginning of music composition. As an example, I’m including here two videos showing the difference between the brightly cheerful main theme music of the game, and one of several grim and dissonant variations on the same melodic theme:


 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).For Final Fantasy XV, audio programmer Sho Iwamoto created an audio engine he dubbed MAGI (Music API for Gaming Interaction).  While the audio engine accomplished tasks similar to the functionality of other applications such as Wwise and FMOD, the MAGI system also included some creative and original specializations.  “There are a lot of time signature variations or changes in JRPG music, so I designed MAGI to support them,” Iwamoto says.  Among its other capabilities, MAGI has the ability to adjust to changing tempo and time signature values across the length of a composition, allowing flexibility in the placing of synchronization points within the track.  This capability was born out of both a spirit of innovation and pure necessity.  Because Iwamoto was hired by Square Enix just two years earlier, much of the music of Final Fantasy XV had already been composed and recorded before he joined the team.  “Many of these (compositions) were not planned to be interactive,” Iwamoto admits.  Because of this, the musical compositions included variable tempos, dramatic pauses, and frequent shifts in time signature.

From the article by game composer Winifred Phillips - an illustration of the game Final Fantasy XV.In order to make these tracks interactive, Iwamoto used the MAGI system to insert custom synchronization points into each composition so that the transitions would be pleasing in a musical way.  “All you have to do is just move or delete (sync points) when you think the transitions are not musical, or add the sync points when you think the transition is musical,” Iwamoto comments.  “Sometimes these sync points can be hard to find, and you may have to wait sixteen bars or more to (find a workable) transition.”  This problem became a frequent issue during combat music, so to solve this problem, Iwamoto devised a strategy that he called the pre-end section. This was so named because it is always used as a preparation for the ending of the combat track.  “The pre-end section is designed to have more constant and dense sync points,” Iwamoto explains.  “That makes for really quick transitions.”  Existing as a separate music file, the pre-end section would always be in the same style as the main body of the composition, so the music could switch from the main body to the pre-end section smoothly.  Designed to be more consistent in terms of tempo, key and time signature, the pre-end section could accommodate lots of sync points, allowing the music to transition to a finishing flourish with almost instantaneous precision in accordance with the end of combat.  Despite its seeming complexity, Iwamoto meant for this system within the MAGI engine to be intuitive by design.  “We used very simple and basic approaches,” Iwamoto says.  “I wanted to make (MAGI) very easy and versatile.”


 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).For audio director Becky Allen, simple and basic approaches are important for very different reasons.  “We are a mobile studio and we have to be very mindful of our size because of the devices we’re played on,” Allen says, describing her strategy for the MIDI music system of Plants vs. Zombies: Heroes.  “We’re always thinking about size, and ways to be smart.”  Part of that strategy involved a plan for limiting the amount of memory required for the MIDI files and their associated library of instrument sounds.  At first, those limitations were meant to be sensible but not extreme.  “80 megs,” Allen says. “40 for music, 40 for sound effects and VO.”  But then, the audio development process hit a snag when another mobile game came along and made a big splash in the marketplace.  “Along came Clash Royale, and that was a quick game, it loaded quickly,” Allen says.  “We had some pressure to come down to 30 megabytes.”

From the article by game composer Winifred Phillips - an illustration of the game Plants vs. Zombies: Heroes.In the end, the team held to 60 megabytes, but even this required sacrifices.  “We switched some things from stereo to mono but not everything. We streamed all the WAV music and we streamed larger WAV assets.” After making these and other modifications, the newly streamlined music system was ready to show how interactive it could be.  For instance, by associating the player’s relative health with a RTPC (real-time parameter control) in Wwise, the music system could adjust MIDI data on the fly in accordance with the player’s success or failure.  “If you’re up by five points all the MIDI music went up a whole step, and if you’re up by ten points all the MIDI music went up two whole steps. And it worked, it was awesome,” Allen enthuses.  “Utilizing your tools and an open mind and a flexible mind and a flexible team,” Allen says, “you can make these changes all the way along through the process.”


 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).If flexibility is the ultimate ambition of an interactive music system, then a procedural music system could be considered the definitive realization of that goal.  While procedural music can be viewed as the height of interactivity, Paul Weir had built such systems for games before, and his viewpoint on the technology is much more down-to-earth. “Essentially they’re glorified random file players with a little bit of logic stuck on top,” Weir admits.  Together with the development team for the space exploration game No Man’s Sky, Weir set about to see what procedural music could bring to the project.  First, Weir hired a drum-driven electronica band called 65daysofstatic, then asked them to start creating music for their science fiction game.  “We were kind of really adamant that as a band, I did not want to interfere creatively with their process,” Weir says.  “So we always said, write an album, just write us an album, and we’re not going to start telling you how to do that, because you know how to do that – because you’re the band.  So we let them go off and write a very traditional album, but in the knowledge that we were going to come back to it later and just rip it all apart.”

From the article by game composer Winifred Phillips - an illustration of the game No Man's Sky.Using Pulse (his self-made procedural music generation tool) Weir set about disassembling the elements of the submitted music so that it could be incorporated into the procedural system.  He quickly learned that it would require more than just a retooling of the original content.  “So it wasn’t a question of saying just give us the stems and we’ll do it,” Weir says. “It was like, no no no, go right back and do us more performances, take out bits, give us more drum loops, perform new guitar riffs, create new stuff – almost like kind of remixing the original tracks.”  With these musical fragments, the procedural system could then perform calculations and assemble the library of elements into new combinations in accordance with the player’s locale and occupation. For instance, the music might be triggered depending on such in-game variables as proximity to buildings and time spent walking, utilizing what amounted to a large library of musical ‘soundscape sets’ that randomly alternated depending on how long any previous soundscape set had been playing.  “In the game you don’t get the album,” Weir insists.  “The game soundtrack is bits of what appear on the album. It’s lots of bits that aren’t on the album, but it feels relatively cohesive.”


 

Conclusion

So, there we have it!  We’ve compared the viewpoints of five game audio pros discussing the interactive music systems of their projects during their presentations at GDC 2017.  Interactive music is a fascinating subject, ranging from the simplest of solutions to the most complex and intricate of designs.  With ingenuity and patience, these audio developers have introduced their own creativity into the implementation of music in games.  Their strategies and solutions can help us to broaden our minds when we’re thinking about ways to make our own music interactive!  I hope you’ve enjoyed this three-article series, and please feel free to leave your comments below!

 

Photo of video game composer Winifred Phillips in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.

Video game music systems at GDC 2017: pros and cons for composers

Video game composer Winifred Phillips, pictured in her music production studio working on the music of LittleBigPlanet 2 Cross Controller

By Winifred Phillips | Contact | Follow

Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:

If you haven’t read part one of this article series, please go do that now and come back.

Okay, so let’s now contemplate some simple but important questions: why were those systems used?  What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?

Continue reading

Video game music systems at GDC 2017: what are composers using?

By video game music composer Winifred Phillips | Contact | Follow

Video game composer Winifred Phillips, presenting at the Game Developers Conference 2017.The 2017 Game Developers Conference could be described as a densely-packed deep-dive exploration of the state-of-the-art tools and methodologies used in modern game development.  This description held especially true for the game audio track, wherein top experts in the field offered a plethora of viewpoints and advice on the awesome technical and artistic challenges of creating great sound for games. I’ve given GDC talks for the past three years now (see photo), and every year I’m amazed at the breadth and diversity of the problem-solving approaches discussed by my fellow GDC presenters.  Often I’ll emerge from the conference with the impression that we game audio folks are all “doing it our own way,” using widely divergent strategies and tools.

This year, I thought I’d write three articles to collect and explore the ideas that were discussed in five different GDC audio talks.  During their presentations, these five speakers all shared their thoughts on best practices and methods for instilling interactivity in modern game music.  By absorbing these ideas side-by-side, I thought we might gain a sense of the “bigger picture” when it comes to the current leading-edge thinking for music interactivity in games. In the first article, we’ll look at the basic nature of these interactive systems.  We’ll devote the second article to the pros and cons of each system, and in the third article we’ll look at tools and tips shared by these music interactivity experts. Along the way, I’ll also be sharing my thoughts on the subject, and we’ll take a look at musical examples from some of my own projects that demonstrate a few ideas explored in these GDC talks:

So, let’s begin with the most obvious question.  What kind of interactive music systems are game audio folks using lately?

Continue reading

GDC 2017: How video game composers can use music to build suspense

Winifred Phillips, video game composer, giving a talk as part of the Game Developers Conference 2016 in San Francisco.

By Winifred Phillips | Contact | Follow

The Game Developers Conference is coming up soon!  Last year I presented a talk on music for mobile games (pictured above), and I’m pleased that this year I’ll be presenting the talk, “Homefront’ to ‘God of War’: Using Music to Build Suspense(Wednesday, March 1st at 11am in room 3006 West Hall, Moscone Center, San Francisco).  In my talk I’ll be focusing on practical applications of techniques for video game composers and game audio folks, using my own experiences as concrete examples for exploration.  Along the way, I’ll be discussing some very compelling scholarly research on the relationship between suspense, gameplay and musical expression.  In preparing my GDC 2017 presentation I did a lot of reading and studying about the nature of suspense in video games, the importance of suspense in gameplay design, and the role that video game music plays in regulating and elevating suspense.  There will be lots of ground to cover in my presentation!  That being said, the targeted focus of my presentation precluded me from incorporating some very interesting extra research into the importance of suspense in a more general sense… why human beings need suspense, and what purpose it serves in our lives.  I also couldn’t find the space to include everything I’d encountered regarding suspense as an element in the gaming experience.  It occurred to me that some of this could be very useful to us in our work as game makers, so I’d like to share some of these extra ideas in this article.

Continue reading

Montreal International Game Summit 2014

WP-Session-MIGS-2
Just came back from a fantastic experience speaking at the Montreal International Game Summit 2014!

Montreal is a beautiful city, and that’s reflected in the fantastic rainbow-tinted windows of the convention center where the summit was held – the Palais des congrès de Montréal.

MIGS-Colored-Glass

The weather was relatively warm while I was there, but I spent most of my time at the summit… although I did enjoy the city views from the enormous walls of windows.


MIGS-WindowWall

This year’s summit was more vibrant than ever, and the fun began in the wide hallways where attendees could test their video game trivia knowledge by taking part in “The Game Masters” quiz show.  I wasn’t brave enough to compete, but I had to get a picture of the set:

MIGS-Game-Masters The show floor was very exciting this year, with a lot of the activity centering around the two Oculus Rift stations.  My attention, though, was caught by two things.  First — the AudioKinetic booth, where the Wwise middleware was on display:

MIGS-AudioKinetic

And second, this big green guy who was hulking inside the Ubisoft booth.  He looks brutish, but don’t let that fool you — he’s a real charmer.

MIGS-Ubisoft-Mascot

Here’s the big schedule of sessions that was posted at the event.  My speech was towards the end of the second day of the summit, right before the MIGS Brain Dump (which is kind of similar to a GDC rant).

MIGS-Big-Schedule

My talk was titled, “Music, the Brain, and the Three Levels of Immersion.”  It was a great audience!

WP-Session-MIGS-1

I had a wonderful time sharing some ideas about the role that music can play in helping gamers to achieve immersion. I’d first explored these ideas in my book, A Composer’s Guide to Game Music, and it was such a joy to explore these ideas with such an enthusiastic audience!

MIGS-SpeechPhoto-YT-Thumb

I’ll be posting a video excerpt from my talk soon.  It was wonderful to speak at MIGS 2014, and thanks to all the creative and inspiring people I met this year in Montreal – it was a tremendous pleasure!

LittleBigPlanet 3 – Hollywood Music in Media Awards

hmma2014

Hey, everyone!  After my blog yesterday about winning the Hollywood Music in Media Award, I’ve received a bunch of questions about LittleBigPlanet 3 and the Hollywood Music in Media Awards program – so I thought I’d post some info that explains everything in a bit more detail.  It’s a little easier to do this in third person, so here goes – I hope this helps!

On November 4th, game composer Winifred Phillips received a 2014 Hollywood Music in Media Award (HMMA) in the category of “Best Song in a Video Game” for music she composed for the LittleBigPlanet 3 video game (developed by Sumo Digital Ltd. and published by Sony Computer Entertainment, LLC).

As one of the composers on the LittleBigPlanet™3 music composer team, Phillips was recognized for her song, “LittleBigPlanet 3 Ziggurat Theme.”  

Info about LittleBigPlanet 3:

Sony Computer Entertainment Europe announced the news about this award on November 6th via their official LittleBigPlanet twitter feed.  

The critically acclaimed and best-selling PlayStation® franchise  LittleBigPlanet™ makes its debut on PlayStation®4  with  LittleBigPlanet™3. Sackboy™ is back, this time with playable new friends – Toggle, OddSock and Swoop – each with their own unique abilities and personalities.  This handcrafted adventure is set to revolutionize the way gamers Play, Create and Share in the world of LittleBigPlanet.

Sumo Digital Ltd, the developer of LittleBigPlanet 3, has forged a reputation as a World Class multiple award-winning independent game development studio. The company has grown exponentially over 11-years from 15, to 270 people spread across the Head Office in Sheffield, UK and a dedicated Art Studio in Pune, India.  Sumo Digital is one of the UK’s leading game development studios.

Info about the Hollywood Music in Media Awards:

The Hollywood Music in Media Award ceremony was held on November 4th 2014 at 7pm at the Fonda Theater (6126 Hollywood Boulevard, Hollywood).  The Hollywood Music in Media Awards recognizes and honors the creation of music for film, TV, and videogames, the talented individuals responsible for licensing it and musicians both mainstream and independent, from around the globe. The HMMAs is co-branded with Billboard/Hollywood Reporter Film & TV Music Conference. HMMA advisory board, selections committee and voters include National Academy of Recording Arts and Sciences, Oscar, Emmy, Society of Composers and Lyricists and Guild of Music Supervisors members.

Additional info about Winifred Phillips (the LittleBigPlanet franchise and the HMMAs):

Phillips’ award-winning track, “LittleBigPlanet 3 Ziggurat Theme,” from LittleBigPlanet™3, is a highly interactive musical work, written as a complex classical fugue, and incorporating an organic, world-music influenced instrumental arrangement in support of a women’s choir.  Phillips has received two previous Hollywood Music in Media Awards – in 2012 for Assassin’s Creed Liberation (Ubisoft®) and in 2010 for the Legend of the Guardians (Warner Bros. Interactive Entertainment).  Phillips is one of the composers on the LittleBigPlanet music composer team, and has created tracks for six games in the series, including LittleBigPlanet 2, LittleBigPlanet 2 Toy Story, LittleBigPlanet Cross Controller, LittleBigPlanet PS Vita, LittleBigPlanet Karting, and now LittleBigPlanet 3.  

Phillips’ work as a composer for the LittleBigPlanet game series has earned her previous awards nominations from the Game Audio Network Guild Awards, the Hollywood Music in Media Awards, the NAViGaTR Awards and the D.I.C.E. Interactive Achievement Awards.  Phillips works with award-winning music producer Winnie Waldron for all her projects, including those in the LittleBigPlanet franchise.  Phillips is also the author of the book A COMPOSER’S GUIDE TO GAME MUSIC, published in 2014 by the Massachusetts Institute of Technology Press.  

Hey, Big Spender! (Games Versus Movies)

movie-projector-55122_640

Since GameSoundCon is starting up tomorrow, I thought I’d direct your attention to an article written by GameSoundCon founder Brian Schmidt about the difference between the money raked in by the video game industry and the motion picture industry.  While it has been reported that games bring in more money than films, according to Brian Schmidt’s article, the figures for the game industry are distorted by the inclusion of hardware sales.  In fact, because film tickets are generally much cheaper than game sales, a blockbuster film must sell tickets to many more people in order to take in the same amount of money that a console game could earn through far fewer sales.

Reading this article on the GameSoundCon site, I found myself thinking about the idea of premium purchases.  What kind of psychological conditions need to exist in order for a customer to become a big spender — i.e. to opt to spend more money?  With a console video game, we are clearly looking at a premium purchase — these games can be up to 50 dollars or more.  Does the willingness to spend reflect on the depth and diversity of the experience?  Games typically outlast films in terms of their long-term entertainment value. Is this the reason why the top-tier console games are able to sustain their premium pricing?

The motion picture industry has made attempts to introduce premium pricing into its business model.  From luxurious theaters with reclining seats, to motion simulators with weather effects and smell-o-vision, to 3D formats, motion picture companies have been repeatedly urging movie-goers to part with larger sums in exchange for enhanced experiences, but success rates have been very limited or are rapidly on the decline.  Console video games, however, have been successfully charging premium prices for many years.

What I find interesting, though, is what happens when these two entertainment juggernauts start reducing their prices.  While movie theaters had dug in their heels for many years and refused to offer discounts, there is currently an initiative underway by the National Association of Theatre Owners for discount tickets to be offered in selected locations on off-nights.  While experimental and limited in scope, the trial period should be revealing in terms of whether discounts will lure movie-goers back to the theaters with more frequency.  In the world of video games, however, the discount experiment is fully underway in the form of the iTunes App Store, XBox Live Indie Store, the PlayStation Network Minis Store, Google Play, the Facebook App Center, and many other online retailers that offer games for drastically reduced prices.  If the movie industry hopes that discounted tickets will lure more people into theaters, then I wonder — have discounted games captured more casual gamers and turned them into frequent players/purchasers?

hang-tags-234566_640

In 2010, Reuters reported that free games had lured players successfully into gaming, converting them into paying customers.  However, in 2014 the optimism had waned as an industry analyst at the NPD Group warned that PC gamers, accustomed to receiving discounts, were now expecting all games to be very inexpensive.  Currently, XBox Live Gold members enjoy steep discounts with the “Deals With Gold” program, and PlayStation Network Plus members get their games at up to 75% off.

In contrast, however, the Gartner’s forecast for worldwide gaming revenues in the coming two years has estimated that mobile, console and PC games will see dramatic increases in their earnings. This seems to be good news for gaming — discounts for some game products may not have taken the luster away from the big-ticket games.  Our industry currently enjoys the benefits of a wider array of offerings that can be priced accordingly, whereas the motion picture industry continues to be saddled with a fairly uniform pricing structure that has been difficult for them to challenge and adjust.