Arrangement for Vertical Layers Pt. 2: A Game Composer’s Guide

music-sepiaWelcome back to my three-part blog series on the art of arrangement for dynamic music systems in games! In this series of articles, I’m discussing the techniques of arrangement as they pertain to interactive game music by exploring examples from the music I composed for video games from the LittleBigPlanet franchise.  In part one of this series, we went over the role of the arranger, the importance of an interesting and creative arrangement, and the relationship between arranging for traditional linear and non-linear interactive music. We also reviewed arranging techniques that apply to melody, and how these should (or should not) be applied in an interactive composition.  If you haven’t read part one, please click here to read that entry first, and then return here to continue reading part two. Okay, are you back now? Ready? Here we go!

Continue reading

Game Music Middleware, Part 4: Elias

Middleware-Blackboard

Welcome back to my blog series that offers tutorial resources exploring game music middleware for the game music composer. I initially planned to write two blog entries on the most popular audio middleware solutions (Wwise and FMOD), but since I started this blog series, I’ve been hearing buzz about other middleware solution and so I thought it best to expand the series to incorporate other interesting solutions to music implementation in games.  This blog will focus on a brand new middleware application called Elias, developed by Elias Software.  While not as famous as Wwise or FMOD, this new application offers some intriguing new possibilities for the creation of interactive music in games.

If you’d like to read the first three blog entries in this series, you can find them here:

Game Music Middleware, Part 1: Wwise

Game Music Middleware, Part 2: FMOD

Game Music Middleware, Part 3: Fabric

Elias-Logo

Elias stands for Elastic Lightweight Integrated Audio System.  It is developed by Kristofer Eng and Philip Bennefall for Microsoft Windows, with a Unity plugin for consoles, mobile devices and browser-based games.  What makes Elias interesting is the philosophy of its design.  Instead of designing a general audio middleware tool with some music capabilities, Eng and Bennefall decided to bypass the sound design arena completely and create a middleware tool specifically outfitted for the game music composer. The middleware comes with an authoring tool called Elias Composer’s Studio that “helps the composer to structure and manage the various themes in the game and bridges the gap between the composer and level designer to ease the music integration process.”

Here’s the introductory video for Elias, produced by Elias Software:

The interactive music system of the Elias middleware application seems to favor a Vertical Layering (or vertical re-orchestration) approach with a potentially huge number of music layers able to play in lots of combinations.  The system includes flexible options for layer triggering, including the ability to randomize the activation of the layers to keep the listening experience unpredictable during gameplay.

Elias has produced a series of four tutorial videos for the Composer’s Studio authoring tool.  Here’s the first of the four tutorials:

There’s also a two-part series of tutorials about Elias produced by Dale Crowley, the founder of the game audio services company Gryphondale Studios.  Here’s the first of the two videos:

As a middleware application designed specifically to address the top needs of game music composers, Elias is certainly intriguing!  The software has so far been used in only one published game – Gauntlet, which is the latest entry in the awesome video game franchise first developed by Atari Games for arcade cabinets in 1985.  This newest entry in the franchise was developed by Arrowhead Game Studios for Windows PCs.  We can hear the Elias middleware solution in action in this gameplay video from Gauntlet:

The music of Gauntlet was composed by Erasmus Talbot.  More of his music from Gauntlet is available on his SoundCloud page.

Elias Software recently demonstrated its Elias middleware application on the expo floor of the Nordic Game 2015 conference in Malmö, Sweden (May 20-22, 2015).  Here’s a look at Elias’ booth from the expo:

Elias-NordicGame2015

Since Elias is a brand new application, I’ll be curious to see how widely it is accepted by the game audio community.  A middleware solution that focuses solely on music is definitely a unique approach!  If audio directors and audio programmers embrace Elias, then it may have the potential to give composers better tools and an easier workflow in the creation of interactive music for games.

MIDI in Wwise for the Game Music Composer: Peggle Blast

PeggleBlastBanner

In a previous blog post, we took a look at a few tutorial resources for the latest version of the Wwise audio middleware.  One of the newest innovations in the Wwise software package is a fairly robust MIDI system.  This system affords music creators and implementers the opportunity to avail themselves of the extensive adaptive possibilities of the MIDI format from within the Wwise application.  Last month, during the Game Developers Conference in the Moscone Center in San Francisco, some members of the PopCap audio development team presented a thorough, step-by-step explanation of the benefits of this MIDI capability for one of their latest projects, Peggle Blast.  Since my talk during the Audio Bootcamp at GDC focused on interactive music and MIDI (with an eye on the role of MIDI in both the history and future of game audio development), I thought that we could all benefit from a summation of some of the ideas discussed during the Peggle Blast talk, particularly as they relate to dynamic MIDI music in Wwise.  In this blog, I’ve tried to convey some of the most important takeaways from this GDC presentation.

PeggleBlastSession

“Peggle Blast: Big Concepts, Small Project” was presented on Thursday, March 5th by three members of the PopCap audio team: technical sound designer RJ Mattingly, audio lead Jaclyn Shumate, and senior audio director Guy Whitmore.  The presentation began with a quote from Igor Stravinsky:

The more constraints one imposes, the more one frees oneself, and the arbitrariness of the constraint serves only to maintain the precision of the execution.

This idea became a running theme throughout the presentation, as the three audio pros detailed the constraints under which they worked, including:

  1. A 5mb memory limit for all audio assets
  2. Limited CPU
  3. 2.5mb memory allocation for the music elements

These constraints were a result of the mobile platforms (iOS and Android) for which Peggle Blast had been built.  For this reason, the music team focused their attention on sounds that could convey lots of emotion while also maintaining a very small file size.  Early experimentation with tracks structured around the use of a music box instrument led the team to realize that they still needed to replicate the musical experience from the full-fledged console versions of the game.  A simple music-box score was too unsatisfying, particularly for players who were familiar with the music from the previous installments in the franchise.  With that in mind, the team concentrated on very short orchestral samples taken from the previous orchestral session recordings for Peggle 2.  Let’s take a look at a video from those orchestral sessions:

Using those orchestral session recordings, the audio team created custom sample banks that were tailored specifically to the needs of Peggle Blast, focusing on lots of very short instrument articulations and performance techniques including:

  1. pizzicato
  2. marcato
  3. staccato
  4. mallets

A few instruments (including a synth pad and some orchestral strings) were edited to loop so that extended note performances became possible, but the large majority of instruments remained brief, punctuated sounds that did not loop.  These short sounds were arranged into sample banks in which one or two note samples would be used per octave of instrument range, and note tracking would transpose the sample to fill in the rest of the octave.  The sample banks consisted of a single layer of sound, which meant that the instruments did not adjust their character depending on dynamics/velocity.  In order to make the samples more musically pleasing, the built-in digital signal processing capability of Wwise was employed by way of a real-time reverb bus that allowed these short sounds to have more extended and natural-sounding decay times.

wwise-logo-empowers

The audio team worked with a beta version of Wwise 2014 during development of Peggle Blast, which allowed them to implement their MIDI score into the Unity game engine.  The composer, Guy Whitmore, composed the music in a style consisting of whimsically pleasant, non-melodic patterns that were structured into a series of chunks.  These chunks could be triggered according to the adaptive system in Peggle Blast, wherein the music went through key changes (invariably following the circle of fifths) in reaction to the player’s progress.  To better see how this works, let’s watch an example of some gameplay from Peggle Blast:

As you can see, very little in the way of a foreground melody existed in this game.  In the place of a melody, foreground musical tones would be emitted when the Peggle ball hit pegs during its descent from the top of the screen.  These tones would follow a predetermined scale, and would choose which type of scale to trigger (major, natural minor, harmonic minor, or mixolydian) depending on the key in which the music was currently playing.  Information about the key was dropped into the music using markers that indicated where key changes took place, so that the Peggle ball would always trigger the correct type of scale at any given time.  The MIDI system did not have to store unique MIDI data for scales in every key change, but would instead calculate the key transpositions for each of the scale types, based on the current key of the music that was playing.

The presentation ended with an emphasis on the memory savings and flexibility afforded by MIDI, and the advantages that MIDI presents to game composers and audio teams.  It was a very interesting presentation!  If you have access to the GDC Vault, you can watch a video of the entire presentation online.  Otherwise, there are plenty of other resources on the music of Peggle Blast, and I’ve included a few below:

Inside the Music of Peggle Blast – An Interview with Audio Director Guy Whitmore

Peggle Blast!  Peg Hits and the Music System, by RJ Mattingly

Real-Time Synthesis for Sound Creation in Peggle Blast, by Jaclyn Shumate

PopCap’s Guy Whitmore Talks Musical Trials And Triumphs On Peggle Blast

 

Interactive Game Music of LittleBigPlanet 3 (Concepts from my GDC Talk)

LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes -- Speaker, Winifred Phillips

LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes

I was honored to be selected by the Game Developers Conference Advisory Board to present two talks during this year’s GDC in San Francisco earlier this month.  On Friday March 6th I presented a talk on the music system of the LittleBigPlanet franchise.  Entitled LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” the talk explored the Vertical Layering music system that has been employed in all of the LittleBigPlanet games (the soundtrack for that game is available here).  I’ve been on the LittleBigPlanet music composition team for six of their games so far, and my talk used many examples from musical compositions I created for all six of those projects.

After my talk, several audience members let me know that the section of my presentation covering the music system for the Pod menu of LittleBigPlanet 3 was particularly interesting – so I thought I’d share the concepts and examples from that part of my presentation in this blog.

GDC15_LBP3-Game-Music_Winifred-Phillips-1

That’s me, giving my GDC speech on the interactive music system of the LittleBigPlanet franchise.  Here I’m just starting the section about the Pod menu music.

The audio team at Media Molecule conceived the dynamic music system for the LittleBigPlanet franchise.  According to the franchise’s music design brief, all interactive tracks in LittleBigPlanet games must be arranged in a vertical layering system.  I discussed this type of interactive music in a blog I published last year, but I’ll recap the system briefly here as well.  In a vertical layering music system, the music is not captured in a single audio recording.  Instead, several audio recordings play in sync with one other.  Each layer of musical sound features unique content.  Each of the layers represents a certain percentage of the entire musical composition.  Played all together, we hear the full mix embodying the entire musical composition.  Played separately, we hear submixes that are still satisfying and entertaining for their own sake.  The music system can play all the layers either together or separately, or can combine the layers into different sets that represent a portion of the whole mix.

When implemented into gameplay, layers are often activated when the player moves into a new area.  This helps the music to feel responsive to the player’s actions.  The music seems to acknowledge the player’s progress throughout the game.  It’s important to think about the way in which individual layers may be activated, and the functions that the layers may be called upon to serve during the course of the game.

GDC15-Winifred-Phillips

In LittleBigPlanet 3, the initial menu system for the game is called “The Pod.”  The music for the Pod is arranged in vertical layers that are activated and deactivated according to where the player is in the menu hierarchy.  All the layers can be played simultaneously, and they play in multiple combinations… however, each of the individual layers is also associated with a specific portion of the menu system, and is activated when the player enters that particular part of the menu.

Let’s take a quick tour through the layers of the Pod menu music.  I’ve embedded some short musical excerpts of each layer.  You’ll find the SoundCloud players for each layer embedded below – just click the Play buttons to listen to each excerpt.  The first layer of the Pod menu music is associated with the Main Menu, and it features some floaty, science-fiction-inspired textures and effects:

The next layer is associated with a menu labeled “My Levels,” and the music for that layer is very different.  Now, woodwinds are accompanied by a gentle harp, combining to create a homey and down-to-earth mood:

Moving on to the music layer for the “Play” menu, we find that the instrumentation now features an ethereal choir and shimmering bells, expressing a much more celestial atmosphere:

Now let’s listen to the “Adventure” menu layer, in which plucked strings and bells combine to deliver a prominent melody line:

Finally, in the music layer associated with the “Community” and “Popit” menus, we hear a quirky mix of synths and effects that hearken back to menu music from previous games in the LittleBigPlanet franchise:

As the player navigates the Pod menu system, these various music layers are activated to correspond with the player’s location within the menu hierarchy.  This sort of dynamic music triggering lies at the very heart of the Vertical Layering interactive music mechanism.

GDC15_LBP3-Game-Music_Winifred-Phillips-2

Every layer in a Vertical Layering composition can have a very distinct musical identity.  When that layer is turned off, the entire mix changes in a noticeable way.  The mix can be changed subtly…

GDC15_LBP3-Game-Music_Winifred-Phillips-3

… or it can be altered radically, with large scale activations or deactivations of layers.  Even with these kinds of dramatic changes, the musical composition retains its identity.  The same piece of music continues to play, and the player is conscious of continuing to hear the same musical composition, even though it has just altered in reaction to the circumstances of gameplay and the player’s progress.

In the Pod menu music system, the layers would change in reaction to the player’s menu navigation, which could be either slow and leisurely or brisk and purposeful.  Layer activations and deactivations would occur with smooth crossfade transitions as the player moved from one menu to another.  Now let’s take a look at a video showing some navigation through the Pod menu system, so we can hear how these musical layers behaved during actual gameplay:

 As you can see, triggering unique musical layers for different portions of the menu system helps serve to define them.  I hope you found this explanation of the Pod music to be interesting!  If you attended GDC but missed my talk on the interactive music of LittleBigPlanet, you’ll be able to find the entire presentation posted as a video in the GDC Vault in just a few weeks.  In the meantime, please feel free to add any comments or questions below!

Game Music Middleware, Part 1: Wwise

Middleware-Blackboard

 

The use of third-party audio middleware in game development is a slow-growth trend that will doubtless become more influential in the future, so I thought I’d devote my next two blog entries to some recent video tutorials produced by a few intrepid game audio pros who have stepped forward to help the community.

This first blog is devoted to Wwise, and the tutorials come to us courtesy of Michael Kamper, Senior Audio Developer at Telltale Games.  With over 16 years of experience in audio production, Michael has served as Audio Director for The Bureau: Xcom Declassified, Bioshock 2 DLC, and Bioshock 2, among others.  Michael has also enjoyed a successful career as a feature film sound designer for such movies as Mission Impossible III, The Day After Tomorrow, Legally Blonde, and many more.  His experience in television includes sound design for Xena: Warrior Princess, Hercules: The Legendary Journeys and Profiler.

In the following two-part video tutorial, Michael generously details his Wwise workflow during music implementation for The Bureau: Xcom Declassified:

Wwise Interactive Music Demo – The Bureau – Part 1 – Switches

border-159926_640_white

Wwise Interactive Music Demo – The Bureau – Part 2 – Segments/RTPCs

 

 

A Composer’s Guide to Game Music – Horizontal Resequencing, Part 1

CCGM-VES-1

Here’s another installment of a four-part series of videos I produced as a supplement to my book, A Composer’s Guide to Game Music. This video focuses on the Horizontal Resequencing model employed in the Speed Racer video game, providing some visual illustration for this interactive music composition technique. The video demonstrates concepts that are explored in depth in my book, beginning on page 188.

Music in the Manual: FMOD Studio Vs. Wwise

Wwise-FMOD

A few days ago, I downloaded and installed the latest version of a software package entitled FMOD Studio and was pleasantly surprised to discover that an oversight had been corrected. It’s not unusual for software updates to correct problems or provide additional functionality, but this update was especially satisfying for me. The makers of FMOD Studio had added the “Music” section to the software manual.

A brief explanation: FMOD Studio is a software application designed by Firelight Technologies to enable game audio professionals to incorporate sound into video games. The application focuses solely on audio, and is used in conjunction with game software. In essence, FMOD Studio is folded into the larger construct of a game’s operational code, giving the overall game the ability to do more sophisticated things with the audio side of its presentation.

When FMOD Studio was initially released in August of 2012, the manual did not include information about the music capabilities of the software. Admittedly, the majority of FMOD Studio users are sound designers whose interests tend to focus on the tools for triggering sound effects and creating environmental atmospheres. That being said, many composers also use the portions of the FMOD Studio application that are specifically designed to enable the assignment of interactive behaviors to music tracks. It was a bit puzzling that the manual didn’t describe those music tools.

One of the biggest competitors to FMOD Studio is the Wwise software from Audiokinetic. Wwise offers much of the same functionality as FMOD, and in working with the software one of the things I really like about it is its documentation. Audiokinetic put a lot of thought and energy into the Wwise Fundamentals Approach document and the expansive tutorial handbook, Project Adventure. Both of these documents discuss the music features of the Wwise software, offering step-by-step guidance for the creation of interactive music systems within the Wwise application. This is why the omission of any discussion of the music tools from the FMOD manual was so perplexing.

It’s true that many of the music features of the FMOD Studio software are also useful in sound design applications, and some are similar in their function to tools described in the sound design portions of the manual. Firelight Technologies may have assumed that those portions of the manual would be sufficient for all users, including composers. However, composers are specialists, and their priorities do not match those of their sound design colleagues. In using the FMOD Studio tools, the needs of composers would be sharply different from those driving the rest of the audio development community. Wwise understood this from the start, but FMOD seemed to be following a philosophy that hearkened back to the early days of the game industry.

In those days, the audio side of a game was often created and implemented by a single person. This jack-of-all-trades would create all the sound effects, voice-overs and music. Nowadays, the audio field is populated by scores of specialists. It makes sense for FMOD Studio to acknowledge specialists such as composers in their software documentation, and I’m very glad to see that they’ve just done so. If you’d like to learn more about FMOD Studio, you can see a general overview of the application in this YouTube video: