Arrangement for Vertical Layers Pt. 2: A Game Composer’s Guide

music-sepiaWelcome back to my three-part blog series on the art of arrangement for dynamic music systems in games! In this series of articles, I’m discussing the techniques of arrangement as they pertain to interactive game music by exploring examples from the music I composed for video games from the LittleBigPlanet franchise.  In part one of this series, we went over the role of the arranger, the importance of an interesting and creative arrangement, and the relationship between arranging for traditional linear and non-linear interactive music. We also reviewed arranging techniques that apply to melody, and how these should (or should not) be applied in an interactive composition.  If you haven’t read part one, please click here to read that entry first, and then return here to continue reading part two. Okay, are you back now? Ready? Here we go!

The countermelody

two-jumpA countermelody is a second melody playing simultaneously with the main melodic line.  Usually, the countermelody is perceived as subordinate to the foreground melody of the composition, but when we give it lots of love and attention, the countermelody can be as memorable and pleasing as the main melody of the piece. Not only does this make the overall composition stronger, but it works beautifully with the Vertical Layering construct of the LittleBigPlanet dynamic music system.  When we create a countermelody for an interactive track, it’s always best for us to keep in mind that this melody should be strong enough to stand on its own… because it may have to do just that.

As we discussed in part one of this blog series, a melody in Vertical Layers often needs to stand in near (or even complete) isolation. When all the other layers are deactivated, the main melody has to be strong enough to carry the track forward and continue to entertain the gamer. This may also be true for the countermelody, if it has been assigned to a layer separate from that which the main melody occupies.  Separating the melody and countermelody can be tremendously useful for this interactive music system, because then two of the layers contain unique, simultaneous foreground melodic content that can either coexist or be played separately.  Of course, it’s also possible to assign the melody and countermelody to the same layer, and we can decide to do that if it seems that these melody lines are too interrelated to sever.  Let’s now embark on a few arrangement concepts that work very effectively for countermelodies. Then we’ll wrestle with some arrangement techniques for countermelodies that can be questionable within the Vertical Layering system.

Effective technique: voice leading / step-wise movement

step-by-stepAs we discussed in the first part of this blog series, step-wise movement (voice leading) involves the creation of a sequence of notes that proceed smoothly and give the impression of logical development.  The satisfaction that the listener experiences from hearing such natural, sensible movement in a melody arises from a long history of established conventions and practices in music theory.  These include the many musical scales that have been formalized and named over the years: this long list will give us an idea of how many of these musical scales exist.  The more familiar the musical scale, the more comfortable it will feel to the listener.  Melodies typically follow the step-wise movement established by scales, and leaps within these melodies are designed to feel compatible with them.

In the previous blog we discussed how step-wise movement can make an isolated foreground melody feel satisfying on its own. This same principle applies to a countermelody that occupies its own layer within the Vertical Layering system.  When that layer is played by itself, it should also feel rewarding and enjoyable.  Step-wise movement will help the countermelody to continue to make aesthetic sense, even when detached from the rest of the arrangement and played on its own.

In this example from the “LittleBigPlanet 3 Stitchem Manor” track, we’ll first hear the layer containing the melody (which is carried by the violin section).  Then we’ll hear the countermelody (performed by woodwinds).  Finally, we’ll listen to both the melody and countermelody layers playing together.

As we can see from the above example, both the melody and countermelody follow step-wise movement and the same scale and harmonic structure, so that the shape and movement of the melodic lines remains pleasing (both separately and combined).  This example also demonstrates another arrangement technique that works well within this interactive system:

Effective technique: polythematic / polyrhythmic motion

One of the best ways for us to help the countermelody stand on its own as a viable melody in its own right is to distinguish it clearly from the main melody. We can emphasize contrast with two techniques that pertain to the ways in which melodies move.

DifferentPolythematic music is sometimes defined as music structured around multiple melodies, but the term can also refer to the ways in which these melodies interact.  Focusing on contrasting opposites in musical structure, the polythematic approach emphasizes contrary rhythms, textures, movements, and energy levels.  If the main melody on top is sustained and slowly mellifluous, the countermelody underneath will feature short punctuated notes and energetic leaps.  The emphasis is always on finding the greatest contrast between one melody and the other.  This arrangement technique helps to clearly differentiate the two melodies, giving them unique musical identities.

Polyrhythmic music is often defined as the juxtaposition of two contrasting time signatures (for instance, a jig rhythm set against a traditional common-time march).  These combinations result in awesome rhythmic complexities which arise from the inconsistent downbeats characteristic of the two dissimilar time signatures. However, polyrhythmic music can also be defined as music that emphasizes syncopation and rhythmic complexity, particularly when there is an overall lack of a strong down beat.  In describing this type of music in his book, What to Listen for in Music, the famous American composer Aaron Copland points out the incredibly clever 16th century madrigals that created complex rhythms without a clear time signature present.  “The result makes for an unprecedented interweaving of independent rhythms,” writes Copland. “The effect is anything but primitive.” For an example of this technique, listen to the “Faulte d’argent” by 16th century composer Josquin Deprez:

Creating rhythmic contrast between a melody and countermelody is a great way to differentiate them, while also infusing the piece with energy and momentum.

Effective technique: polyphonic motion (contrary and oblique)

Polyphonic (or contrapuntal) music is defined as “many-voiced” – a composition that includes multiple independent melody lines.  A countermelody is, by nature, a polyphonic construct.  However, there are degrees of effectiveness between polyphonic techniques when used in a Vertical Layering system.  Let’s look at the two techniques that work best:

  • signpost-contrary-obliqueContrary motion (in which two melodic lines move in opposite directions)
  • Oblique motion (in which one melodic line moves while the other does not)

These two types of polyphonic construction work well because they cause the melodic lines to diverge and display unique characteristics which set them apart from each other.  We’ll be talking about the other two types of polyphonic motion later in this article.

Effective technique: frequency range isolation

In the first installment of this blog series on arranging for Vertical Layers, we discussed the advantage of placing the main melody in a range that allows it to sing clearly through the mix, without interference.  We can achieve this more easily if we place the pitches of the melody in a frequency range that is not currently occupied, and use an instrument with a tone color that isn’t already represented in the rest of the arrangement.  While this is important for differentiating the main melody from the accompaniment, it becomes doubly important when attempting to create contrast between the primary melodic line and its countermelody.

In this example from LittleBigPlanet 2 Victoria’s Lab, I composed the main melody as a set of contrapuntal lines carried by a female choir:

The countermelody was also carried by a female vocal, so to make sure that the melody and countermelody didn’t clash, I wrote the countermelody to be much higher.  In this way, I made sure that it would occupy its own frequency range:

Let’s listen to that excerpt from the LittleBigPlanet 2 Victoria’s Lab track, with all layers now playing:

Effective technique: the bassline countermelody

bassSometimes this approach may seem less than obvious, but in conducive circumstances, the bassline of an arrangement can serve as an effective countermelody.  The viability of this option relies on the logical step-wise motion of the bassline, and its ability to leap forward in the mix with enough dynamism to capture the listener’s attention.  For instance, in the LittleBigPlanet Cross Controller game, the HenOMorph Encounter track includes a bassline countermelody, which contrasts against a high-pitched electronic synth melody.  This works partially because the main melody has simplified at this point, allowing the bassline countermelody to shine through.  Here’s an excerpt of the simple main melody:

Here’s the bassline countermelody:

Now, here’s what it sounds like when all layers are playing:

Questionable technique: antiphonal style

Q&ASo now we move on to the countermelody arranging techniques that can be problematic for an interactive composition in Vertical Layers.  We’ll start with antiphonal style – otherwise known as either antecedent/consequent or (more simply put) question/answer style.  This is the melodic structure in which a melody is split into phrases: one of these phrases creates a questioning tension, while the subsequent phrase releases that tension with a sense of welcome resolve.  The question phrase is performed by one instrument or instrumental section, and the answer phrase is performed by another instrument or section. This is sometimes also known as “call and response.”

Antiphonal style can work in Vertical Layers if the antecedent and consequent are kept within the bounds of a single layer.  While the phrases feel separate and distinct, and are performed by different instruments, the melody that they create still gives the impression of a single melody line, not a countermelody.  We might be tempted to treat them as separate entities and split them into two layers, but that would likely create dissatisfying “half melodies” when the layers are played in isolation.  We’d have a layer full of “questions” and another layer full of “answers.”

Questionable technique: polyphonic motion (similar and parallel)

This problematic arranging approach is a direct echo of the “harmonized melody” problem (which we covered in part one of this blog series). The “many-voiced” of a polyphonic composition allows for types of motion between the melody lines.  We’ve already talked about contrary and oblique motion, which work well for countermelodies, but there are two other types which don’t function as well in a Vertical Layering system:

  • Similar motion (in which two melodic lines move up and down together, but not with an identical interval between them)
  • Parallel motion (in which the two melodic lines move up and down together while keeping the same interval between them)

dancers-similarThese two types of polyphonic construction essentially function as “harmonized melody.”  This kind of similar or parallel movement in two melodies renders them too alike for the purposes of a melody and countermelody in Vertical Layers. We should always seek contrast when creating two melodies that will coexist simultaneously in the same piece.

Questionable technique: homorhythmic texture

In homophonic movement, all pitches move together in parallel, ascending and descending as one unit.  This concept usually implies homorhythmic texture, in which all the notes are performing identical rhythms.  However, it is possible for the notes of a melody and countermelody to have contrasting motion, but an identical rhythmic structure.  This would give us a homorhythmic texture between the melody and countermelody.

While the notes can be very different, there will still be perceived a sense of sameness instilled by the identical rhythms that both the melody and countermelody are performing.  For this reason, it may be best for us to avoid a homorhythmic structure between the melody and countermelody in a Vertical Layering composition.  We want these two melodic lines to give distinctly dissimilar impressions on the listener when played in isolation, and identical rhythms work against that goal.

Questionable technique: imitation

A countermelody that uses the imitation technique will repeat the exact notes of the main melody, but will do so in the manner of an echo: a delayed repetition of the melodic theme.  The most popular and recognizable example of imitation is the “Row, Row, Row Your Boat” song we all learned as kids. The first voice begins the melody, and then after the passage of a couple of measures, the second voice begins the same melody.


This approach never works for a melody and countermelody in Vertical Layering, because the two melodic lines are identical.  Regardless of the fact that they occur at different times, when they are heard in isolation they will sound and feel indistinguishable from each other.


So, we’ve completed our exploration of arrangement techniques for countermelodies that are effective within a Vertical Layering structure. In the third and final installment of this blog series, I’ll be talking about arrangement techniques for the harmonic support, including chord progressions, voicings and inversions.  I hope you found this blog to be helpful! I’d love to hear what you think in the comments!


Studio1_GreenWinifred Phillips is an award-winning game music composer.  Her most well-known projects include such famous and popular games as Assassin’s Creed Liberation, God of War, the LittleBigPlanet franchise, and many others.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press.  Follow her on Twitter @winphillips.

Arrangement for Vertical Layers Pt. 1: A Game Composer’s Guide


The art of arrangement.

This week, I’m beginning a three-part blog series on the art of arrangement for dynamic music systems in games. I’ll be exploring the techniques of arrangement as they relate to interactive game music by discussing examples from the music I composed for video games from the blockbuster LittleBigPlanet franchise.

Arrangement for interactivity is a complex subject, so I thought we should begin by developing a basic understanding of what arrangement is, and then move on to the reasons why it’s especially important in interactive music.

When we talk about music, we often discuss the “melody” and the “accompaniment.” The melody typically occupies the forefront of our attention, while the accompaniment supplies a chord structure and a series of musical events that embellish and support the foreground content. Both melody and accompaniment are expressed by musical instruments chosen to compliment each other when combined, or contrast with each other in the most desirable way. While the underlying melody and chord structure of a piece of music can be considered the essential skeleton of the piece, the creativity with which these structural elements are expressed by musical instruments is the true flesh-and-blood of the composition. This creative expression of music through musical instrumentation is known as the arrangement.


Melody and chord structure: the skeletons of music, waiting to be fleshed out by the arrangement.

Big George Webley

Big George Webley

Sound on Sound magazine has a great article written by George Webley about the art of traditional song arrangement. “Big George” was a well known writer, radio personality, and composer best known for the television theme music for Ricky Gervais’s The Office. His article for Sound on Sound is a great educational resource for traditional linear arrangement. According to Big George, “The line between composing or producing a tune and arranging it is a very thin one. If you’re either the producer or the composer, arrangement goes with the territory.”

Interactive music: the LittleBigPlanet model

Traditional arrangement is a perfect expression of artistic control. As the composer/arranger, we decide on the instruments that will perform the composition and then write the notes that will be played by those instruments. When the arrangement is complete, the piece of music reaches a fixed state in which the only alterations take the form of subtle performance nuances by the musicians.

For interactive game music, we have to accept that the music will never reach a perfectly fixed state, but will instead exist in constant flux. By creating the music in component parts rather than as a contiguous whole, we enable the game’s programming to manipulate the arrangement, activating and deactivating instruments, thinning and thickening the mix, adding and removing structural elements such as melody and harmony… the possibilities for variation are enormous.

LittleBigPlanet 3: photo taken at the Electronic Entertainment Expo 2014.

Photo of Winifred Phillips taken in the LittleBigPlanet 3 booth at the Electronic Entertainment Expo 2014.

Six simultaneous recordings.

Six simultaneous recordings.

In this blog series, we’ll be using the music of LittleBigPlanet as our primary example of an interactive music system. In LittleBigPlanet, music is written in an interactive model known as Vertical Layering. The interactive music in the LittleBigPlanet console games is divided into six simultaneous audio layers playing simultaneously. For those of us not familiar with Vertical Layering, I provided an exploration of that topic in my previous blog, “Interactive Game Music of LittleBigPlanet 3 (Concepts from my GDC Talk).” Here’s a quote from that blog:


In a vertical layering music system, the music is not captured in a single audio recording. Instead, several audio recordings play in sync with one other. Each layer of musical sound features unique content. Each of the layers represents a certain percentage of the entire musical composition. Played all together, we hear the full mix embodying the entire musical composition. Played separately, we hear submixes that are still satisfying and entertaining for their own sake. The music system can play all the layers either together or separately, or can combine the layers into different sets that represent a portion of the whole mix.

When implemented into gameplay, layers are often activated when the player moves into a new area. This helps the music to feel responsive to the player’s actions. The music seems to acknowledge the player’s progress throughout the game. It’s important to think about the way in which individual layers may be activated, and the functions that the layers may be called upon to serve during the course of the game.

To make this abstract concept a bit more concrete, let’s watch a video from the LittleBigPlanet franchise that breaks down the six layer music system and shows how it works. Here’s the “Escape from San Crispin” level of the LittleBigPlanet Cross Controller game. Notice how the music layers are played separately in the beginning of the video, and then their activations and deactivations are indicated at the bottom of the screen throughout gameplay.

For a more elaborate discussion of the six layer system, we can watch this video I produced to explore the music system of the LittleBigPlanet Toy Story video game:

With this interactive music system in mind, let’s now turn to a discussion of the relationship between the fundamentals of arrangement and the Vertical Layering system. We’ll begin with one of the most important elements of any piece of music:

The melody

Those of us who write music will instinctively understand the construction of a satisfying melody, but the strength of the melody can waver in the context of an interactive arrangement. Let’s first go over a few arrangement concepts that are effective when developing a melody for arranging within an interactive framework. Then we’ll address a few common arrangement techniques that can be questionable within this interactive music system.

Effective technique: voice leading / step-wise movement

In arranging the instruments of an interactive composition based in Vertical Layers, we have to make basic decisions about which musical elements will reside in which layers. When we assign the main melody to a layer, we’ll need to make sure that it feels satisfying when heard in isolation. The music system can (and probably will) play the melody layer by itself. In the case of the LittleBigPlanet music system, this means that only one-sixth of the piece will be playing.


The conductor of an orchestra.

Imagine yourself as a conductor, turning to five-sixths of the orchestra and waving them to silence. The remaining players might suddenly feel very exposed and unsupported, unless the music had been structured to accommodate this possibility.

For a melody layer, the graceful and aesthetically-pleasing shape of the musical movement becomes paramount. A composer might feel comfortable creating a melody with surprising leaps and unorthodox movement, confident that the rest of the arrangement will support these jumps… but remove the rest of the arrangement, and the melody becomes awkward and strange. Step-wise movement, otherwise known as voice leading, is the art of structuring a succession of notes so that they flow from one to another with an impression of natural progression. This sensation of “logical movement” stems from our human experience with melody, which includes the note sequences we’re accustomed to hearing, and the scales that reside at the heart of popular music today.

One of the best explanations of this concept can be found in Bobby McFerrin’s tremendously entertaining presentation at the event “Notes & Neurons: In Search of the Common Chorus” at the World Science Festival in 2009. Bobby McFerrin is a pop vocalist famous for his popular 1988 hit song, “Don’t Worry, Be Happy.” In this awesome video, McFerrin demonstrates how the pentatonic scale has become so ingrained in our collective psyche that when given the beginning of a note sequence, we need little prompting to know how to complete it. The notes just feel natural, so we know what should come next.

When considering a melody for use in Vertical Layers, we should always try to incorporate logical step-wise movement. This will help to ensure that the melody still works, even when the arrangement drastically changes due to the interactive nature of the music system.

Effective technique: instrumental motion

In Vertical Layering for a music system such as the one employed by the LittleBigPlanet franchise, the musical content has to be fairly busy in order to be flexible as an interactive construct. For LittleBigPlanet games, each individual layer needs to have enough content to stand on its own, and yet still work when all the layers are played simultaneously. This allows the music to be very interactive during gameplay: the layers can all be played separately, or they can be played in lots of different combinations. In fact, if the game fully utilized every available combination, then gamers would hear the music layers combined in 42 completely different ways, from singular layers, to combinations of 2, 3, 4, 5 and 6. With all these layers designed to have enough foreground content to stand alone, the result of this design can be a music mix that hovers on the edge of sounding excessively busy. How can the all-important foreground melody leap to the top of the mix if the arrangement tends to be quite active?

Author Vince Corozine

Author Vince Corozine

This can be addressed by a technique I described in my book (A Composer’s Guide to Game Music) as “compositional dynamics.” Techniques such as arpeggios, glissandi, runs, trills and other types of ornamentation serve to enliven a melody. The resulting energy creates a sense of movement that helps the melodic theme to leap out of the mix. According to Vince Corozine, the author of Arranging Music for the Real World (Mel Bay Publications), “instrumental motion, with its active design, captures more of the listener’s attention.” (p. 30)

A good example of this can be found in the LittleBigPlanet 3 Ziggurat Theme, which I composed and arranged for the LittleBigPlanet 3 video game. In this piece of music, the melody is arranged as a polyphonic construct according to the tenets of the classical fugue. Because of this, the melody exhibits constant motion and activity, which helps it to leap out of the mix. Here’s a short excerpt of the layer containing the melody:

And here’s the full mix at that specific point in the composition, showing how energetic activity can help a melody leap out of a busy arrangement:

Effective technique: frequency range isolation

As we know, Equalization (EQ) is the process whereby a mixing engineer adjusts the perceived volume level of audio frequencies in sonic sources in order to produce a result that is ideal for the recording in which the sound resides. These adjustments take place on frequency ranges – some high, some middle, and some low. To create a good mix, all the musical elements should occupy a portion of the frequency spread that isn’t already cluttered with content. There should be a balanced and satisfying combination of high, middle and low sounds that are clearly perceived within their respective frequency ranges.

An graphical equalizer can adjust frequencies, from low to high.

An graphical equalizer can adjust frequencies, from low to high.

When we’re arranging a composition, frequency ranges become important when considering what instruments should carry what parts. In the case of the melody, it’s helpful if the assigned instruments occupy a frequency range that’s been isolated and dedicated to the melody – a frequency range that isn’t already busy. Musical content that resides in the same frequency range as the melody may tend to clash and compete with it, and this can prevent the melody from being perceived clearly within the full mix.

Questionable technique: doubling

In traditional arrangement, doubling is the reinforcement of a particular instrument part by virtue of its exact replication in another section of the ensemble. For instance, the flute section may double the violins when they are carrying a melody line, and this doubling may serve to brighten the sound, give clarity to the articulations and help the melody to soar brightly over the rest of the arrangement.

Identical doubling can be problematic.

Identical doubling can be problematic.

While doubling within a single interactive layer is a viable technique that can work within Vertical Layering, it becomes highly ineffective when the doubling instruments are separated into multiple layers. The power of the Vertical Layering system lies in its ability to introduce unique content whenever its layers are activated. If two layers contain nearly identical content (such as a violin layer and a flute layer that both carry an identical melody), then the activation of those layers loses all of its dramatic impact.

In this case, if we intend to have certain instruments doubling each other within the arrangement, it makes sense for us to combine those instruments in the same interactive layer.

Questionable technique: harmonized melody

A useful way for us to add interest and color to a melody is the arrangement technique of adding a harmony line. The harmony supports and embellishes the melody. It’s typically constructed in three possible ways:

  • A line moving in a constant parallel interval such as a diatonic third,
  • A line with parallel motion but a variable interval that uses the available notes of the underlying chord structure,
  • A parallel harmony line that uses the tones of the pentatonic scale in the current key signature of the piece.

This approach can work within Vertical Layers if the melody and its accompanying harmony line are grouped in the same layer. However, separating the melody and harmony lines into different layers is problematic for this interactive system. Because the pitches of the harmony line will necessarily move up and down in very similar ways with an identical rhythmic structure, it will bear a strong resemblance to the original melody. This can result in two layers that have strikingly similar content. It’s much better to create a full-blown counter-melody (which is a topic we’ll be discussing in part two of this blog series).

Questionable technique: the full homophonic tutti

All together now...

All together now…

Now we come to the one arrangement technique that never works in Vertical Layering: the full homophonic tutti. Literally meaning “all together” in Italian, the full tutti is a section of a piece of music in which all instruments of the ensemble play together. This would be perfectly fine for a Vertical Layering composition, except when that tutti is constructed homophonically, with identical rhythms and faithful parallel motion. A section like this might even be arranged as a full unison passage, with all instruments playing the same notes (although some may be arranged in higher or lower octaves according to their natural ranges).

Having the whole ensemble come together to state a theme can be a powerful technique within a traditionally arranged piece of music. However, as we’ve learned from the sections about harmonized melody and doubling, whenever we use an arrangement technique that creates similar or identical content between layers, we deteriorate the interactive utility of the piece of music we’re arranging. If the layers sound similar (or the same), they’ll make no impact when they’re activated or deactivated in the game.


So, now we’ve completed this discussion of some arrangement techniques for melodies that work well within the structure of a Vertical Layering composition. In the next installment of this blog series, I’ll be talking about techniques for effective countermelodies in Vertical Layers, and then later we’ll explore voicing techniques for effective harmonic support. I hope you’ve enjoyed this blog! Please let me know what you think in the comments!


Studio1_GreenWinifred Phillips is an award-winning game music composer.  Her most well-known projects include such famous and popular games as Assassin’s Creed Liberation, God of War, the LittleBigPlanet franchise, and many others.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press.  Follow her on Twitter @winphillips.

Simultaneous Genres for the Game Music Composer


Since the Grammy nominating period is underway, I’ve been thinking a lot about my work on the popular LittleBigPlanet video game franchise.  I recently submitted a couple of tracks from the LittleBigPlanet 3 soundtrack for consideration (LittleBigPlanet 3 The Ziggurat Theme and LittleBigPlanet 3 The Pod), which brought to mind some of the creative processes that went into structuring the interactive music for the LittleBigPlanet games. In my blog today I’d like to share with you a fun technique that’s actually one of my favorite aspects of composing music in this interactive system.  I’ve been a part of the music composition team for six LittleBigPlanet games, and over the course of those six projects, I’ve been asked to execute this particular technique a lot.  It’s a great musical trick that can only be pulled off when you’re composing in a Vertical Layering system.  Since the LittleBigPlanet music system is one of the most complex examples of Vertical Layering, it really makes for ideal conditions in which to execute this technique, which is…

Composing in Two Simultaneous Genres

We’ll recall that Vertical Layering is the process by which a single piece of music is recorded into separate yet simultaneous audio recordings that each embody a percentage of the whole composition.  This allows the music to be disassembled and reassembled into different instrument combinations during gameplay.

Last year I produced an instructional video that goes into the process in more depth:

Vertical Layering gives us the chance to write one track in two simultaneous musical genres. In traditional music composition, if we want to combine two genres of music in one track we can attempt to pull together a creative fusion, in which the styles are mixed together to create a result that isn’t quite one genre, and isn’t quite the other. Fusions can be exciting and original, but that’s not what we’re talking about here. The musical interactivity of Vertical Layering gives us the chance to keep the two genres distinct, and still incorporate them into the same piece of music.  The track can switch up which layers are playing, and it’ll be in one musical genre in one moment, and then become another genre at the drop of a hat. It’s very cool, and a lot of fun for a composer – although it can also be hard for us to wrap our heads around, especially at first.

Let’s take a look at three examples of this technique in action.  We’ll start with a couple of tracks from LittleBigPlanet 2, and then a more recent track from the latest game in the franchise – LittleBigPlanet 3.

LittleBigPlanet 2 Victoria’s Lab

In the “Victoria’s Lab” level from LittleBigPlanet 2, our world-famous hero, Sackboy, must do his best to navigate a perilous steampunk bakery, using cupcakes as weapons against evil robots made of teacups.  All these wacky elements come together to create the typically whimsical awesomeness that makes LittleBigPlanet the lovable franchise it is.  I composed the Victoria’s Lab music for LittleBigPlanet 2. Here’s a music video that includes the complete track, along with action from the Victoria’s Lab level of the game:

Victoria’s lab aptly demonstrates the “two simultaneous musical genres” approach.  For instance, Victoria’s lab can switch from a whimsical lollipop style to a gritty orchestral/rock hybrid at any time. Here’s the whimsical lollipop:

And here’s the orchestral/rock hybrid:

It’s like the music has a case of multiple personality, and the audio team can use this to add distinctive character to locations and situations within the level – some areas benefiting from the cuteness of the whimsical style, others from the toughness of the rock. In order to make this happen, as game composers we have to keep the two styles balanced in our minds – compose them both separately, test how they work together, adjust the instrumental performances and fundamental organization so that the two styles can coexist in a way that makes musical sense, test the layers some more in various configurations, until all the layers seem to work well – both when played together and when played alone.

LittleBigPlanet 2 Eve’s Asylum

EveNow, while the Victoria’s Lab example presents a fairly extreme contrast in music styles, the music from the Eve’s Asylum level of LittleBigPlanet 2 shows off this technique in an even more dramatic way. The Eve’s Asylum level is set inside a giant tree, where a lady with an apple for a head runs a a highly-spiritual insane asylum. The music for this level is structured around two very distinct musical genres that are assigned to specific tasks.

The sparkling, surreal New Age music style works to enhance gameplay during relaxed exploration, and it also highlights the natural beauty of the giant tree. Here’s a taste of that:

On the flip side of the coin, the Boogie-Woogie style pays tribute to the Andrew Sisters and the age of swing, and the high-energy rhythms provide support for combat and perilous situations. Let’s listen to a little of that:

Okay, now here’s what it sounds like when the Vertical Layering music system transitions from one musical genre to the other in the Eve’s Asylum level of LittleBigPlanet 2:

What’s great about this technique is that it allows the music to morph into something completely different in a perfectly seamless way, without ever making the player overtly conscious of the transition, and without creating any artificial sense of demarcation where one style ends and another begins. The music is simply interacting with the gameplay, changing in a logical way as the player’s circumstances change. Now, let’s look at one more example of this technique, this time from LittleBigPlanet 3.

LittleBigPlanet 3 The Ziggurat Theme

SackBrosIn the Ziggurat level, Sackboy explores a gigantic sanctuary that’s full of both grandly spiritual architecture and playfully eccentric machines. As a setting that already had a built-in duality, it seemed clear that the music should also have a similar sense of division – so I composed this Vertical Layering composition in two musical styles. The first was a traditionally designed Baroque-style fugue – a multi-voiced counterpoint composition built around the repetition and development of a single melodic theme. Here’s a snippet of that Baroque-style fugue:

The second style was a quirky World Fusion in which log drums, upright bass and assorted percussion instruments worked together to have some fun with African, Latin, Polynesian and Jazz rhythms. Here’s an excerpt of those groovy world beats:

So, the music is essentially coming from the opposite ends of the cultural spectrum – a very strict and refined musical form on one side, and a very groovy and uninhibited style on the other. Now, watch how the music system added layers during this gameplay sequence in the Ziggurat level of LittleBigPlanet 3:

Vertical Layering is a tremendously flexible composition technique that allows a game composer to incorporate two simultaneous musical genres into a single track. We can use the two distinctly-different genres separately, and then combine them to create dramatically different musical effects.  It’s a fun technique, and I hope that you’ll give it a try in your own work.  Let me know in the comments if you’ve ever tried to combine two musical genres using Vertical Layering, or if you’re planning to try it in the future!


Studio1_GreenWinifred Phillips is an award-winning game music composer.  Her most well-known projects include such famous and popular games as Assassin’s Creed Liberation, God of War, the LittleBigPlanet franchise, and many others.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press. Follow her on Twitter @winphillips.

Power to the Players: Music for User-Created Levels

This week, I’d like to touch upon an aspect of the LittleBigPlanet music system that sets it apart from most other games – and that is the way in which the game gives players the power to directly manipulate the music content.


Every piece of music in a LittleBigPlanet game is also a collectible prize that players can obtain and then use in levels that they build themselves using the game’s creation tools. For this reason, when composing for a LittleBigPlanet game, the members of the music composition team have to keep in mind that there’s no way to predict how the user community will use the music. Certainly, the players will be sharing their user-created levels across the entire community – there are over 9 million levels so far – and that knowledge tends to puts everything in a whole new light.

As I’ve mentioned in previous blogs, the music of the LittleBigPlanet franchise for consoles is structured using a Vertical Layering system comprised of six layers – six simultaneous audio recordings that play in synch with each other and each represent a percentage of the whole composition. This allows the music to be disassembled and reassembled by the game engine according to what’s happening during the course of play.  That means that each music composition is fragmented into six parts.  So, I have to ask myself – when players are using one of the interactive tracks I’ve composed for a LittleBigPlanet game, will users play only one layer out of the six? That thought tends to make me scrutinize every layer pretty intently.

On the other hand, will players just set every layer as active, at full volume, all the time? Again, that’s a thought that puts me on high alert, leading me to turn a hyper critical eye on each composition before I make that final submission to the developers.


When we create interactive music for most projects, we can trust that the audio team at the development studio will work to implement the music in the most advantageous way, with the most satisfying musical results – but players tend to make their decisions based on what seems like fun at the time.

Even so, I’m always excited to hear how players have implemented my music into their games.  Here are some of the best examples of ingenuity and artistry from a few of the top LittleBigPlanet level creators:

LittleBigPlanet 3 The Ziggurat Theme

In the Ziggurat level, Sackboy wanders through an impressive sanctuary characterized by imposing architecture and lots of glittering glass, with outdoor sections blanketed by softly falling snow.  I was asked to create music for this area, which was structured as a central hub from which Sackboy could embark on adventures and accept missions.  The music I composed included six layers – Choir, Harp, Bells, Bass, Jazz Drums and Percussion.  Here is a short 12 second excerpt taken from each of the six layers at the exact same moment in the composition:

In the Ziggurat level created by the development team at Sumo Digital, Sackboy repeatedly visits a central hub area, and the layers of the music are triggered in different configurations depending on when Sackboy visits.  The layers don’t change noticeably while Sackboy is exploring the level, but when he returns to the same level later, the music will have changed its layer configuration. Here’s a brief example of how that worked:

In the awesome user-created level Fuga Ad Infinitum (designed by Aratiatia), the Ziggurat Theme music is used with a very different triggering strategy.  The layers are turned on and off depending on the actions of Sackboy as he runs and flies through a mythologically-inspired environment, causing the music to fluidly change its character while Sackboy explores.  Because of this fundamentally different method of music triggering, The Ziggurat Theme has a unique tone and atmosphere in Fuga Ad Infinitum.  Here’s a gameplay video that shows how the music was triggered in the Fuga Ad Infinitum game:

The user Aratiatia created a mesmerizingly beautiful level, lacing the layers of The Ziggurat Theme throughout with thoughtfully designed trigger points that supported the action of the game very well.

LittleBigPlanet 2 Toy Story

Sometimes an interactive track can come across differently with very small changes in implementation.  As an example – the LittleBigPlanet 2 Toy Story game was a self-contained adventure in the world of the famous and popular Toy Story movies.  I wrote an interactive western bluegrass track for gameplay sequences that included cowboy romps with Woody and his pals.  The details regarding the composition of each layer in this bluegrass Vertical Layering composition are explored in one of the tutorial videos I produced to supplement my book, A Composer’s Guide to Game Music:

During the LittleBigPlanet 2 Toy Story game, the interactive music would be used for both low-energy cinematics and high-energy gameplay.  Here’s a brief video showing how the music was implemented in the LittleBigPlanet 2 Toy Story game:

Now, here’s the same music used in an incredibly clever LittleBigPlanet 2 user-created game called Paper World 2 by Adell22.  In this implementation of the music, Adell22 chose not to use the melody layer, opting instead for the bluegrass rhythm and energy to give the vehicular gameplay its momentum:

The drastically different gameplay circumstances, combined with the different mix of layers in the music, help this track to come across distinctively and support the action of the Paper World 2 user-created game.

LittleBigPlanet 2 Victoria’s Lab

I’ve blogged before about the music I composed for the Victoria’s Lab level of LittleBigPlanet 2 – I mention it here as an illustration of how a Vertical Layering composition can change depending on the implementation.  The music of Victoria’s Lab includes both whimsical and dark layers which can be played together or separately.  Here’s a 15 second excerpt of the full mix of Victoria’s Lab, to remind us of how all six layers sound when played together.

In a user-created level for the LittleBigPlanet 2 game, the user Acanimate chose to implement only the drums, guitars and strings of the Victoria’s Lab music (in other words, the dark and serious layers) in this exciting and perilous level called Sprocketz.

As a contrast, in this section of another user-created level called Sweets Fantasy by the user White Rabbit, only the light and comical layers of the Victoria’s Lab music were used, with the following result:

I’m always inspired by what the LittleBigPlanet user community does with the interactive music written for the franchise.  It’s a privilege to create music that will become part of user-created levels, and fascinating to see how the players choose to implement the interactive components of the LittleBigPlanet music system.  Their choices sometimes reveal hidden utility in the music created for the franchise, and looking at their choices can help us better understand the creative possibilities inherent in Vertical Layering.


Studio1_GreenWinifred Phillips is an award-winning game music composer.  Her most well-known projects include such famous and popular games as Assassin’s Creed Liberation, God of War, the LittleBigPlanet franchise, and many others.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press.  Follow her on Twitter @winphillips.

Social Media for the Game Music Composer


Candace Walker, recruiting manager at Naughty Dog studios.

Candace Walker, recruiting manager at Naughty Dog studios.

At the recent Game Developers Conference Europe (August 3-4, 2015), top recruiting manager Candace Walker of Naughty Dog gave a presentation entitled “Career Bootcamp: The Benefits of Building an Online Presence and How To Do It.”  While her talk was not aimed at freelancers such as game composers, some of her strategies and recommendations are worth considering.

I explored some of these social media approaches in my book, A Composer’s Guide to Game Music (chapter 14, page 246), but Candace adds a new perspective to the topic from her vantage as a recruiter. I’ll be exploring some of the best highlights from her talk in this blog.

But first, let’s watch a short video created by best selling author Erik Qualman, author of What Happens in Vegas Stays on YouTube.  This video focuses on the power of social media, in case any of us were unsure of what impact it might have on our professional lives:

The Goals of Social Presence

Candace Walker began her talk at GDC Europe by emphasizing four important guiding considerations that should shape our online efforts in the realm of social media.

  1.  What is our goal?  What are we trying to achieve?  As game audio freelancers, we want our potential clients to be aware of our availability and (hopefully) our awesome skills as game composers!  We may also want to reach out to the game audio community at large, contributing to the overall body of knowledge and/or making friends and contacts.  Whatever our ultimate purpose in regards to social media, we should always define our goals specifically and keep them at the forefront of our online efforts.
  2. Who is our audience?  For game composers, the online audience may be composed of potential clients, fellow composers, game press, game music fans, etc.  Different messages are meant to reach different audiences, and we need to keep this in mind.
  3. Does our intended message have value for its audience?  Social media has parallels with consumer culture, in that an online audience is investing something of worth in order to obtain something valuable. In this case, the investment takes the form of time, and the valuable return may be educational or entertaining content.  With any social media message, we need to evaluate the inherent value of our content.  Will our audience think it’s valuable enough?  Will our message be worth their time?
  4. Does our intended message have the potential to incite conflict? This one is a tricky issue for us to ponder.  If we’re simply reaching out to potential clients, the issue of unexpected conflict shouldn’t be particularly problematic.  However, if we’re discussing the craft of game audio in social media and we suddenly stumble across a contentious topic that starts ruffling feathers, we need to take a breath and consider the possible ramifications. In this case, Candace advises us to take a step back and favor the cautious approach.

At this point, Candace continued her presentation by taking her audience on a tour of the most famous and popular social media platforms.



Candace tells us that having a YouTube channel and producing videos can be useful for the game industry professional with expertise to share.  YouTube tutorials and educational videos are fantastic ways to spread knowledge.  As game composers, we can avail ourself of this avenue of social media outreach by producing educational videos that explore important skills, or tutorial videos that explain the use of vital game audio tools.



According to Candace, this social media platform is growing in usefulness to game industry recruiters.  Pinterest allows a user to set up a “pinboard” of relevant links that fall within a single subject of interest.



Using Facebook as our conduit for professional outreach is entirely possible, Candace assures us.  However, we have to be clear about our purpose on Facebook.  If we’re on Facebook in a professional capacity, then we have to refrain from sharing too many personal posts.  Candace warns us against diluting our message with day-to-day observations and pet peeves.  Our initial goals for our social media presence should help us make decisions about what to post.



This social media platform is Candace’s #1 tool for finding new talent. According to Candace, LinkedIn has the potential to put us on the radar of our industry colleagues, and can deliver vital information about our services to potential clients.  In her presentation, Candace advises that we complete our LinkedIn profiles as thoroughly as possible, including all the relevant information about our experience in the industry and our skills.  An added side benefit is the ability of the LinkedIn site to reformat the content of a user’s profile page into a serviceable résumé that we can then use to woo potential clients.


Candace ended her presentation by recommending the social media strategies of several of her colleagues at Naughty Dog.  Here are some of the links she provided:

Twitter:  @jack_dillon, @cgyrling

Facebook: Glauro Longhi, John Sweeney

LinkedIn: Kurt Margenau, Jason Gregory

YouTube: Glauco Longhi, Richard “Pipes” Piper.


Studio1_GreenWinifred Phillips is an award-winning game music composer.  Her most well-known projects include such famous and popular games as Assassin’s Creed Liberation, God of War, the LittleBigPlanet franchise, and many others.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press.  Follow her on Twitter @winphillips.

“Feel-Good Game Sound” for the Game Music Composer

How can we define “feel-good game sound”? That’s the question that sound designer Joonas Turner attempted to answer with his recent GDC Europe talk entitled, “Oh My! That Sound Made the Game Feel Better!”  Joonas’ talk was a part of the Independent Games Summit portion of GDC Europe, which took place in Cologne Germany on Monday August 3rd 2015.

While much of Joonas’ talk focused on issues that would chiefly concern sound designers, there were several interesting points for game composers to consider.  I’ll be exploring those ideas in this blog.

Joonas is a video game sound designer and voice actor working within the E-Studio professional recording studio in Helsinki, Finland.  His game credits include Angry Birds Transformers, Broforce, and Nuclear Throne.  After briefly introducing himself, Joonas launched into his talk about creating an aural environment that “feels good” and also makes the game “feel good” to the player. He starts by identifying an important consideration that should guide our efforts right from the start.

Consider design first


Joonas Turner, sound designer at E-Studio.

In his talk, Joonas urges us to first consider the overall atmosphere of the game and the main focus of the player.  Ideally, the player should be able to concentrate on gameplay to the exclusion of any distractions.  The sound of a game should complement the gameplay and, if possible, deliver as much information to the player as possible.  If done perfectly, a player should be able to avoid consulting the graphical user interface in favor of the sonic cues that are delivering the same information.  In this way, the player gets to keep attention completely pinned on the playing field, staying on top of the action at hand.

Clearly, sound effects are designed to serve this purpose, and Joonas discusses a strategy for maximizing the utility of sound effects as conveyors of information… but can music also serve this purpose?  Can music deliver similar information to the player?  I think that music can do this in various ways, by using shifts in mood, or carefully-composed stingers, or other interactive techniques.  By way of these methods, music can let the player know when their health is deteriorating, or when they’re out of ammo.  Music can signal the appearance of new enemies or the successful completion of objectives.  In fact, I think that music can be as informative as sound design.

Music, sound design and voice-over: perfect together

As his GDC Europe talk proceeds, Joonas reminds us to think about how the music, sound design and voice-over will fit together within the overall frequency spectrum.  It’s important to make sure that these elements will complement each other, with frequency ranges that spread evenly across the spectrum, rather than piling up together at the low or high end.  With this in mind, Joonas suggests that the sound designer and composer should be brought together as early as possible to agree on a strategy for how these sonic elements will fit together in the game.


(Here’s where Joonas brought up the first of two controversial ideas he presented during his talk.  While I’m not sure I agree with these ideas, I think the viewpoints he expresses are probably shared amongst other sound designers in the game industry, and therefore could use some more open discussion in the game audio community.)

While composers for video games always want to create the best and most awesome music for their projects, Joonas believes that this desire is not always conducive to a good final result.  He suggests that the soundtrack albums for video games are often more exciting and musically pleasing than the actual music from the game.  With this in mind, Joonas thinks that composers should save their best efforts for the soundtrack, while structuring the actual in-game music to be simpler and less aesthetically interesting.  In this way, the music can fit more comfortably into the overall aural design.

Your sonic brand

At this point in his presentation, Joonas urges the attendees to find aural styles that will be unique to their games.  He tells the audience to avoid using a tired sonic signature in every game, such as the famous brassy “bwah” tone that became pervasively popular after its use in the movie Inception.  If you are wondering what that sounds like, just hit the button below (courtesy of web developer Dave Pedu).

In 2012, Gregory Porter (an avid movie lover and creator of YouTube videos about the movies) created a fun video illustrating just how pervasive the infamous Inception “bwah” had actually become:

In my book, A Composer’s Guide to Game Music, I discuss the concept of creating a unique sonic identity for game in the chapter about the “Roles and Functions of Music in Games.”  In the book, I call this idea “sonic branding”(Chapter 6, page 112), wherein the composer writes such a distinctive musical motif or creates such a memorable musical atmosphere that the score becomes a part of the game’s brand.

Be Consistent

When recording music or sound design for a project, Joonas tells us that it’s important to remain consistent with our gear choices.  If a certain microphone has been used for a certain group of character voices, then that microphone should continue to be used for that purpose across the whole project.  Likewise, the same digital signal processing applications or hardware (compression, limiting, saturation, etc) should be used across the entire game, so that the aural texture remains consistent.  Carrying Joonas’ idea into the world of game music, we would find ourselves sticking with the same instrument and vocal microphones, and favoring the same reverb and signal processing settings throughout the musical score for a game.  This would ensure that the music maintained a unified texture and quality from the beginning of the game to the end.

Shorter is better


In his talk, Joonas shares his personal experience with sound effects designed to indicate a successful action – a button press that causes something to happen.  Joonas tells us that for these sounds, shorter is definitely better.  The most successful sounds feature a quick, crisp entrance followed by a swift release. A short sound designed in this way will be satisfying to trigger, and won’t become tiresome after countless repetitions.

For the composer, the closest analogy to this sort of sound effect is the musical stinger designed to be triggered when the player performs a certain action.  In order to adhere to Joonas’ philosophy, we’d compose these stingers to have assertive entrances and quick resolves, so that they would be fun for the player even when repeated many times.

To clip or not to clip…

(This is the second of the two controversial ideas Joonas presented in his talk. Again, while I don’t necessarily agree with this, I think it’s an idea that hasn’t been expressed often and may need further discussion.)


A volume unit (VU) meter registering some high audio levels.

The common wisdom amongst audio engineers is to avoid overloading the mix.  Such overloads can produce clipping and create distortion, which deteriorates the overall sound quality of the game.  However, Joonas suggests that for intense moments during gameplay, some clipping and distortion may actually enhance the sensation of anxiety and frenetic energy that such moments seek to elicit.  According to Joonas, this enhancement can actually be a desirable outcome, and the sound designer should therefore not be afraid of such overloads and clipping during intense moments in a game.

How would this idea relate to music?  Well, we’ve probably all heard examples of successful pop music that embraces sonic overload.  Lead vocalists sometimes scream into microphones to produce overloads, or a wailing guitar riff may be recorded with lots of overload artifacts.  As a deliberate effect placed carefully for the sake of drama, such brief moments of overload can add edginess to contemporary musical genres.  However, we’ve all likely heard other examples of overloads that seem more the product of high decibel levels rather than any deliberate processing. It’s important to differentiate a deliberate effect from an accidental one.  In music at least, we always want to control the final outcome of the mix, including the presence or absence of overload distortion.


Joonas wound up his talk by urging attendees to always give priority to the elements in the sound mix that are most important.  That would be a good guiding principle for music mixing as well.  Joonas is an interesting thinker in the area of game sound design.  He can be followed at his Twitter account, @KissaKolme.  Please feel free to comment below about anything you’ve read in this blog, and let me know how you feel about the ideas we’ve discussed.  I’d love to read your thoughts!


Studio1_GreenWinifred Phillips is an award-winning game music composer.  Her most well-known projects include such famous and popular games as Assassin’s Creed Liberation, God of War, the LittleBigPlanet franchise, and many others.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press.  Follow her on Twitter @winphillips.

A Composer’s Guide to Game Music, now in Japanese!


A Composer's Guide to Game Music by Winifred Phillips, now on sale in Japanese! Published by O'Reilly Japan.

A Composer’s Guide to Game Music by Winifred Phillips, now on sale in Japanese!  Published by O’Reilly Japan.

I’m excited to share that my book, A Composer’s Guide to Game Music, was released today in Japan in its newly-published Japanese-language edition!  O’Reilly Japan has published the Japanese softcover of my book in Japan under the title, “Game Sound Production Guide: Composer Techniques for Interactive Music.”

This is the Japanese cover of the book. In Japanese, A Composer's Guide to Game Music is titled "Game sound production guide - composer techniques for interactive music," by Winifred Phillips.

Side-by-side, these are the covers of the two editions of the book. In Japanese, A Composer’s Guide to Game Music is titled “Game sound production guide – composer techniques for interactive music,” by Winifred Phillips.

I’m very excited that the Japanese language edition of my book has already hit #1 on the “Most Wished For” list on Amazon Japan!

The Amazon Japan "Most Wished For" list.

The “Most Wished For” list on

Coincidentally, the English-language version of A Composer’s Guide to Game Music is now #1 on the Kindle Top Rated list, too!

The Kindle "Top Rated" list on

The Kindle “Top Rated” list on

O’Reilly Japan is located in Tokyo, and is dedicated to translating books about technological innovation for Japanese readers.  They are a division of O’Reilly Media, a California publishing company that acts as “a chronicler and catalyst of leading-edge development, homing in on the technology trends that really matter and galvanizing their adoption by amplifying “faint signals” from the alpha geeks who are creating the future.  O’Reilly publishes definitive books on computer technologies for developers, administrators, and users. Bestselling series include the legendary “animal books,” Missing Manuals, Hacks, and Head First.”


From what I’ve gathered, my book – A Composer’s Guide to Game Music – is the first English language book about game music to be translated into Japanese and sold in Japan.  There are a few other books available in Japan on the subject – but they were all originally written in Japanese.  These include a book exploring game sound by the audio hardware designer and sound developer Shiomi Toshiyukia text on creating sound for games with the CRI ADX2 middleware by Uchida Tomoya, and a book on producing game music and sound design by the artist “polymoog” of the dance music duo ELEKETL (pictured below, from left to right).


I’m tremendously excited about the Japanese edition of my book, and my excitement comes in large part from the venerable tradition of outstanding music in Japanese games.  From the most celebrated classic scores of such top game composers as Koji Kondo (Super Mario Bros.) and Nobuo Uematsu (Final Fantasy), to the excellent modern scores of such popular composers as Masato Kouda (Monster Hunter) and Yoko Shimomura (Kingdom Hearts), Japanese video game composers have set the creative bar very high.  I’m incredibly honored that my book will be read by both established and aspiring game composers in Japan!  I hope they’ll find some helpful information in my book, and I’m excited to contribute to the ongoing conversation about game music in the Japanese development community.

I’ve always loved Japanese game music.  In 2008, I participated in a compilation album in which successful game composers created cover versions of celebrated video game songs from classic games.  The album was called “Best of the Best: A Tribute to Game Music.”  I chose the music by Koji Kondo from Super Mario Bros., and recorded an a cappella vocal version.  It’s currently available for sale from the Sumthing Else Music Works record label, and can also be downloaded on iTunes.  You can hear the track on YouTube here:

If you’d like to learn more about the rich legacy of game music composition in Japan, you can watch an awesome free documentary series produced by the Red Bull Music Academy, entitled “Diggin’ in the Carts: A Documentary Series About Japanese Video Game Music.”  The series interviews famous game composers of Japan, which means that the interviews and narration are both in Japanese (with English subtitles).  Here’s an episode that focuses on modern accomplishments by Japanese game composers:


Studio1_GreenWinifred Phillips is an award-winning game music composer.  Her most well-known projects include such famous and popular games as Assassin’s Creed Liberation, God of War, the LittleBigPlanet franchise, and many others.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press.  Follow her on Twitter @winphillips.