Game Music Middleware, Part 5: psai

Middleware-Blackboard

 

This is a continuation of my blog series on the top audio middleware options for game music composers, this time focusing on the psai Interactive Music Engine for games, developed by Periscope Studio, an audio/music production house. Initially developed as a proprietary middleware solution for use by Periscope’s in-house musicians, the software is now being made available commercially for use by game composers.  In this blog I’ll take a quick look at psai and provide some tutorial resources that will further explore the utility of this audio middleware.  If you’d like to read the first four blog entries in this series on middleware for the game composer, you can find them here:

Game Music Middleware, Part 1: Wwise

Game Music Middleware, Part 2: FMOD

Game Music Middleware, Part 3: Fabric

Game Music Middleware, Part 4: Elias

What is psai?

The name “psai” is an acronym for “Periscope Studio Audio Intelligence,” and its lowercase appearance is intentional.  Like the Elias middleware (explored in a previous installment of this blog series), the psai application attempts to provide a specialized environment specifically tailored to best suit the needs of game composers.  The developers at Periscope Studio claim that psai’s “ease of use is unrivaled,” primarily because the middleware was “designed by videogame composers, who found that the approaches of conventional game audio middleware to interactive music were too complicated and not flexible enough.”  The psai music engine was originally released for PC games, with a version of the software for the popular Unity engine released in January 2015.

psai graphical user interface

psai graphical user interface

Both Elias and psai offer intuitive graphical user interfaces designed to ease the workflow of a game composer. However, unlike Elias, which focused exclusively on a vertical layering approach to musical interactivity, the psai middleware is structured entirely around horizontal re-sequencing, with no support for vertical layering.  As I described in my book, A Composer’s Guide to Game Music, “the fundamental idea behind horizontal re-sequencing is that when composed carefully and according to certain rules, the sequence of a musical composition can be rearranged.” (Chapter 11, page 188).

Music for the psai middleware is composed in what Periscope describes as a “snippets” format, in which short chunks of music are arranged into groups that can then be triggered semi-randomly by the middleware.  The overall musical composition is called a “theme,” and the snippets represent short sections of that theme.  The snippets are assigned numbers that best represent degrees of emotional intensity (from most intense to most relaxed), and these intensity numbers help determine which of the snippets will be triggered at any given time.  Other property assignments include whether a snippet is designated as an introductory or ending segment, or whether the snippet is bundled into a “middle” group with a particular intensity designation.  Periscope cautions, “The more Middle Segments you provide, the more diversified your Theme will be. The more Middle Segments you provide for a Theme, the less repetition will occur. For a highly dynamic soundtrack make sure to provide a proper number of Segments across different levels of intensity.”

Here’s an introductory tutorial video produced by Periscope for the psai Interactive Music Engine for videogames:

Because psai only supports horizontal re-sequencing, it’s not as flexible as the more famous tools such as Wwise or FMOD, which can support projects that alternate between horizontal and vertical interactivity models.  However, psai’s ease of use may prove alluring for composers who had already planned to implement a horizontal re-sequencing structure for musical interactivity.  The utility of the psai middleware also seems to depend on snippets that are quite short, as is demonstrated by the above tutorial video produced by Periscope Studio.  There could be some negative effects of this structure on a composer’s ability to develop melodic content (as is sometimes the case in a horizontal re-sequencing model).  It would be helpful if Periscope could demonstrate psai using longer snippets that might give us a better sense of how musical ideas might be developed within the confines of their dynamic music system.  One can imagine an awesome potential for creativity with this system, if the structure can be adapted to allow for more development of musical ideas over time.

The psai middleware has been used successfully in a handful of game projects, including Black Mirror III, Lost Chronicles of Zerzura, Legends of Pegasus, Mount & Blade II – Bannerlord, and The Devil’s Men.  Here’s some gameplay video that demonstrates the music system of Legends of Pegasus:

And here is some gameplay video that demonstrates the music system of Mount & Blade II – Bannerlord:

border-159926_640_white

Studio1_GreenWinifred Phillips is an award-winning video game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution. Her credits include five of the most famous and popular franchises in video gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press. As a VR game music expert, she writes frequently on the future of music in virtual reality video games. Follow her on Twitter @winphillips.

Interactive Game Music of LittleBigPlanet 3 (Concepts from my GDC Talk)

LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes -- Speaker, Winifred Phillips

LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes

I was honored to be selected by the Game Developers Conference Advisory Board to present two talks during this year’s GDC in San Francisco earlier this month.  On Friday March 6th I presented a talk on the music system of the LittleBigPlanet franchise.  Entitled LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” the talk explored the Vertical Layering music system that has been employed in all of the LittleBigPlanet games (the soundtrack for that game is available here).  I’ve been on the LittleBigPlanet music composition team for six of their games so far, and my talk used many examples from musical compositions I created for all six of those projects.

After my talk, several audience members let me know that the section of my presentation covering the music system for the Pod menu of LittleBigPlanet 3 was particularly interesting – so I thought I’d share the concepts and examples from that part of my presentation in this blog.

GDC15_LBP3-Game-Music_Winifred-Phillips-1

That’s me, giving my GDC speech on the interactive music system of the LittleBigPlanet franchise.  Here I’m just starting the section about the Pod menu music.

The audio team at Media Molecule conceived the dynamic music system for the LittleBigPlanet franchise.  According to the franchise’s music design brief, all interactive tracks in LittleBigPlanet games must be arranged in a vertical layering system.  I discussed this type of interactive music in a blog I published last year, but I’ll recap the system briefly here as well.  In a vertical layering music system, the music is not captured in a single audio recording.  Instead, several audio recordings play in sync with one other.  Each layer of musical sound features unique content.  Each of the layers represents a certain percentage of the entire musical composition.  Played all together, we hear the full mix embodying the entire musical composition.  Played separately, we hear submixes that are still satisfying and entertaining for their own sake.  The music system can play all the layers either together or separately, or can combine the layers into different sets that represent a portion of the whole mix.

When implemented into gameplay, layers are often activated when the player moves into a new area.  This helps the music to feel responsive to the player’s actions.  The music seems to acknowledge the player’s progress throughout the game.  It’s important to think about the way in which individual layers may be activated, and the functions that the layers may be called upon to serve during the course of the game.

GDC15-Winifred-Phillips

In LittleBigPlanet 3, the initial menu system for the game is called “The Pod.”  The music for the Pod is arranged in vertical layers that are activated and deactivated according to where the player is in the menu hierarchy.  All the layers can be played simultaneously, and they play in multiple combinations… however, each of the individual layers is also associated with a specific portion of the menu system, and is activated when the player enters that particular part of the menu.

Let’s take a quick tour through the layers of the Pod menu music.  I’ve embedded some short musical excerpts of each layer.  You’ll find the SoundCloud players for each layer embedded below – just click the Play buttons to listen to each excerpt.  The first layer of the Pod menu music is associated with the Main Menu, and it features some floaty, science-fiction-inspired textures and effects:

The next layer is associated with a menu labeled “My Levels,” and the music for that layer is very different.  Now, woodwinds are accompanied by a gentle harp, combining to create a homey and down-to-earth mood:

Moving on to the music layer for the “Play” menu, we find that the instrumentation now features an ethereal choir and shimmering bells, expressing a much more celestial atmosphere:

Now let’s listen to the “Adventure” menu layer, in which plucked strings and bells combine to deliver a prominent melody line:

Finally, in the music layer associated with the “Community” and “Popit” menus, we hear a quirky mix of synths and effects that hearken back to menu music from previous games in the LittleBigPlanet franchise:

As the player navigates the Pod menu system, these various music layers are activated to correspond with the player’s location within the menu hierarchy.  This sort of dynamic music triggering lies at the very heart of the Vertical Layering interactive music mechanism.

GDC15_LBP3-Game-Music_Winifred-Phillips-2

Every layer in a Vertical Layering composition can have a very distinct musical identity.  When that layer is turned off, the entire mix changes in a noticeable way.  The mix can be changed subtly…

GDC15_LBP3-Game-Music_Winifred-Phillips-3

… or it can be altered radically, with large scale activations or deactivations of layers.  Even with these kinds of dramatic changes, the musical composition retains its identity.  The same piece of music continues to play, and the player is conscious of continuing to hear the same musical composition, even though it has just altered in reaction to the circumstances of gameplay and the player’s progress.

In the Pod menu music system, the layers would change in reaction to the player’s menu navigation, which could be either slow and leisurely or brisk and purposeful.  Layer activations and deactivations would occur with smooth crossfade transitions as the player moved from one menu to another.  Now let’s take a look at a video showing some navigation through the Pod menu system, so we can hear how these musical layers behaved during actual gameplay:

 As you can see, triggering unique musical layers for different portions of the menu system helps serve to define them.  I hope you found this explanation of the Pod music to be interesting!  If you attended GDC but missed my talk on the interactive music of LittleBigPlanet, you’ll be able to find the entire presentation posted as a video in the GDC Vault in just a few weeks.  In the meantime, please feel free to add any comments or questions below!

GDC Flash Forward and IASIG Recommended Sessions

GDC Flash Forward

flashForward_742x240

I’m happy to announce that I’ve been invited to participate in this year’s GDC Flash Forward!

This will be the fourth annual GDC Flash Forward event, which this year will kick off the main conference sessions taking place from Wednesday March 4th – Friday March 6th.  Like a big “coming attractions” show, the Flash Forward allows attendees to get a first look at sessions that have been selected as especially interesting or noteworthy by the GDC Advisory Board.  Out of the over 400 lectures, panels, tutorials and roundtables that take place during GDC Week, the GDC Advisory Board selects around 70 sessions to participate in the Flash Forward, so I’m very pleased to have been asked to participate this year!

flashForward_twitter-announce-sm

During the Flash Forward event at 9:30am on Wednesday March 4th, each speaker will have from 30-45 seconds to present an enticing preview of their presentation, along with a video clip showing some of the sights that will entertain their presentation attendees.  I’ll be presenting a preview of my talk, “LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” which will take place on Friday March 6th at 10am in room 3006 West Hall.

Last-Slide_FF-and-LBP-session

Here’s a little more about the Flash Forward, from the official press release:

This year the hour-long session will be headlined by industry veterans Brenda Romero (Romero Games, UCSC) and Laura Fryer (Oculus VR), and they’ll be presenting their own informal take on the state of the industry before participating in what always proves to be a fun, fast-paced event that highlights some of the best GDC 2015 talks.

Flash Forward presenters are hand-picked by the GDC Advisory Board, ensuring that the session will feature an eclectic mix of speakers that represents the full breadth of the conference. Those selected will have the chance to grab attendees’ attention by taking the stage for a brief period of time — 30-45 seconds, tops — to present a rapid-fire overview of what their session is and why it’s worth checking out.

This year’s Flash Forward should be very exciting, and I’m honored to be a part of it!  If you’re attending the Game Developers Conference this year, be sure to go to the Flash Forward!  It’s sure to be a lot of fun!

IASIG Recommended Sessions

I’m also very pleased and proud that my session, “LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” was selected by the Interactive Audio Special Interest Group (IASIG) as a Recommended Session for GDC 2015!

iasig

Here’s more about the IASIG, from their official site:

The Interactive Audio Special Interest Group (IASIG) exists to allow developers of audio software, hardware, and content to freely exchange ideas about “interactive audio”. The goal of the group is to improve the performance of interactive applications by influencing hardware and software design, as well as leveraging the combined skills of the audio community to make better tools.  The IASIG has been influential in the development of audio standards, features, and APIs for Microsoft Windows and other platforms, and has helped numerous hardware companies define their directions for the future.

I’m so honored that out of the 46 sessions in the GDC Audio Track, the Interactive Audio Special Interest Group selected my presentation as one of their 7 recommended talks!  Here’s the whole list of IASIG Recommendations:

Session Title Time Location Speakers
Making Full Use of Orchestral Colors in Interactive Music Wednesday 11:00-12:00  West 3002 Jim Fowler (SCE- World Wide Studios)
Creating an Interactive Musical Experience for Fantasia: Music Evolved Wednesday 14:00-15:00  West 3006 Jeff Allen (Harmonix Music Systems), Devon Newsom (Harmonix Music Systems)
BioShock Infinite: Scoring in the Sky, a Postmortem Wednesday 17:00-18:00  West 3002 Garry Schyman (Garry Schyman Productions)
Peggle Blast: Big Concepts, Small Project Thursday 10:00-11:00  West 3006 RJ Mattingly (PopCap), Jaclyn Shumate (PopCap), Guy Whitmore (PopCap)
Inspiring Player Creativity in Disney Fantasia: Music Evolved Thursday 14:00-14:30  West 3020 Jonathan Mintz (Harmonix Music Systems)
LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes Friday 10:00-11:00  West 3006 Winifred Phillips (Generations Productions LLC)
Where Does the Game End and the Instrument Begin? Friday 13:30-14:30  West 3006 Matt Boch (Harmonix Music Systems), Jon Moldover (Smule Inc.), Nick Bonardi (Ubisoft), David Young (Smule Inc.), Brian Schmidt (Brian Schmidt Studios)

GDC Audio Bootcamp

AudioBootCamp-Icon

The Game Developers Conference is nearly here!  It’ll be a fantastic week of learning and inspiration from March 2nd – March 6th.  On Tuesday March 3rd from 10am – 6pm, the GDC Audio Track will be hosting the ever-popular GDC Audio Bootcamp, and I’m honored to be an Audio Bootcamp speaker this year!

This will be the 14th year for the GDC Audio Bootcamp, and I’m honored to join the 9 other speakers who will present this year:

  • Michael Csurics, Voice Director/Writer, The Brightskull Entertainment Group
  • Damian Kastbauer, Technical Audio Lead, PopCap Games
  • Mark Kilborn, Audio Director, Raven Software
  • Richard Ludlow, Audio Director, Hexany Audio
  • Peter McConnell, Composer, Little Big Note Music
  • Daniel Olsén, Audio, Independent
  • Winifred Phillips, Composer, Generations Productions LLC
  • Brian Schmidt, Founder, Brian Schmidt Studios
  • Scott Selfon, Principal Software Engineering Lead, Microsoft
  • Jay Weinland, Head of Audio, Bungie Studios

We’ll all be talking about creative, technical and logistical concerns as they pertain to game sound.  My talk will be from 11:15am to 12:15pm, and I’ll be focusing on “Advanced Composition Techniques for Adaptive Systems.”

Bootcamp-Session-Twitter-Sm

Here’s a description of my Audio Bootcamp talk:

Interactive music technologies have swept across the video game industry, changing the way that game music is composed, recorded, and implemented. Horizontal Resequencing and Vertical Layering have changed the way that music is integrated in the audio file format, while MIDI, MOD and generative models have changed the landscape of music data in games.  With all these changes, how does the game composer, audio director, sound designer and audio engineer address these unique challenges?  This talk will present an overview of today’s interactive music techniques, including numerous strategies for the deployment of successful interactive music structures in modern games. Included in the talk: Vertical Layering in additive and interchange systems, how resequencing methods benefit from the use of digital markers, and how traditionally linear music can be integrated into an interactive music system.

Right after my Bootcamp presentation, all the Audio Bootcamp presenters and attendees will head off to the ever-popular Lunchtime Surgeries.  No, the attendees won’t actually be able to crack open the minds of the presenters and see what’s going on in there, but as a metaphor, it does represent the core philosophy of this lively event.  The Lunchtime Surgeries offer attendees a chance to sit with the presenters at large roundtables and ask lots of questions.  It’s one of the most popular portions of the bootcamp, and I’ll be looking forward to it!

Winifred-Phillips_GDC-Speaker

If you’ll be attending the GDC Audio Track, then I highly recommend the Audio Bootcamp on Tuesday, March 3rd.  Hope to see you there!