VRDC 2017 takeaways: VR music for the game composer

Video game music composer Winifred Phillips, at work in her music production studio - from the article about music for virtual reality / VR.The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences.  I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio.  This year, the hot topic was virtual reality.  In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show.  The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject.  In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks.  Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).

Inside and outside

The talks we’ll be discussing in this article are entitled “Audio Adventures in VR Worlds” and “The Sound Design of Star Wars: Battlefront VR.”  Here’s a common issue that popped up in both talks:

An illustration of music in the popular VR platform, from the article by Winifred Phillips (video game composer).Where should video game music be in a VR game?  Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player?  Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player?  The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal.  Is one of these approaches more effective in VR than the other?  Which choice is best?

These two concepts share a lot in common with the traditional categories of diegetic and non-diegetic music in entertainment media.  Diegetic music exists inside the fictional world, perceived by the characters within it, whereas non-diegetic music is inaudible to the characters and only exists for the benefit of the audience.  VR presents an interesting twist to this usually straightforward dichotomy.  When the entertainment experience is doing everything in its power to make us forget that we’re an audience, to the point where we achieve a sense of complete presence within the fictional world… what role does non-diegetic music play then?  If we can now consider ourselves as characters in the story, how do we hear music that story characters aren’t supposed to hear?

An illustration of the game PlayStation VR Worlds, from the article by popular video game music composer Winifred Phillips“VR goes beyond picture sync. It’s about sync of the world,” says music producer Joe Thwaites of Sony Interactive Entertainment Europe.  In his talk about the music and sound of the game PlayStation VR Worlds, Thwaites explores the relationship between music and the VR environment. “The congruency between audio and visuals is key in maintaining that idea of believability,” Thwaites asserts, “which in turn makes immersiveness, and in turn makes presence.”  In virtual reality development, the term ‘presence’ denotes the sensation of actually existing inside the virtual environment.  According to Thwaites, a strong believable relationship between the aural and visual worlds can contribute to a more satisfying VR experience.

Music inside the world

An illustration of the Ocean Descent portion of famous PlayStation VR Worlds game, from the article by Winifred Phillips, video game composer.As an example, Thwaites describes an interactive music implementation that he integrated into the ‘Ocean Descent’ section of PlayStation VR Worlds.  In this portion of the game, Thwaites pulled the otherwise non-diegetic musical score more fully into the immersive world by creating an illusion that the in-game objects were reacting to the musical notes.  “There’s a part called The Jellyfish Cave, where you descend into this sea of jellyfish,” Thwaites describes.  “You get this 2D music,” he adds, “which bypasses the 3D audio plugin, so it goes straight to your ears.”  In other words, the music is recorded in a traditionally stereo mix and the output is fed directly to the player’s headphones without bothering with any spatial positioning in the virtual world.  “Then, as you look around, these jellyfish light up as you look directly at them,” Thwaites goes on, “and they emit a tone in 3D in space so the music tone stays where it is in the world.”  So, these tones have been attached to specific jellyfish in the virtual world, spatially positioned to emanate from those locations, as if special portions of the non-diegetic score have suddenly leapt into the VR world and taken up residence there. “And that has this really nice effect of creating this really immersive and magical moment which is really unique to VR,” Thwaite remarks.

So this method served to help non-diegetic music feel more natural within the VR environment.  But what happens when pure non-diegetic music is an absolutely necessity?

Music outside the world

An illustration of the famous Star Wars Battlefront VR game, from the article by video game music composer Winifred Phillips.In the game Star Wars Battlefront Rogue One X-Wing VR Mission, the audio team at Criterion Games were tasked with creating an authentic audio experience in a virtual reality environment dedicated to the eternally famous and popular Star Wars franchise.  In this case, according to audio lead Jay Steen, pure non-diegetic music was a must.  “Non-diegetic means not from a source in the scene. This is how most movies and flatscreen games handle the music. So the music plays through the direct out straight to the player’s ears and we were worried from what we’d heard about non-diegetic music that it would distract from immersion,” Steen confesses. “But we actually found the opposite. Maybe that’s because you can’t have a Star Wars story without the music. You don’t feel like you’re in Star Wars until the music kicks in.”  According to Steen, the non-diegetic music worked in this circumstance because the audio team was careful to avoid repetition in the musical score.  “We didn’t reuse or loop cues that much, and due to the linear structure of the mission we could kind of get away with this,” Steen points out. “We think that helps to not break immersion.”

My perspective on using non-diegetic music in VR:

Famous video game composer Winifred Phillips works in her music production studio.Sometimes non-diegetic music can be introduced into a VR game, and then quickly transformed into diegetic music within the immersive environment in order to enhance player presence.  In my musical score for the Dragon Front game for Oculus Rift, I composed a dramatic choral track for the opening main theme of the game.  During the game’s initial logo sequence, the music is channeled directly to the player’s ears without any spatial positioning.  However, this changes as soon as the player fully enters the initial environment (wherein the player navigates menus and prepares to enter matches).  Logo art from the popular Dragon Front game, featured in the article by video game music composer Winifred PhillipsOnce the logo sequence has completed, the music makes a quick transition, from a full-bodied direct stereo mix to the player’s headphones, to a spatially localized narrow mix located to the player’s lower right.  Upon turning, players see that the music is now coming from a battered radio, which the player is free to turn on and off.  The music is now fully diegetic, existing inside the game’s fictional world.  Here’s a video showing this sequence in action:

Music inside and outside

The logo for the VR Luge portion of the popular PlayStation VR Worlds game, from the article by Winifred Phillips (video game music composer).While non-diegetic music can be tricky in VR, sometimes its an important part of the overall aesthetic.  Plus, there can be ways to integrate non-diegetic music into the spatial environment.  Joe Thwaites of Sony Europe describes an interesting combination of diegetic and non-diegetic music that was integrated into the ‘VR Luge’ section of the PlayStation VR Worlds game.  In this gameplay sequence, players ride feet-first on a luge that’s racing downhill amidst heavy vehicle traffic.  The experience was designed to be a heart-stopping thrill ride.  “So one of the experiments we did around the synchronization of the world was using a combination of diegetic and non-diegetic music to build tension as you zoomed down the hill,” Thwaites describes. “We used 3D car radios to introduce elements of percussion into the 2D soundtrack that was playing.”  In the musical score for this sequence, the non-diegetic music presented a purely percussive rhythm, but as the player passed by other cars, the music would change.  “So as you passed a car with a radio playing, an element of that 3D music would transition from the car into the 2D soundtrack.”  In this way, the in-game radio music would briefly become a part of the game’s non-diegetic score, while still conveying spatial positioning inside the 3D world.

So in these examples from PlayStation VR Worlds and Star Wars Battlefront Rogue One X-Wing VR Mission, we see that audio teams grapple constantly with the contrasting natures of diegetic and non-diegetic music.  While it seems as though non-diegetic music has been relegated to a very traditional, non-spatially localized delivery, this may not always be the case.  Jay Steen of Criterion Games spent some time considering the possibility of delivering the non-diegetic music of his Star Wars game with a more enveloping spatial texture.  “We did do a quick experiment on it, and we found that it’s like having an orchestra sitting around you,” Steen says. “We didn’t want to evoke you sitting in the middle of an orchestral recording.  We just wanted it to sound like the movie.”  That being said, Steen doesn’t rule out the possibility of a more spatially-interesting mix for music in the future, including the use of ambisonic recordings for non-diegetic musical scores. “Ambisonic recordings of orchestras for example,” Steen speculates, “I think there’s something fun there. We haven’t experimented with it anymore than that, but yeah, definitely, we’d want to try.”


So this concludes our look at two presentations from GDC 2017 that focused on issues that complicate music creation and implementation in virtual reality.  I hope you’ve found this interesting, and please feel free to leave a comment in the space below!


Photo of video game composer Winifred Phillips in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.

Video game composers can make you smarter! (The music of Dragon Front) Pt. 3

Winifred Phillips, video game music composer, pictured at the GDC 2016 display for the Dragon Front virtual reality game.

By Winifred Phillips | Contact | Follow

Welcome to the third (and final) article in this three-part discussion of how video game composers (like us) can make strategy gamers smarter!  We’ve been exploring the best ways that the music of game composers can help strategy gamers to better concentrate while making more sound tactical decisions. During this discussion, I’ve shared my personal perspective as the composer for the popular Dragon Front strategy game for VR.

In part one, we discussed the concept of ‘music-message congruency,’ so if you haven’t read that article yet, you can read it here.  In part two, we explored the meaning of ‘cognition-enhancing tempo’ – you can read that article here.  Please make sure to read both those articles first and then come back.

Are you back?  Awesome!  Let’s launch into a discussion of the third technique for increasing the smarts of strategy gamers!

Tension-regulating affect

From the article by game composer Winifred Phillips, an illustration of 'psychological affect.'In psychology, the term ‘affect’ refers to emotion, particularly in terms of the way in which such emotional content is displayed.  Whether by visual or aural means, an emotion can not be shared without some kind of ‘affect’ that serves as its mode of communication from one person to another.  When we’re happy, we smile.  When we’re angry, we frown.

Continue reading

Video Game Music Composer: Music and Sound in VR Headphones (Part Two)

Photo of game composer Winifred Phillips in her music production studio, from the article "Video Game Music Composer: Music and Sound in VR Headphones (Part Two)"My work as a video game composer has lately included some projects for virtual reality games (more info on that in the coming months), and as a result I’ve been thinking a lot about the awesome potential of VR, and have also been writing lots of articles on the subject.  Earlier this month I began a two-part article that focuses on the experience of the end user, and the gear with which they’ll be enjoying our video game music and audio content (you can read part one here). So, let’s now continue our discussion about the new generation of headphones designed specifically for VR!

In this article, we’ll be discussing two headphone models:

  • Entrim 4D
  • Plantronics RIG 4VR

So let’s get underway!

Entrim 4D headphones

Photo of the Entrim 4D, from the VR headphones article by Winifred Phillips (award-winning game music composer)This March at the famous SXSW convention in Austin, Samsung showed off a piece of experimental technology promising to bring a new dimension of immersion to virtual reality.  It’s designed specifically to complement their popular Samsung Gear VR device, and it works by virtue of electrodes that send electrical signals right into the wearer’s head!  As if virtual reality itself weren’t futuristic enough, now we’re talking about a device that zaps us to make the VR feel more real!  It’s called Entrim 4D (pictured right).  We’re talking about it here because (among other things) Entrim 4D is a pair of audio headphones built specifically for VR.

Continue reading

Game Music Middleware, Part 4: Elias


Welcome back to my blog series that offers tutorial resources exploring game music middleware for the game music composer. I initially planned to write two blog entries on the most popular audio middleware solutions (Wwise and FMOD), but since I started this blog series, I’ve been hearing buzz about other middleware solution and so I thought it best to expand the series to incorporate other interesting solutions to music implementation in games.  This blog will focus on a brand new middleware application called Elias, developed by Elias Software.  While not as famous as Wwise or FMOD, this new application offers some intriguing new possibilities for the creation of interactive music in games.

If you’d like to read the first three blog entries in this series, you can find them here:

Game Music Middleware, Part 1: Wwise

Game Music Middleware, Part 2: FMOD

Game Music Middleware, Part 3: Fabric


Elias stands for Elastic Lightweight Integrated Audio System.  It is developed by Kristofer Eng and Philip Bennefall for Microsoft Windows, with a Unity plugin for consoles, mobile devices and browser-based games.  What makes Elias interesting is the philosophy of its design.  Instead of designing a general audio middleware tool with some music capabilities, Eng and Bennefall decided to bypass the sound design arena completely and create a middleware tool specifically outfitted for the game music composer. The middleware comes with an authoring tool called Elias Composer’s Studio that “helps the composer to structure and manage the various themes in the game and bridges the gap between the composer and level designer to ease the music integration process.”

Here’s the introductory video for Elias, produced by Elias Software:

The interactive music system of the Elias middleware application seems to favor a Vertical Layering (or vertical re-orchestration) approach with a potentially huge number of music layers able to play in lots of combinations.  The system includes flexible options for layer triggering, including the ability to randomize the activation of the layers to keep the listening experience unpredictable during gameplay.

Elias has produced a series of four tutorial videos for the Composer’s Studio authoring tool.  Here’s the first of the four tutorials:

There’s also a two-part series of tutorials about Elias produced by Dale Crowley, the founder of the game audio services company Gryphondale Studios.  Here’s the first of the two videos:

As a middleware application designed specifically to address the top needs of game music composers, Elias is certainly intriguing!  The software has so far been used in only one published game – Gauntlet, which is the latest entry in the awesome video game franchise first developed by Atari Games for arcade cabinets in 1985.  This newest entry in the franchise was developed by Arrowhead Game Studios for Windows PCs.  We can hear the Elias middleware solution in action in this gameplay video from Gauntlet:

The music of Gauntlet was composed by Erasmus Talbot.  More of his music from Gauntlet is available on his SoundCloud page.

Elias Software recently demonstrated its Elias middleware application on the expo floor of the Nordic Game 2015 conference in Malmö, Sweden (May 20-22, 2015).  Here’s a look at Elias’ booth from the expo:


Since Elias is a brand new application, I’ll be curious to see how widely it is accepted by the game audio community.  A middleware solution that focuses solely on music is definitely a unique approach!  If audio directors and audio programmers embrace Elias, then it may have the potential to give composers better tools and an easier workflow in the creation of interactive music for games.

Interview about Game Music on The Note Show!


I’m excited to share that I’ve been interviewed about my career as a game music composer and my book, A Composer’s Guide to Game Music, for the newest episode of The Note Show!

The Note Show is a terrific podcast that focuses on interviews with professionals in creative fields.  I’m very proud to have been included! Famous guests on The Note Show have included Hugo and Nebula award-winning sci-fi author David Brin, actress Kristina Anapau of the HBO series True Blood, video game designer Al Lowe (Leisure Suit Larry), actress Lisa Jakub (Mrs. Doubtfire, Independence Day), and Steven Long Mitchell and Craig Van Sickle, creators of the NBC series The Pretender.

This is my second time being interviewed on The Note Show, and I’m so glad to have been invited back!

In this interview, I talk about my work on the LittleBigPlanet and Assassin’s Creed franchises, my latest project (Total War Battles: Kingdom), how composing music for a mobile game differs from composing for consoles or PC, and how my life has changed with the publication of my book, A Composer’s Guide to Game Music.

In the podcast, we also talk about the National Indie Excellence Book Award that my book recently won, as well as the importance of optimism for an aspiring game composer.


You can listen to the entire interview here:



Here’s some official info from the creators of The Note Show:

The Creative Professional Podcast – Music & Arts Interviews

The Note Show is a creative journey where host Joshua Note returns to chat life and art with creative people across the world. We interview musicians, artists, comic book creators, novelists, directors, actors and anyone creative and bring you new people and experiences every week!  The Note Show is a Podcast for and featuring Creative Professionals from all walks of life. As long as it’s creative, it’s here on The Note Show.

The show’s host, Joshua Note, is a terrific interviewer who is also the author of a children’s book due for release in 2015.  In addition, Joshua studied classical composition and orchestration at Leeds College of Music and Leeds University, and in 2012 he produced a for-television animated series and worked on several projects for television and cinema.

Joshua Note, host of The Note Show

Joshua Note, host of The Note Show

In his role as the host of The Note Show, Joshua asks intelligent questions about what it means to be a creative person in modern times, and his interviews are always fascinating!  My thanks to Joshua and the staff of The Note Show – I had a great time!