VRDC 2017 takeaways: VR music for the game composer

Video game music composer Winifred Phillips, at work in her music production studio - from the article about music for virtual reality / VR.The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences.  I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio.  This year, the hot topic was virtual reality.  In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show.  The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject.  In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks.  Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).

Inside and outside

The talks we’ll be discussing in this article are entitled “Audio Adventures in VR Worlds” and “The Sound Design of Star Wars: Battlefront VR.”  Here’s a common issue that popped up in both talks:

An illustration of music in the popular VR platform, from the article by Winifred Phillips (video game composer).Where should video game music be in a VR game?  Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player?  Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player?  The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal.  Is one of these approaches more effective in VR than the other?  Which choice is best?

These two concepts share a lot in common with the traditional categories of diegetic and non-diegetic music in entertainment media.  Diegetic music exists inside the fictional world, perceived by the characters within it, whereas non-diegetic music is inaudible to the characters and only exists for the benefit of the audience.  VR presents an interesting twist to this usually straightforward dichotomy.  When the entertainment experience is doing everything in its power to make us forget that we’re an audience, to the point where we achieve a sense of complete presence within the fictional world… what role does non-diegetic music play then?  If we can now consider ourselves as characters in the story, how do we hear music that story characters aren’t supposed to hear?

An illustration of the game PlayStation VR Worlds, from the article by popular video game music composer Winifred Phillips“VR goes beyond picture sync. It’s about sync of the world,” says music producer Joe Thwaites of Sony Interactive Entertainment Europe.  In his talk about the music and sound of the game PlayStation VR Worlds, Thwaites explores the relationship between music and the VR environment. “The congruency between audio and visuals is key in maintaining that idea of believability,” Thwaites asserts, “which in turn makes immersiveness, and in turn makes presence.”  In virtual reality development, the term ‘presence’ denotes the sensation of actually existing inside the virtual environment.  According to Thwaites, a strong believable relationship between the aural and visual worlds can contribute to a more satisfying VR experience.

Music inside the world

An illustration of the Ocean Descent portion of famous PlayStation VR Worlds game, from the article by Winifred Phillips, video game composer.As an example, Thwaites describes an interactive music implementation that he integrated into the ‘Ocean Descent’ section of PlayStation VR Worlds.  In this portion of the game, Thwaites pulled the otherwise non-diegetic musical score more fully into the immersive world by creating an illusion that the in-game objects were reacting to the musical notes.  “There’s a part called The Jellyfish Cave, where you descend into this sea of jellyfish,” Thwaites describes.  “You get this 2D music,” he adds, “which bypasses the 3D audio plugin, so it goes straight to your ears.”  In other words, the music is recorded in a traditionally stereo mix and the output is fed directly to the player’s headphones without bothering with any spatial positioning in the virtual world.  “Then, as you look around, these jellyfish light up as you look directly at them,” Thwaites goes on, “and they emit a tone in 3D in space so the music tone stays where it is in the world.”  So, these tones have been attached to specific jellyfish in the virtual world, spatially positioned to emanate from those locations, as if special portions of the non-diegetic score have suddenly leapt into the VR world and taken up residence there. “And that has this really nice effect of creating this really immersive and magical moment which is really unique to VR,” Thwaite remarks.

So this method served to help non-diegetic music feel more natural within the VR environment.  But what happens when pure non-diegetic music is an absolutely necessity?

Music outside the world

An illustration of the famous Star Wars Battlefront VR game, from the article by video game music composer Winifred Phillips.In the game Star Wars Battlefront Rogue One X-Wing VR Mission, the audio team at Criterion Games were tasked with creating an authentic audio experience in a virtual reality environment dedicated to the eternally famous and popular Star Wars franchise.  In this case, according to audio lead Jay Steen, pure non-diegetic music was a must.  “Non-diegetic means not from a source in the scene. This is how most movies and flatscreen games handle the music. So the music plays through the direct out straight to the player’s ears and we were worried from what we’d heard about non-diegetic music that it would distract from immersion,” Steen confesses. “But we actually found the opposite. Maybe that’s because you can’t have a Star Wars story without the music. You don’t feel like you’re in Star Wars until the music kicks in.”  According to Steen, the non-diegetic music worked in this circumstance because the audio team was careful to avoid repetition in the musical score.  “We didn’t reuse or loop cues that much, and due to the linear structure of the mission we could kind of get away with this,” Steen points out. “We think that helps to not break immersion.”

My perspective on using non-diegetic music in VR:

Famous video game composer Winifred Phillips works in her music production studio.Sometimes non-diegetic music can be introduced into a VR game, and then quickly transformed into diegetic music within the immersive environment in order to enhance player presence.  In my musical score for the Dragon Front game for Oculus Rift, I composed a dramatic choral track for the opening main theme of the game.  During the game’s initial logo sequence, the music is channeled directly to the player’s ears without any spatial positioning.  However, this changes as soon as the player fully enters the initial environment (wherein the player navigates menus and prepares to enter matches).  Logo art from the popular Dragon Front game, featured in the article by video game music composer Winifred PhillipsOnce the logo sequence has completed, the music makes a quick transition, from a full-bodied direct stereo mix to the player’s headphones, to a spatially localized narrow mix located to the player’s lower right.  Upon turning, players see that the music is now coming from a battered radio, which the player is free to turn on and off.  The music is now fully diegetic, existing inside the game’s fictional world.  Here’s a video showing this sequence in action:

Music inside and outside

The logo for the VR Luge portion of the popular PlayStation VR Worlds game, from the article by Winifred Phillips (video game music composer).While non-diegetic music can be tricky in VR, sometimes its an important part of the overall aesthetic.  Plus, there can be ways to integrate non-diegetic music into the spatial environment.  Joe Thwaites of Sony Europe describes an interesting combination of diegetic and non-diegetic music that was integrated into the ‘VR Luge’ section of the PlayStation VR Worlds game.  In this gameplay sequence, players ride feet-first on a luge that’s racing downhill amidst heavy vehicle traffic.  The experience was designed to be a heart-stopping thrill ride.  “So one of the experiments we did around the synchronization of the world was using a combination of diegetic and non-diegetic music to build tension as you zoomed down the hill,” Thwaites describes. “We used 3D car radios to introduce elements of percussion into the 2D soundtrack that was playing.”  In the musical score for this sequence, the non-diegetic music presented a purely percussive rhythm, but as the player passed by other cars, the music would change.  “So as you passed a car with a radio playing, an element of that 3D music would transition from the car into the 2D soundtrack.”  In this way, the in-game radio music would briefly become a part of the game’s non-diegetic score, while still conveying spatial positioning inside the 3D world.

So in these examples from PlayStation VR Worlds and Star Wars Battlefront Rogue One X-Wing VR Mission, we see that audio teams grapple constantly with the contrasting natures of diegetic and non-diegetic music.  While it seems as though non-diegetic music has been relegated to a very traditional, non-spatially localized delivery, this may not always be the case.  Jay Steen of Criterion Games spent some time considering the possibility of delivering the non-diegetic music of his Star Wars game with a more enveloping spatial texture.  “We did do a quick experiment on it, and we found that it’s like having an orchestra sitting around you,” Steen says. “We didn’t want to evoke you sitting in the middle of an orchestral recording.  We just wanted it to sound like the movie.”  That being said, Steen doesn’t rule out the possibility of a more spatially-interesting mix for music in the future, including the use of ambisonic recordings for non-diegetic musical scores. “Ambisonic recordings of orchestras for example,” Steen speculates, “I think there’s something fun there. We haven’t experimented with it anymore than that, but yeah, definitely, we’d want to try.”

Conclusion

So this concludes our look at two presentations from GDC 2017 that focused on issues that complicate music creation and implementation in virtual reality.  I hope you’ve found this interesting, and please feel free to leave a comment in the space below!

 

Photo of video game composer Winifred Phillips in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.

Resources For Video Game Music Composers

Video game music composer Winifred Phillips, at work in her music production studio.

By Winifred Phillips | Contact | Follow

I’m pleased to announce that my book, A Composer’s Guide to Game Music, is now available its new paperback edition! I’m excited that my book has done well enough to merit a paperback release, and I’m looking forward to getting to know a lot of new readers!  The paperback is much lighter and more portable than the hardcover.  Here’s a view of the front and back covers of the new paperback edition of my book (click the image for a bigger version if you’d like to read the back cover):

award-winning video game music composer Winifred Phillips' book, A Composer's Guide to Game Music, is now available in paperback.

From the article by Winifred Phillips (composer of video game music) - depiction of the book cover of A COMPOSER'S GUIDE TO GAME MUSIC.As you might expect, many aspiring game composers read my book, and I’m honored that my book is a part of their hunt for the best resources to help them succeed in this very competitive business.  When I’m not working in my music studio, I like to keep up with all the great new developments in the game audio field, and I share a lot of what I learn in these articles. Keeping in mind how many of my readers are aspiring composers, I’ve made a point of devoting an article once a year to gathering the top online guidance currently available for newcomers to the game music profession.  In previous years I’ve focused solely on recommendations gleaned from the writings of game audio pros, but this time I’d like to expand that focus to include other types of resources that could be helpful.  Along the way, we’ll be taking a look at some nuggets of wisdom that have appeared on these sites.  So, let’s get started!

Continue reading

Video game music systems at GDC 2017: tools and tips for composers

Photo of video game composer Winifred Phillips, working in her music production studio on the music of the SimAnimals video game.

By video game composer Winifred Phillips | Contact | Follow

Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:

In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn.  So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems.  If you haven’t read parts one and two of this series, please go do so now and then come back:

  1. Video game music systems at GDC 2017: what are composers using?
  2. Video game music systems at GDC 2017: pros and cons for composers

Ready?  Great!  Here we go!

Continue reading

Video game music systems at GDC 2017: pros and cons for composers

Video game composer Winifred Phillips, pictured in her music production studio working on the music of LittleBigPlanet 2 Cross Controller

By Winifred Phillips | Contact | Follow

Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:

If you haven’t read part one of this article series, please go do that now and come back.

Okay, so let’s now contemplate some simple but important questions: why were those systems used?  What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?

Continue reading

Video game music systems at GDC 2017: what are composers using?

By video game music composer Winifred Phillips | Contact | Follow

Video game composer Winifred Phillips, presenting at the Game Developers Conference 2017.The 2017 Game Developers Conference could be described as a densely-packed deep-dive exploration of the state-of-the-art tools and methodologies used in modern game development.  This description held especially true for the game audio track, wherein top experts in the field offered a plethora of viewpoints and advice on the awesome technical and artistic challenges of creating great sound for games. I’ve given GDC talks for the past three years now (see photo), and every year I’m amazed at the breadth and diversity of the problem-solving approaches discussed by my fellow GDC presenters.  Often I’ll emerge from the conference with the impression that we game audio folks are all “doing it our own way,” using widely divergent strategies and tools.

This year, I thought I’d write three articles to collect and explore the ideas that were discussed in five different GDC audio talks.  During their presentations, these five speakers all shared their thoughts on best practices and methods for instilling interactivity in modern game music.  By absorbing these ideas side-by-side, I thought we might gain a sense of the “bigger picture” when it comes to the current leading-edge thinking for music interactivity in games. In the first article, we’ll look at the basic nature of these interactive systems.  We’ll devote the second article to the pros and cons of each system, and in the third article we’ll look at tools and tips shared by these music interactivity experts. Along the way, I’ll also be sharing my thoughts on the subject, and we’ll take a look at musical examples from some of my own projects that demonstrate a few ideas explored in these GDC talks:

So, let’s begin with the most obvious question.  What kind of interactive music systems are game audio folks using lately?

Continue reading

Variation for the video game composer: the music of Little Lords of Twilight

Pictured: video game music composer Winifred Phillips at the BKOM booth during GDC 2017.

Since one of my most recent projects, Little Lords of Twilight, became available worldwide earlier this year and was recently greenlit on the famous Steam platform, I thought I’d write this article to share some of my creative and technical process in composing the music for this game. In particular, this project presents a great opportunity to look at how compositional variation (as we understand it from music theory) can be useful for the structure of interactive music.

Developed by BKOM Studios, Little Lords of Twilight won a Best in Play Award at GDC 2017, a Best Designed Mobile App Platinum Award from the BMA Awards, a Communicator Award for Best Mobile App, and has appeared on numerous “Best of” lists, including those published by PocketGamer, Explore Gadgets, and GameInOnline.  As a player-versus-player turn-based strategy game, Little Lords of Twilight offers a unique gameplay mechanic influenced by the in-game passage of time.  Day and night cycles dramatically alter your character’s appearance and abilities. Depending on whether it is currently day or night in the game, your character will have access to a completely different complement of awesome skills and spells to wield on the battlefield.

Continue reading

Composing video game music to build suspense, part 4: drones of dread

Winifred Phillips, video game music composer, at work in her studio on the music of the original God of War.

By Winifred Phillips | Contact | Follow

Welcome to the fourth installment of my five-part article series discussing music composition techniques that heighten tension and suspense for video game projects.  These articles are based on the presentation I gave at this year’s Game Developers Conference in San Francisco, entitled Homefront to God of War: Using Music to Build Suspense.  If you haven’t read the previous three articles, you’ll find them here:

Before we move on to the next music composition technique in our suspense-building arsenal, I’d like to briefly revisit a video game project we discussed in our last article; the popular Dragon Front VR game for the Oculus Rift, developed by High Voltage Software.

Continue reading