Composing video game music for Virtual Reality: Diegetic versus Non-diegetic

In this article for and about the craft of video game composers, Winifred Phillips is pictured in this photo working in her music production studio.

By Winifred Phillips | Contact | Follow

So happy you’ve joined us!  I’m videogame composer Winifred Phillips.  Welcome back to our four part discussion of the role that music plays in Virtual Reality video games! These articles are based on the presentation I gave at this year’s gathering of the famous Game Developer’s Conference in San Francisco.  My talk was entitled Music in Virtual Reality (I’ve included the official description of my talk at this end of this article). If you haven’t read the previous two articles, you’ll find them here:

During my GDC presentation, I focused on three important questions for VR video game composers:

  • Do we compose our music in 3D or 2D?
  • Do we structure our music to be Diegetic or Non-Diegetic?
  • Do we focus our music on enhancing player Comfort or Performance?

While attempting to answer these questions during my GDC talk, I discussed my work on four of my own VR game projects – the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike shooter/RPG from Labrodex Inc.

In these articles, I’ve been sharing the discussions and conclusions that formed the basis of my GDC talk, including numerous examples from these four VR game projects.  So now let’s look at the second of our three questions:

Do we structure our music to be Diegetic or Non-Diegetic?

In this article discussing popular VR issues for video game composers, Winifred Phillips explores an example from one of her game music composition projects - the Dragon Front VR strategy game.Before we launch into this discussion, let’s revisit one of the examples from the previous article.  You’ll remember that we took a look at the Main Theme music I composed for the popular Dragon Front VR strategy game, in order to examine how music can best transition from a traditionally 2D stereo delivery to a 3D positional implementation.  So in this case, the big victorious anthem that I composed for Dragon Front makes its first appearance as a bombastic stereo mix directly piped into the player’s headphones, and then transitions smoothly to a spatially positioned environmental sound issuing from a small in-game radio.  Just as a reminder, let’s take another look at that:

In this example, we see how the Dragon Front theme music starts as traditional underscore (that is, a non-diegetic score), but then moves into the VR space and becomes a diegetic score – one that is understood to be present in the game world. And that brings us to the second of the three core debates at the heart of music in VR: should music in VR be diegetic or non-diegetic?

It’s a thorny issue. As we know, musical underscore is absolutely vital in gaming – it creates momentum, motivates players and adds emotional texture to the story and the characters. However, in VR, the idea of presence becomes paramount. We want players to feel like they are inside the fiction of an awesome VR world. So, when the non-diegetic music starts playing, we worry that players might stop and wonder, ‘where’s this music coming from? Why am I hearing it?’

The obvious solution is to make all of the music in the game diegetic – somehow, in this VR world, all music comes from in-game sources that players can see in the environment around them. Here’s an example from one of my VR projects – Bebylon: Battle Royale, from developers Kite & Lightning.

In this article exploring the craft of VR music for video game composers, Winifred Phillips discusses an example from one of her own VR projects - the Bebylon: Battle Royale game for the famous Oculus Rift VR platform.Bebylon is a great example of a completely diegetic score in VR. The whole premise hinges on immortal babies battling it out in over-the-top arena fights in a futuristic setting. Music during gameplay is represented by a group of in-game baby musicians, so the music originates from that source, and we’re able to see this happening in the VR world. So, let’s take a look at that:

Bebylon: Battle Royale proves that its possible to get away with a completely diegetic score, but we’d need really specific circumstances to justify it. Most games won’t be able to make this approach work. So, what then? I’ve found that there are three strategies to ease non-diegetic music into VR:

  • Keep it subtle and gradual,
  • Keep it dry and warm, and
  • Keep it both inside and outside the VR world.

So let’s start with the first strategy – subtle and gradual.

In this article about music for the popular VR platforms (by a video game composer for video game composers) Winifred Phillips describes her work on the Scraper VR shooter/RPG.We’ve already discussed this technique in the first article in this series, when we took a look at the ambient music for Scraper, a first-person VR shooter set inside colossal skyscrapers in a futuristic city. Exploring the massive buildings in the Scraper fictional universe requires a musical soundtrack to set the tone, but introducing it so that it feels natural in VR is a challenge.

In order to address this problem, I composed the ambient music in Scraper so that it would come and go in subtle, gradual ways. As a technique for music implementation in VR, this can be an effective approach. Let’s take another look at what that was like in Scraper:

While this technique works well for the ambient music, it wasn’t an option for combat. Battles in Scraper are pretty intense – the music begins with a bang and keeps on whaling away until the room is cleared of enemies. At the beginning of the project, we’d decided on a stereo music mix rather than spatialization – considering how important audio cues are to expert first-person-shooter players, we didn’t want a spatialized score to introduce any confusion. My job at that point was to figure out a way to delineate the stereo music mix from the VR world so that the player wouldn’t wonder where the music was coming from.

An illustration for the famous 'proximity effect' in sound recording - in this article for video game composers, Winifred Phillips explores the role of music in VR.From here, I started thinking about proximity effect – it’s a term relating to microphone recording. You’ll notice proximity effect when someone speaks into a mike while leaning very close to it. The voice starts sounding really bassy and warm in tone, and the mike picks up a lot of the dry source signal, with less of the room acoustics coming through. When you listen with headphones to a recording with lots of proximity effect, it tends to feel like it’s inside your head. I thought – great! If the music is in our heads, we’re not going to be looking around, wondering where it’s coming from.

I recorded the music for Scraper with fairly dry acoustics, and when I mixed the music, I focused on keeping the tone warm and bassy, with a solid low end and some rich mids in the EQ spectrum. Here’s an example of how that worked in combat sequences of the Scraper VR game:

The logo of the Fail Factory game for the popular VR platform -- in this article for video game composers, Winifred Phillips explores an example from one of her own VR music composition projects.I also recorded the music of Fail Factory with dry acoustics and a warm, bassy mix – this effect is especially prevalent during the Fail Factory tutorial. The instructor zips around on a hover craft while offering tips and guidelines, so having the music in a dry, warm mix allows it to feel closer to the player, and more separated from the spatialized sounds from the instructor. Let’s check that out:

So now let’s look at another approach, which I’ve called ‘Inside and Outside.’ If music is 3D – if it’s spatialized – we’re more likely to think it actually exists inside the fictional world. If music is 2D – if it’s a direct stereo mix – we’ll be more likely to accept it as non-diegetic, as outside the experience.

A depiction of the official logo of the Dragon Front VR game -- in an article written for video game composers, Winifred Phillips (video game composer) explores the role of music in projects for VR projects.Remember the example I showed earlier from Dragon Front – when the main theme music of the game transitioned into a spatialized music source coming from inside the VR space? This is an example of music making the jump from non-diegetic to diegetic, and that can help the player accept the presence of music as a part of the VR game. Watch how players can look around in the Dragon Front hub area, locate the source of the music, and actually turn it off if they want to:

So we’ve now discussed the second of the three important questions for video game composers creating music for VR games:

  • Do we compose our music in 3D or 2D?
  • Do we structure our music to be Diegetic or Non-Diegetic?
  • Do we focus our music on enhancing player Comfort or Performance?

We’ve contemplated what role our music should play in the VR experience – whether it should be considered a part of the fictional world or an outside commentary that shapes the player’s emotional experience.  Both roles are valid, but the choice between them is especially meaningful within the context of VR.  The next article will focus on the third of the three questions: whether music in VR should enhance player comfort or player performance.  Thanks for reading, and please feel free to leave your comments in the space below!

 

 


 

Music in Virtual Reality

Illustration of the popular VR projects featuring music by game composer Winifred Phillips, to be discussed in a GDC talk presented by Winifred Phillips for video game composers.This lecture presented ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips shared tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips addressed several important problems facing composers in VR.

Topics included 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music were explored, including methods that blur the distinction between the two categories.

The discussion also included an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms.  Phillips’ talk offered techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project.

Takeaway

Through examples from several VR games, Phillips provided an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk included concrete examples and practical advice that audience members can apply to their own games.

Intended Audience

This session provided composers and audio directors with strategies for designing music for VR. It included an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.)

The talk was intended to be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).

 

 

In this article for video game composers, Winifred Phillips is shown here creating game music in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.

Composing video game music for Virtual Reality: 3D versus 2D

In this article written for video game composers, Winifred Phillips is here pictured working in her music production studio.

Welcome!  I’m videogame composer Winifred Phillips, and this is the continuation of our four-part discussion of the role that music can play in Virtual Reality video games.  These articles are based on the presentation I gave at this year’s Game Developer’s Conference in San Francisco, entitled Music in Virtual Reality (I’ve included the official description of my talk at this end of this article).  If you missed the first article exploring the history and significance of positional audio, please go check that article out first.

Are you back?  Great!  Let’s continue!

During my GDC talk, I addressed three questions which are important to video game music composers working in VR:

  • Do we compose our music in 3D or 2D?
  • Do we structure our music to be Diegetic or Non-Diegetic?
  • Do we focus our music on enhancing player Comfort or Performance?

Continue reading

Composing video game music for Virtual Reality: The role of music in VR

In this article for video game composers, Winifred Phillips is pictured working in her music production studio.

By Winifred Phillips | Contact | Follow

Hey everybody!  I’m video game composer Winifred Phillips.  At this year’s Game Developers Conference in San Francisco, I was pleased to give a presentation entitled Music in Virtual Reality (I’ve included the official description of my talk at the end of this article). While I’ve enjoyed discussing the role of music in virtual reality in previous articles that I’ve posted here, the talk I gave at GDC gave me the opportunity to pull a lot of those ideas together and present a more concentrated exploration of the practice of music composition for VR games.  It occurred to me that such a focused discussion might be interesting to share in this forum as well. So, with that in mind, I’m excited to begin a four-part article series based on my GDC 2018 presentation!

Continue reading

Video Game Composers: The Tech of Music in Virtual Reality (GDC 2018)

Video game composer Winifred Phillips, pictured in her music production studio.

By Winifred Phillips | Contact | Follow

The Game Developers Conference is almost here! I’m looking forward to giving my presentation soon on “Music in Virtual Reality” (Thursday, March 22nd at 3pm in room 3002 West Hall, Moscone Center, San Francisco).  Over the course of the last two years, I’ve composed a lot of music for virtual reality projects, some of which have already hit retail, and some of which will be getting released very soon!  As a result, I’ve spent a lot of time thinking about what role music should play in a virtual reality game. During my GDC talk in March, I’ll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike RPG-Shooter hybrid from Labrodex Inc.  In preparing my GDC presentation, I made sure my talk addressed some of the most important creative and technical hurdles facing video game composers working in VR.  However, time constraints ensured that some interesting info ended up ‘on the cutting room floor,’ so to speak.  So, I’ve written two articles that explore some of the best topics that didn’t make it into my GDC presentation.

My previous article focused on some abstract, creative concerns facing video game music composers and audio folks working in VR.  In this article, we’ll be turning our attention to more concrete technical issues.  Ready?  Let’s go.

New Binaural Developments

Illustration of popular binaural developments in VR audio, from the article by composer Winifred Phillips for video game composers.VR games currently focus on binaural audio to immerse players in the awesome soundscapes of their virtual worlds.  As we know, binaural recording techniques use two microphones, often embedded in the artificial ears of a dummy head (pictured right).  By virtual of the popular binaural recording technique and/or binaural encoding technologies, game audio teams can plunge VR players into convincing aural worlds where sounds are spatially localized in a way that conforms with real world expectations.  The technology of binaural sound continually improves, and recently the expert developers of the Oculus Rift VR headset have refined the quality of their VR sound with two significant upgrades.

Continue reading

Video Game Composers: The Art of Music in Virtual Reality (GDC 2018)

Video game composer Winifred Phillips, pictured in her music production studio.

 

By Winifred Phillips | Contact | Follow

Once again, the Game Developers Conference is almost upon us!  GDC 2018 promises to be an awesome event, chock full of great opportunities for us to learn and grow as video game music composers.  I always look forward to the comprehensive sessions on offer in the popular GDC audio track, and for the past few years I’ve been honored to be selected as a GDC speaker.  Last year I presented a talk that explored how I built suspense and tension through music I composed for such games as God of War and Homefront: The Revolution.  This year, I’m tremendously excited that I’ll be presenting the talk, “Music in Virtual Reality.” The subject matter is very close to my heart!  Throughout 2016 and 2017, I’ve composed music for many virtual reality projects, some of which have hit retail over the past year, and some of which will be released very soon.  I’ve learned a lot about the process of composing music for a VR experience, and I’ve given a lot of thought to what makes music for VR unique.  During my GDC talk in March, I’ll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike Shooter/RPG from Labrodex Inc.  I’ll talk about some of the top problems that came up, the solutions that were tried, and the lessons that were learned.  Virtual Reality is a brave new world for game music composers, and there will be a lot of ground for me to cover in my presentation!

In preparing my talk for GDC, I kept my focus squarely on composition techniques for VR music creation, while making sure to supply an overview of the technologies that would help place these techniques in context.  With these considerations in mind, I had to prioritize the information I intended to offer, and some interesting topics simply wouldn’t fit within the time constraints of my GDC presentation.  With that in mind, I thought it would be worthwhile to include some of these extra materials in a couple of articles that would precede my talk in March.  In this article, I’ll explore some theoretical ideas from experts in the field of VR, and I’ll include some of my own musings about creative directions we might pursue with VR music composition.  In the next article, I’ll talk about some practical considerations relating to the technology of VR music.

Continue reading

Video game music systems at GDC 2017: tools and tips for composers

Photo of video game composer Winifred Phillips, working in her music production studio on the music of the SimAnimals video game.

By video game composer Winifred Phillips | Contact | Follow

Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:

In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn.  So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems.  If you haven’t read parts one and two of this series, please go do so now and then come back:

  1. Video game music systems at GDC 2017: what are composers using?
  2. Video game music systems at GDC 2017: pros and cons for composers

Ready?  Great!  Here we go!

Continue reading

Video game music systems at GDC 2017: pros and cons for composers

Video game composer Winifred Phillips, pictured in her music production studio working on the music of LittleBigPlanet 2 Cross Controller

By Winifred Phillips | Contact | Follow

Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:

If you haven’t read part one of this article series, please go do that now and come back.

Okay, so let’s now contemplate some simple but important questions: why were those systems used?  What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?

Continue reading