Composing video game music for Virtual Reality: Diegetic versus Non-diegetic

In this article for and about the craft of video game composers, Winifred Phillips is pictured in this photo working in her music production studio.

By Winifred Phillips | Contact | Follow

So happy you’ve joined us!  I’m videogame composer Winifred Phillips.  Welcome back to our four part discussion of the role that music plays in Virtual Reality video games! These articles are based on the presentation I gave at this year’s gathering of the famous Game Developer’s Conference in San Francisco.  My talk was entitled Music in Virtual Reality (I’ve included the official description of my talk at this end of this article). If you haven’t read the previous two articles, you’ll find them here:

During my GDC presentation, I focused on three important questions for VR video game composers:

  • Do we compose our music in 3D or 2D?
  • Do we structure our music to be Diegetic or Non-Diegetic?
  • Do we focus our music on enhancing player Comfort or Performance?

While attempting to answer these questions during my GDC talk, I discussed my work on four of my own VR game projects – the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike shooter/RPG from Labrodex Inc.

In these articles, I’ve been sharing the discussions and conclusions that formed the basis of my GDC talk, including numerous examples from these four VR game projects.  So now let’s look at the second of our three questions:

Do we structure our music to be Diegetic or Non-Diegetic?

In this article discussing popular VR issues for video game composers, Winifred Phillips explores an example from one of her game music composition projects - the Dragon Front VR strategy game.Before we launch into this discussion, let’s revisit one of the examples from the previous article.  You’ll remember that we took a look at the Main Theme music I composed for the popular Dragon Front VR strategy game, in order to examine how music can best transition from a traditionally 2D stereo delivery to a 3D positional implementation.  So in this case, the big victorious anthem that I composed for Dragon Front makes its first appearance as a bombastic stereo mix directly piped into the player’s headphones, and then transitions smoothly to a spatially positioned environmental sound issuing from a small in-game radio.  Just as a reminder, let’s take another look at that:

In this example, we see how the Dragon Front theme music starts as traditional underscore (that is, a non-diegetic score), but then moves into the VR space and becomes a diegetic score – one that is understood to be present in the game world. And that brings us to the second of the three core debates at the heart of music in VR: should music in VR be diegetic or non-diegetic?

It’s a thorny issue. As we know, musical underscore is absolutely vital in gaming – it creates momentum, motivates players and adds emotional texture to the story and the characters. However, in VR, the idea of presence becomes paramount. We want players to feel like they are inside the fiction of an awesome VR world. So, when the non-diegetic music starts playing, we worry that players might stop and wonder, ‘where’s this music coming from? Why am I hearing it?’

The obvious solution is to make all of the music in the game diegetic – somehow, in this VR world, all music comes from in-game sources that players can see in the environment around them. Here’s an example from one of my VR projects – Bebylon: Battle Royale, from developers Kite & Lightning.

In this article exploring the craft of VR music for video game composers, Winifred Phillips discusses an example from one of her own VR projects - the Bebylon: Battle Royale game for the famous Oculus Rift VR platform.Bebylon is a great example of a completely diegetic score in VR. The whole premise hinges on immortal babies battling it out in over-the-top arena fights in a futuristic setting. Music during gameplay is represented by a group of in-game baby musicians, so the music originates from that source, and we’re able to see this happening in the VR world. So, let’s take a look at that:

Bebylon: Battle Royale proves that its possible to get away with a completely diegetic score, but we’d need really specific circumstances to justify it. Most games won’t be able to make this approach work. So, what then? I’ve found that there are three strategies to ease non-diegetic music into VR:

  • Keep it subtle and gradual,
  • Keep it dry and warm, and
  • Keep it both inside and outside the VR world.

So let’s start with the first strategy – subtle and gradual.

In this article about music for the popular VR platforms (by a video game composer for video game composers) Winifred Phillips describes her work on the Scraper VR shooter/RPG.We’ve already discussed this technique in the first article in this series, when we took a look at the ambient music for Scraper, a first-person VR shooter set inside colossal skyscrapers in a futuristic city. Exploring the massive buildings in the Scraper fictional universe requires a musical soundtrack to set the tone, but introducing it so that it feels natural in VR is a challenge.

In order to address this problem, I composed the ambient music in Scraper so that it would come and go in subtle, gradual ways. As a technique for music implementation in VR, this can be an effective approach. Let’s take another look at what that was like in Scraper:

While this technique works well for the ambient music, it wasn’t an option for combat. Battles in Scraper are pretty intense – the music begins with a bang and keeps on whaling away until the room is cleared of enemies. At the beginning of the project, we’d decided on a stereo music mix rather than spatialization – considering how important audio cues are to expert first-person-shooter players, we didn’t want a spatialized score to introduce any confusion. My job at that point was to figure out a way to delineate the stereo music mix from the VR world so that the player wouldn’t wonder where the music was coming from.

An illustration for the famous 'proximity effect' in sound recording - in this article for video game composers, Winifred Phillips explores the role of music in VR.From here, I started thinking about proximity effect – it’s a term relating to microphone recording. You’ll notice proximity effect when someone speaks into a mike while leaning very close to it. The voice starts sounding really bassy and warm in tone, and the mike picks up a lot of the dry source signal, with less of the room acoustics coming through. When you listen with headphones to a recording with lots of proximity effect, it tends to feel like it’s inside your head. I thought – great! If the music is in our heads, we’re not going to be looking around, wondering where it’s coming from.

I recorded the music for Scraper with fairly dry acoustics, and when I mixed the music, I focused on keeping the tone warm and bassy, with a solid low end and some rich mids in the EQ spectrum. Here’s an example of how that worked in combat sequences of the Scraper VR game:

The logo of the Fail Factory game for the popular VR platform -- in this article for video game composers, Winifred Phillips explores an example from one of her own VR music composition projects.I also recorded the music of Fail Factory with dry acoustics and a warm, bassy mix – this effect is especially prevalent during the Fail Factory tutorial. The instructor zips around on a hover craft while offering tips and guidelines, so having the music in a dry, warm mix allows it to feel closer to the player, and more separated from the spatialized sounds from the instructor. Let’s check that out:

So now let’s look at another approach, which I’ve called ‘Inside and Outside.’ If music is 3D – if it’s spatialized – we’re more likely to think it actually exists inside the fictional world. If music is 2D – if it’s a direct stereo mix – we’ll be more likely to accept it as non-diegetic, as outside the experience.

A depiction of the official logo of the Dragon Front VR game -- in an article written for video game composers, Winifred Phillips (video game composer) explores the role of music in projects for VR projects.Remember the example I showed earlier from Dragon Front – when the main theme music of the game transitioned into a spatialized music source coming from inside the VR space? This is an example of music making the jump from non-diegetic to diegetic, and that can help the player accept the presence of music as a part of the VR game. Watch how players can look around in the Dragon Front hub area, locate the source of the music, and actually turn it off if they want to:

So we’ve now discussed the second of the three important questions for video game composers creating music for VR games:

  • Do we compose our music in 3D or 2D?
  • Do we structure our music to be Diegetic or Non-Diegetic?
  • Do we focus our music on enhancing player Comfort or Performance?

We’ve contemplated what role our music should play in the VR experience – whether it should be considered a part of the fictional world or an outside commentary that shapes the player’s emotional experience.  Both roles are valid, but the choice between them is especially meaningful within the context of VR.  The next article will focus on the third of the three questions: whether music in VR should enhance player comfort or player performance.  Thanks for reading, and please feel free to leave your comments in the space below!

 

 


 

Music in Virtual Reality

Illustration of the popular VR projects featuring music by game composer Winifred Phillips, to be discussed in a GDC talk presented by Winifred Phillips for video game composers.This lecture presented ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips shared tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips addressed several important problems facing composers in VR.

Topics included 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music were explored, including methods that blur the distinction between the two categories.

The discussion also included an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms.  Phillips’ talk offered techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project.

Takeaway

Through examples from several VR games, Phillips provided an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk included concrete examples and practical advice that audience members can apply to their own games.

Intended Audience

This session provided composers and audio directors with strategies for designing music for VR. It included an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.)

The talk was intended to be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).

 

 

In this article for video game composers, Winifred Phillips is shown here creating game music in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.

Composing video game music for Virtual Reality: 3D versus 2D

In this article written for video game composers, Winifred Phillips is here pictured working in her music production studio.

Welcome!  I’m videogame composer Winifred Phillips, and this is the continuation of our four-part discussion of the role that music can play in Virtual Reality video games.  These articles are based on the presentation I gave at this year’s Game Developer’s Conference in San Francisco, entitled Music in Virtual Reality (I’ve included the official description of my talk at this end of this article).  If you missed the first article exploring the history and significance of positional audio, please go check that article out first.

Are you back?  Great!  Let’s continue!

During my GDC talk, I addressed three questions which are important to video game music composers working in VR:

  • Do we compose our music in 3D or 2D?
  • Do we structure our music to be Diegetic or Non-Diegetic?
  • Do we focus our music on enhancing player Comfort or Performance?

Continue reading

Composing video game music for Virtual Reality: The role of music in VR

In this article for video game composers, Winifred Phillips is pictured working in her music production studio.

By Winifred Phillips | Contact | Follow

Hey everybody!  I’m video game composer Winifred Phillips.  At this year’s Game Developers Conference in San Francisco, I was pleased to give a presentation entitled Music in Virtual Reality (I’ve included the official description of my talk at the end of this article). While I’ve enjoyed discussing the role of music in virtual reality in previous articles that I’ve posted here, the talk I gave at GDC gave me the opportunity to pull a lot of those ideas together and present a more concentrated exploration of the practice of music composition for VR games.  It occurred to me that such a focused discussion might be interesting to share in this forum as well. So, with that in mind, I’m excited to begin a four-part article series based on my GDC 2018 presentation!

Continue reading

Video Game Composers: The Art of Music in Virtual Reality (GDC 2018)

Video game composer Winifred Phillips, pictured in her music production studio.

 

By Winifred Phillips | Contact | Follow

Once again, the Game Developers Conference is almost upon us!  GDC 2018 promises to be an awesome event, chock full of great opportunities for us to learn and grow as video game music composers.  I always look forward to the comprehensive sessions on offer in the popular GDC audio track, and for the past few years I’ve been honored to be selected as a GDC speaker.  Last year I presented a talk that explored how I built suspense and tension through music I composed for such games as God of War and Homefront: The Revolution.  This year, I’m tremendously excited that I’ll be presenting the talk, “Music in Virtual Reality.” The subject matter is very close to my heart!  Throughout 2016 and 2017, I’ve composed music for many virtual reality projects, some of which have hit retail over the past year, and some of which will be released very soon.  I’ve learned a lot about the process of composing music for a VR experience, and I’ve given a lot of thought to what makes music for VR unique.  During my GDC talk in March, I’ll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike Shooter/RPG from Labrodex Inc.  I’ll talk about some of the top problems that came up, the solutions that were tried, and the lessons that were learned.  Virtual Reality is a brave new world for game music composers, and there will be a lot of ground for me to cover in my presentation!

In preparing my talk for GDC, I kept my focus squarely on composition techniques for VR music creation, while making sure to supply an overview of the technologies that would help place these techniques in context.  With these considerations in mind, I had to prioritize the information I intended to offer, and some interesting topics simply wouldn’t fit within the time constraints of my GDC presentation.  With that in mind, I thought it would be worthwhile to include some of these extra materials in a couple of articles that would precede my talk in March.  In this article, I’ll explore some theoretical ideas from experts in the field of VR, and I’ll include some of my own musings about creative directions we might pursue with VR music composition.  In the next article, I’ll talk about some practical considerations relating to the technology of VR music.

Continue reading

GDC 2017: How video game composers can use music to build suspense

Winifred Phillips, video game composer, giving a talk as part of the Game Developers Conference 2016 in San Francisco.

By Winifred Phillips | Contact | Follow

The Game Developers Conference is coming up soon!  Last year I presented a talk on music for mobile games (pictured above), and I’m pleased that this year I’ll be presenting the talk, “Homefront’ to ‘God of War’: Using Music to Build Suspense(Wednesday, March 1st at 11am in room 3006 West Hall, Moscone Center, San Francisco).  In my talk I’ll be focusing on practical applications of techniques for video game composers and game audio folks, using my own experiences as concrete examples for exploration.  Along the way, I’ll be discussing some very compelling scholarly research on the relationship between suspense, gameplay and musical expression.  In preparing my GDC 2017 presentation I did a lot of reading and studying about the nature of suspense in video games, the importance of suspense in gameplay design, and the role that video game music plays in regulating and elevating suspense.  There will be lots of ground to cover in my presentation!  That being said, the targeted focus of my presentation precluded me from incorporating some very interesting extra research into the importance of suspense in a more general sense… why human beings need suspense, and what purpose it serves in our lives.  I also couldn’t find the space to include everything I’d encountered regarding suspense as an element in the gaming experience.  It occurred to me that some of this could be very useful to us in our work as game makers, so I’d like to share some of these extra ideas in this article.

Continue reading

Video game composers can make you smarter! (The music of Dragon Front) Pt. 3

Winifred Phillips, video game music composer, pictured at the GDC 2016 display for the Dragon Front virtual reality game.

By Winifred Phillips | Contact | Follow

Welcome to the third (and final) article in this three-part discussion of how video game composers (like us) can make strategy gamers smarter!  We’ve been exploring the best ways that the music of game composers can help strategy gamers to better concentrate while making more sound tactical decisions. During this discussion, I’ve shared my personal perspective as the composer for the popular Dragon Front strategy game for VR.

In part one, we discussed the concept of ‘music-message congruency,’ so if you haven’t read that article yet, you can read it here.  In part two, we explored the meaning of ‘cognition-enhancing tempo’ – you can read that article here.  Please make sure to read both those articles first and then come back.

Are you back?  Awesome!  Let’s launch into a discussion of the third technique for increasing the smarts of strategy gamers!

Tension-regulating affect

From the article by game composer Winifred Phillips, an illustration of 'psychological affect.'In psychology, the term ‘affect’ refers to emotion, particularly in terms of the way in which such emotional content is displayed.  Whether by visual or aural means, an emotion can not be shared without some kind of ‘affect’ that serves as its mode of communication from one person to another.  When we’re happy, we smile.  When we’re angry, we frown.

Continue reading

How I got my big break as a video game music composer

I had a wonderful time last week, speaking before a lively and enthusiastic audience at the Society of Composers & Lyricists seminar, “Inside the World of Game Music.”  Organized by Greg Pliska (board member of the SCL NY), the event was moderated by steering committee member Elizabeth Rose and attended by a diverse audience of composers and music professionals.  Also, steering committee member Tom Salta joined the discussion remotely from his studio via Skype.

SCL-GameMusic-Feb2015

Towards the beginning of the evening, I was asked how I got my first big break in the game industry.  While I’d related my “big break” experience in my book, A Composer’s Guide to Game Music, it was fun sharing those memories with such a great audience, and I’ve included a video clip from that portion of the seminar.

After the event, we all headed over to O’Flanagan’s Irish Pub for great networking and good times at the official NYC SCL/Game Audio Network Guild G.A.N.G. Hang.  I especially enjoyed sharing some stories and getting to know some great people there!  Thanks to everyone who attended the SCL NYC seminar!