Composing video game music for Virtual Reality: The role of music in VR

In this article for video game composers, Winifred Phillips is pictured working in her music production studio.

By Winifred Phillips | Contact | Follow

Hey everybody!  I’m video game composer Winifred Phillips.  At this year’s Game Developers Conference in San Francisco, I was pleased to give a presentation entitled Music in Virtual Reality (I’ve included the official description of my talk at this end of this article). While I’ve enjoyed discussing the role of music in virtual reality in previous articles that I’ve posted here, the talk I gave at GDC gave me the opportunity to pull a lot of those ideas together and present a more concentrated exploration of the practice of music composition for VR games.  It occurred to me that such a focused discussion might be interesting to share in this forum as well. So, with that in mind, I’m excited to begin a four-part article series based on my GDC 2018 presentation!

Illustration of the role of music in the popular VR platform, from the article by Winifred Phillips for video game composersVirtual Reality is one of the newest platforms for gaming, and it’s an exciting time to be a video game composer. We’re entering a brand new phase in the history of game development. For the audio side of the equation, it can seem like the most alien frontier we’ve yet encountered, with lots of unique challenges. Since VR first burst onto the commercial scene with the Oculus Rift in 2016, I’ve had the pleasure of composing the music for a bunch of awesome VR games. When working in VR, I’ve noticed that there are some important issues for video game music composers to address. During my GDC talk, I concentrated on three of these top questions:

  • Do we compose our music in 3D or 2D?
  • Do we structure our music to be Diegetic or Non-Diegetic?
  • Do we focus our music on enhancing player Comfort or Performance?

During my GDC talk I presented the best examples from some of my recent VR projects.  These are four very different VR games, including the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike shooter/RPG game from Labrodex Inc.  During my presentation, I shared video excerpts from these games in order to demonstrate concepts related to the role of music in VR.  In these articles, I’ll be embedding those same video clips so that we can further discuss concrete executions of some fairly abstract concepts.

But first, let’s pause so we can ask ourselves an important question. What does Virtual Reality mean for us game audio folks? How is it different from traditional game audio, and how is it the same?

An illustration of the famous 'presence' concept in VR, explained for video game composers by Winifred Phillips (game music composer).Virtual Reality is all about presence, about making players feel as if they exist inside the VR space. Everything in VR works in tandem to enhance that feeling of presence, including the audio content.  Let’s first take a look at one practical example of how music functions in a VR game, and then we’ll step back and look at the bigger picture.

One of my projects over this past year was the music for Scraper, a first-person VR shooter with RPG elements set inside colossal skyscrapers in a futuristic city. While playing Scraper, when we’re not shooting at enemies, we’re exploring the massive buildings. Ambient music sets the tone for this, but how to introduce it so that it feels natural in VR?  After all, if virtual reality is all about presence, about making players feel as if they exist in a real place — where is the music coming from?  Does it also exist in the VR world, and if it doesn’t, how do we introduce it so that it doesn’t disconcert the player and interfere with the realism of the VR experience?

In this article for video game composers, Winifred Phillips explains her music composition work for the Scraper: First Strike VR game.At the beginning of music production for Scraper, I had a long meeting with the project director, and we were particularly concerned about this issue in connection with the ambient score. So we decided that the ambient music would come and go in subtle, gradual ways. This works really well in VR, because it shifts the player’s focus away from the music when it first begins playing.  Hopefully, by the time that players notice music, it will already have been playing long enough to integrate itself into the environment in an unobtrusive way.  To that end, I composed the ambient tracks in Scraper so that they gently float into existence and then build steadily. Here’s an example of what it was like when ambient music began playing during exploration gameplay in Scraper:

And here’s another video clip in which some time has elapsed, and the ambient music is now gently fading away during continued exploration in the Scraper VR game:

So we’ve taken a brief look at a simple example of music implementation in VR, but the issues surrounding audio content in Virtual Reality become increasingly complex as we further consider how audio differs in a VR space.

Generally speaking, in order for audio to behave in convincing ways within VR, all sounds should emanate faithfully from their positional sources in the fully 3-dimensional VR world. Music can also be spatialized as well, although decisions about music spatialization are more complicated.  We’ll be exploring that question in more detail later in this article series, but for now let’s pause to think about what 3D sound means for us in VR.  While it might seem like audio spatialization in VR presents an exceptional challenge, it’s actually a discipline with a long history in game development – including some controversies. Since it’s useful to understand how we got to this point, let’s quickly review a bit of that history.

In this article discussing VR issues for video game composers, Winifred Phillips explores the historical role of Alan Blumlein's famous stereo format.The whole discipline of positional audio began in the 1930s with English engineer and inventor Alan Blumlein’s invention of the famous stereo audio format. We can thank Blumlein (pictured right) for those two very familiar sound channels; left and right. But for the purposes of our discussion, it’s more interesting to consider what Alan Blumlein initially called his newly invented technology – because he didn’t name it ‘stereo.’ He called it ‘binaural sound.’

We now understand the concept of binaural sound as a two-channel recording technique that’s much more complex and immersive than Blumlein’s simple stereo technology, but for a long time, the two terms (binaural and stereo) were used interchangeably.

In this article discussing VR music for video game composers, Winifred Phillips discusses the newly popular ambisonic technology.Meanwhile, the late 1960s brought another big leap forward in sound spatialization. We understand that Blumlein’s stereo format allows us to localize audio content across a horizontal axis from left to right – but what about vertical positioning? The ambisonic format makes that possible. Using specialized multi-channel microphones capturing audio from all directions, an ambisonic recording can be decoded into lots of ‘virtual’ audio channels that can be spread out and localized. Unfortunately, the ambisonic format failed to become popular with consumers, so it languished in obscurity for decades.

Meanwhile, Blumlein’s two-channel audio format was getting a big upgrade. Remember, Blumlein’s stereo gives us two audio channels that correspond with our two ears… but to localize sound, we need more than just a pair of ears. We also need… a head.  So AT&T provided one, as a part of a very successful exhibit at the 1933 World’s Fair in Chicago.  Author Cheryl R. Ganz described the exhibit in her book about the 1933 Chicago World’s Fair.

In this discussion of music in VR (by a video game composer for video game composers) Winifred Phillips describes the famous 'Oscar' recording device shown in the 1933 World's Fair.“In AT&T’s most popular attraction, Oscar, a mechanical man with microphone ears, sat in a glass room surrounded by visitors wearing head receivers.  Amazed, they heard exactly what Oscar heard.  Flies buzzing, footsteps, or whispers all seemed to surround each listener.”  Due to this amazing auditory phenomenon, the ‘Oscar’ display was “by far the most popular attraction in the exhibit.”

‘Oscar’ (pictured right) was, in fact, a crude precursor to the dummy head binaural microphone that would later emerge.  Even though it astounded visitors to the AT&T exhibit at the World’s Fair, the basic idea behind ‘Oscar’ didn’t really take off until 1975.  That’s when the Neumann microphone manufacturer released their fully-realized version of the famous dummy head binaural microphone.

An illustration for the popular dummy head binaural microphone - in this article for video game composers, Winifred Phillips explores the role of music in VR.The principle of the dummy head binaural microphone is simple. Sound reaches the head. It interacts with the shape of the head, the shape of the ear lobes and ear canals, until it finally reaches the two microphones inside. The binaural dummy head does a great job of replicating how human beings perceive sound in real life. As sound transfers from the built-in resonators and mufflers embodied by our heads, all this bouncing around provides loads of data to our brains. It’s how we get both horizontal and vertical info about where sounds are coming from. The whole process is called Head Related Transfer Function (HRTF). The Verge multimedia magazine produced a video that does a great job of demonstrating the principles behind HRTF – let’s take a look at that:

So, if we’ve had binaural audio since the 70s, why aren’t we listening to everything in binaural format now? Because binaural sound requires headphones to work – it won’t translate to a speaker system. Ambisonics will work with a speaker system… but consumers didn’t warm up to the format. So, the two technologies remained obscure, while everybody embraced surround sound for 3D audio.

An illustration for a discussion of the famous surround sound format -- in this article for video game composers, Winifred Phillips explores the function that music plays in a VR experience.Lots of speakers means great positional audio, right? Except surround sound doesn’t account for the vertical axis. No height. It’s not a sphere of sound – it’s more like a hula hoop.

As we all know, video games have included all of these positional audio technologies… and yet only surround sound gained any long-term traction. Which is weird. Gamers like wearing headphones, so binaural audio should have been a perfect fit. And actually, it was… in the 1990s. And then it dropped off a cliff.  Let’s take a look at why that happened:

Aureal Semiconductor blew everybody’s minds in the nineties by releasing the A3D sound middleware for their Vortex PC sound card chipsets. The A3D middleware enabled Head Related Transfer Functions for game audio, which essentially delivered a form of simulated binaural sound. Maximum PC Magazine called it one of the greatest 100 PC innovations of all time. But then in 1998, Creative Labs slapped Aureal Semiconductor with a patent infringement lawsuit. The legal fees bankrupted Aureal, Creative Labs bought the company in September 2000, then quietly buried the A3D technology… and the years went by.

So now we come to the thing that changed everything: Virtual Reality! VR is all about presence. The VR world has to surround us in a genuine, believable way – so positional audio becomes hugely important. Both the binaural and ambisonic formats are enjoying a renaissance right now, as game audio experts deploy these formats to spatialize their sound content for VR. We all understand the importance of good positional audio for the sound design content of the game… but what about the music?

Over my next three articles, I’ll be exploring the role of music in VR through the examination of three important questions for VR game music composers:

  • Do we compose our music in 3D or 2D?
  • Do we structure our music to be Diegetic or Non-Diegetic?
  • Do we focus our music on enhancing player Comfort or Performance?

The next article will focus on the difference between 3D and 2D audio strategies for music implementation in VR games.  In the meantime, please feel free to leave your comments in the space below!

 

 


 

Music in Virtual Reality (GDC 2018 Session)

Illustration of the VR projects featuring music by game composer Winifred Phillips, to be discussed in a GDC talk presented by Winifred Phillips for video game composers.This lecture presented ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips shared tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips addressed several important problems facing composers in VR.

Topics included 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music were explored, including methods that blur the distinction between the two categories.

The discussion also included an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms.  Phillips’ talk offered techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project.

Takeaway

Through examples from several VR games, Phillips provided an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk included concrete examples and practical advice that audience members can apply to their own games.

Intended Audience

This session provided composers and audio directors with strategies for designing music for VR. It included an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.)

The talk was intended to be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).

 

 

In this article for video game composers, Winifred Phillips is shown here creating game music in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.

 

 

 

Video Game Composers: The Tech of Music in Virtual Reality (GDC 2018)

Video game composer Winifred Phillips, pictured in her music production studio.

By Winifred Phillips | Contact | Follow

The Game Developers Conference is almost here! I’m looking forward to giving my presentation soon on “Music in Virtual Reality” (Thursday, March 22nd at 3pm in room 3002 West Hall, Moscone Center, San Francisco).  Over the course of the last two years, I’ve composed a lot of music for virtual reality projects, some of which have already hit retail, and some of which will be getting released very soon!  As a result, I’ve spent a lot of time thinking about what role music should play in a virtual reality game. During my GDC talk in March, I’ll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike RPG-Shooter hybrid from Labrodex Inc.  In preparing my GDC presentation, I made sure my talk addressed some of the most important creative and technical hurdles facing video game composers working in VR.  However, time constraints ensured that some interesting info ended up ‘on the cutting room floor,’ so to speak.  So, I’ve written two articles that explore some of the best topics that didn’t make it into my GDC presentation.

My previous article focused on some abstract, creative concerns facing video game music composers and audio folks working in VR.  In this article, we’ll be turning our attention to more concrete technical issues.  Ready?  Let’s go.

New Binaural Developments

Illustration of popular binaural developments in VR audio, from the article by composer Winifred Phillips for video game composers.VR games currently focus on binaural audio to immerse players in the awesome soundscapes of their virtual worlds.  As we know, binaural recording techniques use two microphones, often embedded in the artificial ears of a dummy head (pictured right).  By virtual of the popular binaural recording technique and/or binaural encoding technologies, game audio teams can plunge VR players into convincing aural worlds where sounds are spatially localized in a way that conforms with real world expectations.  The technology of binaural sound continually improves, and recently the expert developers of the Oculus Rift VR headset have refined the quality of their VR sound with two significant upgrades.

Continue reading

Video Game Composers: The Art of Music in Virtual Reality (GDC 2018)

Video game composer Winifred Phillips, pictured in her music production studio.

 

By Winifred Phillips | Contact | Follow

Once again, the Game Developers Conference is almost upon us!  GDC 2018 promises to be an awesome event, chock full of great opportunities for us to learn and grow as video game music composers.  I always look forward to the comprehensive sessions on offer in the popular GDC audio track, and for the past few years I’ve been honored to be selected as a GDC speaker.  Last year I presented a talk that explored how I built suspense and tension through music I composed for such games as God of War and Homefront: The Revolution.  This year, I’m tremendously excited that I’ll be presenting the talk, “Music in Virtual Reality.” The subject matter is very close to my heart!  Throughout 2016 and 2017, I’ve composed music for many virtual reality projects, some of which have hit retail over the past year, and some of which will be released very soon.  I’ve learned a lot about the process of composing music for a VR experience, and I’ve given a lot of thought to what makes music for VR unique.  During my GDC talk in March, I’ll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike Shooter/RPG from Labrodex Inc.  I’ll talk about some of the top problems that came up, the solutions that were tried, and the lessons that were learned.  Virtual Reality is a brave new world for game music composers, and there will be a lot of ground for me to cover in my presentation!

In preparing my talk for GDC, I kept my focus squarely on composition techniques for VR music creation, while making sure to supply an overview of the technologies that would help place these techniques in context.  With these considerations in mind, I had to prioritize the information I intended to offer, and some interesting topics simply wouldn’t fit within the time constraints of my GDC presentation.  With that in mind, I thought it would be worthwhile to include some of these extra materials in a couple of articles that would precede my talk in March.  In this article, I’ll explore some theoretical ideas from experts in the field of VR, and I’ll include some of my own musings about creative directions we might pursue with VR music composition.  In the next article, I’ll talk about some practical considerations relating to the technology of VR music.

Continue reading

VRDC 2017 takeaways: VR music for the game composer

Video game music composer Winifred Phillips, at work in her music production studio - from the article about music for virtual reality / VR.The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences.  I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio.  This year, the hot topic was virtual reality.  In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show.  The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject.  In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks.  Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).

Inside and outside

The talks we’ll be discussing in this article are entitled “Audio Adventures in VR Worlds” and “The Sound Design of Star Wars: Battlefront VR.”  Here’s a common issue that popped up in both talks:

An illustration of music in the popular VR platform, from the article by Winifred Phillips (video game composer).Where should video game music be in a VR game?  Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player?  Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player?  The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal.  Is one of these approaches more effective in VR than the other?  Which choice is best?

Continue reading

Video game music systems at GDC 2017: tools and tips for composers

Photo of video game composer Winifred Phillips, working in her music production studio on the music of the SimAnimals video game.

By video game composer Winifred Phillips | Contact | Follow

Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:

In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn.  So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems.  If you haven’t read parts one and two of this series, please go do so now and then come back:

  1. Video game music systems at GDC 2017: what are composers using?
  2. Video game music systems at GDC 2017: pros and cons for composers

Ready?  Great!  Here we go!

Continue reading

Video game music systems at GDC 2017: pros and cons for composers

Video game composer Winifred Phillips, pictured in her music production studio working on the music of LittleBigPlanet 2 Cross Controller

By Winifred Phillips | Contact | Follow

Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music!  These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects.  We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:

If you haven’t read part one of this article series, please go do that now and come back.

Okay, so let’s now contemplate some simple but important questions: why were those systems used?  What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?

Continue reading

Video game music systems at GDC 2017: what are composers using?

By video game music composer Winifred Phillips | Contact | Follow

Video game composer Winifred Phillips, presenting at the Game Developers Conference 2017.The 2017 Game Developers Conference could be described as a densely-packed deep-dive exploration of the state-of-the-art tools and methodologies used in modern game development.  This description held especially true for the game audio track, wherein top experts in the field offered a plethora of viewpoints and advice on the awesome technical and artistic challenges of creating great sound for games. I’ve given GDC talks for the past three years now (see photo), and every year I’m amazed at the breadth and diversity of the problem-solving approaches discussed by my fellow GDC presenters.  Often I’ll emerge from the conference with the impression that we game audio folks are all “doing it our own way,” using widely divergent strategies and tools.

This year, I thought I’d write three articles to collect and explore the ideas that were discussed in five different GDC audio talks.  During their presentations, these five speakers all shared their thoughts on best practices and methods for instilling interactivity in modern game music.  By absorbing these ideas side-by-side, I thought we might gain a sense of the “bigger picture” when it comes to the current leading-edge thinking for music interactivity in games. In the first article, we’ll look at the basic nature of these interactive systems.  We’ll devote the second article to the pros and cons of each system, and in the third article we’ll look at tools and tips shared by these music interactivity experts. Along the way, I’ll also be sharing my thoughts on the subject, and we’ll take a look at musical examples from some of my own projects that demonstrate a few ideas explored in these GDC talks:

So, let’s begin with the most obvious question.  What kind of interactive music systems are game audio folks using lately?

Continue reading