VR for the Game Music Composer: Audio for VR Platforms

In this article written for video game composers, Winifred Phillips (video game composer) is here pictured working in her music production studio on the music for the Scraper: First Strike game, developed for popular VR gaming platforms (PSVR, Oculus Rift, HTC Vive).

By Winifred Phillips | Contact | Follow

Hello there!  I’m video game music composer Winifred Phillips.  Lately, I’ve been very busy in my production studio composing music for a lot of awesome virtual reality games, including the upcoming Scraper: First Strike first person VR shooter (pictured above) that’s coming out next Wednesday (November 21st) for the Oculus Rift, HTC Vive and Windows Mixed Reality Devices, and will be released on December 18th for the Playstation VR.  My work on this project has definitely stoked my interest in everything VR!  Since the game will be released very soon, here’s a trailer video released by the developers Labrodex Studios, featuring some of the music I composed for the game:

Scraper: First Strike is just one of a whole slew of VR games I’ve been working on over the past year.  Last year, when I was just starting to get really busy working with VR development teams, I wrote an article here that offered a bunch of informative resources connected to the field of VR audio.  The article I posted in 2017 took a general approach to the role that audio plays in Virtual Reality experiences.  Since we’re well into 2018, I thought we could benefit from expanding that topic to include the state-of-the-art in VR headset platforms.  Taking a look at the hardware platforms that are currently available should give us video game composers a better idea of the direction that VR audio is currently headed.

For one thing, VR is now broadly considered a part of a larger category that also includes AR (Augmented Reality) and MR (Mixed Reality) devices.  Those two categories are often considered synonymous, although that’s certainly debatable.  Since there’s no clear expert consensus at this point on what characteristics separate AR from MR, let’s just consider them as one category that we’ll call AR/MR for now.  In this article I’ll be focusing on resources that are specific to each of the competing platforms in VR and AR/MR.

Let’s get started!

Audio for VR and AR/MR devices

A wide variety of head-mounted devices now exist that can immerse us in imaginary worlds, or bring fantastic creatures to life in our living rooms.  While many of these devices share common underlying technologies in regards to audio creation and implementation, there are differing tools and techniques that apply to each of them.  I’ve included links in the discussion below that may be helpful in understanding how these technologies differ.

When virtual acoustics meets actual acoustics

The newly-released Magic Leap One is an AR/MR device.  This means that it allows the wearer to see the real world, while superimposing digital images that seem to exist in reality, and not just within the device.  For instance, an AR/MR device can make us think that a miniature toy dinosaur is toddling across our coffee table.  With this in mind, creating audio for AR/MR becomes a little tricky.

An image illustrating a discussion of spatial acoustics in AR/MR devices, from the article by award-winning game composer Winifred Phillips.For instance, let’s say that we want our tiny dinosaur to emit a ferociously-adorable little roar as he climbs on top of our coffee table books.  That sound won’t be convincing if it doesn’t seem to be happening inside our actual living room, with its unique acoustical properties.  The real-life room has to be mapped, and acoustic calculations have to be factored in.  This isn’t an issue when developing sound for virtual reality, since the sound sources emit within an environment that exists completely within the virtual world.

It’s a fascinating problem, and one that the Magic Leap folks have considered seriously, using a system they’ve dubbed ‘Soundfield Audio’ to apply physics calculations that can produce appropriate acoustics based on the environment.  They’ve also patented a spatial audio technology that uses the wearer’s head movements to calculate the position of virtual sound sources.  Here’s a video that shows off a video game music visualization application for Magic Leap called Tónandi:

 

The Hololens is also an AR/MR device, and therefore faces a lot of the same issues as the Magic Leap One.  To address these, Hololens uses a spatial audio engine that calculates the position of sound-emitting sources combined with personalized Head Related Transfer Functions or HRTFs (a concept we discussed in an article from 2015).  These HRTFs help to localize all the aural components of the virtual soundscape.  In addition, the Hololens creates a room model to match the user’s location so that sounds seem to reflect from real-life walls and travel convincingly to the player’s ears.  We should expect this technology to improve when Microsoft releases their next generation of Hololens early next year.  Here’s a video produced by Engadget that goes into more detail about the audio experience delivered by Hololens:

 

Spatial sound for mixed reality

An image illustrating the use of Spatial Sound on the Windows Mixed Reality platform, from the article for video game composers by Winifred Phillips (game music composer).While we’re waiting for the next generation of Hololens to be released, Microsoft has been keeping busy in the traditional VR space with its Windows Mixed Reality platform, which allows third party equipment manufacturers to create VR headsets based on its existing VR reference designs and software.  While the Mixed Reality platform shares common elements with the Hololens, the VR devices under the Windows Mixed Reality banner offer standard VR experiences, without any AR/MR elements.  Both the Hololens and the Windows Mixed Reality devices use the Spatial Sound software development kit for the design and implementation of positional audio.  This allows audio developers to create soundscapes for a large number of devices using the same tools.  While the convenience factor is certainly attractive, Hololens and Windows Mixed Reality offer very different experiences, so audio developers will certainly need to keep that in mind.  Here’s a short video that reviews the capabilities of the Spatial Sound SDK:

 

Positional audio inside the virtual machine

Now let’s move on to discuss what’s happening with the current VR devices.  As we know, unlike an AR/MR headset, a VR device cuts us off completely from the outside world and plunges us into an environment existing entirely within the machine.  There is currently a healthy and varied crop of VR devices from which to choose.  The two most popular and famous VR headsets are the Oculus Rift and the HTC Vive.  In this article for video game composers, Winifred Phillips discusses the new audio capabilities of the recently released 3DSP SDK for the famous HTC Vive headset for Virtual Reality.Both devices rely on positional audio technologies to deliver great aural experiences, and each company has worked diligently to improve the technology over time.  In June 2018 the HTC Vive introduced a new Software Development Kit (SDK) for immersive audio.  The new SDK allows for more sophisticated audio technologies like higher order ambisonics, higher resolution audio, more refined spatial acoustics (pictured right), and HRTFs based on refined real-world models to improve the accuracy of positional audio.

Depiction of popular and famous Oculus Rift device's audio SDK for Virtual Reality, from the article written by Winifred Phillips for video game composers.The Oculus Rift has upgraded their existing audio SDK to improve the positional accuracy of sounds emitting very close to the player (a technology they call Near-Field Head-Related-Transfer-Function (HRTF).  They have also provided the option of implementing sound that originate from large sources (such as an ocean, for instance, or a forest fire).  Using the Volumetric Sound Sources technology, large sound-emitting objects can project their aural content across an assigned radius consistent with their scale.  Here’s a video from the Oculus Connect 4 conference, demonstrating the Near-Field HRTF and Volumetric Sound Sources capabilities of the Oculus audio SDK:

 

Photo depicting the new built-in headphones for the popular PlayStation VR system, from the article by Winifred Phillips for video game composersThe PlayStation VR, as the only console-specific VR device, does not share the same market with such devices as the Vive or the Rift and therefore is not faced with the same competitive pressures.  Nevertheless, improvements continue to be made to the PSVR’s technology.  The newest model of the PSVR (released late last year) is a revised version with small but valuable improvements.  Among the changes, Sony added built-in stereo headphones to the headset (pictured right), removing the need for players to hook up separate headphones in order to experience VR audio.

Standalone VR audio

Now let’s take a quick look at the standalone VR devices (i.e. those devices that don’t need to be hooked up to a computer or console, and don’t need a mobile phone installed in order to work).  These VR headsets offer untethered, cable-free virtual reality exploration, but they’re also usually a bit less powerful and full-featured.  The five best-known standalone headsets are the Oculus Go, the Oculus Quest, the Lenovo Mirage Solo, the HTC Vive Focus, and the Shadow VR.

The Oculus Go and Lenovo Mirage Solo both hit retail this May.  The HTC Vive Focus and the Shadow VR both became available for consumers just this month. The Oculus Quest was recently announced and is expected to hit retail in spring 2019.  All five use a Qualcomm smartphone processor chip from the Snapdragon line, so in that respect they’ve essentially adopted the internal mechanism of a high-end mobile phone and simply incorporated it into their on-board hardware.  In fact, the Qualcomm Snapdragon 835 A depiction of the Qualcomm Snapdragon 835 chip for popular untethered VR devices, from the article for video game composers by Winifred Phillips (game music composer).(used in the Lenovo, Vive, Oculus Quest and Shadow VR devices), is also the same chip that’s at the heart of the Samsung Galaxy S8, the Google Pixel 2, and many other smartphone models.  Since all five untethered VR devices use the Snapdragon technology, developers can choose to avail themselves of the Qualcomm Snapdragon VR Software Development Kit, which includes a 3D Audio Plugin for Unity (designed to provide high-performance audio on Qualcomm Snapdragon devices).  Qualcomm also offers a suite of 3D Audio Tools for use in conjunction with a Digital Audio Workstation such as Pro Tools.  While these are by no means the only choices, since they were developed by the company responsible for the Snapdragon processor, it stands to reason that some Snapdragon insights may have influenced the design and function of these tools.  Here’s a video interview with Hugo Swart, head of IoE Consumer Electronics at Qualcomm, as he discusses the virtual reality capabilities of the Snapdragon 835:

 

Audio for mobile VR

If you do a quick Amazon search under the phrase “Mobile VR Headsets,” you’ll see that there is now a dizzying plethora of headset models based around the “insert your mobile phone here” philosophy.  These headsets all rely on the processing technology of the phone inserted into them, and there are so many varying models that I won’t be attempting to delve into that topic.  Generally speaking, if there is a SDK specific to a particular VR headset model, then it should be considered.  For instance, the Oculus Audio SDK makes sense for the Samsung Gear VR and the Oculus Go and Oculus Quest, since all three are Oculus-designed VR systems.  Likewise, the new Resonance Audio SDK from Google is a good choice for any of the Google headsets (Daydream View, Google Cardboard, Lenovo Mirage Solo with Daydream).  Here’s a brief video produced by Google that demonstrates the Resonance Audio SDK:

 

Conclusion

That’s our discussion of where things currently stand with regards to VR platforms!  In the next article, we’ll be focusing on tips and tools for game audio folks working in VR.  I hope you enjoyed the article, and please let me know your thoughts in the comments section below!

 

Photo of video game composer Winifred Phillips in her game composers production studio.Winifred Phillips is an award-winning video game music composer whose recent projects include the triple-A first person shooter Homefront: The Revolution. Her latest video game credits also include numerous Virtual Reality games, including Scraper: First Strike, Bebylon: Battle Royale, Fail Factory, Dragon Front, and many more.  She has composed music for games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.

VRDC 2017 takeaways: VR music for the game composer

Video game music composer Winifred Phillips, at work in her music production studio - from the article about music for virtual reality / VR.The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences.  I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio.  This year, the hot topic was virtual reality.  In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show.  The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject.  In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks.  Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).

Inside and outside

The talks we’ll be discussing in this article are entitled “Audio Adventures in VR Worlds” and “The Sound Design of Star Wars: Battlefront VR.”  Here’s a common issue that popped up in both talks:

An illustration of music in the popular VR platform, from the article by Winifred Phillips (video game composer).Where should video game music be in a VR game?  Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player?  Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player?  The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal.  Is one of these approaches more effective in VR than the other?  Which choice is best?

Continue reading

Video game composers can make you smarter! (The music of Dragon Front) Pt. 2

Pictured: Winifred Phillips (video game music composer) in her studio working on the music of the Dragon Front virtual reality game.

By Winifred Phillips | Contact | Follow

Welcome back to our three-part discussion of how video game composers (such as ourselves) can make strategy gamers smarter!  In these articles, we’re looking at ways in which our music can enhance concentration and tactical decision-making for players engrossed in strategic gameplay.  Along the way, I’ve been sharing my personal experiences as the composer for the Dragon Front strategy game for virtual reality.  Over the course of these articles we’ll be covering three of the top concepts that pertain to the relationship between music and concentration.  In part one, we discussed the concept of ‘music-message congruency,’ so if you haven’t read that article yet, please go check it out and then come back.

Are you back now?  Good!  Let’s move on to the second big technique for increasing the smarts of strategy gamers!

Cognition-enhancing tempo

As video game composers, we create music in a wide variety of tempos designed to support the energy of play and the pacing of the game’s overall design.  From leisurely tracks that accompany unstructured exploration to frenetic pieces that support the most high-stakes combat, our music is planned with expert precision to shape the excitement level of players and keep them motivated as they progress.

Continue reading

Video game composers can make you smarter! (The music of Dragon Front) Pt. 1

Video game composer Winifred Phillips, pictured in her music studio working on the original score for the Dragon Front virtual reality game.

Can video game composers make you smarter?  Well, video gaming can be a pretty cerebral activity, requiring astute problem-solving skills and disciplined concentration in order to excel.  That’s especially true for any game built around strategic and/or tactical gameplay, such as real-time or turn-based strategy, tactical shooters, multiplayer online battle arenas (MOBAs), and online collectible card strategy games.  To succeed in these types of games, players must assess the current situation and formulate a plan that accounts for future developments and variables.  Without this type of tactical forward-thinking gameplay, a gamer has little chance to win.  So, can music enable gamers to think tactically, stay focused and make smart decisions?  Over the next three articles, I’ll try to answer that question, while exploring the role of music in enhancing the concentration of strategic/tactical gamers.

Along the way, we’ll be taking a look at some scholarly research on the subject, consulting the opinions of experts, and I’ll be sharing my experiences creating the music for the recently released Dragon Front strategy game from High Voltage software.  We’ll check out some music tracks I composed for the popular Dragon Front game (pictured at the top of this article), and we’ll discuss methods for supporting and enhancing concentration for strategic/tactical game players.  But first, let’s take a closer look at the Dragon Front game.

Continue reading

VR Game Composer: Music Beyond the Virtual

Photo of video game music composer Winifred Phillips, from the article entitled "VR Game Composer: Music Beyond the Virtual."Welcome to the third installment in our series on the fascinating possibilities created by virtual reality motion tracking, and how the immersive nature of VR may serve to inspire us as video game composers and afford us new and innovative tools for music creation.  As modern composers, we work with a lot of technological tools, as I can attest from the studio equipment that I rely on daily (pictured left). Many of these tools communicate with each other by virtue of the Musical Instrument Digital Interface protocol, commonly known as MIDI – a technical standard that allows music devices and software to interact.

Image depicting VR apps from the article by Winifred Phillips, Game Music Composer.In order for a VR music application to control and manipulate external devices, the software must be able to communicate by way of the MIDI protocol – and that’s an exciting development in the field of music creation in VR!

This series of articles focuses on what VR means for music composers and performers. In previous installments, we’ve had some fun exploring new ways to play air guitar and air drums, and we’ve looked at top VR applications that provide standalone virtual instruments and music creation tools.  Now we’ll be talking about the most potentially useful application of VR for video game music composers – the ability to control our existing music production tools from within a VR environment.

We’ll explore three applications that employ MIDI to connect music creation in VR to our existing music production tools. But first, let’s take a look at another, much older gesture-controlled instrument that in ways is quite reminiscent of these motion-tracking music applications for VR:

Continue reading

VR Game Composer: Music Inside the Machine

Illustration for an article in the VR Game Composer series, written by video game composer Winifred PhillipsWelcome to part two of our ongoing exploration of some interesting possibilities created by the motion tracking capabilities of VR, and how this might alter our creative process as video game composers.

In part one we discussed how motion tracking lets us be awesome air guitarists and drummers inside the virtual space.  In this article, we’ll be taking a look at how the same technology will allow us to make interesting music using more serious tools that are incorporated directly inside the VR environment – musical instruments that exist entirely within the VR ‘machine.’

Our discussion to follow will concentrate on three software applications: Soundscape, Carillon, and Lyra.  Later, in the third article of this ongoing series, we’ll take a look at applications that allow our VR user interfaces to harness the power of MIDI to control some of the top music devices and software that we use in our external production studios. But first, let’s look at the ways that VR apps can function as fully-featured musical instruments, all on their own!

Soundscape

Let’s start with something simple – a step sequencer with a sound bank and signal processing tools, built for the mobile virtual reality experience of the Samsung Gear VR.

Video game composer Winifred Phillips demonstrating the Samsung Gear VR headset during the AES convention in NYC.I got a chance to demo the Samsung Gear VR during the Audio Engineering Society Convention in NYC last year, and while it doesn’t offer the best or most mind-blowing experience in VR (such as what we can experience from products like the famous Oculus Rift), it does achieve a satisfying level of immersion. Plus, it’s great fun!  The Soundscape VR app was built for Samsung Gear VR by developer Sander Sneek of the Netherlands.  It’s a simple app designed to enable users to create dance loops using three instruments from a built-in electro sound library, a pentatonic step sequencer that enables the user to create rhythm and tone patterns within the loops, and a collection of audio signal processing effects that let the user warp and mold the sounds as the loops progress, adding variety to the performance.

Continue reading

VR Game Composer: Music in the Air

Illustration for the article by game music composer Winifred Phillips, entitled "VR Game Composer: Music in the Air"Since I’ve been working recently on music for a Virtual Reality project (more info in the coming months), I’ve been thinking a lot about VR technology and its effect on the creative process.  Certainly, VR is going to be a great environment in which to be creative and perform tasks and skills with enhanced focus, according to this article from the VR site SingularityHub.  I’ve written in this blog before about the role that music and sound will play in the Virtual Reality gaming experience.  It’s clear that music will have an impact on the way in which we experience VR, not only during gaming experiences, but also when using the tools of VR to create and be productive.  With that in mind, let’s consider if the opposite statement may also be true – will VR impact the way in which we experience music, not only as listeners, but also as video game composers?

Game composer Winifred Phillips tries out the VR experience of Google Cardboard (pictured here in her music production studio).Simple VR technologies like the popular Google Cardboard headset can be a lot of fun – as I personally experienced recently (photo to the left).  However, they offer only the rudimentary visual aspects, which omits some of the most compelling aspects of the VR experience.  When motion tracking (beyond simple head movement) is added to the mix, the potential of VR explodes.  Over the next three articles, we’ll be exploring some interesting possibilities created by the motion tracking capabilities of VR, and how this might alter our creative process.  In the first article, we’ll have some fun exploring new ways to play air guitars and air drums in the VR environment. In the second article, we’ll take a look at ways to control virtual instruments and sound modules that are folded into the VR software.  And finally, in the third article we’ll explore the ways in which VR motion tracking is allowing us to immersively control our existing real-world instruments using MIDI. But first, let’s take a look at the early days of VR musical technology!

Continue reading