Delighted you’re here! I’m videogame composer Winifred Phillips, and I’m happy to welcome you back to this four-part article series exploring the role of music in VR games! These articles are based on the presentation I gave at this year’s game Developer’s Conference in San Francisco, entitled Music in Virtual Reality (I’ve included the official description of my talk at this end of this article). If you haven’t read the previous three articles, you’ll find them here:
- Part One: The Importance of Positional Audio
- Part Two: 2D Versus 3D Music in Virtual Reality
- Part Three: Diegetic Versus Non-Diegetic Music in Virtual Reality
During my GDC presentation, I focused on three important questions for VR game music composers:
- Do we compose our music in 3D or 2D?
- Do we structure our music to be Diegetic or Non-Diegetic?
- Do we focus our music on enhancing player Comfort or Performance?
In the course of exploring these questions during my GDC presentation, I discussed my work on four of my own VR game projects –the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike shooter/RPG from Labrodex Inc.
In these articles, I’ve been sharing the discussions and conclusions that formed the basis of my talk, including the examples from these four VR game projects. So now let’s look at the last of our three questions for VR video game composers:
Do we focus our music on enhancing player comfort or performance?
Okay, so we’re now going to spend some time talking about one of the top issues in VR game design – player comfort versus player performance. It’s a tricky problem. By this time, we’ve probably all heard about VIMS – visually induced motion sickness. Unfortunately, it’s a pretty famous obstacle in VR game design, causing tons of consternation for game audio experts. So, can video game composers help? Can music be used to combat motion sickness?
Actually, yes. The US Center for Disease Control and Prevention recommends the use of music to relieve motion sickness. Music has been shown to appreciably decrease symptoms of nausea for people in moving vehicles.
Going further, according to a study at the Toronto Rehabilitation Institute, music significantly reduces nausea induced by visual stimulus alone. The conclusion is – music can be an awesome treatment for VIMS – but it can’t be just any music.
According to this research, in order for music to effectively ease VIMS, it has to be considered ‘pleasant’ by listeners. Pleasant music is nice, friendly and agreeable. That’s a real problem for an intense futuristic action game like Scraper: First Strike – we don’t expect pleasant music when we’re mowing down enemies. According to studies from both Memory & Cognition Journal and the Journal of Marketing, when the mood of background music is incongruent with the situation in which we hear it, the music can actually impede our ability to absorb information. In the context of a game, that can potentially impact player performance.
When talking over style choices, the project director and I decided that it would be best to concentrate on uplifting heroic music for the Scraper: First Strike game. That style would keep the score feeling positive, but would still allow it to make sense during action sequences. Plus, the positive, inspirational music can potentially alleviate some VIMS symptoms. Here’s an example from the Scraper VR game:
This technique also came into play in the Fail Factory project. When players are whisked off from one minigame to another, they travel through some twisty tunnels that might induce a touch of VIMS. However, the tunes during these journeys are probably the most pleasant in the game, as they’re designed to feel like really cheery elevator music. Here’s an example of that:
This cheery musical approach can have a palliative effect on the symptoms of VIMS, helping to reduce some nausea that might otherwise be experienced by players in the VR environment.
So now we’ve come to the conclusion of this four-article series, during which we’ve explored some of the issues facing game music composers creating music for virtual reality. As we discussed, 3D audio has had a long and complex history in game development, and the popular emergence of VR has breathed new life into the discipline.
Over the course of these four articles, we’ve talked about ways to combine stereo and positional music into a VR environment. We’ve explored how non-diegetic music can be effectively enfolded into virtual reality, and we’ve looked at ways in which music can help to alleviate the symptoms of visually induced motion sickness without impacting player performance.
I hope you’ll try some of the techniques we’ve discussed in these articles. Each VR game offers its own unique challenges, and music for VR can be a real trial-and-error process – but that’s what makes working in VR so great for video game composers! I hope you’ve enjoyed this four article series, and please feel free to share your thoughts in the comments section below!
Music in Virtual Reality
This lecture will present ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips will share tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips will address several important problems facing composers in VR.
Topics will include 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music will be explored, including methods that blur the distinction between the two categories.
The discussion will also include an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms. Phillips’ talk will offer techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project.
Through examples from several VR games, Phillips will provide an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk will include concrete examples and practical advice that audience members can apply to their own games.
This session will provide composers and audio directors with strategies for designing music for VR. It will include an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.)
The talk will be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).
Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Follow her on Twitter @winphillips.