When I’m not at work in my studio making music for games, I like to keep up with new developments in the field of interactive entertainment, and I’ll often share what I learn here in these articles. Virtual reality is an awesome subject for study for a video game composer, and several of my recent projects have been in the world of VR. Since I’m sure that most of us are curious about what’s coming next in virtual reality, I’ve decided to devote this article to a collection of educational resources. I’ve made a point of keeping our focus general here, with the intent of understanding the role of audio in VR and the best resources available to audio folks. As a component of the VR soundscape, our music must fit into the entire matrix of aural elements, so we’ll spend this article learning about what goes into making expert sound for a virtual reality experience. Let’s start with a few articles that discuss methods and techniques for VR audio practitioners.
The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences. I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio. This year, the hot topic was virtual reality. In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show. The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject. In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks. Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).
Where should video game music be in a VR game? Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player? Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player? The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal. Is one of these approaches more effective in VR than the other? Which choice is best?
I’m pleased to announce that my book, A Composer’s Guide to Game Music, is now available its new paperback edition! I’m excited that my book has done well enough to merit a paperback release, and I’m looking forward to getting to know a lot of new readers! The paperback is much lighter and more portable than the hardcover. Here’s a view of the front and back covers of the new paperback edition of my book (click the image for a bigger version if you’d like to read the back cover):
As you might expect, many aspiring game composers read my book, and I’m honored that my book is a part of their hunt for the best resources to help them succeed in this very competitive business. When I’m not working in my music studio, I like to keep up with all the great new developments in the game audio field, and I share a lot of what I learn in these articles. Keeping in mind how many of my readers are aspiring composers, I’ve made a point of devoting an article once a year to gathering the top online guidance currently available for newcomers to the game music profession. In previous years I’ve focused solely on recommendations gleaned from the writings of game audio pros, but this time I’d like to expand that focus to include other types of resources that could be helpful. Along the way, we’ll be taking a look at some nuggets of wisdom that have appeared on these sites. So, let’s get started!
Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:
In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn. So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems. If you haven’t read parts one and two of this series, please go do so now and then come back:
Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:
Okay, so let’s now contemplate some simple but important questions: why were those systems used? What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?
By video game music composer Winifred Phillips | Contact | Follow
The 2017 Game Developers Conference could be described as a densely-packed deep-dive exploration of the state-of-the-art tools and methodologies used in modern game development. This description held especially true for the game audio track, wherein top experts in the field offered a plethora of viewpoints and advice on the awesome technical and artistic challenges of creating great sound for games. I’ve given GDC talks for the past three years now (see photo), and every year I’m amazed at the breadth and diversity of the problem-solving approaches discussed by my fellow GDC presenters. Often I’ll emerge from the conference with the impression that we game audio folks are all “doing it our own way,” using widely divergent strategies and tools.
This year, I thought I’d write three articles to collect and explore the ideas that were discussed in five different GDC audio talks. During their presentations, these five speakers all shared their thoughts on best practices and methods for instilling interactivity in modern game music. By absorbing these ideas side-by-side, I thought we might gain a sense of the “bigger picture” when it comes to the current leading-edge thinking for music interactivity in games. In the first article, we’ll look at the basic nature of these interactive systems. We’ll devote the second article to the pros and cons of each system, and in the third article we’ll look at tools and tips shared by these music interactivity experts. Along the way, I’ll also be sharing my thoughts on the subject, and we’ll take a look at musical examples from some of my own projects that demonstrate a few ideas explored in these GDC talks:
Since one of my most recent projects, Little Lords of Twilight, became available worldwide earlier this year and was recently greenlit on the famous Steam platform, I thought I’d write this article to share some of my creative and technical process in composing the music for this game. In particular, this project presents a great opportunity to look at how compositional variation (as we understand it from music theory) can be useful for the structure of interactive music.