The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences. I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio. This year, the hot topic was virtual reality. In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show. The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject. In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks. Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).
Where should video game music be in a VR game? Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player? Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player? The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal. Is one of these approaches more effective in VR than the other? Which choice is best?
Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:
In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn. So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems. If you haven’t read parts one and two of this series, please go do so now and then come back:
Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:
Okay, so let’s now contemplate some simple but important questions: why were those systems used? What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?
By video game music composer Winifred Phillips | Contact | Follow
The 2017 Game Developers Conference could be described as a densely-packed deep-dive exploration of the state-of-the-art tools and methodologies used in modern game development. This description held especially true for the game audio track, wherein top experts in the field offered a plethora of viewpoints and advice on the awesome technical and artistic challenges of creating great sound for games. I’ve given GDC talks for the past three years now (see photo), and every year I’m amazed at the breadth and diversity of the problem-solving approaches discussed by my fellow GDC presenters. Often I’ll emerge from the conference with the impression that we game audio folks are all “doing it our own way,” using widely divergent strategies and tools.
This year, I thought I’d write three articles to collect and explore the ideas that were discussed in five different GDC audio talks. During their presentations, these five speakers all shared their thoughts on best practices and methods for instilling interactivity in modern game music. By absorbing these ideas side-by-side, I thought we might gain a sense of the “bigger picture” when it comes to the current leading-edge thinking for music interactivity in games. In the first article, we’ll look at the basic nature of these interactive systems. We’ll devote the second article to the pros and cons of each system, and in the third article we’ll look at tools and tips shared by these music interactivity experts. Along the way, I’ll also be sharing my thoughts on the subject, and we’ll take a look at musical examples from some of my own projects that demonstrate a few ideas explored in these GDC talks:
The Game Developers Conference is coming up soon! Last year I presented a talk on music for mobile games (pictured above), and I’m pleased that this year I’ll be presenting the talk, “Homefront’ to ‘God of War’: Using Music to Build Suspense” (Wednesday, March 1st at 11am in room 3006 West Hall, Moscone Center, San Francisco). In my talk I’ll be focusing on practical applications of techniques for video game composers and game audio folks, using my own experiences as concrete examples for exploration. Along the way, I’ll be discussing some very compelling scholarly research on the relationship between suspense, gameplay and musical expression. In preparing my GDC 2017 presentation I did a lot of reading and studying about the nature of suspense in video games, the importance of suspense in gameplay design, and the role that video game music plays in regulating and elevating suspense. There will be lots of ground to cover in my presentation! That being said, the targeted focus of my presentation precluded me from incorporating some very interesting extra research into the importance of suspense in a more general sense… why human beings need suspense, and what purpose it serves in our lives. I also couldn’t find the space to include everything I’d encountered regarding suspense as an element in the gaming experience. It occurred to me that some of this could be very useful to us in our work as game makers, so I’d like to share some of these extra ideas in this article.
As a video game composer and author of the book A Composer’s Guide to Game Music, I’m frequently asked for advice on how a young composer can gain entry into this business. I dedicated a chapter of my book to this topic (Chapter 14: Acting Like a Business and Finding Work), so I’ve certainly thought a great deal about the issue. From my very first project (God of War) all the way to my most recent game (Homefront The Revolution, pictured right), one thing has always been abundantly clear: landing gigs can be a complex journey. That’s especially true for newcomers, and there are no easy signposts pointing the way. While I tried to use my own experiences and insights to provide useful guidance in my book, I know that everyone’s experience is different, and multiple points of view can be very helpful. So in this article, I’ll be offering resources from articles and community discussions on how to face down the awesome challenges of breaking into the industry as a composer of music for games.
First, I’ll be sharing a video from my presentation at the Society of Composers and Lyricists seminar, in which I answered the question about how I got my start in the games industry. Then, we’ll be exploring highlights from a collection of online articles that offer helpful tips for how to break in and establish a career as a game composer. Finally, at the end of this article I’ll be including a full list of links for further reading and reference.
I was pleased to give a talk about composing music for games at the 2016 Game Developers Conference (pictured left). GDC took place this past March in San Francisco – it was an honor to be a part of the audio track again this year, which offered a wealth of awesome educational sessions for game audio practitioners. So much fun to see the other talks and learn about what’s new and exciting in the field of game audio! In this blog, I want to share some info that I thought was really interesting from two talks that pertained to the audio production side of game development: composer Laura Karpman’s talk about “Composing Virtually, Sounding Real” and audio director Garry Taylor’s talk on “Audio Mastering for Interactive Entertainment.” Both sessions had some very good info for video game composers who may be looking to improve the quality of their recordings. Along the way, I’ll also be sharing a few of my own personal viewpoints on these music production topics, and I’ll include some examples from one of my own projects, the Ultimate Trailers album for West One Music, to illustrate ideas that we’ll be discussing. So let’s get started!