The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences. I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio. This year, the hot topic was virtual reality. In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show. The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject. In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks. Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).
Where should video game music be in a VR game? Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player? Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player? The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal. Is one of these approaches more effective in VR than the other? Which choice is best?
Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:
In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn. So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems. If you haven’t read parts one and two of this series, please go do so now and then come back:
Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:
Okay, so let’s now contemplate some simple but important questions: why were those systems used? What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?
By video game music composer Winifred Phillips | Contact | Follow
The 2017 Game Developers Conference could be described as a densely-packed deep-dive exploration of the state-of-the-art tools and methodologies used in modern game development. This description held especially true for the game audio track, wherein top experts in the field offered a plethora of viewpoints and advice on the awesome technical and artistic challenges of creating great sound for games. I’ve given GDC talks for the past three years now (see photo), and every year I’m amazed at the breadth and diversity of the problem-solving approaches discussed by my fellow GDC presenters. Often I’ll emerge from the conference with the impression that we game audio folks are all “doing it our own way,” using widely divergent strategies and tools.
This year, I thought I’d write three articles to collect and explore the ideas that were discussed in five different GDC audio talks. During their presentations, these five speakers all shared their thoughts on best practices and methods for instilling interactivity in modern game music. By absorbing these ideas side-by-side, I thought we might gain a sense of the “bigger picture” when it comes to the current leading-edge thinking for music interactivity in games. In the first article, we’ll look at the basic nature of these interactive systems. We’ll devote the second article to the pros and cons of each system, and in the third article we’ll look at tools and tips shared by these music interactivity experts. Along the way, I’ll also be sharing my thoughts on the subject, and we’ll take a look at musical examples from some of my own projects that demonstrate a few ideas explored in these GDC talks:
Welcome to the fifth and final installment of my five-part article series on music composition techniques for stimulating tension and suspense in video games. These articles are based on the presentation I gave this year at the popular Game Developers Conference in San Francisco, entitled Homefront to God of War: Using Music to Build Suspense. If you haven’t yet read the previous four articles, you’ll find them here:
Now that we’ve considered the power of Ominous Ambiences, Jarring Jolts, Creepy Clusters, and Drones of Dread, let’s take a look at the last item on our list of suspenseful music composition techniques – Semi Silence.
Welcome to the fourth installment of my five-part article series discussing music composition techniques that heighten tension and suspense for video game projects. These articles are based on the presentation I gave at this year’s Game Developers Conference in San Francisco, entitled Homefront to God of War: Using Music to Build Suspense. If you haven’t read the previous three articles, you’ll find them here:
Before we move on to the next music composition technique in our suspense-building arsenal, I’d like to briefly revisit a video game project we discussed in our last article; the popular Dragon Front VR game for the Oculus Rift, developed by High Voltage Software.
Welcome back to our five part discussion of the role that video game music can play in enhancing tension and promoting suspenseful gameplay! These articles are based on the presentation I gave at this year’s Game Developers Conference in San Francisco, entitled Homefront to God of War: Using Music to Build Suspense. If you haven’t read the previous two articles, you’ll find them here:
So, now that we’ve discussed ominous atmospheres and jarring jolts, let’s look at the next technique in our arsenal:
The Creepy Cluster technique
As we know, tone clusters are collections of notes packed together to produce unnerving dissonant effects. While it might seem like any cat can walk across a piano and produce unpleasant clusters, well-executed dissonance is actually one of the trickiest techniques we can employ. It’s tremendously potent when used with expert precision.