The Game Developers Conference is almost here! I’m looking forward to giving my presentation soon on “Music in Virtual Reality” (Thursday, March 22nd at 3pm in room 3002 West Hall, Moscone Center, San Francisco). Over the course of the last two years, I’ve composed a lot of music for virtual reality projects, some of which have already hit retail, and some of which will be getting released very soon! As a result, I’ve spent a lot of time thinking about what role music should play in a virtual reality game. During my GDC talk in March, I’ll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike RPG-Shooter hybrid from Labrodex Inc. In preparing my GDC presentation, I made sure my talk addressed some of the most important creative and technical hurdles facing video game composers working in VR. However, time constraints ensured that some interesting info ended up ‘on the cutting room floor,’ so to speak. So, I’ve written two articles that explore some of the best topics that didn’t make it into my GDC presentation.
VR games currently focus on binaural audio to immerse players in the awesome soundscapes of their virtual worlds. As we know, binaural recording techniques use two microphones, often embedded in the artificial ears of a dummy head (pictured right). By virtual of the popular binaural recording technique and/or binaural encoding technologies, game audio teams can plunge VR players into convincing aural worlds where sounds are spatially localized in a way that conforms with real world expectations. The technology of binaural sound continually improves, and recently the expert developers of the Oculus Rift VR headset have refined the quality of their VR sound with two significant upgrades.
Welcome to the third installment of my four-part article series on the core principles of music interactivity, including video demonstrations and supplementary supporting materials that take these abstract concepts and make them more concrete. In Part One of this series, we took a look at a simple example demonstrating the Horizontal Re-Sequencing model of musical interactivity, as it was used in the music I composed for the Speed Racer Videogame from Warner Bros. Interactive. Part Two of this series looked at the more complex Horizontal Re-sequencing music system of the Spore Hero game from Electronic Arts. So now let’s move on to another major music interactivity model used by video game composers – Vertical Layering.
Last week, it was my honor and pleasure to give a presentation at the Game Developers Conference in San Francisco. My talk was entitled “From Total War to Assassin’s Creed: Music for Mobile Games.” The talk focused on the best and most effective methods for composition and implementation of music in portable gaming. The talk was structured for the benefit of video game composers and game audio pros, and as a part of the presentation, I played short excerpts of music that I composed for several of my top mobile and handheld video game projects. Now that GDC is over, I thought I’d provide streaming links to some of the complete music tracks that I featured during my presentation, in case attendees were curious about the complete pieces of music. So, without further ado, here are tracks from my GDC 2016 talk!
Assassin’s Creed Liberation
The Assassin’s Creed Liberation game was released by Ubisoft for the PlayStation Vita, and delivered an immersive experience from the popular Assassin’s Creed franchise. The game was designed specifically for a portable system, and as such, all aspects of the design were adjusted to cater specifically to a portable gaming experience, including the music.
This week I thought we’d check in with some of the top orchestral video game music concert tours currently underway. We’ll take a look at some reviews of 2015 performances from the respective tours, and we’ll also take a look at video from some of the most recent concert performances.
The Legend of Zelda: Symphony of the Goddesses
Originating as a simple four-minute overture performed at a Nintendo press event in 2011, Symphony of the Goddesses kicked off as a full-fledged concert tour in January 2012 and currently has 33 dates scheduled for 2016 that will take the popular tour all around the world. The concert’s program lineup focuses exclusively on famous music from the Legend of Zelda games. In a review of the September 25th 2015 performance at the Providence Performing Arts Center in Rhode Island, Broadway World critic Andria Tieman wrote, “Overall, this was a night of fantastic music, excellent people-watching and a fun, visual performance. This is something that Zelda fans should certainly seek out.” Here’s a video clip from the Oct. 30th 2015 broadcast of the Late Show with Stephen Colbert, in which the Symphony of the Goddesses tour performed their Legend of Zelda Medley:
Last week, I spoke at the Montreal International Game Summit. It was a fantastic experience, and I wanted to share a video excerpt of my speech with you! The speech was called, “Music, the Brain, and the Three Levels of Immersion.” I’m grateful to Clement Galiay and Nicolas Bertrand-Verge of the MIGS for the opportunity to speak at this great event! Also, I’d like to give a shout-out to Jean-Frederic Vachon for the tremendous support and encouragement for me to get involved in the MIGS — thanks, JF!!
More about the Montreal International Game Summit:
MIGS was founded in 2004 to meet the needs of the video game sector, which currently represents close to 9,000 workers in Quebec. Ten years later, its mission remains: developing the transfer of knowledge and expertise, increasing exposure for Quebec players abroad and promoting exchanges and communications between stakeholders, making MIGS the East Coast’s leading professional-only event for the games industry.
Music, the Brain, and the Three Levels of Immersion
Music has the power to deepen player immersion through psychological effects documented in scientific research. This talk explored the influence of music on the brain, and how these effects can aid game designers in meeting the criteria necessary for the “Three Levels of Immersion.”
I’m very excited to share that my book, A Composer’s Guide to Game Music, has been reviewed by the nation’s leading writer on the subject of music for films and television, Jon Burlingame! As the most respected journalist in the field of music for visual media, Jon Burlingame writes regularly for Variety, and also contributes to The New York Times, Los Angeles Times, the Washington Post, Newsday, Emmy, Premiere and The Hollywood Reporter.
His review article about my book appeared in the Film Music Society features section. He described the book as a “beautifully organized, intelligently written book about music for games,” and said that “gamers as well as composers may be fascinated by her thorough analysis of what music works, and why, in various game genres.”
The Palais des congrès de Montréal convention center
If you’re attending the event this year, please feel free to say hi! It would be great to meet you! Also, I’ll be very happy to sign your copy of my book,A Composer’s Guide to Game Music, so please bring it along! Here’s the official description of my upcoming talk at the Montreal International Game Summit:
Music, the Brain, and the Three Levels of Immersion
Game Music Talk / Game Audio Track – 4pm November 11th – Room 522 – Palais des congrès de Montréal
Music has the power to deepen player immersion through psychological effects documented in scientific research. This talk will explore the influence of music on the brain, and how these effects can aid game designers in meeting the criteria necessary for the “Three Levels of Immersion.” According to research, these levels of immersion require specific mental states that music can help the player to achieve. Through a discussion of several scientific studies, the talk will investigate the power of music to alter time perception, deepen our appreciation of visual details, enhance our mental prowess, increase the intrinsic motivation of activities, change our understanding of plot, and enhance both our attention spans and our memory capacity. The talk will also explore the techniques of music composition and implementation that provide practical strategies for composers, audio teams and game designers to maximize the ability of game music to help players achieve total immersion.
Attendees will gain an understanding of the effects of music on the brain, and how music can alter the experience of the player through specific documented effects.
Study data will be discussed, including the “Three Levels of Immersion” from the Conference on Human Factors in Computing Systems (sponsored by the Special Interest Group on Computer-Human Interaction), as well as several research studies on the relationship between music and cognitive function.
Tips and strategies will be explored for the application of practical techniques to exploit the power of music to alter the mental state of the player, thus enabling deeper immersion in the gameplay experience.