On April 6th I was honored to give a lecture at the Thomas Jefferson Building of the Library of Congress in Washington DC (pictured right). As a video game composer, I’d been invited to speak by the Music Division of the Library of Congress. I’d be delivering the concluding presentation during their premiere event celebrating popular video game music. My lecture would be the very first video game music composition lecture ever given at the Library of Congress. I was both honored and humbled to accept the invitation and have my lecture included in the 2018-2019 season of concerts and symposia from the Library of Congress.
In my presentation, I included many topics that I’ve written about in previous articles. My lecture topics included horizontal resequencing, vertical layering, and interactive MIDI-based composition. I explored the various roles that music has played in famous games from the earliest days of game design (like Frogger and Ballblazer). I also discussed how music has been implemented in some of the awesome games from the modern era (like one of my own projects, Assassin’s Creed Liberation).
My lecture was supported by a full house in the Whittall Pavilion at the Library of Congress. The audience gave me both a warm welcome and lots of great questions following the conclusion of my lecture. Afterwards, the discussion continued during a book signing event that was kindly hosted by the Library of Congress shop. During the book signing event, I was pleased to sign copies of my book A Composer’s Guide to Game Music. I also got to talk personally with quite a few audience members. Such an engaging and insightful crowd! It was a pleasure getting to know these lovely people. I really enjoyed the lively conversation – I had the best time!!
Delighted you’re here! I’m very pleased to share that over the next two months I’ll be speaking at two fantastic events focusing on music in video games! My two presentations will explore the unique structure and character of video game music, and how it helps to better envelop players in the worlds that game designers have created. I thought that this article might be a good opportunity to delve into some of the ideas that form the basis of my two upcoming talks. First, I’d like to share some details about the presentations I’ll be giving.
The Library of Congress has invited me to speak this April as a part of their “Augmented Realities” video game music festival. My presentation, “The Interface Between Music Composition and Game Design,” will take place at the Library of Congress in Washington DC. I’m very excited to participate in this event, which will be the first of its kind hosted by the “Concerts from the Library” series at the Library of Congress! The “Augmented Realities” video game music festival will also include panels on video game music history and preservation presented by distinguished curators and archivists at the Library of Congress, a special documentary screening that explores the ChipTunes movement, and a live “game creation lab.” My presentation will be the concluding lecture of the festival, and I’m honored to speak at such an illustrious event! If you find yourself in the Washington DC area on April 6th 2019, you’re very welcome to come to my lecture at the Library of Congress! Tickets are free (first come, first served), and they’re available now via EventBrite.
But before my lecture at the Library of Congress, I’ll be making a trip to San Francisco for the famous Game Developers Conference that takes place this month. For the past few years I’ve been excited and honored to be selected as a Game Developers Conference speaker in the Game Audio track, and I’m happy to share that I’ll be speaking again this month in San Francisco at GDC 2019! My talk this year is entitled “How Music Enhances Virtual Presence.”
The Game Developers Conference is almost here! I’m looking forward to giving my presentation soon on “Music in Virtual Reality” (Thursday, March 22nd at 3pm in room 3002 West Hall, Moscone Center, San Francisco). Over the course of the last two years, I’ve composed a lot of music for virtual reality projects, some of which have already hit retail, and some of which will be getting released very soon! As a result, I’ve spent a lot of time thinking about what role music should play in a virtual reality game. During my GDC talk in March, I’ll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike RPG-Shooter hybrid from Labrodex Inc. In preparing my GDC presentation, I made sure my talk addressed some of the most important creative and technical hurdles facing video game composers working in VR. However, time constraints ensured that some interesting info ended up ‘on the cutting room floor,’ so to speak. So, I’ve written two articles that explore some of the best topics that didn’t make it into my GDC presentation.
VR games currently focus on binaural audio to immerse players in the awesome soundscapes of their virtual worlds. As we know, binaural recording techniques use two microphones, often embedded in the artificial ears of a dummy head (pictured right). By virtual of the popular binaural recording technique and/or binaural encoding technologies, game audio teams can plunge VR players into convincing aural worlds where sounds are spatially localized in a way that conforms with real world expectations. The technology of binaural sound continually improves, and recently the expert developers of the Oculus Rift VR headset have refined the quality of their VR sound with two significant upgrades.
Welcome to the third installment of my four-part article series on the core principles of music interactivity, including video demonstrations and supplementary supporting materials that take these abstract concepts and make them more concrete. In Part One of this series, we took a look at a simple example demonstrating the Horizontal Re-Sequencing model of musical interactivity, as it was used in the music I composed for the Speed Racer Videogame from Warner Bros. Interactive. Part Two of this series looked at the more complex Horizontal Re-sequencing music system of the Spore Hero game from Electronic Arts. So now let’s move on to another major music interactivity model used by video game composers – Vertical Layering.
Last week, it was my honor and pleasure to give a presentation at the Game Developers Conference in San Francisco. My talk was entitled “From Total War to Assassin’s Creed: Music for Mobile Games.” The talk focused on the best and most effective methods for composition and implementation of music in portable gaming. The talk was structured for the benefit of video game composers and game audio pros, and as a part of the presentation, I played short excerpts of music that I composed for several of my top mobile and handheld video game projects. Now that GDC is over, I thought I’d provide streaming links to some of the complete music tracks that I featured during my presentation, in case attendees were curious about the complete pieces of music. So, without further ado, here are tracks from my GDC 2016 talk!
Assassin’s Creed Liberation
The Assassin’s Creed Liberation game was released by Ubisoft for the PlayStation Vita, and delivered an immersive experience from the popular Assassin’s Creed franchise. The game was designed specifically for a portable system, and as such, all aspects of the design were adjusted to cater specifically to a portable gaming experience, including the music.
This week I thought we’d check in with some of the top orchestral video game music concert tours currently underway. We’ll take a look at some reviews of 2015 performances from the respective tours, and we’ll also take a look at video from some of the most recent concert performances.
The Legend of Zelda: Symphony of the Goddesses
Originating as a simple four-minute overture performed at a Nintendo press event in 2011, Symphony of the Goddesses kicked off as a full-fledged concert tour in January 2012 and currently has 33 dates scheduled for 2016 that will take the popular tour all around the world. The concert’s program lineup focuses exclusively on famous music from the Legend of Zelda games. In a review of the September 25th 2015 performance at the Providence Performing Arts Center in Rhode Island, Broadway World critic Andria Tieman wrote, “Overall, this was a night of fantastic music, excellent people-watching and a fun, visual performance. This is something that Zelda fans should certainly seek out.” Here’s a video clip from the Oct. 30th 2015 broadcast of the Late Show with Stephen Colbert, in which the Symphony of the Goddesses tour performed their Legend of Zelda Medley:
Last week, I spoke at the Montreal International Game Summit. It was a fantastic experience, and I wanted to share a video excerpt of my speech with you! The speech was called, “Music, the Brain, and the Three Levels of Immersion.” I’m grateful to Clement Galiay and Nicolas Bertrand-Verge of the MIGS for the opportunity to speak at this great event! Also, I’d like to give a shout-out to Jean-Frederic Vachon for the tremendous support and encouragement for me to get involved in the MIGS — thanks, JF!!
More about the Montreal International Game Summit:
MIGS was founded in 2004 to meet the needs of the video game sector, which currently represents close to 9,000 workers in Quebec. Ten years later, its mission remains: developing the transfer of knowledge and expertise, increasing exposure for Quebec players abroad and promoting exchanges and communications between stakeholders, making MIGS the East Coast’s leading professional-only event for the games industry.
Music, the Brain, and the Three Levels of Immersion
Music has the power to deepen player immersion through psychological effects documented in scientific research. This talk explored the influence of music on the brain, and how these effects can aid game designers in meeting the criteria necessary for the “Three Levels of Immersion.”