Get the latest Education e-news
 
  • Audio Design For Interactive Narrative VR Experiences

    [11.15.18]
    - Larry Chang

  • 3. Spatial Audio Placement in VR

    Now that we have created all the sound files, how should we place them in a VR environment? We want to make the player feels like they are actually in that space but, at the same time, we try to reinforce the clarity of some audio cues and make sure the player won't miss it. In general, we separated them into two groups: the first group is the one where audio is placed in fixed locations in the environment;  audio from the second group is placed in relative locations based on the player.

    The first group includes our ambiance sound, collision/ interaction sound, diegetic music and some of the dialogue. It attaches the sound to either a certain object or a place in the space that helps to create the audio environment of the player. Take our surrounding ambiance environmental sound as an example: We designed a 5 speakers system for each room, attached each mono sound file to each speaker, and place them in the space to simulate how we set 5.1 speakers in the real world.

    The second group of sound; however, will move to correspond to the player's location. This is sometimes called "head-locking" the audio.  Examples include some of the dialogue and some of the non-diegetic music and audio from the cutscenes. This group is for the purpose of making sure the player would receive the full information of the audio content. Take the voiceover as an example: we have a voice from the doctor's character that gives you orders about what you should do throughout the experience. We put the voice audio source right behind the player's head all the time. This setting not only enables the players to always be able to get clear information from the same orientation but it also matches the setting of the story.

    4. Mixing in VR

    Now we have all the sound in place in VR. How are we going to mix it?

    We figured that it would be easier for us to manage all audio sources by putting them into bigger groups (SFX, Music, Dialogue, and Ambience), assigning the groups to individual channels in the Unity mixer and mixing based on the group instead of individual files. Of course, for the files that are grouped into one channel, I have already adjusted their relative volume between each other in DAW first before putting them into the game engine. That being said, it is still very important to check your mix in VR because the audio you heard in DAW is definitely going to sound different in VR. it's not a one-time thing, but a back and forth process.

    5. Conclusion

    To sum up, working on collision and interaction sounds and putting them in the prototype as early as possible helps the developing process for both designers and audio professionals. Understanding how to use audio as guide the players enriches the interactive storytelling by addressing information to the players and manipulating their state of mind. Adopting the concept of signature music also enables the players to identify with the characters and plots in the story and have a more memorable experience. Last but not least, a good spatial audio placement strategy and mixing technique not only enhances the efficiency of the audio pipeline but also gives a more immersive result.

Comments

comments powered by Disqus