Source Sound is collaborating with others and capturing 360 sound for VR environments.
By Jennifer Walden
Everyone wants it, but not everyone can make it. No, I’m not talking about money. I’m talking about virtual reality content.
Let’s say you want to shoot a short VR film. You’ve got a solid script, a cast of known actors, you’ve got a 360-degree camera and a pretty good idea of how to use it, but what about the sound? The camera has a built-in mic, but will that be enough coverage? Should the cast be mic’d as they would be for a traditional production? How will the production sound be handled in post?
Tim Gedemer, owner/sound supervisor at Source Sound in Woodland Hills, California, can help answer these questions. “In VR, we are audio directors,” he says. “Our services include advising clients at the script level on how they should be shooting their visuals to be optimal for sound.”
As audio directors, Source Sound walks their clients through every step of the process, from production to distribution. Starting with the recording on set, they manage all of the technical aspects of sound file management through production, and then guide their clients through the post sound process, both creatively and technically.
They recommend what technology should be used, how clients should be using it and what deals they need to make to sort out their distribution. “It really is a point-to-point service,” says Gedemer. “We decided early on that we needed to influence the entire process, so that is what we do.”
Two years ago, Dolby Labs referred Jaunt Studio to Source Sound to for their first VR film gig. Gedemer explains that because of Source Sound’s experience with games and feature films, Dolby felt they would be a good match to handle Jaunt’s creative sound needs while Dolby worked with Jaunt on the technical challenges.
Jaunt’s Kaiju Fury! premiered at the 2015 Sundance Film Festival. The experience puts the viewer in the middle of an epic Godzilla-like monster battle. “They realized their film needed cinematic sound, so Dolby called us up and asked if we’d like to get involved. We said, ‘We’re really busy with projects, but show us the tech and maybe we’ll help.’ We were disinterested at first, figuring it was going to be gimmicky, but I went to San Francisco and I looked at their first test, and I was just shocked. I had never seen anything like that before in my life. I realized, in that first moment of putting on those goggles, that we needed to do this.”
Kaiju Fury! was just the start. Source Sound completed three more VR projects for Jaunt, all within a week. There was the horror VR short film called Black Mass, a battle sequence called The Mission and the Atmos VR mastering of Paul McCartney’s Live and Let Die in concert.
Gedemer admits, “It was just insane. No one had ever done anything like this and no one knew how to do it. We just said, ‘Okay, we’ll just stay up for a week, figure all of that out and get it done.’”
Adjusting The Workflow
At first, their Pro Tools-based post sound workflow was similar to a traditional production, says Gedemer, “because we didn’t know what we didn’t know. It was only when we got into creating the final mix that we realized we didn’t have the tools to do this.”
Specifically, how could they experience the full immersion of the 360-degree video and concurrently make adjustments to the mix? On that first project, there was no way to slave the VR picture playing back through the Oculus headgear to the sound playing back via Pro Tools. “We had to manually synchronize,” explains Gedemer. “Literally, I would watch the equi-rectangular video that we were working with in Pro Tools, and at the precise moment I would just press play on the laptop, playing back the VR video through the Oculus HMD to try and synchronize it that way. I admit I got pretty good at that, but it’s not really the way you want to be working!”
Since that time, Dolby has implemented timecode synchronization and a video player that will playback the VR video through the Oculus headset. Now the Source Sound team can pick up the Oculus and it will be synchronized to the Pro Tools session.
Working Together For VR
Over the last few years, Source Sound has been collaborating with tech companies like Dolby, Avid, Oculus, Google, YouTube and Nokia on developing audio-related VR tools, workflow solutions and spec standards that will eventually become available to the wider audio post industry.
“We have this holistic approach to how we want to work, both in virtual and augmented reality audio,” says Gedemer. “We’re working with many different companies, beta testing technology and advising on what they should be thinking about regarding VR sound — with a keen eye toward new product development.”
Since Kaiju Fury, Source Sound has continued to create VR experiences with Jaunt. They have worked with other VR content creators, including the Emblematic Group (founded by “the godmother of VR,” Nonny de la Peña), 30 Ninjas (founded by director Doug Liman, The Bourne Identity and Edge of Tomorrow), Fusion Media, Mirada, Disney, Google, YouTube and many others.
Currently, Source Sound is working with Fusion Media on a project with NASA called Mars 2030, which takes a player to Mars as an astronaut and allows him/her to experience what life might be like while living in a Mars habitat. NASA feels that human exploration of Mars may be possible in the year 2030, so why not let people see and feel what it’s like.
The project has given Source Sound unprecedented access to the NASA facilities and engineers. One directive for Mars 2030 is to be as accurate as possible, with information on Mars coming directly from NASA’s Mars missions. For example, NASA collected information about the surface of Mars, such as the layout of all the rocks and the type of sand covering the surface. All of that data was loaded into the Unreal Engine, so when a player steps out of the habitat in the Mars 2030 experience and walks around, that surface is going to be the exact surface that is on Mars. “It’s not a facsimile,” says Gedemer. “That rock is actually there on Mars. So in order for us to be accurate from an audio perspective, there’s a lot that we have to do.”
In the experience the player gets to drive the Mars Rover. At NASA in Houston, there are multiple iterations of the rover that are being developed for this mission. They also have a special area that is set up like the Mars surface with a few craters and rocks.
For audio capture, Gedemer and sound effects recordist John Fasal headed to Houston with Sound Devices recorders and a slew of mic options. While the rover is too slow to do burnouts and donuts, Gedemer and Fasal were able to direct a certified astronaut driver and record the rover from every relevant angle. They captured sounds and ambiences from the various habitats on site. “There is a new prototype space suit that is designed for operation on Mars, and as such we will need to capture all the relevant sound associated with it,” says Gedemer. “We’ll be looking into helmet shape and size, communication systems, life support air flow, etc. when recreating this in the Unreal Engine.”
Another question the sound team needs to address is, “What does it sound like out on the surface of Mars?” It has an atmosphere, but the tricky thing is that a human can never actually walk around on the surface of Mars without wearing a suit. Sounds traveling through the Mars atmosphere will sound different than sounds traveling through Earth’s atmosphere, and additional special considerations need to be made for how the suit will impact sound getting to the astronaut’s ears.
“Only certain sounds and/or frequencies will penetrate the suit, and if it is loud enough to penetrate the suit, what is it going to sound like to the astronaut?” asks Gedemer. “So we are trying to figure out some of these technical things along the way. We hope to present a paper on this at the upcoming AES Conference on Audio for Virtual and Augmented Reality.”
Another interesting project at Source Sound is the work they’re doing with Nokia to develop specialized audio technology for live broadcasts in VR. “We are currently the sole creative provider of spatial audio for Nokia’s VR broadcasting initiative,” reveals Gedemer. Source Sound has been embedded with the Nokia Ozo Live team at events where they have been demonstrating their technology. They were part of the official Ozo Camera Launches in Los Angeles and London. They captured and spatialized a Los Angeles Lakers basketball game at the Staples Center. And once again they teamed up with Nokia at their NAB event this past spring.
“We’ve been working with them very closely on the technology that they are developing for live capture and distribution of stereoscopic visual and spatial audio in VR. I can’t elaborate on any details, but we have some very cool things going on there.”
However, Gedemer does break down one of the different requirements of live VR broadcast versus a cinematic VR experience — an example being the multi-episode VR series called Invisible, which Source Sound and Doug Liman of 30 Ninjas are currently collaborating on.
For a live broadcast you want an accurate representation of the event, but for a cinematic experience the opposite is true. Accuracy is not the objective. A cinematic experience needs a highly curated soundtrack in order to tell the story.
Gedemer elaborates, “The basic premise is that, for VR broadcasts you need to have an accurate audio representation of camera location. There is the matter of proper perspective to attend to. If you have a multi-camera shoot, every time you change camera angles to the viewer, you change perspective, and the sound needs to follow. Unlike a traditional live environment, which has a stereo or 5.1 mix that stays the same no matter the camera angle, our opinion is that approach is not adequate for true VR. We think Nokia is on the right track, and we are helping them perfect the finer points. To us that is truly exciting.”
Jennifer Walden is a New Jersey based writer and audio engineer.