By Beth Marchant
Shooting immersive content is one thing, but posting it for an ever-changing set of players and headsets is whole other multidimensional can of beans.
With early help from software companies that have developed off-the-shelf ways to tackle VR post — and global improvements to their storage and networking infrastructures — some facilities are diving into immersive content by adapting their existing post suites with a hybrid set of new tools. As with everything else in this business, it’s an ongoing challenge to stay one step ahead.
New York- and Los Angeles-based motion graphics and VFX post house The Molecule leapt into the VR space more than a year and a half ago when it fused The Foundry’s Nuke with the open-sourced panoramic photo stitching software Hugin. Then, CEO Chris Healer took the workflow one step further. He developed an algorithm that rendered stereoscopic motion graphics spherically in Nuke.
Today, those developments have evolved into a robust pipeline that fuels The Molecule’s work for Conan O’Brien’s eponymous TBS talk show, The New York Times’s VR division and commercial work. “It’s basically eight or ten individual nodes inside Nuke that complete one step or another of the process,” says Healer. “Some of them overlap with Cara VR,” The Foundry’s recently launched VR plug-in for Nuke, “but all of it works really well for our artists. I talk to The Foundry from time to time and show them the tools, so there’s definitely an open conversation there about what we all need to move VR post forward.”
Collaborating with VR production companies like SuperSphere, Jaunt and Pixvana in Seattle, The Molecule is heading first where mass VR adoption seems likeliest. “The New York Times, for example, wants to have a presence at film festivals and new technology venues, and is trying to get out of the news-only business and into the entertainment-provider business. And the job for Conan was pretty wild — we had to create a one-off gag for Comic-Con that people would watch once and go away laughing to the next thing. It’s kind of a cool format.”
Healer’s team spent six weeks on the three-minute spot. “We had to shoot plates, model characters, animate them, composite it, build a game engine around it, compile it, get approval and iterate through that until we finished. We delivered 20 or so precise clips that fit into a game engine design, and I think it looks great.”
Healer says the VR content The Molecule is posting now is, like the Conan job, a slight variation on more typical recent VR productions. “I think that’s also what makes VR so exciting and challenging right now,” he says. “Everyone’s got a different idea about how to take it to the next level. And a lot of that is in anticipation of AR (augmented reality) and next-generation players/apps and headsets.
The Steam store,” the premiere place online to find virtual content, “has content that supports multiple headsets, but not all of them.” He believes that will soon gel into a more unified device driver structure, “so that it’s just VR, not Oculus VR or Vive VR. Once you get basic head tracking together, then there’s the whole next thing: Do you have a controller of some kind, are you tracking in positional space, do you need to do room set up? Do we want wands or joysticks or hand gestures, or will keyboards do fine? What is the thing that wins? Those hurdles should solidify in the next year or two. The key factor in any of that is killer content.”
The biggest challenge facing his facility, and anyone doing VR post right now, he says, is keeping pace with changing resolutions and standards. “It used to be that 4K or 4K stereo was a good deliverable and that would work,” says Healer. “Now everything is 8K or 10K, because there’s this idea that we also have to future-proof content and prepare for next-gen headsets. You end up with a lot of new variables, like frame rate and resolution. We’re working on stereo commercial right now, and just getting the footage of one shot converted from only six cameras takes almost 3TB of disk space, and that’s just the raw footage.”
When every client suddenly wants to dip their toes into VR, how does a post facility respond? Healer thinks the onus is on production and post services to provide as many options as possible while using their expertise to blaze new paths. “It’s great that everyone wants to experiment in the space, and that puts a certain creative question in our field,” he says. “You have to seriously ask of every project now, does it really just need to be plain-old video? Or is there a game component or interactive component that involves video? We have to explore that. But that means you have to allocate more time in Unity https://unity3d.com/ building out different concepts for how to present these stories.”
As the client projects get more creative, The Molecule is relying on traditional VFX processes like greenscreen, 3D tracking and shooting plates to solve VR-related problems. “These VFX techniques help us get around a lot of the production issues VR presents. If you’re shooting on a greenscreen, you don’t need a 360 lens, and that helps. You can shoot one person walking around on a stage and then just pan to follow them. That’s one piece of footage that you then composite into some other frame, as opposed to getting that person out there on the day, trying to get their performance right and then worrying about hiding all the other camera junk. Our expertise in VFX definitely gives us an advantage in VR post.”
From a post perspective, Healer still hopes most for new camera technology that would radically simplify the stitching process, allowing more time for concepting and innovative project development. “I just saw a prototype of a toric lens,” shaped like the donut-like torus that results from revolving a circle in three-dimensional space, “that films 360 minus a little patch, where the tripod is, in a single frame,” he says. “That would be huge for us. That would really change the workflow around, and while we’re doing a lot of CG stuff that has to be added to VR, stitching takes the most time. Obviously, I care most about post, but there are also lots of production issues around a new lens like that. You’d need a lot of light to make it work well.”
Local Hero Post
For longtime Scratch users Local Hero Post, in Santa Monica, the move to begin grading and compositing in Assimilate Scratch VR was a no-brainer. “We were one of the very first American companies to own a Scratch when it was $75,000 a license,” says founder and head of imaging Leandro Marini. “That was about 10 years ago and we’ve since done about 175 feature film DIs entirely in Scratch, and although we also now use a variety of tools, we still use it.”
Marini says he started seeing client demand for VR projects about two years ago and he turned to Scratch VR. He says it allows users do traditional post the way editors and colorist are used to — with all the same DI tools that let you do complicated paint outs, visual effects and 50-layer-deep color corrections, Power Windows, in realtime on a VR sphere.”
New Deal Studios’ 2015 Sundance film, Kaiju Fury was an early project, “when Scratch VR was first really user-friendly and working in realtime.” Now Marini says their VR workflow is “pretty robust. [It’s] currently the only system that I know of that can work in VR in realtime in multiple ways,” which includes a echo-rectangular projection, which gives you a YouTube 360-type of feel and an Oculus headset view.
“You can attach the headset, put the Oculus on and grade and do visual effects in the headset,” he says. “To me, that’s the crux: you really have to be able to work inside the headset if you are going to grade and do VR for real. The difference between seeing a 360 video on a computer screen and seeing it from within a headset and being able to move your head around is huge. Those headsets have wildly different colors than a computer screen.”
The facility’s — and likely the industry’s — highest profile and biggest budget project to date is Invisible, a new VR scripted miniseries directed by Doug Liman and created by 30 Ninjas, the VR company he founded with Julina Tatlock. Invisible premiered in October on Samsung VR and the Jaunt app and will roll out in coming months in VR theaters nationwide. Written by Dallas Buyers Club screenwriter Melisa Wallack and produced by Jaunt and Condé Nast Entertainment, it is billed as the first virtual reality action-adventure series of its kind.
“Working on that was a pretty magical experience,” says Marini. “Even the producers and Liman himself had never seen anything like being able to do the grade, do VFX and do composite and stereo fixes in 3D virtual reality all with the headset on. That was our initial dilemma for this project, until we figured it out: do you make it look good for the headset, for the computer screen or for iPhones or Samsung phones? Everyone who worked on this understood that every VR project we do now is in anticipation of the future wave of VR headsets. All we knew was that about a third would probably see it on a Samsung Gear VR, another third would see it on a platform like YouTube 360 and the final third would see it on some other headset like Oculus Rift, HTC or Google’s new Daydream.”
How do you develop a grading workflow that fits all of the above? “This was a real tricky one,” admits Marini. “It’s a very dark and moody film and he wanted to make a family drama thriller within that context. A lot of it is dark hallways and shadows and people in silhouette, and we had to sort of learn the language a bit.”
Marini and his team began exclusively grading in the headset, but that was way too dark on computer monitors. “At the end of the day, we learned to dial it back a bit and make pretty conservative grades that worked on every platform so that it looked good everywhere. The effect of the headset is it’s a light that’s shining right into your eyeball, so it just looks a lot brighter. It had to still look moody inside the headset in a dark room but not too moody that it vanishes on computer laptop in a bright room. It was a balancing act.”
Local Hero also had to figure out how to juggle the new VR work with its regular DI workload. “We had to break off the VR services into a separate bay and room that is completely dedicated to it,” he explains. “We had to slice it off from the main pipeline because it needs around-the-clock custom attention. Very quickly we realized we needed to quarantine this workflow. One of our colorists here has become a VR expert, and he’s now the only one allowed to grade those projects.” The facility upgraded to a Silverdraft Demon workstation with specialized storage to meet the exponential demand for processing power and disk space.
Marini says Invisible, like the other VR work Local Hero has done before is, in essence, a research project in these early days of immersive content. “There is no standard color space or headset or camera. And we’re still in the prototype phase of this. While we are in this phase, everything is an experiment. The experience of being in 3D space is interesting but the quality of what you’re watching is still very, very low resolution. The color fidelity relative to what we’re used to in the theater and on 4K HDR televisions is like VHS 1980’s quality. We’re still very far away from truly excellent VR.”
Scratch VR workflows in Invisible included a variety of complicated processes. “We did things like dimension-alizing 2D shots,” says Marini. “That’s complicated stuff. In 3D with the headset on we would take a shot that was in 2D, draw a rough roto mask around the person, create a 3D field, pull their nose forward, push their eyes back, push the sky back — all in a matter of seconds. That is next-level stuff for VR post.”
Local Hero also used Scratch Web for reviews. “Moments after we finished a shot or sequence it was online and someone could put on a headset and watch it. That was hugely helpful. Doug was in London, Condé Nast in New York. Lexus was a sponsor of this, so their agency in New York was also involved. Jaunt is down the street from us here in Santa Monica. And there were three clients in the bay with us at all times.”
As such, there is no way to standardize a VR DI workflow, he says. “For Invisible, it was definitely all hands on deck and every day was a new challenge. It was 4K 60p stereo, so the amount of data we had to push — 4K 60p to both eyes — which was unprecedented.” Strange stereo artifacts would appear for no apparent reason. “A bulge would suddenly show up on a wall and we’d have to go in there and figure out why and fix it. Do we warp it? Try something else? It was like that throughout the entire project: invent the workflow every day and fudge your way through. But that’s the nature of experimental technology.”
Will there be a watershed VR moment in the year ahead? “I think it all depends on the headsets, which are going to be like mobile phones,” he says. “Every six months there will be a new group of them that will be better and more powerful with higher resolution. I don’t think there will be a point in the future when everyone has a self-contained high-end headset. I think the more affordable headsets that you put your phone into, like Gear VR and Daydream, are the way most people will begin to experience VR. And we’re only 20 percent of the way there now. The whole idea of VR narrative content is completely unknown and it remains to be seen if audiences care and want it and will clamor for it. When they do, then we’ll develop a healthy VR content industry in Hollywood.”
Beth Marchant has been covering the production and post industry for 21 years. She was the founding editor-in-chief of Studio/monthly magazine and the co-editor of StudioDaily.com. She continues to write about the industry.