Category Archives: 360

Deluxe VFX

Craig Zerouni joins Deluxe VFX as head of technology

Deluxe has named Craig Zerouni as head of technology for Deluxe Visual Effects. In this role, he will focus on continuing to unify software development and systems architecture across Deluxe’s Method studios in Los Angeles, Vancouver, New York and India, and its Iloura studios in Sydney and Melbourne, as well as LA’s Deluxe VR.

Based in LA and reporting to president/GM of Deluxe VFX and VR Ed Ulbrich, Zerouni will lead VFX and VR R&D and software development teams and systems worldwide, working closely with technology teams across Deluxe’s Creative division.

Zerouni has been working in media technology and production for nearly three decades, joining Deluxe most recently from DreamWorks, where he was director of technology at its Bangalore, India-bsed facility overseeing all technology. Prior to that he spent nine years at Digital Domain, where he was first head of R&D responsible for software strategy and teams in five locations across three countries, then senior director of technology overseeing software, systems, production technology, technical directors and media systems. He has also directed engineering, products and teams at software/tech companies Silicon Grail, Side Effects Software and Critical Path. In addition, he was co-founder of London-based computer animation company CFX.

Zerouni’s work has contributed to features including Tron: Legacy, Iron Man 3, Maleficent, X-Men: Days of Future Past, Ender’s Game and more than 400 commercials and TV IDs and titles. He is a member of BAFTA, ACM/SIGGRAPH, IEEE and the VES. He has served on the AMPAS Digital Imaging Technology Subcommittee and is the author of the technical reference book “Houdini on the Spot.”

Says Ulbrich on the new hire: “Our VFX work serves both the features world, which is increasingly global, and the advertising community, which is increasingly local. Behind the curtain at Method, Iloura, and Deluxe, in general, we have been working to integrate our studios to give clients the ability to tap into integrated global capacity, technology and talent anywhere in the world, while offering a high-quality local experience. Craig’s experience leading global technology organizations and distributed development teams, and building and integrating pipelines is right in line with our focus.”

Assimilate Scratch and Scratch VR Suite upgraded to V.8.6

Assimilate is now offering an open beta for Scratch 8.6 and the Scratch VR Suite 8.6, the latest versions of its realtime post tools and workflow — VR/360 and 2D/3D content, from dailies to conform grading, compositing and finishing. Expanded HDR functions are featured throughout the product line, including in Scratch VR, which now offers stitching capabilities.

Both open beta versions gives pros the opportunity to actively use the full suite of Scratch and Scratch VR tools, while evaluating and submitting requests and recommendations for additional features or updates.

Scratch Web for cloud-based, realtime review and collaboration, and Scratch Play for immediate review and playback, are also included in the ecosystem updates. Both products support VR/360 and 2D/3D content.

Current users of the Scratch VR Suite 8.5 and Scratch Finishing 8.5 can download the Scratch 8.6 open beta. Scratch 8.6 open beta and the Scratch VR Suite open beta are available now.

“V8.6 is a major update for both Scratch and the Scratch VR Suite with significant enhancements to the HDR and ACES workflows. We’ve added stitching to the VR toolset so that creators have a complete and streamlined end-to-end VR workflow,” says Jeff Edson, CEO at Assimilate. “The open Beta helps us to continue developing the best and most useful post production features and techniques all artists need to perfect their creativity in color grading and finishing. We act on all input, much of it immediately and some in regular updates.”

Here are some details of the update:

HDR
• PQ and HLG transfer functions are now an integral part of Scratch color management.
• Scopes automatically switch to HDR mode if needed and show levels in a nit-scale; highlights any reference level that you set.
• At the project level, define the HDR mastering metadata: color space, color primaries and white levels, luminance levels and more. The metadata is automatically included in the Video HDMI interface (AJA, BMD, Bluefish444) for display.
• Static metadata has the function to calculate dynamic luminance metadata like MaxCLL and MaxFall.
• HDR footage can be published directly to YouTube with HDR metadata.

VR/360 – Scratch VR Suite
• 360 stitching functionality: load all your source media from your 360 cameras into Scratch VR and combine it to a single equirectangular image. Support for camera stitch templates: AutoPano projects, Hugin and PTStitch scripts.
• Ambisonic Audio: Scratch VR can load, set and playback ambisonic audio files to complete the 360 immersive experience.
• Video with 360 sound can be published directly to YouTube 360.
• Additional overlay handles to the existing 2D-equirectangular feature for more easily positioning 2D elements in a 360 scene.

DIT Reporting Function
• Create a report of all clips of either a timeline, a project or just a selection of shots.
• Reports include metadata, such as a thumbnail, clip-name, timecode, scene, take, comments and any metadata attached to a clip.
• Choose from predefined templates or create your own.

G-Tech 6-15

Rick & Morty co-creator Justin Roiland to keynote VRLA

Justin Roiland, co-creator of Rick & Morty from Cartoon Network’s Adult Swim, will be delivering VRLA’s Saturday keynote. The expo, which takes place April 14 and 15 at the LA Convention Center, will include demos, educational sessions, experimental work and presentations.

The exhibit floor will feature hardware and software developers, content creators and prototype technology that can only be seen at VRLA. Registration is currently open, with the business-focused two-day “Pro” pass at $299 and a one-day pass for Saturday priced at $40.

Roiland, is also the newly-minted founder of the VR studio Squanchtendo, aims to dive into the surreally funny possibilities of the medium in his keynote, remarking, “What does the future of VR hold? Will there be more wizard games? Are grandmas real? What is a wizard really? Are there wizard grandmas? How does this factor into VR? Please come to my incredible keynote address on the state of VR.”

VRLA is currently accepting applications for its Indie Zone, which offers complimentary exhibition space to small teams who have raised less than $500,000 in venture capital funding or generated less than less than that amount in revenue. Click here to apply.


One of Lenovo’s new mobile workstations is VR-ready

Lenovo Workstations launched three new mobile workstations at Solidworks World 2017 — the Lenovo ThinkPad P51s and P51, as well as its VR-ready ThinkPad P71.

ThinkPad P51s

The ThinkPad P51s features a new chassis, Intel’s seventh-generation Core i7 processors and the latest Nvidia Quadro workstation graphics, as well as a 4K UHD IPS display with optional IR camera. With all its new features, the ThinkPad P51s still boasts a lightweight, Ultrabook build, shaving off over half a pound from the previous generation. In fact, the P51s is the lightest and thinnest mobile ThinkPad. It also offers Intel Thunderbolt 3 technology with a docking solution, providing users ultra-fast connectivity and the ability to move massive files quickly.

Also new are the ThinkPad P51 — including 4K IPS display with 100 percent color gamut and X-Rite Pantone color calibrator — and the VR-ready ThinkPad P71. These mobile workstations are MIL-SPEC tested and offer a dual-fan cooling system to allow users to push their system harder for use in the field. These two new offerings feature 2400MHz DDR4 memory, along with massive storage. The ThinkPad P71 handles up to four storage devices. These two workstations also feature the latest Intel Xeon processors for mobile workstations and are ISV.

Taking on VR
The VR-ready ThinkPad P71 (our main image) features Nvidia Pascal-based Quadro GPUs and comes equipped with full Oculus and HTC certifications, along with Nvidia’s VR-ready certification.

SuperSphere, a creative VR company is using the P71. “To create high-quality work on the go, our company requires Lenovo’s industry-leading mobile workstations that allow us to put the performance of a tower in our backpacks,” says SuperSphere partner/director Jason Diamond. “Our company’s focus on VR requires us to travel to a number of locations, and the ThinkPad P71 lets us achieve the same level of work on location as we can in the office, with the same functionality.”

The Lenovo P51s will be available in March, starting at $1,049, while the P51 and P71 will be available in April, starting at $1,399 and $1,849, respectively. .


Chris Hill & Sami Tahari

Imaginary Forces expands with EP Chris Hill and director of biz dev Sami Tahari

Imaginary Forces has added executive producer Chris Hill and director of business development Sami Tahari to its Los Angeles studio. The additions come at a time when the creative studio is looking to further expand their cross-platform presence with projects that mix VR/AR/360 with traditional, digital and social media.

Celebrating 20 years in business this year, the independently owned Imaginary Forces is a creative company specializing in brand strategy and visual storytelling encompassing many disciplines, including full-service design, production and post production. Being successful for that long in this business means they are regularly innovating and moving where the industry takes them. This led to the hiring of Hill and Tahari, whose diverse backgrounds will help strengthen the company’s long-standing relationships, as well as its continuous expansion into emerging markets.

Recent work of note includes main titles for Netflix’s beloved Stranger Things, the logo reveal for Michael Bay’s Transformers: The Last Knight and an immersive experience for the Empire State Building.

Hill’s diverse production experience includes commercials, experience design, entertainment marketing and branding for such clients as HBO Sports, Google, A&E and the Jacksonville Jaguars, among others. He joins Imaginary Forces after recently presiding over the broadcast division of marketing agency BPG.

Tahari brings extensive marketing, business and product development experience spanning the tech and entertainment spaces. His resume includes time at Lionsgate and Google, where he was an instrumental leader in the creative development and marketing of Google Glass.

“Imaginary Forces has a proven ability to use design and storytelling across any medium or industry,” adds Hill. “We can expand that ability to new markets, whether it’s emerging technologies, original content or sports franchises. When you consider, for example, the investment in massive screens and new technologies in stadiums across the country, it demands [that] same high level of brand strategy and visual storytelling.”

Our Main Image: L-R: Chris Hill and Sami Tahari.


Rise Above

Sundance 2017: VR for Good’s Rise Above 

By Elise Ballard

On January 22, during the Sundance Film Festival in Park City, the Oculus House had an event for their VR for Good initiative, described as “helping non-profits and rising filmmakers bring a variety of social missions to life.” Oculus awarded 10 non-profits a $40,000 grant and matched them with VR filmmakers to make a short film related to their community and cause.

One of the films, Rise Above, highlights a young girl’s recovery from sexual abuse and the support and therapy she received from New York City’s non-profit Womankind (formerly New York Asian Women’s Center).

Rise AboveRise Above is a gorgeous film — shot on the Nokia Ozo camera — and really well done, especially in as far as guiding your eye to the storytelling going on in a VR360 environment. I had the opportunity to interview the filmmakers, Ben Ross and Brittany Neff, about their experience. I was curious why they feel VR is one of the best mediums to create empathy and action for social impact. Check out their website.

Referencing the post process, Ross said he wore headsets the entire time as he worked with the editor in order to make sure it worked as a VR experience. All post production for VR for Good films was done at Reel FX. In terms of tools, for stitching the footage they used a combination of the Ozo Creator software from Nokia, Autopano Video from Kolor and the Cara plug-in for Nuke. Reel FX finished all the shots in Nuke (again making major use of Care) and Autodesk’s Flame for seam fixing and rig removal. TD Ryan Hartsell did the graphics work in After Effects using the mettle plug-in to help him place the graphics in 360 space and in 3D.

For more on the project and Reel FX’s involvement visit here.

The Oculus’ VR for Good initiative will be exhibiting will be at other major film festivals throughout the year and will be distributed by Facebook after the festival circuit.

Visit VR for Good here for more information, news and updates, and to stay connected (and apply!) to this inspiring and cutting-edge project.

Elise Ballard is a Los Angeles-based writer and author of Epiphany, True Stories of Sudden Insight, and the director of development at Cognition and Arc/k Project, a non-profit dedicated to preserving cultural heritage via virtual reality and digital media.


HPA Tech Retreat takes on VR/AR at Tech Retreat Extra

The long-standing HPA Tech Retreat is always a popular destination for tech-focused post pros, and while they have touched on virtual reality and augmented reality in the past, this year they are dedicating an entire day to the topic — February 20, the day before the official Retreat begins. TR-X (Tech Retreat Extra) will feature VR experts and storytellers sharing their knowledge and experiences. The traditional HPA Tech Retreat runs from February 21-24 in Indian Wells, California.

TR-X VR/AR is co-chaired by Lucas Wilson (Founder/Executive Producer at SuperSphereVR) and Marcie Jastrow (Senior VP, Immersive Media & Head of Technicolor Experience Center), who will lead a discussion focused on the changing VR/AR landscape in the context of rapidly growing integration into entertainment and applications.

Marcie Jastrow

Experts and creative panelists will tackle questions such as: What do you need to understand to enable VR in your environment? How do you adapt? What are the workflows? Storytellers, technologists and industry leaders will provide an overview of the technology and discuss how to harness emerging technologies in the service of the artistic vision. A series of diverse case studies and creative explorations — from NASA to the NFL — will examine how to engage the audience.

The TR-X program, along with the complete HPA Tech Retreat program, is available here. Additional sessions and speakers will be announced.

TR-X VR/AR Speakers and Panel Overview
Monday, February 20

Opening and Introductions
Seth Hallen, HPA President

Technical Introduction: 360/VR/AR/MR
Lucas Wilson

Panel Discussion: The VR/AR Market
Marcie Jastrow
David Moretti, Director of Corporate Development, Jaunt
Catherine Day, Head of VR/AR, Missing Pieces
Phil Lelyveld, VR/AR Initiative Program Lead, Entertainment Technology Center at USC

Acquisition Technology
Koji Gardiner, VP, Hardware, Jaunt

Live 360 Production Case Study
Andrew McGovern, VP of VR/AR Productions, Digital Domain

Live 360 Production Case Study
Michael Mansouri, Founder, Radiant Images

Interactive VR Production Case Study
Tim Dillon, Head of VR & Immersive Content, MPC Advertising USA

Immersive Audio Production Case Study
Kyle Schember, CEO, Subtractive

Panel Discussion: The Future
Alan Lasky, Director of Studio Product Development, 8i
Ben Grossmann, CEO, Magnopus
Scott Squires, CTO, Creative Director, Pixvana
Moderator: Lucas Wilson
Jen Dennis, EP of Branded Content, RSA

Panel Discussion: New Voices: Young Professionals in VR
Anne Jimkes, Sound Designer and Composer, Ecco VR
Jyotsna Kadimi, USC Graduate
Sho Schrock, Chapman University Student
Brian Handy, USC Student

TR-X also includes an ATSC 3.0 seminar, focusing on the next-generation television broadcast standard, which is nearing completion and offers a wide range of new content delivery options to the TV production community. This session will explore the expanding possibilities that the new standard provides in video, audio, interactivity and more. Presenters and panelists will also discuss the complex next-gen television distribution ecosystem that content must traverse, and the technologies that will bring the content to life in consumers’ homes.

Early registration is highly recommended for TR-X and the HPA Tech Retreat, which is a perennially sold-out event. Attendees can sign up for TR-X VR/AR, TR-X ATSC or the HPA Tech Retreat.

Main Image: Lucas Wilson.


Review: Mettle VR plug-ins for Adobe Premiere

By Barry Goch

I was very frustrated. I took a VR production class, I bought a LG 360 camera, but I felt like I was missing something. Then it dawned on me — I wanted to have more control. I started editing 360 videos using the VR video viewing tools in Adobe Premiere Pro, but I still was lacking the control I desired. I wanted my audience to have a guided, immersive experience without having to be in a swivel chair to get the most out of my work. Then, like a bolt of lightning, it came to me — I needed to rotate the 360 video sphere. I needed to be able to reorient it to accomplish my vision, but how would I do that?

Rotate Sphere plug-in showing keyframing.

Mettle’s Skybox 360/VR Tools are exactly what I was looking for. The Rotate Sphere plug-in alone is worth the price of the entire plug-in package. With this one plug-in, you’re able to re-orient your 360 video without worrying about any technical issues — it gives you complete creative control to re-frame your 360 video — and it’s completely keyframable too! For example, I mounted my 360 camera on my ski helmet this winter and went down a ski run at Heavenly in Lake Tahoe. There are amazing views of the lake from this run, but I also needed to follow the skiers ahead of me. Plus, the angle of the slope changed and the angle to the subjects I was following changed as well. Since the camera was fixed, how could I guide the viewer? By using the Rotate Sphere plug-in from Mettle and keyframing the orientation of the shot as the slope/subject relationship changed relative to my position.

My second favorite plug-in is Project 2D. Without the Project 2D plug-in, when you add titles to your 360 videos they become warped and you have very little control over their appearance. In Project 2D, you create your title using the built-in titler in Premiere Pro, add it to the timeline, then apply the Project 2D Mettle Skybox plug-in. Now you have complete control over the scale, rotation of the titling element and the placement of the title within the 360 video sphere. You can also use the Project 2D plug-in to composite graphics or video into your 360 video environment.

Mobius Zoom transition in action.

Rounding out the Skybox plug-in set are 360 video-aware plug-ins that every content creator needs. What do I mean but 360 video-aware? For example, when you apply a blur that is not 360 video-content-aware, it crosses the seam where the equi-rectangular video’s edges join together and makes the seam unseemly. With the Skybox Blur, Denoise, Glow and Sharpen plug-ins, you don’t have this problem. Just as the Rotate Sphere plug-in does the crazy math to rotate your 360 video without distortion or introducing artifacts, these plug-ins do the same.

Transitioning between cuts in 360 video is an evolving art form. There is really no right or wrong way. Longer cuts, shorter cuts, dissolves and dips to black are some of the basic options. Now, Mettle is adding to our creative toolkit by applying their crazy math skills on transitions in 360 videos. Mettle started with their first pack of four transitions: Mobius Zoom, Random Blocks, Gradient Wipe and Iris Wipe. I used the Mobius Zoom to transition from the header card to the video and then the Iris Wipe with a soft edge to transition from one shot to the next in the linked video.

Check out this video, which uses Rotate Sphere, Project 2D, Mobius Zoom and Iris wipe effects.

New Plug-Ins
I’m pleased to be among the first to show you their second set of plug-ins specifically designed for 360 / VR video! Chroma Leaks, Light Leaks, Spherical Blurs and everyone’s favorite, Light Rays!

Mettle plug-ins work on both Mac and Windows platforms — on qualified systems — and in realtime. The Mettle plug-ins are also both mono- and stereo-aware.

The Skybox plug-in set for Adobe Premiere Pro is truly the answer I’ve been looking for since I started exploring 360 video. It’s changed the way I work and opened up a world of control that I had been wishing for. Try it for yourself by downloading a demo at www.mettle.com.


Barry Goch is currently a digital intermediate editor for Deluxe in Culver City, working on Autodesk Flame. He started his career as a camera tech for Panavision Hollywood. He then transitioned to an offline Avid/FCP editor. His resume includes Passengers, Money Monster, Eye in the Sky and Game of Thrones. His latest endeavor is VR video.


China’s online video network Youku calls on Nokia’s Ozo ecosystem for VR  

One of the largest online video platforms in China, Youku, has chosen the Nokia Ozo VR ecosystem of technologies to create and distribute immersive VR content to more than 500 million monthly active users. Their platform features daily views of more than 1.1 billion.

Youku will use the entire Ozo VR solution, which includes the Ozo Camera, Ozo Software Suite, Ozo Live and Ozo Player SDK, in the creation and distribution of content ranging from film and television to news and documentaries, as well as professional user-generated content featuring Youku’s top talent.

Youku will integrate Nokia’s Ozo Player SDK and Ozo Audio solutions into all its platforms, mobile apps and consumer offerings, enabling its enormous audience to enjoy 3D 360-degree VR. The Ozo Player SDK allows VR pros to create VR app experiences on most major platforms with a single, unified development interface.

Full-featured reference players are also included in the SDKfor all supported platforms — including Oculus Desktop, Oculus Mobile/GearVR, HTC Vive and Google VR for Android and iOS. The multi-platform Ozo Player SDK is now available in a free version as well as a Pro tier with more features and larger deployment options.

 

VR Post: Hybrid workflows are key

By Beth Marchant

Shooting immersive content is one thing, but posting it for an ever-changing set of players and headsets is whole other multidimensional can of beans.

With early help from software companies that have developed off-the-shelf ways to tackle VR post — and global improvements to their storage and networking infrastructures — some facilities are diving into immersive content by adapting their existing post suites with a hybrid set of new tools. As with everything else in this business, it’s an ongoing challenge to stay one step ahead.

Chris Healer

The Molecule
New York- and Los Angeles-based motion graphics and VFX post house The Molecule leapt into the VR space more than a year and a half ago when it fused The Foundry’s Nuke with the open-sourced panoramic photo stitching software Hugin. Then, CEO Chris Healer took the workflow one step further. He developed an algorithm that rendered stereoscopic motion graphics spherically in Nuke.

Today, those developments have evolved into a robust pipeline that fuels The Molecule’s work for Conan O’Brien’s eponymous TBS talk show, The New York Times’s VR division and commercial work. “It’s basically eight or ten individual nodes inside Nuke that complete one step or another of the process,” says Healer. “Some of them overlap with Cara VR,” The Foundry’s recently launched VR plug-in for Nuke, “but all of it works really well for our artists. I talk to The Foundry from time to time and show them the tools, so there’s definitely an open conversation there about what we all need to move VR post forward.”

Collaborating with VR production companies like SuperSphere, Jaunt and Pixvana in Seattle, The Molecule is heading first where mass VR adoption seems likeliest. “The New York Times, for example, wants to have a presence at film festivals and new technology venues, and is trying to get out of the news-only business and into the entertainment-provider business. And the job for Conan was pretty wild — we had to create a one-off gag for Comic-Con that people would watch once and go away laughing to the next thing. It’s kind of a cool format.”

Healer’s team spent six weeks on the three-minute spot. “We had to shoot plates, model characters, animate them, composite it, build a game engine around it, compile it, get approval and iterate through that until we finished. We delivered 20 or so precise clips that fit into a game engine design, and I think it looks great.”

Healer says the VR content The Molecule is posting now is, like the Conan job, a slight variation on more typical recent VR productions. “I think that’s also what makes VR so exciting and challenging right now,” he says. “Everyone’s got a different idea about how to take it to the next level. And a lot of that is in anticipation of AR (augmented reality) and next-generation players/apps and headsets.

‘Conan’

The Steam store,” the premiere place online to find virtual content, “has content that supports multiple headsets, but not all of them.” He believes that will soon gel into a more unified device driver structure, “so that it’s just VR, not Oculus VR or Vive VR. Once you get basic head tracking together, then there’s the whole next thing: Do you have a controller of some kind, are you tracking in positional space, do you need to do room set up? Do we want wands or joysticks or hand gestures, or will keyboards do fine? What is the thing that wins? Those hurdles should solidify in the next year or two. The key factor in any of that is killer content.”

The biggest challenge facing his facility, and anyone doing VR post right now, he says, is keeping pace with changing resolutions and standards. “It used to be that 4K or 4K stereo was a good deliverable and that would work,” says Healer. “Now everything is 8K or 10K, because there’s this idea that we also have to future-proof content and prepare for next-gen headsets. You end up with a lot of new variables, like frame rate and resolution. We’re working on stereo commercial right now, and just getting the footage of one shot converted from only six cameras takes almost 3TB of disk space, and that’s just the raw footage.”

When every client suddenly wants to dip their toes into VR, how does a post facility respond? Healer thinks the onus is on production and post services to provide as many options as possible while using their expertise to blaze new paths. “It’s great that everyone wants to experiment in the space, and that puts a certain creative question in our field,” he says. “You have to seriously ask of every project now, does it really just need to be plain-old video? Or is there a game component or interactive component that involves video? We have to explore that. But that means you have to allocate more time in Unity https://unity3d.com/ building out different concepts for how to present these stories.”

As the client projects get more creative, The Molecule is relying on traditional VFX processes like greenscreen, 3D tracking and shooting plates to solve VR-related problems. “These VFX techniques help us get around a lot of the production issues VR presents. If you’re shooting on a greenscreen, you don’t need a 360 lens, and that helps. You can shoot one person walking around on a stage and then just pan to follow them. That’s one piece of footage that you then composite into some other frame, as opposed to getting that person out there on the day, trying to get their performance right and then worrying about hiding all the other camera junk. Our expertise in VFX definitely gives us an advantage in VR post.”

From a post perspective, Healer still hopes most for new camera technology that would radically simplify the stitching process, allowing more time for concepting and innovative project development. “I just saw a prototype of a toric lens,” shaped like the donut-like torus that results from revolving a circle in three-dimensional space, “that films 360 minus a little patch, where the tripod is, in a single frame,” he says. “That would be huge for us. That would really change the workflow around, and while we’re doing a lot of CG stuff that has to be added to VR, stitching takes the most time. Obviously, I care most about post, but there are also lots of production issues around a new lens like that. You’d need a lot of light to make it work well.”

Local Hero Post
For longtime Scratch users Local Hero Post, in Santa Monica, the move to begin grading and compositing in Assimilate Scratch VR was a no-brainer. “We were one of the very first American companies to own a Scratch when it was $75,000 a license,” says founder and head of imaging Leandro Marini. “That was about 10 years ago and we’ve since done about 175 feature film DIs entirely in Scratch, and although we also now use a variety of tools, we still use it.”

Leandro Marini

Marini says he started seeing client demand for VR projects about two years ago and he turned to Scratch VR. He says it allows users do traditional post the way editors and colorist are used to — with all the same DI tools that let you do complicated paint outs, visual effects and 50-layer-deep color corrections, Power Windows, in realtime on a VR sphere.”

New Deal Studios’ 2015 Sundance film, Kaiju Fury was an early project, “when Scratch VR was first really user-friendly and working in realtime.” Now Marini says their VR workflow is “pretty robust. [It’s] currently the only system that I know of that can work in VR in realtime in multiple ways,” which includes a echo-rectangular projection, which gives you a YouTube 360-type of feel and an Oculus headset view.

“You can attach the headset, put the Oculus on and grade and do visual effects in the headset,” he says. “To me, that’s the crux: you really have to be able to work inside the headset if you are going to grade and do VR for real. The difference between seeing a 360 video on a computer screen and seeing it from within a headset and being able to move your head around is huge. Those headsets have wildly different colors than a computer screen.”

The facility’s — and likely the industry’s — highest profile and biggest budget project to date is Invisible, a new VR scripted miniseries directed by Doug Liman and created by 30 Ninjas, the VR company he founded with Julina Tatlock. Invisible premiered in October on Samsung VR and the Jaunt app and will roll out in coming months in VR theaters nationwide. Written by Dallas Buyers Club screenwriter Melisa Wallack and produced by Jaunt and Condé Nast Entertainment, it is billed as the first virtual reality action-adventure series of its kind.

‘Invisible’

“Working on that was a pretty magical experience,” says Marini. “Even the producers and Liman himself had never seen anything like being able to do the grade, do VFX and do composite and stereo fixes in 3D virtual reality all with the headset on. That was our initial dilemma for this project, until we figured it out: do you make it look good for the headset, for the computer screen or for iPhones or Samsung phones? Everyone who worked on this understood that every VR project we do now is in anticipation of the future wave of VR headsets. All we knew was that about a third would probably see it on a Samsung Gear VR, another third would see it on a platform like YouTube 360 and the final third would see it on some other headset like Oculus Rift, HTC or Google’s new Daydream.”

How do you develop a grading workflow that fits all of the above? “This was a real tricky one,” admits Marini. “It’s a very dark and moody film and he wanted to make a family drama thriller within that context. A lot of it is dark hallways and shadows and people in silhouette, and we had to sort of learn the language a bit.”

Marini and his team began exclusively grading in the headset, but that was way too dark on computer monitors. “At the end of the day, we learned to dial it back a bit and make pretty conservative grades that worked on every platform so that it looked good everywhere. The effect of the headset is it’s a light that’s shining right into your eyeball, so it just looks a lot brighter. It had to still look moody inside the headset in a dark room but not too moody that it vanishes on computer laptop in a bright room. It was a balancing act.”

Local Hero

Local Hero also had to figure out how to juggle the new VR work with its regular DI workload. “We had to break off the VR services into a separate bay and room that is completely dedicated to it,” he explains. “We had to slice it off from the main pipeline because it needs around-the-clock custom attention. Very quickly we realized we needed to quarantine this workflow. One of our colorists here has become a VR expert, and he’s now the only one allowed to grade those projects.” The facility upgraded to a Silverdraft Demon workstation with specialized storage to meet the exponential demand for processing power and disk space.

Marini says Invisible, like the other VR work Local Hero has done before is, in essence, a research project in these early days of immersive content. “There is no standard color space or headset or camera. And we’re still in the prototype phase of this. While we are in this phase, everything is an experiment. The experience of being in 3D space is interesting but the quality of what you’re watching is still very, very low resolution. The color fidelity relative to what we’re used to in the theater and on 4K HDR televisions is like VHS 1980’s quality. We’re still very far away from truly excellent VR.”

Scratch VR workflows in Invisible included a variety of complicated processes. “We did things like dimension-alizing 2D shots,” says Marini. “That’s complicated stuff. In 3D with the headset on we would take a shot that was in 2D, draw a rough roto mask around the person, create a 3D field, pull their nose forward, push their eyes back, push the sky back — all in a matter of seconds. That is next-level stuff for VR post.”

Local Hero also used Scratch Web for reviews. “Moments after we finished a shot or sequence it was online and someone could put on a headset and watch it. That was hugely helpful. Doug was in London, Condé Nast in New York. Lexus was a sponsor of this, so their agency in New York was also involved. Jaunt is down the street from us here in Santa Monica. And there were three clients in the bay with us at all times.”

‘Invisible’

As such, there is no way to standardize a VR DI workflow, he says. “For Invisible, it was definitely all hands on deck and every day was a new challenge. It was 4K 60p stereo, so the amount of data we had to push — 4K 60p to both eyes — which was unprecedented.” Strange stereo artifacts would appear for no apparent reason. “A bulge would suddenly show up on a wall and we’d have to go in there and figure out why and fix it. Do we warp it? Try something else? It was like that throughout the entire project: invent the workflow every day and fudge your way through. But that’s the nature of experimental technology.”

Will there be a watershed VR moment in the year ahead? “I think it all depends on the headsets, which are going to be like mobile phones,” he says. “Every six months there will be a new group of them that will be better and more powerful with higher resolution. I don’t think there will be a point in the future when everyone has a self-contained high-end headset. I think the more affordable headsets that you put your phone into, like Gear VR and Daydream, are the way most people will begin to experience VR. And we’re only 20 percent of the way there now. The whole idea of VR narrative content is completely unknown and it remains to be seen if audiences care and want it and will clamor for it. When they do, then we’ll develop a healthy VR content industry in Hollywood.”


Beth Marchant has been covering the production and post industry for 21 years. She was the founding editor-in-chief of Studio/monthly magazine and the co-editor of StudioDaily.com. She continues to write about the industry.