Category Archives: VR

HPA Tech Retreat takes on realities of virtual reality

By Tom Coughlin

The HPA Tech Retreat, run by the Hollywood Professional Association in association with SMPTE, began with an insightful one-day VR seminar— Integrating Virtual Reality/Augmented Reality into Entertainment Applications. Lucas Wilson from SuperSphere kicked off the sessions and helped with much of the organization of the seminar.

The seminar addressed virtual reality (VR), augmented reality (AR) and mixed reality (MR, a subset of AR where the real world and the digital world interact, like Pokeman Go). As in traditional planar video, 360-degree video still requires a director to tell a story and direct the eye to see what is meant to be seen. Successful VR requires understanding how people look at things, how they perceive reality, and use that understanding to help tell a story. Some things that may help with this are reinforcement of the viewer’s gaze with color and sound that may vary with the viewer — e.g. these may be different for the “good guy” and the “bad guy.”

VR workflows are quite different from traditional ones, with many elements changing with multiple-camera content. For instance, it is much more difficult to keep a camera crew out of the image, and providing proper illumination for all the cameras can be a challenge. The image below from Jaunt shows their 360-degree workflow, including the use of their cloud-based computational image service to stitch the images from the multiple cameras.
Snapchat is the biggest MR application, said Wilson. Snapchat’s Snapchat-stories could be the basis of future post tools.

Because stand-alone headsets (head-mounted displays, HMDs) are expensive, most users of VR rely on smart phone-based displays. There are also some places that allow one or more people to experience VR, such as the IMAX center in Los Angeles. Activities such as VR viewing will be one of the big drivers for higher-resolution mobile device displays.

Tools that allow artists and directors to get fast feedback on their shots are still in development. But progress is being made, and today over 50 percent of VR is used for video viewing rather than games. Participants in a VR/AR market session, moderated by the Hollywood Reporter’s Carolyn Giardina and including Marcie Jastrow, David Moretti, Catherine Day, Phil Lelyveld, seemed to agree that the biggest immediate opportunity is probably with AR.

Koji Gardiner from Jaunt gave a great talk on their approach to VR. He discussed the various ways that 360-degree video can be captured and the processing required to create finished stitched video. For an array of cameras with some separation between the cameras (no common axis point for the imaging cameras), there will be area that needs to be stitched together between camera images using common reference points between the different camera images as well as blind spots near to the cameras where they are not capturing images.

If there is a single axis for all of the cameras then there are effectively no blind spots and no stitching possible as shown in the image below. Covering all the space to get a 360-degree video requires additional cameras located on that axis to cover all the space.

The Fraunhofer Institute, in Germany, has been showing a 360-degree video camera with an effective single axis for several cameras for several years, as shown below. They do this using mirrors to reflect images to the individual cameras.

As the number of cameras is increased, the mathematical work to stitch the 360-degree images together is reduced.

Stitching
There are two approaches commonly used in VR stitching of multiple camera videos. The easiest to implement is a geometric approach that uses known geometries and distances to objects. It requires limited computational resources but results in unavoidable ghosting artifacts at seams from the separate images.

The Optical Flow approach synthesizes every pixel by computing correspondences between neighboring cameras. This approach eliminates the ghosting artifacts at the seams but has its own more subtle artifacts and requires significantly more processing capability. The Optical Flow approach requires computational capabilities far beyond those normally available to content creators. This has led to a growing market to upload multi-camera video streams to cloud services that process the stitching to create finished 360-degree videos.

Files from the Jaunt One camera system are first downloaded and organized on a laptop computer and then uploaded to Jaunt’s cloud server to be processed and create the stitching to make a 360 video. Omni-directionally captured audio can also be uploaded and mixed ambisonically, resulting in advanced directionality in the audio tied to the VR video experience.

Google and Facebook also have cloud-based resources for computational photography used for this sort of image stitching.

The Jaunt One360-degree camera has a 1-inch 20MP rolling shutter sensor with frame-rates up to 60fps with 3200 ISO Max, 29dB SNR at ISO800. It has 10 stops per camera modules with customer 130-degree diagonal FOV, 4/2.9 optics and with up to 16K resolution (8K per eye). Jaunt One at 60fps provides 200 GB/minute uncompressed. This can fill a 1TB SSD in five minutes. They are forced to use compression to be able to use currently affordable storage devices. This compression creates 11GB per minute, which can fill a 1TB SSD in 90 minutes.

The actual stitched image, laid out flat, looks like a distorted projection. But when viewed in a stereoscopic viewer it appears to look like a natural image of the world around the viewer, giving an immersive experience. At one point in time the viewer does not see all of the image but only the image in a restricted space that they are looking directly at as shown in the red box in the figure below.

The full 360-degree image can be pretty high resolution, but unless the resolution is high enough, the resolution inside the scene being viewed at any point in time will be much less that the resolution of the overall scene, unless special steps are taken.

The image below shows that for a 4k 360-degree video the resolution in the field of view (FOV) may be only 1K, much less resolution and quite perceptible to the human eye.

In order to provide a better viewing experience in the FOV, either the resolution of the entire view must be better (e.g. the Jaunt One high-resolution version has 8K per eye and thus 16K total displayed resolution) or there must be a way to increase the resolution in the most significant FOV in a video, so at least in that FOV, the resolution leads to a greater feeling of reality.

Virtual reality, augmented reality and mixed reality create new ways of interacting with the world around us and will drive consumer technologies and the need for 360-degree video. New tools and stitching software, much of this cloud-based, will enable these workflows for folks who want to participate in this revolution in content. The role of a director is as important as ever as new methods are needed to tell stories and guide the viewer to engage in this story.

2017 Creative Storage Conference
You can learn more about the growth in VR content in professional video and how this will drive new digital storage demand and technologies to support the high data rates needed for captured content and cloud-based VR services at the 2017 Creative Storage Conference — taking place May 24, 2017 in Culver City.


Thomas M. Coughlin of Coughlin Associates is a storage analyst and consultant. He has over 30 years in the data storage industry and is the author of Digital Storage in Consumer Electronics: The Essential Guide.

VFX house Jamm adds Flame artist Mark Holden

Santa Monica-based visual effects boutique Jamm has added veteran Flame artist Mark Holden to its roster. Holden comes to Jamm with over 20 years of experience in post production, including stints in London and Los Angeles.

It didn’t take long for Holden to dive right in at Jamm; he worked on Space 150’s Buffalo Wild Wings Super Bowl campaign directed by the Snorri Bros. and starring Brett Favre. The Super Bowl teaser kicked off the pre-game.

Holden is known not only for his visual effects talent, but also for turning projects around under tight deadlines and offering his clients as many possible solutions within the post process. This has earned him work with leading agencies such as Fallon, Mother, Saatchi & Saatchi, Leo Burnett, 180, TBWA/Chiat/Day, Goodby Silverstein & Partners, Deutsch, David & Goliath, and Team One. He has worked with brands including Lexus, Activision, Adidas, Chevy, Geico, Grammys, Kia, Lyft, Pepsi, Southwest Airlines, StubHub, McDonald’s, Kellogg’s, Stella Artois, Silk, Heineken and Olay.

 

G-Tech 6-15
Deluxe VFX

Craig Zerouni joins Deluxe VFX as head of technology

Deluxe has named Craig Zerouni as head of technology for Deluxe Visual Effects. In this role, he will focus on continuing to unify software development and systems architecture across Deluxe’s Method studios in Los Angeles, Vancouver, New York and India, and its Iloura studios in Sydney and Melbourne, as well as LA’s Deluxe VR.

Based in LA and reporting to president/GM of Deluxe VFX and VR Ed Ulbrich, Zerouni will lead VFX and VR R&D and software development teams and systems worldwide, working closely with technology teams across Deluxe’s Creative division.

Zerouni has been working in media technology and production for nearly three decades, joining Deluxe most recently from DreamWorks, where he was director of technology at its Bangalore, India-bsed facility overseeing all technology. Prior to that he spent nine years at Digital Domain, where he was first head of R&D responsible for software strategy and teams in five locations across three countries, then senior director of technology overseeing software, systems, production technology, technical directors and media systems. He has also directed engineering, products and teams at software/tech companies Silicon Grail, Side Effects Software and Critical Path. In addition, he was co-founder of London-based computer animation company CFX.

Zerouni’s work has contributed to features including Tron: Legacy, Iron Man 3, Maleficent, X-Men: Days of Future Past, Ender’s Game and more than 400 commercials and TV IDs and titles. He is a member of BAFTA, ACM/SIGGRAPH, IEEE and the VES. He has served on the AMPAS Digital Imaging Technology Subcommittee and is the author of the technical reference book “Houdini on the Spot.”

Says Ulbrich on the new hire: “Our VFX work serves both the features world, which is increasingly global, and the advertising community, which is increasingly local. Behind the curtain at Method, Iloura, and Deluxe, in general, we have been working to integrate our studios to give clients the ability to tap into integrated global capacity, technology and talent anywhere in the world, while offering a high-quality local experience. Craig’s experience leading global technology organizations and distributed development teams, and building and integrating pipelines is right in line with our focus.”


Assimilate Scratch and Scratch VR Suite upgraded to V.8.6

Assimilate is now offering an open beta for Scratch 8.6 and the Scratch VR Suite 8.6, the latest versions of its realtime post tools and workflow — VR/360 and 2D/3D content, from dailies to conform grading, compositing and finishing. Expanded HDR functions are featured throughout the product line, including in Scratch VR, which now offers stitching capabilities.

Both open beta versions gives pros the opportunity to actively use the full suite of Scratch and Scratch VR tools, while evaluating and submitting requests and recommendations for additional features or updates.

Scratch Web for cloud-based, realtime review and collaboration, and Scratch Play for immediate review and playback, are also included in the ecosystem updates. Both products support VR/360 and 2D/3D content.

Current users of the Scratch VR Suite 8.5 and Scratch Finishing 8.5 can download the Scratch 8.6 open beta. Scratch 8.6 open beta and the Scratch VR Suite open beta are available now.

“V8.6 is a major update for both Scratch and the Scratch VR Suite with significant enhancements to the HDR and ACES workflows. We’ve added stitching to the VR toolset so that creators have a complete and streamlined end-to-end VR workflow,” says Jeff Edson, CEO at Assimilate. “The open Beta helps us to continue developing the best and most useful post production features and techniques all artists need to perfect their creativity in color grading and finishing. We act on all input, much of it immediately and some in regular updates.”

Here are some details of the update:

HDR
• PQ and HLG transfer functions are now an integral part of Scratch color management.
• Scopes automatically switch to HDR mode if needed and show levels in a nit-scale; highlights any reference level that you set.
• At the project level, define the HDR mastering metadata: color space, color primaries and white levels, luminance levels and more. The metadata is automatically included in the Video HDMI interface (AJA, BMD, Bluefish444) for display.
• Static metadata has the function to calculate dynamic luminance metadata like MaxCLL and MaxFall.
• HDR footage can be published directly to YouTube with HDR metadata.

VR/360 – Scratch VR Suite
• 360 stitching functionality: load all your source media from your 360 cameras into Scratch VR and combine it to a single equirectangular image. Support for camera stitch templates: AutoPano projects, Hugin and PTStitch scripts.
• Ambisonic Audio: Scratch VR can load, set and playback ambisonic audio files to complete the 360 immersive experience.
• Video with 360 sound can be published directly to YouTube 360.
• Additional overlay handles to the existing 2D-equirectangular feature for more easily positioning 2D elements in a 360 scene.

DIT Reporting Function
• Create a report of all clips of either a timeline, a project or just a selection of shots.
• Reports include metadata, such as a thumbnail, clip-name, timecode, scene, take, comments and any metadata attached to a clip.
• Choose from predefined templates or create your own.


Rick & Morty co-creator Justin Roiland to keynote VRLA

Justin Roiland, co-creator of Rick & Morty from Cartoon Network’s Adult Swim, will be delivering VRLA’s Saturday keynote. The expo, which takes place April 14 and 15 at the LA Convention Center, will include demos, educational sessions, experimental work and presentations.

The exhibit floor will feature hardware and software developers, content creators and prototype technology that can only be seen at VRLA. Registration is currently open, with the business-focused two-day “Pro” pass at $299 and a one-day pass for Saturday priced at $40.

Roiland, is also the newly-minted founder of the VR studio Squanchtendo, aims to dive into the surreally funny possibilities of the medium in his keynote, remarking, “What does the future of VR hold? Will there be more wizard games? Are grandmas real? What is a wizard really? Are there wizard grandmas? How does this factor into VR? Please come to my incredible keynote address on the state of VR.”

VRLA is currently accepting applications for its Indie Zone, which offers complimentary exhibition space to small teams who have raised less than $500,000 in venture capital funding or generated less than less than that amount in revenue. Click here to apply.


One of Lenovo’s new mobile workstations is VR-ready

Lenovo Workstations launched three new mobile workstations at Solidworks World 2017 — the Lenovo ThinkPad P51s and P51, as well as its VR-ready ThinkPad P71.

ThinkPad P51s

The ThinkPad P51s features a new chassis, Intel’s seventh-generation Core i7 processors and the latest Nvidia Quadro workstation graphics, as well as a 4K UHD IPS display with optional IR camera. With all its new features, the ThinkPad P51s still boasts a lightweight, Ultrabook build, shaving off over half a pound from the previous generation. In fact, the P51s is the lightest and thinnest mobile ThinkPad. It also offers Intel Thunderbolt 3 technology with a docking solution, providing users ultra-fast connectivity and the ability to move massive files quickly.

Also new are the ThinkPad P51 — including 4K IPS display with 100 percent color gamut and X-Rite Pantone color calibrator — and the VR-ready ThinkPad P71. These mobile workstations are MIL-SPEC tested and offer a dual-fan cooling system to allow users to push their system harder for use in the field. These two new offerings feature 2400MHz DDR4 memory, along with massive storage. The ThinkPad P71 handles up to four storage devices. These two workstations also feature the latest Intel Xeon processors for mobile workstations and are ISV.

Taking on VR
The VR-ready ThinkPad P71 (our main image) features Nvidia Pascal-based Quadro GPUs and comes equipped with full Oculus and HTC certifications, along with Nvidia’s VR-ready certification.

SuperSphere, a creative VR company is using the P71. “To create high-quality work on the go, our company requires Lenovo’s industry-leading mobile workstations that allow us to put the performance of a tower in our backpacks,” says SuperSphere partner/director Jason Diamond. “Our company’s focus on VR requires us to travel to a number of locations, and the ThinkPad P71 lets us achieve the same level of work on location as we can in the office, with the same functionality.”

The Lenovo P51s will be available in March, starting at $1,049, while the P51 and P71 will be available in April, starting at $1,399 and $1,849, respectively. .


Chris Hill & Sami Tahari

Imaginary Forces expands with EP Chris Hill and director of biz dev Sami Tahari

Imaginary Forces has added executive producer Chris Hill and director of business development Sami Tahari to its Los Angeles studio. The additions come at a time when the creative studio is looking to further expand their cross-platform presence with projects that mix VR/AR/360 with traditional, digital and social media.

Celebrating 20 years in business this year, the independently owned Imaginary Forces is a creative company specializing in brand strategy and visual storytelling encompassing many disciplines, including full-service design, production and post production. Being successful for that long in this business means they are regularly innovating and moving where the industry takes them. This led to the hiring of Hill and Tahari, whose diverse backgrounds will help strengthen the company’s long-standing relationships, as well as its continuous expansion into emerging markets.

Recent work of note includes main titles for Netflix’s beloved Stranger Things, the logo reveal for Michael Bay’s Transformers: The Last Knight and an immersive experience for the Empire State Building.

Hill’s diverse production experience includes commercials, experience design, entertainment marketing and branding for such clients as HBO Sports, Google, A&E and the Jacksonville Jaguars, among others. He joins Imaginary Forces after recently presiding over the broadcast division of marketing agency BPG.

Tahari brings extensive marketing, business and product development experience spanning the tech and entertainment spaces. His resume includes time at Lionsgate and Google, where he was an instrumental leader in the creative development and marketing of Google Glass.

“Imaginary Forces has a proven ability to use design and storytelling across any medium or industry,” adds Hill. “We can expand that ability to new markets, whether it’s emerging technologies, original content or sports franchises. When you consider, for example, the investment in massive screens and new technologies in stadiums across the country, it demands [that] same high level of brand strategy and visual storytelling.”

Our Main Image: L-R: Chris Hill and Sami Tahari.


Rise Above

Sundance 2017: VR for Good’s Rise Above 

By Elise Ballard

On January 22, during the Sundance Film Festival in Park City, the Oculus House had an event for their VR for Good initiative, described as “helping non-profits and rising filmmakers bring a variety of social missions to life.” Oculus awarded 10 non-profits a $40,000 grant and matched them with VR filmmakers to make a short film related to their community and cause.

One of the films, Rise Above, highlights a young girl’s recovery from sexual abuse and the support and therapy she received from New York City’s non-profit Womankind (formerly New York Asian Women’s Center).

Rise AboveRise Above is a gorgeous film — shot on the Nokia Ozo camera — and really well done, especially in as far as guiding your eye to the storytelling going on in a VR360 environment. I had the opportunity to interview the filmmakers, Ben Ross and Brittany Neff, about their experience. I was curious why they feel VR is one of the best mediums to create empathy and action for social impact. Check out their website.

Referencing the post process, Ross said he wore headsets the entire time as he worked with the editor in order to make sure it worked as a VR experience. All post production for VR for Good films was done at Reel FX. In terms of tools, for stitching the footage they used a combination of the Ozo Creator software from Nokia, Autopano Video from Kolor and the Cara plug-in for Nuke. Reel FX finished all the shots in Nuke (again making major use of Care) and Autodesk’s Flame for seam fixing and rig removal. TD Ryan Hartsell did the graphics work in After Effects using the mettle plug-in to help him place the graphics in 360 space and in 3D.

For more on the project and Reel FX’s involvement visit here.

The Oculus’ VR for Good initiative will be exhibiting will be at other major film festivals throughout the year and will be distributed by Facebook after the festival circuit.

Visit VR for Good here for more information, news and updates, and to stay connected (and apply!) to this inspiring and cutting-edge project.

Elise Ballard is a Los Angeles-based writer and author of Epiphany, True Stories of Sudden Insight, and the director of development at Cognition and Arc/k Project, a non-profit dedicated to preserving cultural heritage via virtual reality and digital media.


HPA Tech Retreat takes on VR/AR at Tech Retreat Extra

The long-standing HPA Tech Retreat is always a popular destination for tech-focused post pros, and while they have touched on virtual reality and augmented reality in the past, this year they are dedicating an entire day to the topic — February 20, the day before the official Retreat begins. TR-X (Tech Retreat Extra) will feature VR experts and storytellers sharing their knowledge and experiences. The traditional HPA Tech Retreat runs from February 21-24 in Indian Wells, California.

TR-X VR/AR is co-chaired by Lucas Wilson (Founder/Executive Producer at SuperSphereVR) and Marcie Jastrow (Senior VP, Immersive Media & Head of Technicolor Experience Center), who will lead a discussion focused on the changing VR/AR landscape in the context of rapidly growing integration into entertainment and applications.

Marcie Jastrow

Experts and creative panelists will tackle questions such as: What do you need to understand to enable VR in your environment? How do you adapt? What are the workflows? Storytellers, technologists and industry leaders will provide an overview of the technology and discuss how to harness emerging technologies in the service of the artistic vision. A series of diverse case studies and creative explorations — from NASA to the NFL — will examine how to engage the audience.

The TR-X program, along with the complete HPA Tech Retreat program, is available here. Additional sessions and speakers will be announced.

TR-X VR/AR Speakers and Panel Overview
Monday, February 20

Opening and Introductions
Seth Hallen, HPA President

Technical Introduction: 360/VR/AR/MR
Lucas Wilson

Panel Discussion: The VR/AR Market
Marcie Jastrow
David Moretti, Director of Corporate Development, Jaunt
Catherine Day, Head of VR/AR, Missing Pieces
Phil Lelyveld, VR/AR Initiative Program Lead, Entertainment Technology Center at USC

Acquisition Technology
Koji Gardiner, VP, Hardware, Jaunt

Live 360 Production Case Study
Andrew McGovern, VP of VR/AR Productions, Digital Domain

Live 360 Production Case Study
Michael Mansouri, Founder, Radiant Images

Interactive VR Production Case Study
Tim Dillon, Head of VR & Immersive Content, MPC Advertising USA

Immersive Audio Production Case Study
Kyle Schember, CEO, Subtractive

Panel Discussion: The Future
Alan Lasky, Director of Studio Product Development, 8i
Ben Grossmann, CEO, Magnopus
Scott Squires, CTO, Creative Director, Pixvana
Moderator: Lucas Wilson
Jen Dennis, EP of Branded Content, RSA

Panel Discussion: New Voices: Young Professionals in VR
Anne Jimkes, Sound Designer and Composer, Ecco VR
Jyotsna Kadimi, USC Graduate
Sho Schrock, Chapman University Student
Brian Handy, USC Student

TR-X also includes an ATSC 3.0 seminar, focusing on the next-generation television broadcast standard, which is nearing completion and offers a wide range of new content delivery options to the TV production community. This session will explore the expanding possibilities that the new standard provides in video, audio, interactivity and more. Presenters and panelists will also discuss the complex next-gen television distribution ecosystem that content must traverse, and the technologies that will bring the content to life in consumers’ homes.

Early registration is highly recommended for TR-X and the HPA Tech Retreat, which is a perennially sold-out event. Attendees can sign up for TR-X VR/AR, TR-X ATSC or the HPA Tech Retreat.

Main Image: Lucas Wilson.

Review: Mettle VR plug-ins for Adobe Premiere

By Barry Goch

I was very frustrated. I took a VR production class, I bought a LG 360 camera, but I felt like I was missing something. Then it dawned on me — I wanted to have more control. I started editing 360 videos using the VR video viewing tools in Adobe Premiere Pro, but I still was lacking the control I desired. I wanted my audience to have a guided, immersive experience without having to be in a swivel chair to get the most out of my work. Then, like a bolt of lightning, it came to me — I needed to rotate the 360 video sphere. I needed to be able to reorient it to accomplish my vision, but how would I do that?

Rotate Sphere plug-in showing keyframing.

Mettle’s Skybox 360/VR Tools are exactly what I was looking for. The Rotate Sphere plug-in alone is worth the price of the entire plug-in package. With this one plug-in, you’re able to re-orient your 360 video without worrying about any technical issues — it gives you complete creative control to re-frame your 360 video — and it’s completely keyframable too! For example, I mounted my 360 camera on my ski helmet this winter and went down a ski run at Heavenly in Lake Tahoe. There are amazing views of the lake from this run, but I also needed to follow the skiers ahead of me. Plus, the angle of the slope changed and the angle to the subjects I was following changed as well. Since the camera was fixed, how could I guide the viewer? By using the Rotate Sphere plug-in from Mettle and keyframing the orientation of the shot as the slope/subject relationship changed relative to my position.

My second favorite plug-in is Project 2D. Without the Project 2D plug-in, when you add titles to your 360 videos they become warped and you have very little control over their appearance. In Project 2D, you create your title using the built-in titler in Premiere Pro, add it to the timeline, then apply the Project 2D Mettle Skybox plug-in. Now you have complete control over the scale, rotation of the titling element and the placement of the title within the 360 video sphere. You can also use the Project 2D plug-in to composite graphics or video into your 360 video environment.

Mobius Zoom transition in action.

Rounding out the Skybox plug-in set are 360 video-aware plug-ins that every content creator needs. What do I mean but 360 video-aware? For example, when you apply a blur that is not 360 video-content-aware, it crosses the seam where the equi-rectangular video’s edges join together and makes the seam unseemly. With the Skybox Blur, Denoise, Glow and Sharpen plug-ins, you don’t have this problem. Just as the Rotate Sphere plug-in does the crazy math to rotate your 360 video without distortion or introducing artifacts, these plug-ins do the same.

Transitioning between cuts in 360 video is an evolving art form. There is really no right or wrong way. Longer cuts, shorter cuts, dissolves and dips to black are some of the basic options. Now, Mettle is adding to our creative toolkit by applying their crazy math skills on transitions in 360 videos. Mettle started with their first pack of four transitions: Mobius Zoom, Random Blocks, Gradient Wipe and Iris Wipe. I used the Mobius Zoom to transition from the header card to the video and then the Iris Wipe with a soft edge to transition from one shot to the next in the linked video.

Check out this video, which uses Rotate Sphere, Project 2D, Mobius Zoom and Iris wipe effects.

New Plug-Ins
I’m pleased to be among the first to show you their second set of plug-ins specifically designed for 360 / VR video! Chroma Leaks, Light Leaks, Spherical Blurs and everyone’s favorite, Light Rays!

Mettle plug-ins work on both Mac and Windows platforms — on qualified systems — and in realtime. The Mettle plug-ins are also both mono- and stereo-aware.

The Skybox plug-in set for Adobe Premiere Pro is truly the answer I’ve been looking for since I started exploring 360 video. It’s changed the way I work and opened up a world of control that I had been wishing for. Try it for yourself by downloading a demo at www.mettle.com.


Barry Goch is currently a digital intermediate editor for Deluxe in Culver City, working on Autodesk Flame. He started his career as a camera tech for Panavision Hollywood. He then transitioned to an offline Avid/FCP editor. His resume includes Passengers, Money Monster, Eye in the Sky and Game of Thrones. His latest endeavor is VR video.