Category Archives: 360

Z Cam, Assimilate reduce price of S1 VR camera/Scratch VR bundle

The Z Cam S1 VR camera/WonderStitch/Assimilate Scratch VR Z bundle, an integrated VR production workflow offering, is now $3,999, down from $4,999.

The Z Cam S1/Scratch VR Z bundle provides acquisition via Z Cam’s S1 pro VR camera, stitching via the WonderStitch software and a streamlined VR post workflow via Assimilate’s realtime Scratch VR Z tools.

Here are some details:
If streaming live 360 from the Z Cam S1 through Scratch VR Z, users can take advantage of realtime features such as inserting/composting graphics/text overlays, including animations, and keying for elements like greenscreen — all streaming live to Facebook Live 360.

Scratch VR Z can be used to do live camera preview, prior to shooting with the S1. During the shoot, Scratch VR Z is used for dailies and data management, including metadata. It’s a direct connect to the PC and then to the camera via a high-speed Ethernet port. Stitching of the imagery is done in Z Cam’s WonderStitch, now integrated into Scratch VR Z, then comes traditional editing, color grading, compositing, multichannel audio from the S1 or adding external ambisonic sound, finishing and then publishing to all final online or stand-alone 360 platforms.

The Z Cam S1/Scratch VR Z bundle is available now.

Behind the Title: Light Sail VR’s Matthew Celia

NAME: Matthew Celia

COMPANY: LA’s Light Sail VR (@lightsailvr)

CAN YOU DESCRIBE YOUR COMPANY?
Light Sail VR is a virtual reality production company specializing in telling immersive narrative stories. We’ve built a strong branded content business over the last two years working with clients such as Google and GoPro, and studios like Paramount and ABC.

Whether it’s 360 video, cinematic VR or interactive media, we’ve built an end-to-end pipeline to go from script to final delivery. We’re now excited to be moving into creating original IP and more interactive content that fuses cinematic live-action film footage with game engine mechanics.

WHAT’S YOUR JOB TITLE?
Creative Director and Managing Partner

WHAT DOES THAT ENTAIL?
A lot! We’re a small boutique shop so we all wear many hats. First and foremost, I am a director and work hard to deliver a compelling story and emotional connection to the audience for each one of our pieces. Story first is our motto, and I try and approach every technical problem with a creative solution. Figuring out execution is a large part of that.

In addition to the production side, I also carry a lot of the technical responsibilities in post production, such as keeping our post pipeline humming and inventing new workflows. Most recently, I have been dabbling in programming interactive cinema using the Unity game engine.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I am in charge of washing the lettuce when we do our famous “Light Sail VR Sandwich Club” during lunch. Yes, you get fed for free if you work with us, and I make an amazing italian sandwich.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Hard to say. I really like what I do. I like being on set and working with actors because VR is such a great medium for them to play in, and it’s exciting to collaborate with such creative and talented people.

National Parks Service

WHAT’S YOUR LEAST FAVORITE?
Render times and computer crashes. My tech life is in constant beta. Price we pay for being on the bleeding edge, I guess!

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I like the early morning because it is quiet, my brain is fresh, and I haven’t yet had 20 people asking something of me.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Probably the same, but at a large company. If I left the film business I’d probably teach. I love working with kids.

WHY DID YOU CHOOSE THIS PROFESSION?
I feel like I’ve wanted to be a filmmaker since I could walk. My parents like to drag out the home movies of me asking to look in my dad’s VHS video camera when I was 4. I spent most of high school in the theater and most people assumed I would be an actor. But senior year I fell in love with film when I shot and cut my first 16mm reversal stock on an old reel-to-reel editing machine. The process was incredibly fun and rewarding and I was hooked. I only recently discovered VR, but in many ways it feels like the right path for me because I think cinematic VR is the perfect intersection of filmmaking and theater.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
On the branded side, we just finished up two tourism videos. One for the National Parks Service which was a 360 tour of the Channel Islands with Jordan Fisher and the other was a 360 piece for Princess Cruises. VR is really great to show people the world. The last few months of my life have been consumed by Light Sail VR’s first original project, Speak of the Devil.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Speak of the Devil is at the top of that list. It’s the first live-action interactive project I’ve worked on and it’s massive. Crafted using the GoPro Odyssey camera in partnership with Google Jump it features over 50 unique locations, 13 different endings and is currently taking up about 80TB of storage (and counting). It is the largest project I’ve worked on to date, and we’ve done it all on a shoestring budget thanks to the gracious contributions of talented creative folks who believed in our vision.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My instant-read grill meat thermometer, my iPhone and my Philips Hue bulbs. Seriously, if you have a baby, it’s a life saver being able to whisper, Hey, Siri, turn off the lights.”

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m really active on several Facebook groups related to 360 video production. You can get a lot of advice and connect directly with vendors and software engineers. It’s a great community.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I tend to pop on some music when I’m doing repetitive mindless tasks, but when I have to be creative or solve a tough tech problem, the music is off so that I can focus. My favorite music to work to tends to be Dave Matthews Band live albums. They get into 20-minute long jams and it’s great.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
De-stressing is really hard when you own your own company. I like to go walking, but if that doesn’t work, I’ll try diving into some cooking for my family, which forces me to focus on something not work related. I tend to feel better after eating a really good meal.

Cinna 1.2

Industry mainstay Click3X purchased by Industrial Color Studios

Established New York City post house Click3X has been bought by Industrial Color Studios. Click3X is a 25-year-old facility that specializes in new media formats such as VR, AR, CGI and live streaming. Industrial Color Studios is a visual content production company. Founded in 1992, Industrial Color’s services range from full image capture and e-commerce photography to production support and post services, including creative editorial, color grading and CG.

With offices in New York and LA, Industrial Color has developed its own proprietary systems to support online digital asset management for video editing and high-speed file transfers for its clients working in broadcast and print media. The company is an end-to-end visual content production provider, partnering with top brands, agencies and creative professionals to accelerate multi-channel creative content.

Click3X was founded in 1993 by Peter Corbett, co-founder of numerous companies specializing in both traditional and emerging forms of media.  These include Media Circus (a digital production and web design company), IllusionFusion, Full Blue, ClickFire Media, Reason2Be, Sound Lounge and Heard City. A long-time member of the DGA as a commercial film director, Corbett emigrated to the US from Australia to pursue a career as a commercial director and, shortly thereafter, segued into integrated media and mixed media, becoming one of the first established film directors to do so.

Projects produced at Click3X have been honored with the industry’s top awards, including Cannes Lions, Clios, Andy Awards and others. Click3X also was presented with the Crystal Apple Award, presented by the New York City Mayor’s Office of Media and Entertainment, in recognition of its contributions to the city’s media landscape.

Corbett will remain in place at Click3X and eventually the companies will share the ICS space on 6th Avenue in NYC.

“We’ve seen a growing need for video production capabilities and have been in the market for a partner that would not only enhance our video offering, but one that provided a truly integrated and complementary suite of services,” says Steve Kalalian, CEO of Industrial Color Studios. “And Click3X was the ideal fit. While the industry continues to evolve at lightning speed, I’ve long admired Click3X as a company that’s consistently been on the cutting edge of technology as it pertains to creative film, digital video and new media solutions. Our respective companies share a passion for creativity and innovation, and I’m incredibly excited to share this unique new offering with our clients.”

“When Steve and I first entered into talks to align on the state of our clients’ future, we were immediately on the same page,” says Corbett, president of Click3X. “We share a vision for creating compelling content in all formats. As complementary production providers, we will now have the exciting opportunity to collaborate on a robust and highly-regarded client roster, but also expand the company’s creative and new media capabilities, using over 200,000 square feet of state-of-the-art facilities in New York, Los Angeles and Philadelphia.”

The added capabilities Click3X gives Industrial Color in video production and new media mirrors its growth in the field of e-commerce photography and image capture. The company has recently opened a new 30,000 square-foot studio in downtown Los Angeles designed to produce high-volume, high-quality product photography for advertisers. That studio complements the company’s existing e-commerce photography hub in Philadelphia.

Main Image: (L-R) Peter Corbett and Steve Kalalian


Editing 360 video with Lenovo’s Explorer WMR headset

By Mike McCarthy

Microsoft has released its Windows Mixed Reality (WMR) platform as part of the Fall Creator’s Update to Windows 10. This platform allows users to experience a variety of immersive experiences, and thankfully there are now many WMR headsets available from many familiar names in the hardware business. One of those is from Lenovo who kindly sent me their Explorer WMR headset to test on my Thinkpad P71. This provided me with a complete VR experience on their hardware.

On November 15, Microsoft’s WMR released beta support for SteamVR on WMR devices. This allows WMR headsets to be used in applications that are compatible with SteamVR. For example, the newest release of Adobe Premiere Pro (CC 2018, or V.12.0) uses SteamVR for 360 video preview.

My goal for this article was to see if I could preview my 360 videos in a Lenovo headset while editing in Premiere, especially now that I had new 360 footage from my GoPro Fusion camera. I also provide some comparisons to the Oculus Rift which I reviewed for postPerspective in October.

There are a number of advantages to the WMR options, including lower prices and hardware requirements, higher image resolution and simpler setup. Oculus and HTC’s VR-Ready requirements have always been a bit excessive for 360 video, because unlike true 3D VR there is no 3D rendering involved when playing back footage from a fixed perspective. But would it work? No one seemed to know if it would, but Lenovo was willing to let me try.

The first step is to get your installation of Windows 10 upgraded with the Fall Creators Update. This includes integrated support for Windows Mixed Reality headsets. Once installed, you can plug in the single USB3 cable and HDMI port and Windows will automatically configure the device and its drivers for you. You will also need to install Valve’s Steam application and SteamVR, which adds support for VR content. The next step is to find Microsoft’s Windows Mixed Reality for SteamVR in the Steam store, which is a free installation. Once you confirm that the headset is functioning in WMR and then in SteamVR, open up Premiere Pro and test it out.

Working in Premiere Pro
Within Premiere Pro, preview and playback worked immediately within my existing immersive project. I watched footage captured with my Samsung Gear 360 and GoPro Fusion cameras. The files played, and the increased performance within the new version of the software is noticeable. My 4K and 5K 30fps content worked great, but my new 3Kp60 content only played when Mercury Playback was set to software-only, which disabled most of the new Immersive Video effects. In CUDA mode, I could hold down the right arrow and watch it progress in slow motion, but pressing the space bar caused the VR preview to freeze even though it played fine on the laptop monitor. The 60p content played fine in the Rift, so this appears to be an issue specific to WMR. Hopefully, that will be addressed in a software update in the near future.

The motion controllers were visible in the interface and allow you to scrub the timeline, but I still had to use space bar to start and stop playback. One issue that arose was that the mouse cursor is hidden when the display is snapped down into place over my eyes. I had to tip it up out of the way each time I wanted to make a change, instead of just peeking under it, which is a lot of snapping up and down for the headset.

I found the WMR experience to be slightly less solid than the Oculus system. It would occasionally lag on the tracking for a couple of frames, causing the image to visibly jump. This may be due to the integrated tracking instead of dedicated external cameras. The boundary system is a visual distraction, so I would recommend disabling it if you are primarily using it for 360 video — because it doesn’t require moving much within your space. The setup on the WMR is better; it is much easier and has lower requirements and fewer ports needed. The resolution is higher than the Oculus Rift I had tested, (1440×1440 per eye instead of 1080×1200), so I wanted to see how much of a difference that would make. The Explorer also has a narrower field of view (105 degrees instead of 110), which I wouldn’t expect to make a difference, but I think it did.

By my calculations, the increased resolution should allow you to resolve a 5K sphere, compared to the 3.9K resolution available from the Rift — 1440pixels/105degrees*360 vs 1080pixels /110degrees*360. You will also want a pair of headphones or earbuds to plug into the headset so the audio tracks with your head (compared to your computer speakers, which are fixed).

The Feel of the Headset
The headset is designed very differently from the Rift, and the display can be tipped up out of the way while the headband is still on. It is also way easier to put on and remove, but a bit less comfortable to keep on for longer periods of time. The headband has to be on tight enough to hold the display in front of your eyes, since it doesn’t rest on your face, and the cabling has to slide through a clip on the headband when you fold the display upward. And since you have to fold the display upward to use the mouse, it is a frequent annoyance. But between the motion controllers and the keyboard, you can navigate and playback while the headset is on.

Using the Microsoft WMR lobby interface was an interesting experience, but I’m not sure if it’s going to catch on. SteamVR’s lobby experience isn’t much better, but Steam does offer a lot more content for its users. I anticipate Steam will be the dominant software platform based on the fact that most hardware vendors have support for it — HTC, Oculus, WMR. The fact that Adobe chose SteamVR to support their immersive preview experience is why these new WMR headsets work in Premiere Pro without any further software updates needed on their part. (Adobe doesn’t officially support this configuration yet, hence the “beta” designation in SteamVR, but besides 60p playback, I was very happy.) Hopefully we will only see further increased support and integration between the various hardware and software options in the future.

Summing Up
Currently, the Lenovo Explorer and the Oculus Rift are both priced the same at $399 — I say currently because prices have been fluctuating, so investigate thoroughly. So which one is better? Well, neither is a clear winner. Each has its own strengths. The Rift has more specific hardware requirements and lower total resolution. The Explorer requires Windows 10, but will work on a wider array of systems. The Rift is probably better for periods of extended use, while I would recommend the Explorer if you are going to be doing something that involves taking it on and off all the time (like tweaking effects settings in Adobe apps). Large fixed installations may offer a better user experience with the Rift or Vive on a powerful GPU, but most laptop users will probably have an easier time with the Explorer (no external camera to calibrate and fewer ports needed).


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.


Rogue takes us on VR/360 tour of Supermodel Closets

Rogue is a NYC-based creative boutique that specializes in high-end production and post for film, advertising and digital. Since its founding two years ago, executive creative director, Alex MacLean and his team have produced a large body of work providing color grading, finishing and visual effects for clients such as HBO, Vogue, Google, Vice, Fader and more. For the past three years MacLean has also been at the forefront of VR/360 content for narratives and advertising.

MacLean recently wrapped up post production on four five-minute episodes of 360-degree tours of Supermodel Closets. The series is a project of Conde Nast Entertainment and Vogue for Vogue’s 125th anniversary. If you’re into fashion, this VR tour gives you a glimpse at what supermodels wear in their daily lives. Viewers can look up, down and all around to feel immersed in the closet of each model as she shows her favorite fashions and shares the stories behind their most prized pieces.

 

Tours include the closets of Lily Aldridge, Cindy Crawford, Kendall Jenner  and
Amber Valletta.

MacLean worked with director Julina Tatlock, who is a co-founder and CEO of 30 Ninjas, a digital entertainment company that develops, writes and produces VR, multi-platform and interactive content. Rogue and 30 Ninjas worked together to determine the best workflow for the series. “I always think it’s best practice to collaborate with the directors, DPs and/or production companies in advance of a VR shoot to sort out any technical issues and pre-plan the most efficient production process from shoot to edit, stitching through all the steps of post-production,” reports MacLean. “Foresight is everything; it saves a lot of time, money, and frustration for everyone, especially when working in VR, as well as 3D.”

According to MacLean, they worked with a new camera format, the YI Halo camera, which is designed for professional VR data acquisition. “I often turn to the Assimilate team to discuss the format issues because they always support the latest camera formats in their Scratch VR tools. This worked well again because I needed to define an efficient VR and 3D workflow that would accommodate the conforming, color grading, creating of visual effects and the finishing of a massive amount of data at 6.7K x 6.7K resolution.”

 

The Post
“The post production process began by downloading 30 Ninjas’ editorial, stitched footage from the cloud to ingest into our MacBook Pro workstations to do the conform at 6K x 6K,” explains MacLean. “Organized data management is a critical step in our workflow, and Scratch VR is a champ at that. We were simultaneously doing the post for more than one episode, as well as other projects within the studio, so data efficiency is key.”

“We then moved the conformed raw 6.7K x 6.7K raw footage to our HP Z840 workstations to do the color grading, visual effects, compositing and finishing. You really need powerful workstations when working at this resolution and with this much data,” reports MacLean. “Spherical VR/360 imagery requires focused concentration, and then we’re basically doing everything twice when working in 3D. For these episodes, and for all VR/360 projects, we create a lat/long that breaks out the left eye and right eye into two spherical images. We then replicate the work from one eye to the next, and color correct any variances. The result is seamless color grading.

 

“We’re essentially using the headset as a creative tool with Scratch VR, because we can work in realtime in an immersive environment and see the exact results of work in each step of the post process,” he continues. “This is especially useful when doing any additional compositing, such as clean-up for artifacts that may have been missed or adding or subtracting data. Working in realtime eases the stress and time of doing a new composite of 360 data for the left eye and right eye 3D.”

Playback of content in the studio is very important to MacLean and team, and he calls the choice of multiple headsets another piece to the VR/360 puzzle. “The VR/3D content can look different in each headset so we need to determine a mid-point aesthetic look that displays well in each headset. We have our own playback black box that we use to preview the color grading and visual effects, before committing to rendering. And then we do a final QC review of the content, and for these episodes we did so in Google Daydream (untethered), HTV Live (tethered) and the Oculus Rift (tethered).”

MacLean sees rendering as one of their biggest challenges. “It’s really imperative to be diligent throughout all the internal and client reviews prior to rendering. It requires being very organized in your workflow from production through finishing, and a solid QC check. Content at 6K x 6K, VR/360 and 3D means extremely large files and numerous hours of rendering, so we want to restrict re-rendering as much as possible.”


Review: GoPro Fusion 360 camera

By Mike McCarthy

I finally got the opportunity to try out the GoPro Fusion camera I have had my eye on since the company first revealed it in April. The $700 camera uses two offset fish-eye lenses to shoot 360 video and stills, while recording ambisonic audio from four microphones in the waterproof unit. It can shoot a 5K video sphere at 30fps, or a 3K sphere at 60fps for higher motion content at reduced resolution. It records dual 190-degree fish-eye perspectives encoded in H.264 to separate MicroSD cards, with four tracks of audio. The rest of the magic comes in the form of GoPro’s newest application Fusion Studio.

Internally, the unit is recording dual 45Mb H.264 files to two separate MicroSD cards, with accompanying audio and metadata assets. This would be a logistical challenge to deal with manually, copying the cards into folders, sorting and syncing them, stitching them together and dealing with the audio. But with GoPro’s new Fusion Studio app, most of this is taken care of for you. Simply plug-in the camera and it will automatically access the footage, and let you preview and select what parts of which clips you want processed into stitched 360 footage or flattened video files.

It also processes the multi-channel audio into ambisonic B-Format tracks, or standard stereo if desired. The app is a bit limited in user-control functionality, but what it does do it does very well. My main complaint is that I can’t find a way to manually set the output filename, but I can rename the exports in Windows once they have been rendered. Trying to process the same source file into multiple outputs is challenging for the same reason.

Setting Recorded Resolution (Per Lens) Processed Resolution (Equirectangular)
5Kp30 2704×2624 4992×2496
3Kp60 1568×1504 2880×1440
Stills 3104×3000 5760×2880

With the Samsung Gear 360, I researched five different ways to stitch the footage, because I wasn’t satisfied with the included app. Most of those will also work with Fusion footage, and you can read about those options here, but they aren’t really necessary when you have Fusion Studio.

You can choose between H.264, Cineform or ProRes, your equirectangular output resolution and ambisonic or stereo audio. That gives you pretty much every option you should need to process your footage. There is also a “Beta” option to stabilize your footage, which once I got used to it, I really liked. It should be thought of more as a “remove rotation” option since it’s not for stabilizing out sharp motions — which still leave motion blur — but for maintaining the viewer’s perspective even if the camera rotates in unexpected ways. Processing was about 6x run-time on my Lenovo Thinkpad P71 laptop, so a 10-minute clip would take an hour to stitch to 360.

The footage itself looks good, higher quality than my Gear 360, and the 60p stuff is much smoother, which is to be expected. While good VR experiences require 90fps to be rendered to the display to avoid motion sickness that does not necessarily mean that 30fps content is a problem. When rendering the viewer’s perspective, the same frame can be sampled three times, shifting the image as they move their head, even from a single source frame. That said, 60p source content does give smoother results than the 30p footage I am used to watching in VR, but 60p did give me more issues during editorial. I had to disable CUDA acceleration in Adobe Premiere Pro to get Transmit to work with the WMR headset.

Once you have your footage processed in Fusion Studio, it can be edited in Premiere Pro — like any other 360 footage — but the audio can be handled a bit differently. Exporting as stereo will follow the usual workflow, but selecting ambisonic will give you a special spatially aware audio file. Premiere can use this in a 4-track multi-channel sequence to line up the spatial audio with the direction you are looking in VR, and if exported correctly, YouTube can do the same thing for your viewers.

In the Trees
Most GoPro products are intended for use capturing action moments and unusual situations in extreme environments (which is why they are waterproof and fairly resilient), so I wanted to study the camera in its “native habitat.” The most extreme thing I do these days is work on ropes courses, high up in trees or telephone poles. So I took the camera out to a ropes course that I help out with, curious to see how the recording at height would translate into the 360 video experience.

Ropes courses are usually challenging to photograph because of the scale involved. When you are zoomed out far enough to see the entire element, you can’t see any detail, or if you are so zoomed in close enough to see faces, you have no good concept of how high up they are — 360 photography is helpful in that it is designed to be panned through when viewed flat. This allows you to give the viewer a better sense of the scale, and they can still see the details of the individual elements or people climbing. And in VR, you should have a better feel for the height involved.

I had the Fusion camera and Fusion Grip extendable tripod handle, as well as my Hero6 kit, which included an adhesive helmet mount. Since I was going to be working at heights and didn’t want to drop the camera, the first thing I did was rig up a tether system. A short piece of 2mm cord fit through a slot in the bottom of the center post and a triple fisherman knot made a secure loop. The cord fit out the bottom of the tripod when it was closed, allowing me to connect it to a shock-absorbing lanyard, which was clipped to my harness. This also allowed me to dangle the camera from a cord for a free-floating perspective. I also stuck the quick release base to my climbing helmet, and was ready to go.

I shot segments in both 30p and 60p, depending on how I had the camera mounted, using higher frame rates for the more dynamic shots. I was worried that the helmet mount would be too close, since GoPro recommends keeping the Fusion at least 20cm away from what it is filming, but the helmet wasn’t too bad. Another inch or two would shrink it significantly from the camera’s perspective, similar to my tripod issue with the Gear 360.

I always climbed up with the camera mounted on my helmet and then switched it to the Fusion Grip to record the guy climbing up behind me and my rappel. Hanging the camera from a cord, even 30-feet below me, worked much better than I expected. It put GoPro’s stabilization feature to the test, but it worked fantastically. With the camera rotating freely, the perspective is static, although you can see the seam lines constantly rotating around you. When I am holding the Fusion Grip, the extended pole is completely invisible to the camera, giving you what GoPro has dubbed “Angel View.” It is as if the viewer is floating freely next to the subject, especially when viewed in VR.

Because I have ways to view 360 video in VR, and because I don’t mind panning around on a flat screen view, I am less excited personally in GoPro’s OverCapture functionality, but I recognize it is a useful feature that will greater extend the use cases for this 360 camera. It is designed for people using the Fusion as a more flexible camera to produce flat content, instead of to produce VR content. I edited together a couple OverCapture shots intercut with footage from my regular Hero6 to demonstrate how that would work.

Ambisonic Audio
The other new option that Fusion brings to the table is ambisonic audio. Editing ambisonics works in Premiere Pro using a 4-track multi-channel sequence. The main workflow kink here is that you have to manually override the audio settings every time you import a new clip with ambisonic audio in order to set the audio channels to Adaptive with a single timeline clip. Turn on Monitor Ambisonics by right clicking in the monitor panel and match the Pan, Tilt, and Roll in the Panner-Ambisonics effect to the values in your VR Rotate Sphere effect (note that they are listed in a different order) and your audio should match the video perspective.

When exporting an MP4 in the audio panel, set Channels to 4.0 and check the Audio is Ambisonics box. From what I can see, the Fusion Studio conversion process compensates for changes in perspective, including “stabilization” when processing the raw recorded audio for Ambisonic exports, so you only have to match changes you make in your Premiere sequence.

While I could have intercut the footage at both settings together into a 5Kp60 timeline, I ended up creating two separate 360 videos. This also makes it clear to the viewer which shots were 5K/p30 and which were recorded at 3K/p60. They are both available on YouTube, and I recommend watching them in VR for the full effect. But be warned that they are recorded at heights up to 80 feet up, so it may be uncomfortable for some people to watch.

Summing Up
GoPro’s Fusion camera is not the first 360 camera on the market, but it brings more pixels and higher frame rates than most of its direct competitors, and more importantly it has the software package to assist users in the transition to processing 360 video footage. It also supports ambisonic audio and offers the OverCapture functionality for generating more traditional flat GoPro content.

I found it to be easier to mount and shoot with than my earlier 360 camera experiences, and it is far easier to get the footage ready to edit and view using GoPro’s Fusion Studio program. The Stabilize feature totally changes how I shoot 360 videos, giving me much more flexibility in rotating the camera during movements. And most importantly, I am much happier with the resulting footage that I get when shooting with it.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.


The Other Art Fair: Brands as benefactors for the arts

By Tom Westerlin

Last weekend, courtesy of Dell, I had the opportunity to attend The Other Art Fair, presented by Saatchi Art here in New York City. My role at Nice Shoes is creative director for VR/AR/360, and I was interested to see how the worlds of interactive and traditional art would intersect. I was also curious to see what role brands like Dell would play, as I feel that as we’ve transitioned from traditional advertising to branded content, brands have emerged as benefactors for the arts.

It was great to have so many artists represented that had created such high-quality work, and unlike other art shows I’ve attended, everything felt affordable and accessible. Art is often priced out for the average person and here was an opportunity to get to know artists, learn about their process and possibly walk away with a beautiful piece to bring into the home.

The curators and sponsors created a very welcoming, jovial atmosphere. Kids had an area where they could draw on the walls, and adults had access to a bar area and lounge where they could converse (I suppose adults could have drawn there as well, but some needed a drink or two to loosen up). The human body was also a canvas as there was an artist offering tattoos. Overall, the organizers created an infectious, creative vibe.

A variety of artists were represented. Traditional paintings, photography, collage, sculpture, neon and VR were all on display in the same space. Seeing VR and digital art amongst traditional art was very encouraging. I’ve encountered bits of this at other shows, but in those instances everything felt cordoned off. At The Other Art Fair, every medium felt as if it were being displayed on equal ground, and, in some cases, the lines between physical and digital art were blurred.

Samsung had framed displays that looked like physical paintings. Their high-quality monitors sat flat on the wall, framed and indistinguishable from physical art.

Dell’s 8K monitor looked amazing. It was such a high resolution and the pixel density was very tight. It looked perfect for displaying a high-resolution photo at 100%. I’d be curious to see how galleries take advantage of monitors like these. Traditionally, prints of photographs would be shown, but monitors like these offer up new potential for showcasing vivid texture, detail and composition.

Although I didn’t walk out with a painting that night, I did come away with the desire to keep my eye on a number of artists — in particular, Glen Gauthier, Paul Richard, Laura Noel and Beth Radford. They all stood out to me.

As the lines between art and advertising blur, there are always new opportunities for brands and artists to come together to create stunning content, and I expect many brands, agencies, and creative studios to engage these artists in the near future.


Behind the Title: Start VR Producer Ela Topcuoglu

NAME: Ela Topcuoglu

COMPANY: Start VR (@Start_VR)

CAN YOU DESCRIBE YOUR COMPANY?
Start VR is a full-service production studio (with offices in Sydney, Australia and Marina Del Rey, California) specializing in immersive and interactive cinematic entertainment. The studio brings expertise in entertainment and technology together with feature film quality visuals with interactive content, creating original and branded narrative experiences in VR.

WHAT’S YOUR JOB TITLE?
Development Executive and Producer

WHAT DOES THAT ENTAIL?
I am in charge of expanding Start VR’s business in North America. That entails developing strategic partnerships and increasing business development in the entertainment, film and technology sectors.

I am also responsible for finding partners for our original content slate as well as seeking existing IP that would fit perfectly in VR. I also develop relationships with brands and advertising agencies to create branded content. Beyond business development, I also help produce the projects that we move forward with.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
The title comes with the responsibility of convincing people to invest in something that is constantly evolving, which is the biggest challenge. My job also requires me to be very creative in coming up with a native language to this new medium. I have to wear many hats to ensure that we create the best experiences out there.

WHAT’S YOUR FAVORITE PART OF THE JOB?
My favorite part of the job is that I get to wear lots of different hats. Being in the emerging field of VR, everyday is different. I don’t have a traditional 9-to-5 office job and I am constantly moving and hustling to set up business meetings and stay updated on the latest industry trends.

Also, being in the ever-evolving technology field, I learn something new almost everyday, which is extremely essential to my professional growth.

WHAT’S YOUR LEAST FAVORITE?
Convincing people to invest in virtual reality and seeing its incredible potential. That usually changes once they experience truly immersive VR, but regardless, selling the future is difficult.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
My favorite part of the day is the morning. I start my day with a much-needed shot of Nespresso, get caught up on emails, take a look at my schedule and take a quick breather before I jump right into the madness.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
If I wasn’t working in VR, I would be investing my time in learning more about artificial intelligence (AI) and use that to advance medicine/health and education.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I loved entertaining people from a very young age, and I was always looking for an outlet to do that, so the entertainment business was the perfect fit. There is nothing like watching someone’s reaction to a great piece of content. Virtual reality is the ultimate entertainment outlet and I knew that I wanted to create experiences that left people with the same awe reaction that I had the moment I experienced it.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I worked and assisted in the business and legal affairs department at Media Rights Capital and had the opportunity to work on amazing TV projects, including House of Cards, Baby Driver and Ozark.

Awake: First Contact

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
The project that I am most proud of to date is the project that I am currently producing at Start VR. It’s called Awake: First Contact. It was a project I read about and said, “I want to work on that.”

I am in incredibly proud that I get to work on a virtual reality project that is pushing the boundaries of the medium both technically and creatively.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone, laptop and speakers.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Twitter, Facebook and LinkedIn

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Yes, especially if I’m working on a pitch deck. It really keeps me in the moment. I usually listen to my favorite DJ mixes on Soundcloud. It really depends on my vibe that day.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I have recently started surfing, so that is my outlet at the moment. I also meditate regularly. It’s also important for me to make sure that I am always learning something new and unrelated to my industry.


A Closer Look: VR solutions for production and post

By Alexandre Regeffe

Back in September, I traveled to Amsterdam to check out new tools relating to VR and 360 production and post. As a producer based in Paris, France, I have been working in the virtual reality part of the business for over two years. While IBC took place in September, the information I have to share is still quite relevant.

KanDao

I saw some very cool technology at the show regarding VR and 360 video, especially within the cinematic VR niche. And niche is the perfect word — I see the market slightly narrowing after the wave of hype that happened a couple of years ago. Personally, I don’t think the public has been reached yet, but pardon my French pessimism. Let’s take a look…

Cameras
One new range of products I found amazing were the Obsidian cameras from manufacturer KanDao. This Chinese brand has a smart product line with their 3D/360 cameras. Starting with the Obsidian Go, they reach pro cinematic levels with the Obsidian R (for Resolution, which is 8K per eye) and the Obsidian S (for speed, which you can capture at 120fps). It offers a small radial form factor, only six eyes to produce very smooth stereoscopy, with very a high resolution per eye, which is one of the keys to reaching a good feeling of immersion using a HMD.

Kandao’s features are promising, including handling 6DoF with depth map generation. To me, this is the future of cinematic VR producing — you will be able to have more freedom as the viewer, translating slightly your point of view to see behind objects with natural parallax distortion in realtime! Let me call it “extended” stereoscopic 360.

I can’t speak about professional 360 cameras without also mentioning the Ozo from Nokia. Considered by users to be the first pro VR camera, the Ozo+ version launched this year with a new ISP and offers astonishing new features, especially when you transfer your shots in the Ozo creator tool, which is in version 2.1.

Nokia Ozo+

Powerful tools, like highlights and shadow recovery, haze removal, auto stabilization and better denoising. are there to improve the overall image quality. Another big thing on the Nokia booth was the version 2.0 of the Ozo Live system. Yes, you can now webcast your live event in stereoscopic 360 with a 4K-per-eye resolution! And you can simply use a (boosted) laptop to do it! All the VR tools from Nokia are part of what they call Ozo Reality, an integrated ecosystem where you can create, deliver and experience cinematic VR.

VR Post
When you talk about VR post you have to talk about stitching — assembling all sources to obtain a 360 image. As a French-educated man, you know I have to complain somehow: I hate stitching. And I often yell at these guys who shoot at wrong camera positions. Spending hours (and money) dealing with seam lines is not my tasse de thé.

A few months before IBC, I found my grace: Mistika VR from SGO. Well known for their color grading tool Mistika Ultima (which is one of the finest in stereoscopic), SGO launched a stitching tool for 360 video. Fantastic results. Fantastic development team.

In this very intuitive tool, you can stitch sources of almost all existing cameras and rigs available on the market now, from Samsung gear 360 to Jaunt. With amazing optical flow algorithms, seam line fine adjustments, color matching and many other features, it is to me by far the best tool for outputing a clean, seamless equirectangular image. And the upcoming Mistika VR 3D for stitching stereoscopic sources is very promising. You know what? Thanks to Mistika VR, the stitching process could be fun. Even for me.

In general, optical flow is a huge improvement for stitching, and we can find this parameter in the Kandao Studio stitching tool (designed only for Obsidian cameras), for instance. When you’re happy with your stitch, you can then edit, color grade and maybe add VFX and interactivity in order to bring a really good experience to viewers.

Immersive video within Adobe Premiere.

Today, Adobe CC takes the lead of the editing scene with their specific 360 tools, such as their contextual viewer. But the big hit was when they acquired the Skybox plugins suite from Mettle, which will be integrated natively in the next Adobe CC version (for Premiere and After Effects).

With this set of tools you can easily manipulate your equirectangular sources, do tripod removal, sky replacements and all the invisible effects that were tricky to do without Skybox. You can then add contextual 360 effects like text, blur, transitions, greenscreen, and much more, in monoscopic and even stereoscopic mode. All this while viewing your timeline directly in your Oculus Rift and in realtime! And, incredibly it’s working — I use these tools all day long.

So let’s talk about the Mettle team. Created by two artists back in 1992, they joined the VR movement three years ago with the Skybox suite. They understood they had to bring tech to creative people. As a result they made smart tools with very well-designed GUI. For instance, look at Mettle’s new Mantra creative toolset for After Effects and Premiere. It is incredible to work with because you get the power to create very artistic designs in 360 in Adobe CC. And if you’re a solid VFX tech, wait for their Volumatrix depth-related VR FX software tools. Working in collaboration with Facebook, Mettle will launch the next big tool to do VFX in 3D/360 environments using camera-generated depth maps. It will open new awesome possibilities for content creators.

You know, the current main issue in cinematic 360 is image quality. Of course, we could talk about resolution or pixel per eye, but I think we should focus on color grading. This task is very creative — bringing emotions to the viewers. For me, the best 360 color grading tool to achieve these goals with uncompromised quality is Scratch VR from Assimilate. Beautiful. Formidable. Scratch is a very powerful color grading system, always on top in terms of technology. Now that they’ve added VR capabilities, you can color grade your stereoscopic equirectangular sources as easily as with normal sources. My favorite is mask repeater function, so you can naturally handle masks even in the back seam, which is almost impossible in other color grading tools. And you can also view your results directly in your HMD.

Scratch VR and ZCam collaboration.

At NAB 2017, they provided Scratch VR Z, an integrated workflow in collaboration with ZCam, the manufacturer of the S1 and S1 Pro. In this workflow you can, for instance, stitch sources directly into Scratch and do super high-quality color grading with realtime live streaming, along with logo insertion, greenscreen capabilities, layouts, etc. Crazy. For finishing, the Scratch VR output module is also very useful, enabling you to render your result in ProRes even on Windows, or in 10-bit H264, and many other formats.

Finishing and Distribution
So your cinematic VR experience is finished (you’ll notice I’ve skipped the sound part of the process, but since it’s not the part I work on I will not speak about this essential stage). But maybe you want to add some interactivity for a better user experience?

I visited IBC’s Future Zone to talk with the Liquid Cinema team. What is it? Simply, it’s a set of tools enabling you to enhance your cinematic VR experience. One important word is storytelling — with liquid cinema you can add an interactive layer to your story. The first tool needed is the authoring application where you drop your sources, which can be movies, stills, 360 and 2D stuff. Then create and enjoy.

For example, you can add graphic layers and enable the viewers gaze function, create multibranching scenarios based on intelligent timelines, play with forced perspective features so your viewer never misses an important thing… you must to try it.

The second part of the suite is about VR distribution. As a content creator you want your experience to be on all existing platforms, HMDs, channels … not an easy feat, but with Liquid Cinema it’s possible. Their player is compatible with Samsung Gear VR, Oculus Rift, HTC Vive, iOS, Android, Daydream and more. It’s coming to Apple TV soon.

IglooVision

The third part of the suite is the management of your content. Liquid Cinema has a CMS tool, which is very simple and allows changes, like geoblocking, easily, and provides useful analytics tools like heat map. And you can use your Vimeo pro account as a CDN if needed. Perfect.

Also in the Future Zone was the igloo from IglooVision. This is one of the best “social” ways to experience cinematic VR that I have ever seen. Enter this room with your friends and you can watch 360 all around and finish your drink (try this with an HMD). Comfortable, isn’t it? You can also use it as a “shared VR production suite” by connecting Adobe Premiere or your favorite tool directly to the system. Boom. You have now an immersive 360-degree monitor around you and your post production team.

So that was my journey into the VR stuff of IBC 2017. Of course, this is a non-exhaustive list of tools, with nothing about sound (which is very important in VR), but it’s my personal choice. Period.

One last thing: VR people. I have met a lot of enthusiastic, smart, interesting and happy women and men, helping content producers like me to push their creative limits. So thanks to all of them and see ya.


Paris-based Alexandre Regeffe is a 25-year veteran of TV and film. He is currently VR post production manager at Neotopy, a VR studio, as well as a VR effects specialist working on After Effects and the entire Adobe suite. His specialty is cinematic VR post workflows.

Tackling VR storytelling challenges with spatial audio

By Matthew Bobb

From virtual reality experiences for brands to top film franchises, VR is making a big splash in entertainment and evolving the way creators tell stories. But, as with any medium and its production, bringing a narrative to life is no easy feat, especially when it’s immersive. VR comes with its own set of challenges unique to the platform’s capacity to completely transport viewers into another world and replicate reality.

Making high-quality immersive experiences, especially for a film franchise, is extremely challenging. Creators must place the viewer into a storyline crafted by the studios and properly guide them through the experience in a way that allows them to fully grasp the narrative. One emerging strategy is to emphasize audio — specifically, 360 spatial audio. VR offers a sense of presence no other medium today can offer. Spatial audio offers an auditory presence that augments a VR experience, amplifying its emotional effects.

My background as audio director for VR experiences includes top film franchises such as Warner Bros. and New Line Cinema’s IT: Float — A Cinematic VR Experience, The Conjuring 2 — Experience Enfield VR 360, Annabelle: Creation VR — Bee’s Room, and the upcoming Greatest Showman VR experience for 20th Century Fox. In the emerging world of VR, I have seen production teams encounter numerous challenges that call for creative solutions. For some of the most critical storytelling moments, it’s crucial for creators to understand the power of spatial audio and its potential to solve some of the most prevalent challenges that arise in VR production.

Most content creators — even some of those involved in VR filmmaking — don’t fully know what 360 spatial audio is or how its implementation within VR can elevate an experience. With any new medium, there are early adopters who are passionate about the process. As the next wave of VR filmmakers emerge, they will need to be informed about the benefits of spatial audio.

Guiding Viewers
Spatial audio is an incredible tool that helps make a VR experience feel believable. It can present sound from several locations, which allows viewers to identify their position within a virtual space in relation to the surrounding environment. With the ability to provide location-based sound from any direction and distance, spatial audio can then be used to produce directional auditory cues that grasp the viewer’s attention and coerce them to look in a certain direction.

VR is still unfamiliar territory for a lot of people, and the viewing process isn’t as straightforward as a 2D film or game, so dropping viewers into an experience can leave them feeling lost and overwhelmed. Inexperienced viewers are also more apprehensive and rarely move around or turn their heads while in a headset. Spatial audio cues prompting them to move or look in a specific direction are critical, steering them to instinctively react and move naturally. On Annabelle: Creation VR — Bee’s Room, viewers go into the experience knowing it’s from the horror genre and may be hesitant to look around. We strategically used audio cues, such as footsteps, slamming doors and a record player that mysteriously turns on and off, to encourage viewers to turn their head toward the sound and the chilling visuals that await.

Lacking Footage
Spatial audio can also be a solution for challenging scene transitions, or when there is a dearth of visuals to work with in a sequence. Well-crafted aural cues can paint a picture in a viewer’s mind without bombarding the experience with visuals that are often unnecessary.

A big challenge when creating VR experiences for beloved film franchises is the need for the VR production team to work in tandem with the film’s production team, making recording time extremely limited. When working on IT: Float, we were faced with the challenge of having a time constraint for shooting Pennywise the Clown. Consequently, there was not an abundance of footage of him to place in the promotional VR experience. Beyond a lack of footage, they also didn’t want to give away the notorious clown’s much-anticipated appearance before the film’s theatrical release. The solution to that production challenge was spatial audio. Pennywise’s voice was strategically used to lead the experience and guide viewers throughout the sewer tunnels, heightening the suspense while also providing the illusion that he was surrounding the viewer.

Avoiding Visual Overkill
Similar to film and video games, sound is half of the experience in VR. With the unique perspective the medium offers, creators no longer have to fully rely on a visually-heavy narrative, which can overwhelm the viewer. Instead, audio can take on a bigger role in the production process and make the project a well-rounded sensory experience. In VR, it’s important for creators to leverage sensory stimulation beyond visuals to guide viewers through a story and authentically replicate reality.

As VR storytellers, we are reimagining ways to immerse viewer in new worlds. It is crucial for us to leverage the power of audio to smooth out bumps in the road and deliver a vivid sense of physical presence unique to this medium.


Matthew Bobb is the CEO of the full-service audio company Spacewalk Sound. He is a spatial audio expert whose work can be seen in top VR experiences for major film franchises.