Tag Archives: AR/VR/360

Behind the Title: Light Sail VR’s Matthew Celia

NAME: Matthew Celia

COMPANY: LA’s Light Sail VR (@lightsailvr)

CAN YOU DESCRIBE YOUR COMPANY?
Light Sail VR is a virtual reality production company specializing in telling immersive narrative stories. We’ve built a strong branded content business over the last two years working with clients such as Google and GoPro, and studios like Paramount and ABC.

Whether it’s 360 video, cinematic VR or interactive media, we’ve built an end-to-end pipeline to go from script to final delivery. We’re now excited to be moving into creating original IP and more interactive content that fuses cinematic live-action film footage with game engine mechanics.

WHAT’S YOUR JOB TITLE?
Creative Director and Managing Partner

WHAT DOES THAT ENTAIL?
A lot! We’re a small boutique shop so we all wear many hats. First and foremost, I am a director and work hard to deliver a compelling story and emotional connection to the audience for each one of our pieces. Story first is our motto, and I try and approach every technical problem with a creative solution. Figuring out execution is a large part of that.

In addition to the production side, I also carry a lot of the technical responsibilities in post production, such as keeping our post pipeline humming and inventing new workflows. Most recently, I have been dabbling in programming interactive cinema using the Unity game engine.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I am in charge of washing the lettuce when we do our famous “Light Sail VR Sandwich Club” during lunch. Yes, you get fed for free if you work with us, and I make an amazing italian sandwich.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Hard to say. I really like what I do. I like being on set and working with actors because VR is such a great medium for them to play in, and it’s exciting to collaborate with such creative and talented people.

National Parks Service

WHAT’S YOUR LEAST FAVORITE?
Render times and computer crashes. My tech life is in constant beta. Price we pay for being on the bleeding edge, I guess!

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I like the early morning because it is quiet, my brain is fresh, and I haven’t yet had 20 people asking something of me.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Probably the same, but at a large company. If I left the film business I’d probably teach. I love working with kids.

WHY DID YOU CHOOSE THIS PROFESSION?
I feel like I’ve wanted to be a filmmaker since I could walk. My parents like to drag out the home movies of me asking to look in my dad’s VHS video camera when I was 4. I spent most of high school in the theater and most people assumed I would be an actor. But senior year I fell in love with film when I shot and cut my first 16mm reversal stock on an old reel-to-reel editing machine. The process was incredibly fun and rewarding and I was hooked. I only recently discovered VR, but in many ways it feels like the right path for me because I think cinematic VR is the perfect intersection of filmmaking and theater.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
On the branded side, we just finished up two tourism videos. One for the National Parks Service which was a 360 tour of the Channel Islands with Jordan Fisher and the other was a 360 piece for Princess Cruises. VR is really great to show people the world. The last few months of my life have been consumed by Light Sail VR’s first original project, Speak of the Devil.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Speak of the Devil is at the top of that list. It’s the first live-action interactive project I’ve worked on and it’s massive. Crafted using the GoPro Odyssey camera in partnership with Google Jump it features over 50 unique locations, 13 different endings and is currently taking up about 80TB of storage (and counting). It is the largest project I’ve worked on to date, and we’ve done it all on a shoestring budget thanks to the gracious contributions of talented creative folks who believed in our vision.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My instant-read grill meat thermometer, my iPhone and my Philips Hue bulbs. Seriously, if you have a baby, it’s a life saver being able to whisper, Hey, Siri, turn off the lights.”

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m really active on several Facebook groups related to 360 video production. You can get a lot of advice and connect directly with vendors and software engineers. It’s a great community.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I tend to pop on some music when I’m doing repetitive mindless tasks, but when I have to be creative or solve a tough tech problem, the music is off so that I can focus. My favorite music to work to tends to be Dave Matthews Band live albums. They get into 20-minute long jams and it’s great.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
De-stressing is really hard when you own your own company. I like to go walking, but if that doesn’t work, I’ll try diving into some cooking for my family, which forces me to focus on something not work related. I tend to feel better after eating a really good meal.

Editing 360 video with Lenovo’s Explorer WMR headset

By Mike McCarthy

Microsoft has released its Windows Mixed Reality (WMR) platform as part of the Fall Creator’s Update to Windows 10. This platform allows users to experience a variety of immersive experiences, and thankfully there are now many WMR headsets available from many familiar names in the hardware business. One of those is from Lenovo who kindly sent me their Explorer WMR headset to test on my Thinkpad P71. This provided me with a complete VR experience on their hardware.

On November 15, Microsoft’s WMR released beta support for SteamVR on WMR devices. This allows WMR headsets to be used in applications that are compatible with SteamVR. For example, the newest release of Adobe Premiere Pro (CC 2018, or V.12.0) uses SteamVR for 360 video preview.

My goal for this article was to see if I could preview my 360 videos in a Lenovo headset while editing in Premiere, especially now that I had new 360 footage from my GoPro Fusion camera. I also provide some comparisons to the Oculus Rift which I reviewed for postPerspective in October.

There are a number of advantages to the WMR options, including lower prices and hardware requirements, higher image resolution and simpler setup. Oculus and HTC’s VR-Ready requirements have always been a bit excessive for 360 video, because unlike true 3D VR there is no 3D rendering involved when playing back footage from a fixed perspective. But would it work? No one seemed to know if it would, but Lenovo was willing to let me try.

The first step is to get your installation of Windows 10 upgraded with the Fall Creators Update. This includes integrated support for Windows Mixed Reality headsets. Once installed, you can plug in the single USB3 cable and HDMI port and Windows will automatically configure the device and its drivers for you. You will also need to install Valve’s Steam application and SteamVR, which adds support for VR content. The next step is to find Microsoft’s Windows Mixed Reality for SteamVR in the Steam store, which is a free installation. Once you confirm that the headset is functioning in WMR and then in SteamVR, open up Premiere Pro and test it out.

Working in Premiere Pro
Within Premiere Pro, preview and playback worked immediately within my existing immersive project. I watched footage captured with my Samsung Gear 360 and GoPro Fusion cameras. The files played, and the increased performance within the new version of the software is noticeable. My 4K and 5K 30fps content worked great, but my new 3Kp60 content only played when Mercury Playback was set to software-only, which disabled most of the new Immersive Video effects. In CUDA mode, I could hold down the right arrow and watch it progress in slow motion, but pressing the space bar caused the VR preview to freeze even though it played fine on the laptop monitor. The 60p content played fine in the Rift, so this appears to be an issue specific to WMR. Hopefully, that will be addressed in a software update in the near future.

The motion controllers were visible in the interface, and allow you to scrub the timeline, but I still had to use space-bar to start and stop playback. (Update: The 12.1 release of Premiere Pro support WMR headsets, and testing confirms that 60p now works, and the motion controllers are fully functional and can control playback.) One other issue that arose was that the mouse cursor is hidden when the display is snapped down into place over my eyes, which is an intrinsic feature of WMR. I had to tip it up out of the way every time I wanted to make a change, instead of just peeking under it, which is a lot of snapping up and down for the headset.

I found the WMR experience to be slightly less solid than the Oculus system. It would occasionally lag on the tracking for a couple of frames, causing the image to visibly jump. This may be due to the integrated tracking instead of dedicated external cameras. The boundary system is a visual distraction, so I would recommend disabling it if you are primarily using it for 360 video — because it doesn’t require moving much within your space. The setup on the WMR is better; it is much easier and has lower requirements and fewer ports needed. The resolution is higher than the Oculus Rift I had tested, (1440×1440 per eye instead of 1080×1200), so I wanted to see how much of a difference that would make. The Explorer also has a narrower field of view (105 degrees instead of 110), which I wouldn’t expect to make a difference, but I think it did.

By my calculations, the increased resolution should allow you to resolve a 5K sphere, compared to the 3.9K resolution available from the Rift — 1440pixels/105degrees*360 vs 1080pixels /110degrees*360. You will also want a pair of headphones or earbuds to plug into the headset so the audio tracks with your head (compared to your computer speakers, which are fixed).

The Feel of the Headset
The headset is designed very differently from the Rift, and the display can be tipped up out of the way while the headband is still on. It is also way easier to put on and remove, but a bit less comfortable to keep on for longer periods of time. The headband has to be on tight enough to hold the display in front of your eyes, since it doesn’t rest on your face, and the cabling has to slide through a clip on the headband when you fold the display upward. And since you have to fold the display upward to use the mouse, it is a frequent annoyance. But between the motion controllers and the keyboard, you can navigate and playback while the headset is on.

Using the Microsoft WMR lobby interface was an interesting experience, but I’m not sure if it’s going to catch on. SteamVR’s lobby experience isn’t much better, but Steam does offer a lot more content for its users. I anticipate Steam will be the dominant software platform based on the fact that most hardware vendors have support for it — HTC, Oculus, WMR. The fact that Adobe chose SteamVR to support their immersive preview experience is why these new WMR headsets work in Premiere Pro without any further software updates needed on their part. (Adobe doesn’t officially support this configuration yet, hence the “beta” designation in SteamVR, but besides 60p playback, I was very happy.) Hopefully we will only see further increased support and integration between the various hardware and software options in the future.

Summing Up
Currently, the Lenovo Explorer and the Oculus Rift are both priced the same at $399 — I say currently because prices have been fluctuating, so investigate thoroughly. So which one is better? Well, neither is a clear winner. Each has its own strengths. The Rift has more specific hardware requirements and lower total resolution. The Explorer requires Windows 10, but will work on a wider array of systems. The Rift is probably better for periods of extended use, while I would recommend the Explorer if you are going to be doing something that involves taking it on and off all the time (like tweaking effects settings in Adobe apps). Large fixed installations may offer a better user experience with the Rift or Vive on a powerful GPU, but most laptop users will probably have an easier time with the Explorer (no external camera to calibrate and fewer ports needed).


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Technicolor Experience Center launches with HP Mars Home Planet

By Dayna McCallum

Technicolor’s Tim Sarnoff and Marcie Jastrow oversaw the official opening of the Technicolor Experience Center (TEC), with the help of HP’s Sean Young and Rick Champagne, on June 15. The kickoff event also featured the announcement that TEC is teaming up with HP to develop HP Mars Home Planet, an experimental VR experience to reinvent life on Mars for one million humans.

The purpose-built TEC space is located in Blackwelder creative park, a business district designed specifically for the needs of creative and media companies in Culver City. The center, dedicated to bringing artists and scientists together to explore immersive media, covers almost 27,000 square feet, with 3,000 square feet dedicated to motion capture. The TEC serves as a hub connecting Technicolor’s creative houses and research labs across the globe, including an R&D team from France that made an appearance during event via a remote demo, with technology partners, such as HP.

Sarnoff, Technicolor deputy CEO and president of production services, said, “The TEC is about realizing the aspirations of all the players who are part of the nascent immersive ecosystem we work in, from content creation, to content distribution and content consumption. Designing and delivering immersive experiences will require a massive convergence of artistic, technological and economic talent. They will have to come together productively. That is why the TEC has been formed. It is designed to be a practical place where we take theoretical constructs and move systematically to tactical implementation through a creative and dynamic process of experimentation.”

The HP Mars Home Planet project is a global, immersive media collaboration uniting engineers, architects, designers, artists and students to design an urban area on Mars in a VR environment. The project will be built on the terrain from Fusion’s “Mars 2030” game, which is based on research, images, and expertise based on NASA research. In addition to HP, Fusion and TEC, partners include Nvidia, Unreal Engine, Autodesk and HTCVive. Additional details will be released at Siggraph 2017.

Young, worldwide segment manager for product development and AEC for HP Inc., said of the Mars project, “To ensure fidelity and professional-grade quality and a fantastic end-user experience, the TEC is going to oversee the virtual reality development process of the work that is going to be done by collaborators from all over the world. It is an incredible opportunity for anybody from anywhere in the world that is interested in VR to work with Technicolor.”

Sound editor/mixer Korey Pereira on 3D audio workflows for VR

By Andrew Emge

As the technologies for VR and 360 video rapidly advance and become more accessible, media creators are realizing the crucial role that sound plays in achieving realism. Sound designers are exploring this new frontier of 3D audio at the same time that tools for the medium are being developed and introduced. When everything is so new and constantly evolving, how does one learn where to start or decide where to invest time and experimentation?

To better understand this process, I spoke with Korey Pereira, a sound editor and mixer based in Austin, Texas. He recently entered the VR/360 audio world and has started developing a workflow.

Can you provide some background about who you are, the work you’ve done, and what you’ve been up to lately?
I’m the owner/creative director at Soularity Sound, an Austin-based post company. We primarily work with indie filmmakers, but also do some television and ad work. In addition to my work at Soularity, I also work as a sound editor and mixer at a few other Austin post facilities, including Soundcrafter. My credits with them include Richard Linklater’s Boyhood and Everybody Wants Some, as well as TV shows such as Shipping Wars and My 600lb Life.

You recently purchased the Pro Sound Effects NYC Ambisonics library. Can you talk about some VR projects you are working on?
In the coming months I plan to start creating audio content for VR with a local content creator, Deepak Chetty. Over the years we have collaborated on a number of projects, most recently I worked on his stereoscopic 3D sci-fi/action film, Hard Reset, which won the 2016 “Best 3D Live Action Short” from the Advanced Imaging Society.

Deepak Chetty shooting a VR project.

I love sci-fi as a genre, because there really are no rules. It lets you really go for it as far as sound. Deepak has been shifting his creative focus toward 360 content and we are hoping to start working together in that aspect in the near future.

The content Deepak is currently mostly working on non-fiction and documentary-based content in 360 — mainly environment capture with a through line of audio storytelling that serves as the backbone of the piece. He is also looking forward to experimenting with fiction-based narratives in the 360 space, especially with the use of spatial audio to enhance immersion for the viewer.

Prior to meeting Deepak, did you have any experience working with VR/3D audio?
No, this is my first venture into the world of VR audio or 3D audio. I have been mixing in surround for over a decade, but I am excited about the additional possibilities this format brings to the table.

What have been the most helpful sources for studying up and figuring out a workflow?
The Internet! There is such a wealth of information out there, and you kind of just have to dive in. The benefit of 360 audio being a relatively new format is that people are still willing to talk openly about it.

Was there anything particularly challenging to get used to or wrap your head around?
In a lot of ways designing audio for VR is not that different from traditional sound mixing for film. You start with a bed of ambiences and then place elements within a surround space. I guess the most challenging part of the transition is anticipating how the audience might hear your mix. If the viewer decides to watch a whole video facing the surrounds, how will it sound?

Can you describe the workflow you’ve established so far? What are some decisions you’ve made regarding DAW, monitoring, software, plug-ins, tools, formats and order of operation?
I am a Pro Tools guy, so my main goal was finding a solution that works seamlessly inside the Pro Tools environment. As I started looking into different options, the Two Big Ears Spatial Workstation really stood out to me as being the most intuitive and easiest platform to hit the ground running with. (Two Big Ears recently joined Facebook, so Spatial Workstation is now available for free!)

Basically, you install a Pro Tools plug-in that works as a 3D audio engine and gives you a Pro Tools project with all the routing and tracks laid out for you. There are object-based tracks that allow you to place sounds within a 3D environment as well as ambience tracks that allow you to add stereo or ambisonic beds as a basis for your mix.

The coolest thing about this platform is that it includes a 3D video player that runs in sync with Pro Tools. There is a binaural preview pathway in the template that lets you hear the shift in perspective as you move the video around in the player. Pretty cool!

In September 2016, another audio workflow for VR in Pro Tools entered the market from the Dutch company Audio Ease and their 360 pan suite. Much like the Spatial Workstation, the suite offers an object-based panner (360 pan) that when placed on every audio track allows you to pan individual items within the 360-degree field of view. The 360 pan suite also includes the 360 monitor, which allows you to preview head tracking within Pro Tools.

Where the 360 pan suite really stands out is with their video overlay function. By loading a 360 video inside of Pro Tools, Audio Ease adds an overlay on top of the Pro Tools video window, letting you pan each track in real time, which is really useful. For the features it offers, it is relatively affordable. The suite does not come with its own template, but they have a quick video guide to get you up and going fairly easily.

Are there any aspects that you’re still figuring out?
Delivery is still a bit up in the air. You may need to export in multiple formats to be able to upload to Facebook, YouTube, etc. I was glad to see that YouTube is supporting the ambisonic format for delivery, but I look forward to seeing workflows become more standardized across the board.

Any areas in which you see the need for further development, and/or where the tech just isn’t there yet?
I think the biggest limitation with VR is the lack of affordable and easy-to-use 3D audio capture devices. I would love to see a super-portable ambisonic rig that filmmakers can easily use in conjunction with shooting 360 video. Especially as media giants like YouTube are gravitating toward the ambisonic format for delivery, it would be great for them to be able to capture the actual space in the same format.

In January 2017, Røde announced the VideoMic Soundfield — an on-camera ambisonic, 360-degree surround sound microphone — though pricing and release dates have not yet been made public.

One new product I am really excited about is the Sennheiser Ambeo VR mic, which is around $1,650. That’s a bit pricey for the most casual user once you factor in a 4-track recorder, but for the professional user that already has a 788T, the Ambeo VR mic offers a nice turnkey solution. I like that the mic looks a little less fragile than some of the other options on the market. It has a built-in windscreen/cage similar to what you would see on a live handheld microphone. It also comes with a Rycote shockmount and cable to 4-XLR, which is nice.

Some leading companies have recently selected ambisonics as the standard spatial audio format — can you talk a bit about how you use ambisonics for VR?
Yeah, I think this is a great decision. I like the “future proof” nature of the ambisonic format. Even in traditional film mixing, I like having the option to export to stereo, 5.1 or 7.1 depending on the project. Until ambisonic becomes more standardized, I like that the Two Big Ears/FB 360 encoder allows you to export to the .tbe B-Format (FuMa or ambiX/YouTube) as well as quad-binaural.

I am a huge fan of the ambisonic format in general. The Pro Sound Effects NYC Ambisonics Library (and now Chicago and Tokyo as well) was my first experience using the format and I was blown away. In a traditional mixing environment it adds another level of depth to the backgrounds. I really look forward to being able to bring it to the VR format as well.


Andrew Emge is operations manager at Pro Sound Effects.

Chris Hill & Sami Tahari

Imaginary Forces expands with EP Chris Hill and director of biz dev Sami Tahari

Imaginary Forces has added executive producer Chris Hill and director of business development Sami Tahari to its Los Angeles studio. The additions come at a time when the creative studio is looking to further expand their cross-platform presence with projects that mix VR/AR/360 with traditional, digital and social media.

Celebrating 20 years in business this year, the independently owned Imaginary Forces is a creative company specializing in brand strategy and visual storytelling encompassing many disciplines, including full-service design, production and post production. Being successful for that long in this business means they are regularly innovating and moving where the industry takes them. This led to the hiring of Hill and Tahari, whose diverse backgrounds will help strengthen the company’s long-standing relationships, as well as its continuous expansion into emerging markets.

Recent work of note includes main titles for Netflix’s beloved Stranger Things, the logo reveal for Michael Bay’s Transformers: The Last Knight and an immersive experience for the Empire State Building.

Hill’s diverse production experience includes commercials, experience design, entertainment marketing and branding for such clients as HBO Sports, Google, A&E and the Jacksonville Jaguars, among others. He joins Imaginary Forces after recently presiding over the broadcast division of marketing agency BPG.

Tahari brings extensive marketing, business and product development experience spanning the tech and entertainment spaces. His resume includes time at Lionsgate and Google, where he was an instrumental leader in the creative development and marketing of Google Glass.

“Imaginary Forces has a proven ability to use design and storytelling across any medium or industry,” adds Hill. “We can expand that ability to new markets, whether it’s emerging technologies, original content or sports franchises. When you consider, for example, the investment in massive screens and new technologies in stadiums across the country, it demands [that] same high level of brand strategy and visual storytelling.”

Our Main Image: L-R: Chris Hill and Sami Tahari.