Tag Archives: Lucas Wilson

HPA Tech Retreat takes on realities of virtual reality

By Tom Coughlin

The HPA Tech Retreat, run by the Hollywood Professional Association in association with SMPTE, began with an insightful one-day VR seminar— Integrating Virtual Reality/Augmented Reality into Entertainment Applications. Lucas Wilson from SuperSphere kicked off the sessions and helped with much of the organization of the seminar.

The seminar addressed virtual reality (VR), augmented reality (AR) and mixed reality (MR, a subset of AR where the real world and the digital world interact, like Pokeman Go). As in traditional planar video, 360-degree video still requires a director to tell a story and direct the eye to see what is meant to be seen. Successful VR requires understanding how people look at things, how they perceive reality, and using that understanding to help tell a story. Some things that may help with this are reinforcement of the viewer’s gaze with color and sound that may vary with the viewer — e.g. these may be different for the “good guy” and the “bad guy.”

VR workflows are quite different from traditional ones, with many elements changing with multiple-camera content. For instance, it is much more difficult to keep a camera crew out of the image, and providing proper illumination for all the cameras can be a challenge. The image below from Jaunt shows their 360-degree workflow, including the use of their cloud-based computational image service to stitch the images from the multiple cameras.
Snapchat is the biggest MR application, said Wilson. Snapchat’s Snapchat-stories could be the basis of future post tools.

Because stand-alone headsets (head-mounted displays, or HMDs) are expensive, most users of VR rely on smart phone-based displays. There are also some places that allow one or more people to experience VR, such as the IMAX center in Los Angeles. Activities such as VR viewing will be one of the big drivers for higher-resolution mobile device displays.

Tools that allow artists and directors to get fast feedback on their shots are still in development. But progress is being made, and today over 50 percent of VR is used for video viewing rather than games. Participants in a VR/AR market session, moderated by the Hollywood Reporter’s Carolyn Giardina and including Marcie Jastrow, David Moretti, Catherine Day and Phil Lelyveld, seemed to agree that the biggest immediate opportunity is probably with AR.

Koji Gardiner from Jaunt gave a great talk on their approach to VR. He discussed the various ways that 360-degree video can be captured and the processing required to create finished stitched video. For an array of cameras with some separation between the cameras (no common axis point for the imaging cameras), there will be area that needs to be stitched together between camera images using common reference points between the different camera images as well as blind spots near to the cameras where they are not capturing images.

If there is a single axis for all of the cameras then there are effectively no blind spots and no stitching possible as shown in the image below. Covering all the space to get a 360-degree video requires additional cameras located on that axis to cover all the space.

The Fraunhofer Institute, in Germany, has been showing a 360-degree video camera with an effective single axis for several cameras for several years, as shown below. They do this using mirrors to reflect images to the individual cameras.

As the number of cameras is increased, the mathematical work to stitch the 360-degree images together is reduced.

Stitching
There are two approaches commonly used in VR stitching of multiple camera videos. The easiest to implement is a geometric approach that uses known geometries and distances to objects. It requires limited computational resources but results in unavoidable ghosting artifacts at seams from the separate images.

The Optical Flow approach synthesizes every pixel by computing correspondences between neighboring cameras. This approach eliminates the ghosting artifacts at the seams but has its own more subtle artifacts and requires significantly more processing capability. The Optical Flow approach requires computational capabilities far beyond those normally available to content creators. This has led to a growing market to upload multi-camera video streams to cloud services that process the stitching to create finished 360-degree videos.

Files from the Jaunt One camera system are first downloaded and organized on a laptop computer and then uploaded to Jaunt’s cloud server to be processed and create the stitching to make a 360 video. Omni-directionally captured audio can also be uploaded and mixed ambisonically, resulting in advanced directionality in the audio tied to the VR video experience.

Google and Facebook also have cloud-based resources for computational photography used for this sort of image stitching.

The Jaunt One 360-degree camera has a 1-inch 20MP rolling shutter sensor with frame rates up to 60fps with 3200 ISO max, 29dB SNR at ISO800. It has a 10 stops per camera module, with 130-degree diagonal FOV, 4/2.9 optics and with up to 16K resolution (8K per eye). Jaunt One at 60fps provides 200GB/minute uncompressed. This can fill a 1TB SSD in five minutes. They are forced to use compression to be able to use currently affordable storage devices. This compression creates 11GB per minute, which can fill a 1TB SSD in 90 minutes.

The actual stitched image, laid out flat, looks like a distorted projection. But when viewed in a stereoscopic viewer it appears to look like a natural image of the world around the viewer, giving an immersive experience. At one point in time the viewer does not see all of the image but only the image in a restricted space that they are looking directly at as shown in the red box in the figure below.

The full 360-degree image can be pretty high resolution, but unless the resolution is high enough, the resolution inside the scene being viewed at any point in time will be much less that the resolution of the overall scene, unless special steps are taken.

The image below shows that for a 4k 360-degree video the resolution in the field of view (FOV) may be only 1K, much less resolution and quite perceptible to the human eye.

In order to provide a better viewing experience in the FOV, either the resolution of the entire view must be better (e.g. the Jaunt One high-resolution version has 8K per eye and thus 16K total displayed resolution) or there must be a way to increase the resolution in the most significant FOV in a video, so at least in that FOV, the resolution leads to a greater feeling of reality.

Virtual reality, augmented reality and mixed reality create new ways of interacting with the world around us and will drive consumer technologies and the need for 360-degree video. New tools and stitching software, much of this cloud-based, will enable these workflows for folks who want to participate in this revolution in content. The role of a director is as important as ever as new methods are needed to tell stories and guide the viewer to engage in this story.

2017 Creative Storage Conference
You can learn more about the growth in VR content in professional video and how this will drive new digital storage demand and technologies to support the high data rates needed for captured content and cloud-based VR services at the 2017 Creative Storage Conference — taking place May 24, 2017 in Culver City.


Thomas M. Coughlin of Coughlin Associates is a storage analyst and consultant. He has over 30 years in the data storage industry and is the author of Digital Storage in Consumer Electronics: The Essential Guide.

HPA Tech Retreat takes on VR/AR at Tech Retreat Extra

The long-standing HPA Tech Retreat is always a popular destination for tech-focused post pros, and while they have touched on virtual reality and augmented reality in the past, this year they are dedicating an entire day to the topic — February 20, the day before the official Retreat begins. TR-X (Tech Retreat Extra) will feature VR experts and storytellers sharing their knowledge and experiences. The traditional HPA Tech Retreat runs from February 21-24 in Indian Wells, California.

TR-X VR/AR is co-chaired by Lucas Wilson (Founder/Executive Producer at SuperSphereVR) and Marcie Jastrow (Senior VP, Immersive Media & Head of Technicolor Experience Center), who will lead a discussion focused on the changing VR/AR landscape in the context of rapidly growing integration into entertainment and applications.

Marcie Jastrow

Experts and creative panelists will tackle questions such as: What do you need to understand to enable VR in your environment? How do you adapt? What are the workflows? Storytellers, technologists and industry leaders will provide an overview of the technology and discuss how to harness emerging technologies in the service of the artistic vision. A series of diverse case studies and creative explorations — from NASA to the NFL — will examine how to engage the audience.

The TR-X program, along with the complete HPA Tech Retreat program, is available here. Additional sessions and speakers will be announced.

TR-X VR/AR Speakers and Panel Overview
Monday, February 20

Opening and Introductions
Seth Hallen, HPA President

Technical Introduction: 360/VR/AR/MR
Lucas Wilson

Panel Discussion: The VR/AR Market
Marcie Jastrow
David Moretti, Director of Corporate Development, Jaunt
Catherine Day, Head of VR/AR, Missing Pieces
Phil Lelyveld, VR/AR Initiative Program Lead, Entertainment Technology Center at USC

Acquisition Technology
Koji Gardiner, VP, Hardware, Jaunt

Live 360 Production Case Study
Andrew McGovern, VP of VR/AR Productions, Digital Domain

Live 360 Production Case Study
Michael Mansouri, Founder, Radiant Images

Interactive VR Production Case Study
Tim Dillon, Head of VR & Immersive Content, MPC Advertising USA

Immersive Audio Production Case Study
Kyle Schember, CEO, Subtractive

Panel Discussion: The Future
Alan Lasky, Director of Studio Product Development, 8i
Ben Grossmann, CEO, Magnopus
Scott Squires, CTO, Creative Director, Pixvana
Moderator: Lucas Wilson
Jen Dennis, EP of Branded Content, RSA

Panel Discussion: New Voices: Young Professionals in VR
Anne Jimkes, Sound Designer and Composer, Ecco VR
Jyotsna Kadimi, USC Graduate
Sho Schrock, Chapman University Student
Brian Handy, USC Student

TR-X also includes an ATSC 3.0 seminar, focusing on the next-generation television broadcast standard, which is nearing completion and offers a wide range of new content delivery options to the TV production community. This session will explore the expanding possibilities that the new standard provides in video, audio, interactivity and more. Presenters and panelists will also discuss the complex next-gen television distribution ecosystem that content must traverse, and the technologies that will bring the content to life in consumers’ homes.

Early registration is highly recommended for TR-X and the HPA Tech Retreat, which is a perennially sold-out event. Attendees can sign up for TR-X VR/AR, TR-X ATSC or the HPA Tech Retreat.

Main Image: Lucas Wilson.

Paul McCartney’s VR doc series produced by Jaunt

When Sir Paul McCartney, a genuine living legend, agrees to let you shoot a virtual reality documentary series in conjunction with his new album, what do you do? You pull together an A-list team of production and post pros to get it right! Especially when so much of this road to VR remains uncharted.

Paul McCartney and Tony Kaye

Jaunt, which produces and publishes cinematic virtual reality content, was up to the task of shooting, posting and publishing this six-part series of VR documentary shorts. The first two episodes of Pure McCartney VR, are available now. As part of the upcoming launch of Pure McCartney, an album that spans his career, each of these immersive VR experiences dig into the stories behind some of McCartney’s most iconic songs.

That A-list team we referred to before includes director/DP Tony Kaye, producer and soundscape architect Geoff Emerick (who worked on the Beatles albums Revolver, Sgt. Pepper’s Lonely Hearts Club Band, The Beatles and Abbey Road), and executive producers Cliff Plumer, Lucas Wilson and Doug Allenstein. We will get into the post team shortly, but the series was shot on a Jaunt One camera and used the Jaunt Cloud Services Platform for stitching and publishing.

In the VR experience, viewers go inside this rock legend’s home studio as he shares stories related to various songs. He also shares archived and never-before-seen footage. The Pure McCartney VR episodes, Dance Tonight and Coming Up are now available, with My Valentine, Mull of Kintyre and Early Days to be released an episode at a time leading up to the Pure McCartney album release on June 10.

The VR episodes include digitally remastered and spatially oriented ambisonic audio mixed in Dolby Atmos. This project marks the first time an original Paul McCartney track has been remixed in Dolby Atmos. Sound designer/re-recording mixer on the project was Luke Bechthold.

“The pieces are creative, have a point of view and are very quirky and real 360 pieces,” explains EP Wilson. “These were conceived by Tony Kaye as 360 pieces in stereo, not 2D.”

Post Production
The post was done in a big room at Jaunt Studios in LA. “That’s where we lived for a couple of months,” explains Wilson. “There was a ton of creative back and forth, which we needed because so much of this stuff hadn’t been done before. There is no body of work to look at as an example. We just tried things out until we got it to where we wanted it to be.”

The series was edited on Final Cut X by Duncan Shepherd working with two assistants. VFX were accomplished using Houdini, Maya and After Effects by Machineyes. Dave Franks did the color grade using Scratch. “Scratch offers realtime VR color grading, so Dave sat with Tony Kaye using an Oculus Rift and just did the grade.”

“We used to see artists connect with their fans through album covers and liner notes but that personal expression, and deeper understanding of the music, has diminished over the years,” says Plumer, president of Jaunt Studios. “With virtual reality, Paul McCartney is taking the most innovative step yet; he’s connecting directly with his fans, to share his innermost thoughts and experiences in an entirely new, personal and immersive way.”

Each track will be released into the custom Paul McCartney world within the Jaunt VR App. Jaunt VR is available on iOS, Android, Gear VR, Oculus Rift, HTC Vive and Desktop 360.

Lucas Wilson on Scratch’s new end-to-end VR workflow

With NAB looming, and chatter pointing to virtual reality being ubiquitous on the show floor, Assimilate has launched its new Scratch VR Suite, an end-to-end virtual reality workflow with an all-inclusive realtime toolset for working within a 360 environment. The Scratch VR Suite includes features from Scratch V8.4 (soon to be Scratch 8.5). Scratch VR also includes Scratch Web and creative tools that are specific to the VR Suite. Scratch Web enables realtime, online collaboration and review via Google Cardboard and Samsung GearVR headsets.

We reached out to Lucas Wilson, VR producer at Assimilate to find out more about the product and workflow.

Lucas Wilson on set.

Lucas Wilson on set.

Can you walk us through the workflow of someone shooting VR content and using Scratch VR from on-set to post?
In many ways, Assimilate has kind of just removed “VR” as an issue in a lot of post production, bringing it back to “just production.” Scratch does not do any stitching. So, once material is stitched you take these steps…

1) Publish to Scratch Web and generate review links for clients. 2) Review links can be opened up and played in either “Magic Window” or VR-Cardboard mode, allowing for an effective, real, headset-based review workflow for dailies. 3) VR goes through to editorial. 4) Conform to Scratch 5) Scratch can then grade in a true VR mode — with 360 viewer options on the desktop, to an external monitor, or live to an Oculus Rift DK2. mono or stereo. 6) In addition, grading “respects” 360 mode. Shapes will wrap around in 360 mode, respecting the edges in a lat/long frame, etc. It is real grading and finishing in 360/VR. 7) Publish to YouTube 360 with correctly inserted metadata, or as a normal equi-rectangular video for publishing elsewhere.

What are some common areas of focus that people who are just jumping into VR need to know from the outset?
Best advice I can give is to think through your workflow carefully from the beginning. Planning and pre-production is so important in any VR project, and that can get you into trouble more quickly than with many traditional projects.

You’ve been out on the road shooting VR in real-world situations. What has surprised you the most about this process, and can you talk about some of the tools that were built into the VR suite based on your input?
The biggest surprise was two-fold: the necessity of dealing quickly and effectively with stitching, and then the complete lack of good review and finish tools in VR. Getting anything reviewed by a client was a painful process before Scratch Web. My real-world experience (I think) had a big influence on the VR suite, because in that sense I was kind of a “customer with an inside track.” I was able to feed back my pain quickly and easily to the team, and they listened. Using Scratch and Scratch Web, I can review, conform, color, finish and deliver in VR.

——–

Assimilate and Wilson will be at NAB offering demos

Talking to Assimilate about new VR dailies/review tool

CEO Jeff Edson and VP of biz dev Lucas Wilson answer our questions

By Randi Altman

As you can tell from our recent Sundance coverage, postPerspective has a little crush on VR. While we know that today’s VR is young and creatives are still figuring out how it will be used — narrative storytelling, gaming, immersive concerts (looking at you Paul McCartney), job training, therapy, etc. — we cannot ignore how established film fests and trade shows are welcoming it, or the tools that are coming out for its production and post.

One of those tools comes from Assimilate, which is expanding its Scratch Web cloud-platform capabilities to offer a professional, web-based dailies/review tool for reviewing headset-based 360-degree VR content, regardless of location.

How does it work? Kind of simply: Users launch this link vr360.sweb.media on an Android phone (Samsung S6 or other) via Chrome, click the goggles in the lower right corner, put it in their Google Cardboard and view immediate headset-based VR. Once users launch the Scratch Web review link for the VR content, they can playback VR imagery, pan around imagery or create a “magic window” so they can move their smart phone around, similar to looking through a window to see the 360-degree content behind it.

The VR content, including metadata, is automatically formatted for 360-degree video headsets, such as Google Cardboard. The reviewer can then make notes and comments on their mobile device to send back to the sender. The company says they will be announcing support for other mobile devices, headsets and browsers in the near future.

On the heels of this news, we decided to reach out to Assimilate CEO Jeff Edson and VP of business development Lucas Wilson to find out more.

Assimilate has been offering tools for VR, but with this new dailies and reviews tool, you’ve taken it to a new level. Can you talk about the evolution of how you service VR and how this newest product came to be?
Jeff Edson: Professional imagery needs professional tools and workflows to succeed. Much like imagery evolutions to date (digital cinema), this is a new way to capture and tell stories and provide experiences. VR provides a whole new way for people to tell stories amongst other experiences.

So regarding the evolution of tools, Scratch has supported the 360 format for a while now. It has allowed people to playback their footage as well as do basic DI — basic functionality to help produce the best output. As the production side of VR continues to evolve, the workflow aligns itself with a more standard process. This means the same toolset for VR as exists for non-VR. Scratch Web-VR is the natural progression to provide VR productions with the ability to review dailies worldwide.

Lucas Wilson: When VR first started appearing as a real deliverable for creative professionals, Assimilate jumped in. Scratch has supported 360 video live to an Oculus Rift for more than a year now. But with the new Scratch Web toolset and the additional tools added in Scratch to make 360 work more easily and be more accessible, it is no longer just a feature added to a product. It is a workflow and process — review and approval for Cardboard via a web link, or via the free Scratch Play tool, along with color and finishing with Scratch.

It seems pretty simple to use, how are you able to do this via the cloud and through a standard browser?
Jeff: The product is very straight forward to use, as there is a very wide range of people who will have access to it, most of whom do not want the technology to get in the way of the solution. We work very hard at the core of all we have developed — interactive performance.

Lucas: Good programmers (smiles)! Seriously though, we looked at what was needed and what was missing in the VR delivery chain and tried to serve those needs. Scratch Web allows users to upload a clip and generate a link that will work in Cardboard. Review and approval is now just clicking a link and putting your phone into a headset.

What’s the price?
Jeff: The same price as Scratch Web — Free-Trial, Basic-$79/month, Extended-$249/month and Enterprise for special requirements.

Prior to this product, how were those working on VR production going about dailies and reviews?
Jeff: In most cases they were doing it by looking at output from several cameras for review. The main process for viewing was to edit and publish. There really was no tool targeted at dailies/review of VR.

Lucas: It has been really difficult. Reviews are typically done on a flat screen and by guessing, or by reverse engineering MilkVR or Oculus Videos in GearVR.

Can you talk about real-world testing of the product? VR productions that used this tool?
Lucas: We have a few large productions doing review and approval right now with Scratch Web. We can’t talk about them yet, but one of them is the first VR project directed by an A-List director. There are also two of the major sports leagues in the US who employed the tool.

SuperSphere and Fox team on ‘Scream Queens’ VR videos

Fox Television decided to help fans of its Scream Queens horror/comedy series visit the show’s set in a way that wasn’t previously possible, thanks to eight new virtual reality short videos. For those of you who haven’t seen the show, Scream Queens focuses on a series of murders tied to a sorority and is set at a fictional college in New Orleans. The VR videos have been produced for the Samsung Milk VR, YouTube 360° and Facebook platforms and are rolling out in the coming weeks.

Fox called on SuperSphere Productions — a consultancy that helps with virtual reality project execution and delivery — to bring their VR concepts to life. SuperSphere founder Lucas Wilson worked closely with Fox creative and marketing executives to develop the production, post and delivery process using the talent, tools and equipment already in place for Scream Queens.

“It was the first VR shoot for a major episodic that proved the ability to replicate a realistic production formula, because so much of VR is very ‘science project-y’ right now,” explains industry vet Wilson, who many of you might know from his work with Assimilate and his own company Revelens.

Lucas Wilson

Lucas Wilson

Wilson reports that this project had a reasonable budget and a small crew. “This allowed us to work together to produce and deliver a wide series of experiences for the show that — judging by reaction on Facebook — are pretty successful. As of late November, the Closet Set Tour (extended) has over 660,000 views, over 31,000 likes and over 10,500 shares. In product terms, that price/performance ratio is pretty damn impressive.”

The VR content, captured over a two-day shoot on the show’s set in New Orleans, was directed by Jessica Sanders and shot by the 360Heros team. “The fact that a woman directed these videos is relevant, and it’s intentional. For virtual reality to take root and grow in every corner of the globe, it must become clear very quickly that VR is for everyone,” says Wilson. “So in addition to creating compelling content, it is critical for that content to be produced and influenced by talented people who bring a wide range of perspectives and experiences. Hiring smart, ambitious women like Jessica as directors and DPs is a no-brainer. SuperSphere’s mission is to open up a whole new kind of immersive, enriching experience to everyone on the planet. To reach everyone, you have to include everyone… from the beginning.

In terms of post, editorial and the 5.1 sound mix was done by Fox’s internal team. SuperSphere did the conform and finish on Assimilate Scratch VR. Local Hero did the VR grading, also on Scratch VR. “The way we worked with Local Hero was actually kinda cool,” explains Wilson. “Most of the pieces are very single-location with highly controlled lighting. We sent them representative still frames, and they graded the stills and sent back a Scratch preset, which we used to then render and conform/output. SuperSphere then output the three different VR deliverables — Facebook, MilkV Rand YouTube.

Two videos have already launched — the first includes a behind-the-scenes tour, mentioned earlier, of the set and closet of Chanel Oberlin (Emma Roberts), created by Scream Queens production designer Andrew Murdock. The second shows a screaming match between the Scream Queens‘ Zayday Williams (Keke Palmer) and Grace Gardner (Skyler Samuels).

Following the Emmy-winning Comic-Con VR experience for its drama Sleepy Hollow last year, these Scream Queens videos mark the first of an ongoing Fox VR and augmented reality initiative for its shows.

“The intelligent way that Fox went about it, and how SuperSphere and Fox worked together to very specifically create a formula for replication and success, is in my opinion a model for how episodic television can leverage VR into an overall experience,” concludes Wilson.

Quick Chat: Assimilate’s Lucas Wilson talks about Scratch Web

Recently, Assimilate launched Scratch Web, a cloud-based multi-user collaboration tool that offers individual clip, timeline or timeline plus version sharing (known as Scratch Construct) as well as native support for industry-standard RAW camera formats.

It’s actually already in use at Santa Monica’s Local Hero Post, where founder and supervising colorist Leandro Marini has made it a part of his everyday workflow. Actually, keep an eye on this space in the future for a Scratch Web review from Marini.

To find out more about the product itself, we picked the brain of Assimilate’s VP of business Continue reading

Revelens: Making online viewing more interactive

By Randi Altman

Industry vet Lucas Wilson loves metadata. So much so he has built a company around it. Revelens offers non-disruptive web-based contextual video bookmarking. What’s that you ask?

Imagine watching an episode of your favorite show online and seeing a watch you might consider buying… if only you knew where to find it. That is where Revelens technology comes in. Tap or click on the screen to register a bookmark – you can either immediately swipe up to open the link and get the information you want, or just continue watching the show. When you do choose to open that bookmark there’s a link to the watch, there’s the UPC code, there’s the price, and information on the product.

Continue reading

The future of post — one man’s vision, part II

By Lucas Wilson

Tremors lead to earthquakes, and the industry has felt a few… the shelves are starting to rattle. And the Big One is not far away.

There is a fundamental change happening in the minds of creators right now. It is possibly the biggest shift since the dawn of film and the ability to make still pictures appear to move.

Continue reading

Blog: The future of post — one man’s vision

By Lucas Wilson

Content is exploding, yet production and post production is crumbling. Those two statements put together make no sense if you’re stuck thinking about the market in the terms we all grew up with.

The world of tentpoles and episodic television is doing what markets do: maturing, automating, and flowing to the lowest cost centers. That will not change or reverse course. Finance… Steel… Automotive… Fabric/Clothing… this is not a new or unpredictable course Continue reading