Tag Archives: 360 video

We Want to Hear From You – Take Our Storage Survey!

If you’re working in post production, animation, VFX and/or VR/AR/360, please take our short survey and tell us what works (and what doesn’t work) for your day-to-day needs.

As today’s visual effects continue to become more sophisticated, and as VR/AR/360 video open up new frontiers for content creation, storage is more important than ever.

What do you need from a storage solution? Your opinion is important to us, so please complete the survey by March 8th.

We want to hear your thoughts… so click here to get started now!

 

 

Review: Mettle VR plug-ins for Adobe Premiere

By Barry Goch

I was very frustrated. I took a VR production class, I bought a LG 360 camera, but I felt like I was missing something. Then it dawned on me — I wanted to have more control. I started editing 360 videos using the VR video viewing tools in Adobe Premiere Pro, but I still was lacking the control I desired. I wanted my audience to have a guided, immersive experience without having to be in a swivel chair to get the most out of my work. Then, like a bolt of lightning, it came to me — I needed to rotate the 360 video sphere. I needed to be able to reorient it to accomplish my vision, but how would I do that?

Rotate Sphere plug-in showing keyframing.

Mettle’s Skybox 360/VR Tools are exactly what I was looking for. The Rotate Sphere plug-in alone is worth the price of the entire plug-in package. With this one plug-in, you’re able to re-orient your 360 video without worrying about any technical issues — it gives you complete creative control to re-frame your 360 video — and it’s completely keyframable too! For example, I mounted my 360 camera on my ski helmet this winter and went down a ski run at Heavenly in Lake Tahoe. There are amazing views of the lake from this run, but I also needed to follow the skiers ahead of me. Plus, the angle of the slope changed and the angle to the subjects I was following changed as well. Since the camera was fixed, how could I guide the viewer? By using the Rotate Sphere plug-in from Mettle and keyframing the orientation of the shot as the slope/subject relationship changed relative to my position.

My second favorite plug-in is Project 2D. Without the Project 2D plug-in, when you add titles to your 360 videos they become warped and you have very little control over their appearance. In Project 2D, you create your title using the built-in titler in Premiere Pro, add it to the timeline, then apply the Project 2D Mettle Skybox plug-in. Now you have complete control over the scale, rotation of the titling element and the placement of the title within the 360 video sphere. You can also use the Project 2D plug-in to composite graphics or video into your 360 video environment.

Mobius Zoom transition in action.

Rounding out the Skybox plug-in set are 360 video-aware plug-ins that every content creator needs. What do I mean but 360 video-aware? For example, when you apply a blur that is not 360 video-content-aware, it crosses the seam where the equi-rectangular video’s edges join together and makes the seam unseemly. With the Skybox Blur, Denoise, Glow and Sharpen plug-ins, you don’t have this problem. Just as the Rotate Sphere plug-in does the crazy math to rotate your 360 video without distortion or introducing artifacts, these plug-ins do the same.

Transitioning between cuts in 360 video is an evolving art form. There is really no right or wrong way. Longer cuts, shorter cuts, dissolves and dips to black are some of the basic options. Now, Mettle is adding to our creative toolkit by applying their crazy math skills on transitions in 360 videos. Mettle started with their first pack of four transitions: Mobius Zoom, Random Blocks, Gradient Wipe and Iris Wipe. I used the Mobius Zoom to transition from the header card to the video and then the Iris Wipe with a soft edge to transition from one shot to the next in the linked video.

Check out this video, which uses Rotate Sphere, Project 2D, Mobius Zoom and Iris wipe effects.

New Plug-Ins
I’m pleased to be among the first to show you their second set of plug-ins specifically designed for 360 / VR video! Chroma Leaks, Light Leaks, Spherical Blurs and everyone’s favorite, Light Rays!

Mettle plug-ins work on both Mac and Windows platforms — on qualified systems — and in realtime. The Mettle plug-ins are also both mono- and stereo-aware.

The Skybox plug-in set for Adobe Premiere Pro is truly the answer I’ve been looking for since I started exploring 360 video. It’s changed the way I work and opened up a world of control that I had been wishing for. Try it for yourself by downloading a demo at www.mettle.com.


Barry Goch is currently a digital intermediate editor for Deluxe in Culver City, working on Autodesk Flame. He started his career as a camera tech for Panavision Hollywood. He then transitioned to an offline Avid/FCP editor. His resume includes Passengers, Money Monster, Eye in the Sky and Game of Thrones. His latest endeavor is VR video.

Virtual Reality Roundtable

By Randi Altman

Virtual reality is seemingly everywhere, especially this holiday season. Just one look at your favorite electronics store’s website and you will find VR headsets from the inexpensive, to the affordable, to the “if I win the lottery” ones.

While there are many companies popping up to service all aspects of VR/AR/360 production, for the most part traditional post and production companies are starting to add these services to their menu, learning best practices as they go.

We reached out to a sampling of pros who are working in this area to talk about the problems and evolution of this burgeoning segment of the industry.

Nice Shoes Creative Studio: Creative director Tom Westerlin

What is the biggest issue with VR productions at the moment? Is it lack of standards?
A big misconception is that a VR production is like a standard 2D video/animation commercial production. There are some similarities, but it gets more complicated when we add interaction, different hardware options, realtime data and multiple distribution platforms. It actually takes a lot more time and man hours to create a 360 video or VR experience relative to a 2D video production.

tom

Tom Westerlin

More development time needs to be scheduled for research, user experience and testing. We’re adding more stages to the overall production. None of this should discourage anyone from exploring a concept in virtual reality, but there is a lot of consideration and research that should be done in the early stages of a project. The lack of standards presents some creative challenges for brands and agencies considering a VR project. The hardware and software choices made for distribution can have an impact on the size of the audience you want to reach as well as the approach to build it.

The current landscape provides the following options:
YouTube and Facebook can hit a ton of people with a 360 video, but has limited VR functionality; a WebVR experience, works within certain browsers like Chrome or Firefox, but not others, limiting your audience; a custom app or experimental installation using the Oculus or HTC Vive, allows for experiences with full interactivity, but presents the issue of audience limitations. There is currently no one best way to create a VR experience. It’s still very much a time of discovery and experimentation.

What should clients ask of their production and post teams when embarking on their VR project?
We shouldn’t just apply what we’ve all learned from 2D filmmaking to the creation of a VR experience, so it is crucial to include the production, post and development teams in the design phase of a project.

The current majority of clients are coming from a point of view where many standard constructs within the world of traditional production (quick camera moves or cuts, extreme close-ups) have negative physiological implications (nausea, disorientation, extreme nausea). The impact of seemingly simple creative or design decisions can have huge repercussions on complexity, time, cost and the user experience. It’s important for clients to be open to telling a story in a different manner than they’re used to.

What is the biggest misconception about VR — content, process or anything relating to VR?
The biggest misconception is clients thinking that 360 video and VR are the same. As we’ve started to introduce this technology to our clients, we’ve worked to explain the core differences between these extremely difference experiences: VR is interactive and most of the time a full CG environment, while 360 is video and although immersive, it’s a more passive experience. Each have their own unique challenges and rewards, so as we think about the end user’s experiences, we can determine what will work best.

There’s also the misconception that VR will make you sick. If executed poorly, VR can make a user sick, but the right creative ideas executed with the right equipment can result in an experience that’s quite enjoyable and nausea free.

Nice Shoes’ ‘Mio Garden’ 360 experience.

Another misconception is that VR is capable of anything. While many may confuse VR and 360 and think an experience is limited to passively looking around, there are others who have bought into the hype and inflated promises of a new storytelling medium. That’s why it’s so important to understand the limitations of different devices at the early stages of a concept, so that creative, production and post can all work together to deliver an experience that takes advantage of VR storytelling, rather than falling victims to the limitations of a specific device.

The advent of affordable systems that are capable of interactivity, like the Google Daydream, should lead to more popular apps that show off a higher level of interactivity. Even sharing video of people experiencing VR while interacting with their virtual worlds could have a huge impact on the understanding of the difference between passively watching and truly reaching out and touching.

How do we convince people this isn’t stereo 3D?
In one word: Interactivity. By definition VR is interactive and giving the user the ability to manipulate the world and actually affect it is the magic of virtual reality.

Assimilate: CEO Jeff Edson

What is the biggest issue with VR productions at the moment? Is it lack of standards?
The biggest issue in VR is straightforward workflows — from camera to delivery — and then, of course, delivery to what? Compared to a year ago, shooting 360/VR video today has made big steps in ease of use because more people have experience doing it. But it is a LONG way from point and shoot. As integrated 360/VR video cameras come to market more and more, VR storytelling will become much more straightforward and the creators can focus more on the story.

Jeff Edson

And then delivery to what? There are many online platforms for 360/VR video playback today: Facebook, YouTube 360 and others for mobile headset viewing, and then there is delivery to a PC for non-mobile headset viewing. The viewing perspective is different for all of these, which means extra work to ensure continuity on all the platforms. To cover all possible viewers one needs to publish to all. This is not an optimal business model, which is really the crux of this issue.

Can standards help in this? Standards as we have known in the video world, yes and no. The standards for 360/VR video are happening by default, such as equirectangular and cubic formats, and delivery formats like H.264, Mov and more. Standards would help, but they are not the limiting factor for growth. The market is not waiting on a defined set of formats because demand for VR is quickly moving forward. People are busy creating.

What should clients ask of their production and post teams when embarking on their VR project?
We hear from our customers that the best results will come when the director, DP and post supervisor collaborate on the expectations for look and feel, as well as the possible creative challenges and resolutions. And experience and budget are big contributors. A key issue is, what camera/rig requirements are needed for your targeted platform(s)? For example, how many cameras and what type of cameras (4K, 6K, GoPro, etc.) as well as lighting? When what about sound, which plays a key role in the viewer’s VR experience.

unexpected concert

This Yael Naim mini-concert was posted in Scratch VR by Alex Regeffe at Neotopy.

What is the biggest misconception about VR — content, process or anything relating to VR?
I see two. One: The perception that VR is a flash in the pan, just a fad. What we see today is just the launch pad. The applications for VR are vast within entertainment alone, and then there is the extensive list of other markets like training and learning in such fields as medical, military, online universities, flight, manufacturing and so forth. Two: That VR post production is a difficult process. There are too many steps and tools. This definitely doesn’t need to be the case. Our Scratch VR customers are getting high-quality results within a single, simplified VR workflow

How do we convince people this isn’t stereo 3D?
The main issue with stereo 3D is that it has really never scaled beyond a theater experience. Whereas with VR, it may end up being just the opposite. It’s unclear if VR can be a true theater experience other than classical technologies like domes and simulators. 360/VR video in the near term is, in general, a short-form media play. It’s clear that sooner than later smart phones will be able to shoot 360/VR video as a standard feature and usage will sky rocket overnight. And when that happens, the younger demographic will never shoot anything that is not 360. So the Snapchat/Instagram kinds of platforms will be filled with 360 snippets. VR headsets based upon mobile devices make the pure number of displays significant. The initial tethered devices are not insignificant in numbers, but with the next-generation of higher-resolution and untethered devices, maybe most significantly at a much lower price point, we will see the numbers become massive. None of this was ever the case with stereo 3D film/video.

Pixvana: Executive producer Aaron Rhodes

What is the biggest issue with VR productions at the moment? Is it lack of standards?
There are many issues with VR productions, many of them are just growing pains: not being able to see a live stitch, how to direct without being in the shot, what to do about lighting — but these are all part of the learning curve and evolution of VR as a craft. Resolution and management around big data are the biggest issues I see on the set. Pixvana is all about resolution — it plays a key role in better immersion. Many of the cameras out there only master at 4K and that just doesn’t cut it. But when they do shoot 8K and above, the data management is extreme. Don’t under estimate the responsibility you are giving to your DIT!

aaron rhodes

Aaron Rhodes

The biggest issue is this is early days for VR capture. We’re used to a century of 2D filmmaking and decade of high-definition capture with an assortment of camera gear. All current VR camera rigs have compromises, and will, until technology catches up. It’s too early for standards since we’re still learning and this space is changing rapidly. VR production and post also require different approaches. In some cases we have to unlearn what worked in standard 2D filmmaking.

What should clients ask of their production and post teams when embarking on their VR project?
Give me a schedule, and make it realistic. Stitching takes time, and unless you have a fleet of render nodes at your disposal, rendering your shot locally is going to take time — and everything you need to update or change it will take more time. VR post has lots in common with a non-VR spot, but the magnitude of data and rendering is much greater — make sure you plan for it.

Other questions to ask, because you really can’t ask enough:
• Why is this project being done as VR?
• Does the client have team members who understand the VR medium?
• If not will they be willing to work with a production team to design and execute with VR in mind?
• Has this project been designed for VR rather than just a 2D project in VR?
• Where will this be distributed? (Headsets? Which ones? YouTube? Facebook? Etc.)
• Will this require an app or will it be distributed to headsets through other channels?
• If it is an app, who will build the app and submit it to the VR stores?
• Do they want to future proof it by finishing greater than 4K?
• Is this to be mono or stereo? (If it’s stereo it better be very good stereo)
• What quality level are they aiming for? (Seamless stitches? Good stereo?)
• Is there time and budget to accomplish the quality they want?
• Is this to have spatialized audio?

What is the biggest misconception about VR — content, process or anything relating to VR?
VR is a narrative component, just like any actor or plot line. It’s not something that should just be done to do it. It should be purposeful to shoot VR. It’s the same with stereo. Don’t shoot stereo just because you can — sure, you can experiment and play (we need to do that always), but don’t without purpose. The medium of VR is not for every situation.
Other misconceptions because there are a lot out there:
• it’s as easy as shooting normal 2D.
• you need to have action going on constantly in 360 degrees.
• everything has to be in stereo.
• there are fixed rules.
• you can simply shoot with a VR camera and it will be interesting, without any idea of specific placement, story or design.
How do we convince people this isn’t stereo 3D?
Education. There are tiers of immersion with VR, and stereo 3D is one of them. I see these tiers starting with the desktop experience and going up in immersion from there, and it’s important to the strengths and weakness of each:
• YouTube/Facebook on the desktop [low immersion]
• Cardboard, GearVR, Daydream 2D/3D low-resolution
• Headset Rift and Vive 2D/3D 6 degrees of freedom [high immersion]
• Computer generated experiences [high immersion]

Maxon US: President/CEO Paul Babb

paul babb

Paul Babb

What is the biggest issue with VR productions at the moment? Is it lack of standards?
Project file size. Huge files. Lots of pixels. Telling a story. How do you get the viewer to look where you want them to look? How do you tell and drive a story in a 360 environment.

What should clients ask of their production and post teams when embarking on their VR project?
I think it’s more that production teams are going to have to ask the questions to focus what clients want out of their VR. Too many companies just want to get into VR (buzz!) without knowing what they want to do, what they should do and what the goal of the piece is.

What is the biggest misconception about VR — content, process or anything relating to VR? How do we convince people this isn’t stereo 3D?
Oh boy. Let me tell you, that’s a tough one. People don’t even know that “3D” is really “stereography.”

Experience 360°: CEO Ryan Moore

What is the biggest issue with VR productions at the moment? Is it lack of standards?
One of the biggest issues plaguing the current VR production landscape is the lack of true professionals that exist in the field. While a vast majority of independent filmmakers are doing their best at adapting their current techniques, they have been unsuccessful in perceiving ryan moorehow films and VR experiences genuinely differ. This apparent lack of virtual understanding generally leads to poor UX creation within finalized VR products.

Given the novelty of virtual reality and 360 video, standards are only just being determined in terms of minimum quality and image specifications. These, however, are constantly changing. In order to keep a finger on the pulse, it is encouraged for VR companies to be plugged into 360 video communities through social media platforms. It is through this essential interaction that VR production technology can continually be reintroduced.

What should clients ask of their production and post teams when embarking on their VR project?
When first embarking on a VR project, it is highly beneficial to walk prospective clients through the entirety of the process, before production actually begins. This allows the client a full understanding of how the workflow is used, while also ensuring client satisfaction with the eventual partnership. It’s vital that production partners convey an ultimate understanding of VR and its use, and explain their tactics in “cutting” VR scenes in post — this can affect the user’s experience in a pronounced way.

‘The Backwoods Tennessee VR Experience’ via Experience 360.

What is the biggest misconception about VR — content, process or anything relating to VR? How do we convince people that this isn’t stereo 3D?
The biggest misconception about VR and 360 video is that it is an offshoot of traditional storytelling, and can be used in ways similar to both cinematic and documentary worlds. The mistake in the VR producer equating this connection is that it can often limit the potential of the user’s experience to that of a voyeur only. Content producers need to think much farther out of this box, and begin to embrace having images paired with interaction and interactivity. It helps to keep in mind that the intended user will feel as if these VR experiences are very personal to them, because they are usually isolated in a HMD when viewing the final product.

VR is being met with appropriate skepticism, and is widely still considered a ‘“fad” without the media landscape. This is often because the critic has not actually had a chance to try a virtual reality experience firsthand themselves, and does not understand the wide reaching potential of immersive media. At three years in, a majority of the adults in the United States have never had a chance to try VR themselves, relying on what they understand from TV commercials and online reviews. One of the best ways to convince a doubtful viewer is to give them a chance to try a VR headset themselves.

Radeon Technologies Group at AMD: Head of VR James Knight

What is the biggest issue with VR productions at the moment? Is it lack of standards?
The biggest issue for us is (or was) probably stitching and the excessive amount of time it takes, but we’re tacking that head on with Project Loom. We have realtime stitching with Loom. You can already download an early version of it on GPUopen.com. But you’re correct, there is a lack of standards in VR/360 production. It’s mainly because there are no really established common practices. That’s to be expected though when you’re shooting for a new medium. Hollywood and entertainment professionals are showing up to the space in a big way, so I suspect we’ll all be working out lots of the common practices in 2017 on sets.

James Knight

What should clients ask of their production and post teams when embarking on their VR project?
Double check they have experience shooting 360 and ask them for a detailed post production pipeline outline. Occasionally, we hear horror stories of people awarding projects to companies that think they can shoot 360 without having personally explored 360 shooting themselves and making mistakes. You want to use an experienced crew that’s made the mistakes, and mostly is cognizant of what works and what doesn’t. The caveat there though is, again, there’s no established rules necessarily, so people should be willing to try new things… sometimes it takes someone not knowing they shouldn’t do something to discover something great, if that makes sense.

What is the biggest misconception about VR — content, process or anything relating to VR? How do we convince people this isn’t stereo 3D?
That’s a fun question. The overarching misconception for me, honestly, is just as though a cliché politician might, for example, make a fleeting judgment that video games are bad for society, people are often times making assumptions that VR if for kids or 16 year old boys at home in their boxer shorts. It isn’t. This young industry is really starting to build up a decent library of content, and the payoff is huge when you see well produced content! It’s transformative and you can genuinely envision the potential when you first put on a VR headset.

The biggest way to convince them this isn’t 3D is to convince a naysayer put the headset on… let’s agree we all look rather silly with a VR headset on, and once you get over that, you’ll find out what’s inside. It’s magical. I had the CEO of BAFTA LA, Chantal Rickards, tell me upon seeing VR for the first time, “I remember when my father had arrived home on Christmas Eve with a color TV set in the 1960s and the excitement that brought to me and my siblings. The thrill of seeing virtual reality for the first time was like seeing color TV for the first time, but times 100!”

Missing Pieces: Head of AR/VR/360 Catherine Day

Catherine Day

What is the biggest issue with VR productions at the moment?
The biggest issue with VR production today is the fact that everything keeps changing so quickly. Every day there’s a new camera, a new set of tools, a new proprietary technology and new formats to work with. It’s difficult to understand how all of these things work, and even harder to make them work together seamlessly in a deadline-driven production setting. So much of what is happening on the technology side of VR production is evolving very rapidly. Teams often reinvent the wheel from one project to the next as there are endless ways to tell stories in VR, and the workflows can differ wildly depending on the creative vision.

The lack of funding for creative content is also a huge issue. There’s ample funding to create in other mediums, and we need more great VR content to drive consumer adoption.

Is it lack of standards?
In any new medium and any pioneering phase of an industry, it’s dangerous to create standards too early. You don’t want to stifle people from trying new things. As an example, with our recent NBA VR project, we broke all of the conventional rules that exist around VR — there was a linear narrative, fast cut edits, it was over 25 minutes long — yet still was very well received. So it’s not a lack of standards, just a lack of bravery.

What should clients ask of their production and post teams when embarking on their VR project?
Ask to see what kind of work that team has done in the past. They should also delve in and find out exactly who completed the work and how much, if any, of it was outsourced. There is a curtain that often closes between the client and the production/post company and it closes once the work is awarded. Clients need to know who exactly is working on their project, as much of the legwork involved in creating a VR project — stitching, compositing etc. — is outsourced.

It’s also important to work with a very experienced post supervisor — one with a very discerning eye. You want someone who really knows VR that can evaluate every aspect of what a facility will assemble. Everything from stitching, compositing to editorial and color — the level of attention to detail and quality control for VR is paramount. This is key not only for current releases, but as technology evolves — and as new standards and formats are applied — you want your produced content to be as future-proofed as possible so that if it requires a re-render to accommodate a new, higher-res format in the future, it will still hold up and look fantastic.

What is the biggest misconception about VR — content, process or anything relating to VR?
On the consumer level, the biggest misconception is that people think that 360 video on YouTube or Facebook is VR. Another misconception is that regular filmmakers are the creative talents best suited to create VR content. Many of them are great at it, but traditional filmmakers have the luxury of being in control of everything, and in a VR production setting you have no box to work in and you have to think about a billion moving parts at once. So it either requires a creative that is good with improvisation, or a complete control freak with eyes in the back of their head. It’s been said before, but film and theater are as different as film and VR. Another misconception is that you can take any story and tell it in VR — you actually should only embark on telling stories in VR if they can, in some way, be elevated through the medium.

How do we convince people this isn’t stereo 3D?
With stereo 3D, there was no simple, affordable path for consumer adoption. We’re still getting there with VR, but today there are a number of options for consumers and soon enough there will be a demand for room-scale VR and more advanced immersive technologies in the home.

VR Audio: Virtual and spacial soundscapes

By Beth Marchant

The first things most people think of when starting out in VR is which 360-degree camera rig they need and what software is best for stitching. But virtual reality is not just a Gordian knot for production and post. Audio is as important — and complex — a component as the rest. In fact, audio designers, engineers and composers have been fascinated and challenged by VR’s potential for some time and, working alongside future-looking production facilities, are equally engaged in forging its future path. We talked to several industry pros on the front lines.

Howard Bowler

Music industry veteran and Hobo Audio founder Howard Bowler traces his interest in VR back to the groundbreaking film Avatar. “When that movie came out, I saw it three times in the same week,” he says. I was floored by the technology. It was the first time I felt like you weren’t just watching a film, but actually in the film.” As close to virtual reality as 3D films had gotten to that point, it was the blockbuster’s evolved process of motion capture and virtual cinematography that ultimately delivered its breathtaking result.

“Sonically it was extraordinary, but visually it was stunning as well,” he says. “As a result, I pressed everyone here at the studio to start buying 3D televisions, and you can see where that has gotten us — nowhere.” But a stepping stone in technology is more often a sturdy bridge, and Bowler was not discouraged. “I love my 3D TVs, and I truly believe my interest in that led me and the studio directly into VR-related projects.”

When discussing the kind of immersive technology Hobo Sound is involved with today, Bowler — like others interviewed for this series — clearly define VR’s parallel deliverables. “First, there’s 360 video, which is passive viewing, but still puts you in the center of the action. You just don’t interact with it. The second type, more truly immersive VR, lets you interact with the virtual environment as in a video game. The third area is augmented reality,” like the Pokemon Go phenomenon of projecting virtual objects and views onto your actual, natural environment. “It’s really important to know what you’re talking about when discussing these types of VR with clients, because there are big differences.”

With each segment comes related headsets, lenses and players. “Microsoft’s HoloLens, for example, operates solely in AR space,” says Hobo producer Jon Mackey. “It’s a headset, but will project anything that is digitally generated, either on the wall or to the space in front of you. True VR separates you from all that, and really good VR separates all your senses: your sight, your hearing and even touch and feeling, like some of those 4D rides at Disney World.” Which technology will triumph? “Some think VR will take it, and others think AR will have wider mass adoption,” says Mackey. “But we think it’s too early to decide between either one.”

Boxed Out

‘Boxed Out’ is a Hobo indie project about how gentrification is affecting artists studios in the Gowanus section of Brooklyn.

Those kinds of end-game obstacles are beside the point, says Bowler. “The main reason why we’re interested in VR right now is that the experiences, beyond the limitations of whatever headset you watch it on, are still mind-blowing. It gives you enough of a glimpse of the future that it’s incredible. There are all kinds of obstacles it presents just because it’s new technology, but from our point of view, we’ve honed it to make it pretty seamless. We’re digging past a lot of these problem areas, so at least from the user standpoint, it seems very easy. That’s our goal. Down the road, people from medical, education and training are going to need to understand VR for very productive reasons. And we’re positioning ourselves to be there on behalf of our clients.”

Hobo’s all-in commitment to VR has brought changes to its services as well. “Because VR is an emerging technology, we’re investing in it globally,” says Bowler. “Our company is expanding into complete production, from concepting — if the client needs it — to shooting, editing and doing all of the audio post. We have the longest experience in audio post, but we find that this is just such an exciting area that we wanted to embrace it completely. We believe in it and we believe this is where the future is going to be. Everybody here is completely on board to move this forward and sees its potential.”

To ramp up on the technology, Hobo teamed up with several local students who were studying at specialty schools. “As we expanded out, we got asked to work with a few production companies, including East Coast Digital and End of Era Productions, that are doing the video side of it. We’re bundling our services with them to provide a comprehensive set of services.” Hobo is also collaborating with Hidden Content, a VR production and post production company, to provide 360 audio for premium virtual reality content. Hidden Content’s clients include Samsung, 451 Media, Giant Step, PMK-BNC, Nokia and Popsugar.

There is still plenty of magic sauce in VR audio that continues to make it a very tricky part of the immersive experience, but Bowler and his team are engineering their way through it. “We’ve been developing a mixing technique that allows you to tie the audio to the actual object,” he says. “What that does is disrupt the normal stereo mix. Say you have a public speaker in the center of the room; normally that voice would turn with you in your headphones if you turn away from him. What we’re able to do is to tie the audio of the speaker to the actual object, so when you turn your head, it will pan to the right earphone. That also allows you to use audio as signaling devices in the storyline. If you want the viewer to look in a certain direction in the environment, you can use an audio cue to do that.”

Hobo engineer Diego Jimenez drove a lot of that innovation, says Mackey. “He’s a real VR aficionado and just explored a lot of the software and mixing techniques required to do audio in VR. We started out just doing a ton of tests and they all proved successful.” Jimenez was always driven by new inspiration, notes Bowler. “He’s certainly been leading our sound design efforts on a lot of fronts, from creating instruments to creating all sorts of unusual and original sounds. VR was just the natural next step for him, and for us. For example, one of the spots that we did recently was to create a music video and we had to create an otherworldly environment. And because we could use our VR mixing technology, we could also push the viewer right into the experience. It was otherworldly, but you were in that world. It’s an amazing feeling.”

boxed-out

‘Boxed Out’

What advice do Bowler and Mackey have for those interested in VR production and post? “360 video is to me the entry point to all other versions of immersive content,” says Bowler. “It’s the most basic, and it’s passive, like what we’re used to — television and film. But it’s also a completely undefined territory when it comes to production technique.” So what’s the way in? “You can draw on some of the older ways of doing productions,” he says, “but how do you storyboard in 360? Where does the director sit? How do you hide the crew? How do you light this stuff? All of these things have to be considered when creating 360 video. That also includes everyone on camera: all the viewer has to do is look around the virtual space to see what’s going on. You don’t want anything that takes the viewer out of that experience.”

Bowler thinks 360 video is also the perfect entry point to VR for marketers and advertisers creating branded VR content, and Hobo’s clients agree. “When we’ve suggested 360 video on certain projects and clients want to try it out, what that does is it allows the technology to breathe a little while it’s underwritten at the same time. It’s a good way to get the technology off the ground and also to let clients get their feet wet in it.”

Any studio or client contemplating VR, adds Mackey, should first find what works for them and develop an efficient workflow. “This is not really a solidified industry yet,” he says. “Nothing is standard, and everyone’s waiting to see who comes out on top and who falls by the wayside. What’s the file standard going to be? Or the export standard?  Will it be custom-made apps on (Google) YouTube or Facebook? We’ll see Facebook and Google battle it out in the near term. Facebook has recently acquired an audio company to help them produce audio in 360 for their video app and Google has the Daydream platform,” though neither platform’s codec is compatible with the other, he points out. “If you mix your audio to Facebook audio specs, you can actually have your audio come out in 360. For us, it’s been trial and error, where we’ve experimented with these different mixing techniques to see what fits and what works.”

Still, Bowler concedes, there is no true business yet in VR. “There are things happening and people getting things out there, but it’s still so early in the game. Sure, our clients are intrigued by it, but they are still a little mystified by what the return will be. I think this is just part of what happens when you deal with new technology. I still think it’s a very exciting area to be working in, and it wouldn’t surprise me if it doesn’t touch across many, many different subjects, from history to the arts to original content. Think about applications for geriatrics, with an aging population that gets less mobile but still wants to experience the Caribbean or our National Parks. The possibilities are endless.”

At one point, he admits, it may even become difficult to distinguish one’s real memory from one’s virtual memory. But is that really such a bad thing? “I’m already having this problem. I was watching an immersive video of Cuban music, that was pretty beautifully done, and by the end of the five-minute spot, I had the visceral experience that I was actually there. It’s just a very powerful way of experiencing content. Let me put it another way: 3D TVs were at the rabbit hole, and immersive video will take you down the rabbit hole into the other world.”

Source Sound
LA-based Source Sound, which has provided supervision and sound design on a number of Jaunt-produced cinematic VR experiences, including a virtual fashion show, a horror short and a Godzilla short film written and directed by Oscar-winning VFX artist Ian Hunter, as well as final Atmos audio mastering for the early immersive release Sir Paul McCartney Live, is ready for spacial mixes to come. That wasn’t initially the case.

Tim

Tim Gedemer

“When Jaunt first got into this space three years ago, they went to Dolby to try to figure out the audio component,” says Source Sound owner/supervising sound designer/editor Tim Gedemer. “I got a call from Dolby, who told me about what Jaunt was doing, and the first thing I said was, ‘I have no idea what you are talking about!’ Whatever it is, I thought, there’s really no budget and I was dragging my feet. But I asked them to show me exactly what they were doing. I was getting curious at that point.”

After meeting the team at Jaunt, who strapped some VR goggles on him and showed him some footage, Gedemer was hooked. “It couldn’t have been more than 30 seconds in and I was just blown away. I took off the headset and said, ‘What the hell is this?! We have to do this right now.’ They could have reached out to a lot of people, but I was thrilled that we were able to help them by seizing the moment.”

Gedemer says Source Sound’s business has expanded in multiple directions in the past few years, and VR is still a significant part of the studio’s revenue. “People are often surprised when I tell them VR counts for about 15-20 percent of our business today,” he says. “It could be a lot more, but we’d have to allocate the studios differently first.”

With a background in mixing and designing sound for film and gaming and theatrical trailers, Gedemer and his studio have a very focused definition of immersive experiences, and it all includes spacial audio. “Stereo 360 video with mono audio is not VR. For us, there’s cinematic, live-action VR, then straight-up game development that can easily migrate into a virtual reality world and, finally, VR for live broadcast.” Mass adoption of VR won’t happen, he believes, until enterprise and job training applications jump on the bandwagon with entertainment. “I think virtual reality may also be a stopover before we get to a world where augmented reality is commonplace. It makes more sense to me that we’ll just overlay all this content onto our regular days, instead of escaping from one isolated experience to the next.”

On set for the European launch of the Nokia Ozo VR camera in London, which featured a live musical performances captured in 360 VR.

For now, Source Sound’s VR work is completed in dedicated studios configured with gear for that purpose. “It doesn’t mean that we can’t migrate more into other studios, and we’re certainly evolving our systems to be dual-purpose,” he says. “About a year ago we were finally able to get a grip on the kinds of hardware and software we needed to really start coagulating this workflow. It was also clear from the beginning of our foray into VR that we needed to partner with manufacturers, like Dolby and Nokia. Both of those companies’ R&D divisions are on the front lines of VR in the cinematic and live broadcast space, with Dolby’s Atmos for VR and Nokia’s Ozo camera.”

What missing tools and technology have to be developed to achieve VR audio nirvana? “We delivered a wish list to Dolby, and I think we got about a quarter of the list,” he says. “But those guys have been awesome in helping us out. Still, it seems like just about every VR project that we do, we have to invent something to get us to the end. You definitely have to have an adventurous spirit if you want to play in this space.”

The work has already influenced his approach to more traditional audio projects, he says, and he now notices the lack of inter-spacial sound everywhere. “Everything out there is a boring rectangle of sound. It’s on my phone, on my TV, in the movie theater. I didn’t notice it as much before, but it really pops out at me now. The actual creative work of designing and mixing immersive sound has realigned the way I perceive it.”

Main Image: One of Hobo’s audio rooms, where the VR magic happens.


Beth Marchant has been covering the production and post industry for 21 years. She was the founding editor-in-chief of Studio/monthly magazine and the co-editor of StudioDaily.com. She continues to write about the industry.

 

Missing Pieces hires head of VR/AR/360, adds VR director

Production company Missing Pieces has been investing in VR recently by way of additional talent. Catherine Day has joined the studio as head of VR/AR/360. She was most recently at Jaunt VR where she was executive producer/head of unscripted. VR director Sam Smith has also joined the company as part of its VR directing team.

This bi-coastal studio has a nice body of VR under its belt. They are responsible for Dos Equis’ VR Masquerade and for bringing a president into VR with Bill Clinton’s Inside Impact series. They also created Follow My Lead: The Story of the NBA 2016 Finals, a VR sports documentary for the NBA and Oculus.

In her new role, Day (pictured) will drive VR/AR/360 efforts from the studio’s Los Angeles office and oversee several original VR series that will be announced jointly with WME and partners in the coming months. In her previous role at Jaunt VR, Day led projects for ABC News, RYOT/Huffington Post, Camp 4 Collective, XRez, Tastemade, Outside TV, Civic Nation and Conservation International.

VR director Smith is a CD and VR director who previously worked with MediaMonks on projects for Expedia, Delta, Converse and YT. Smith also has an extensive background in commercial visual effects. His has a deep understanding of post and VFX, which is helpful when developing VR/360 projects. He will also act as technical advisor.

Experiencing autism in VR via Happy Finish

While people with autism might “appear” to be like the rest of us, the way they experience the world is decidedly different. Imagine sensory overload times 10. In an effort to help the public understand autism, the UK’s National Autistic Society and agency Don’t Panic have launched a campaign called “Too Much Information” (#autismTMI) that is set to challenge myths, misconceptions and stereotypes relating to this neurobiological disorder.

In order to help tell that story, the NAS called on London’s Happy Finish to help create a 360-degree VR film that puts viewers into the shoes of a child with autism during a visit to the store. A 2D film had previously been developed based on the experience of a 10-year-old boy autistic boy named Alexander. Happy Finish provided visual effects for that version, which, since March of last year, has over 54 million views and over 850K shares. The new 360-degree VR experience takes the viewer into Alexander’s world in a more immersive way.

After interviewing several autistic adults as part of the research, Happy Finish worked on this idea that aims to trigger viewer’s empathy and understanding. Working with Don’t Panic and The National Autistic Society, they share Alexander’s experience in an immersive and moving way.

The piece was shot by DP Michael Hornbogen using a six-camera GoPro array in 3D printed housing. For stitching, Happy Finish called on Autopano by Kolor, The Foundry’s Nuke and Adobe After Effects. Editing was in Adobe Premiere. Color grading was via Blackmagic’s Resolve.

“It was a long process of compositing using various tools,” explains Jamie Mossahebi, director of the VR shooting at Happy Finish. “We created 18 versions and amended and tweaked based on initial feedback from autistic adults.”

He says that most of the studio’s VR experiences aim to create something comfortable and pleasant, but this one needed to be uncomfortable while remaining engaging. “The main challenge was to be as realistic as possible, for that, we focused a lot on the sound design as well as a testing a wide variety of visual effects, selecting the key ones that contributed to making it as immersive and as close to a sensory overload as possible,” explains Mossahebi, who directed the VR film.

“This is Don’t Panic’s first experience of creating a virtual reality campaign,” says Richard Beer, creative director of Don’t Panic. “The process of creating a virtual reality film has a whole different set of rules: it’s about creating a place for people to visit and a person for them to become, rather than simply telling a story. This interactivity of virtual reality gives it a unique sense of “presence” — it has the power to take us somewhere else in time and space, to help us feel, just for a while, what it’s like to be someone else – which is why it was the perfect tool to communicate exactly what a sensory overload feels like for someone with autism for the NAS.”

Sponsored by Tangle Teaser and Intu, the film will tour shopping centers around the UK and will also be available through Autism TMI Virtual Reality Experience view app.

VR Audio: Crytek goes to new heights for VR game ‘The Climb’

By Jennifer Walden

Dealing with locomotion, such as walking and especially running, is a challenge for VR content developers — but what hasn’t been a challenge in creating VR content? Climbing, on the other hand, has proved to be a simple, yet interesting, locomotion that independent game developer Crytek found to be sustainable for the duration of a full-length game.

Crytek, known for the Crysis game series, recently released their first VR game title, The Climb, a rock climbing adventure exclusively for the Oculus Rift. Players climb, swing and jump their way up increasingly difficult rock faces modeled after popular climbing destinations in places like Indonesia, the Grand Canyon and The Alps.

Crytek’s director of audio, Simon Pressey, says their game engine, CryEngine, is capable of UltraHD resolutions higher than 8K. They could have taken GPS data of anywhere in the world and turned that into a level on The Climb. “But to make the climbing interesting and compelling, we found that real geography wasn’t the way to go. Still, we liked the idea of representing different areas of the world,” he says. While the locations Crytek designed aren’t perfect geographical imitations, geologically they’re pretty accurate. “The details of how the rocks look up close — the color, the graininess and texture — they are as close to photorealistic as we can get in the Oculus Rift. We are running at a resolution that the Rift can handle. So how detailed it looks depends on the Rift’s capabilities.”

Keep in mind that this is first-generation VR technology. “It’s going to get better,” promises Pressey. “By the third-generation of this, I’m sure we’ll have visuals you can’t tell apart from reality.”

Simon Pressey

Simon Pressey

The Sound Experience
Since the visuals aren’t perfect imitations of reality, the audio is vital for maintaining immersion and supporting the game play. Details in the audio actually help the brain process the visuals faster. Even still, flaws and all, first-gen VR headsets give the player a stronger connection to his/her actions in-game than was previously possible with traditional 2D (flat screen) games. “You can look away from the screen in a traditional game, but you can’t in VR. When you turn around in The Climb, you can see a thousand feet below you. You can see that it’s a long way down, and it feels like a long way down.”

One key feature of the Oculus Rift is the integrated audio — it comes equipped with headphones. For Pressey, that meant knowing the exact sound playback system of the end user, a real advantage from a design and mix standpoint. “We were designing for a known playback variable. We knew that it would be a binaural experience. Early on we started working with the Oculus-provided 3D encoder plug-in for Audiokinetic’s Wwise, which Oculus includes with their audio SDK. That plug-in provides HRTF binaural encoding, adding the z-axis that you don’t normally experience even with surround sound,” says Pressey.

He explains that the sounds start as mono source-points, positioned in a 3D space using middleware like Wwise. Then, using the Oculus audio SDK via the middleware, those audio signals are being downmixed to binaural stereo, which gets HRTF (head related transfer function) processing, adding a spatialized effect to the sounds. So even though the player is listening through two speakers, he/she perceives sounds as coming from the left, the right, in front, behind, above and below.

Since most VR is experienced with headphones, Pressey feels there is an opportunity to improve the binaural presentation of the audio [i.e., better headphones or in-ear monitors], and to improve 3D positional audio with personalized HRTFs and Ambisonics. “While the visuals are still very apparently a representation of reality, the audio is perceived as realistic, even if it is a totally manufactured reality. The headphone environment is very intimate and allows greater use of dynamic range, so subtle mixes and more realistic recordings and rendering are sort of mandatory.”

Realistic Sound
Pressey leads the Crytek audio team, and together they collaborated on The Climb’s audio design, which includes many different close-up hand movements and grabs that signify the quality of the player’s grip. There are sweaty, wet sounding hand grabs. There are drier, firmer hand grabs for when a player’s hands are freshly chalked. There are rock crumbles for when holds crumble away.

At times a player needs to wipe dirt away from a hold, or brush aside vegetation. These are very subtle details that in most games wouldn’t be sounded, says Pressey. “But in VR, we are going into very subtle detail. Like, when you rub your hands over plants searching for grips, we are following your movement speed to control how much sound it makes as you ruffle the leaves.” It’s that level of detail that makes the immersion work. Even though in real life a sound so small would probably be masked by other environmental sounds, in the intimacy of VR, those sounds engage the player in the action of climbing.

Crytek_TheClimb_Asia_Screenshot4

Breathing and heartbeat elements also pull a player into the game experience. After moving through several holds, a player’s hands get sweaty, and the breathing sound becomes more labored. If the hold crumbles or if a player is losing his/her grip, the audio design employs a heartbeat sound. “It is not like your usual game situation where you hear a heartbeat if you have low health. In The Climb you actually think, “I’ve got to jump!” Your heart is racing, and after you make the jump and chalk your hands, then your heartbeat and your breathing slow down, and you physically relax,” he says.

Crytek’s aim was to make The Climb believable, to have realistic qualities, dynamic environments and a focused sound to mimic the intensity of focus felt when concentrating on important life or death decisions. They wanted the environment sounds to change, such as the wind changing as a player moves around a corner. But, they didn’t want to intentionally draw the player’s attention away from climbing.

For example, there’s a waterfall near one of the climbs, and the sound for it plays subtly in the background. If the player turns to look at it, then the waterfall sound fades up. They are able to focus the player’s attention by attenuating non-immediate sounds. “You don’t want to hear that waterfall as the focus of your attention and so we steer the sound. But, if that is what you’re focusing on, then we want to be more obvious,” explains Pressey.

The Crytek audio team

The Crytek audio team

The Crytek audio team records, designs and edits sounds in Steinberg’s Nuendo 7, which works directly with Audiokinetic’s Wwise middleware that connects directly to the CryEngine. The audio team, which has been working this way for the past two years, feels the workflow is very iterative, with the audio flowing easily in that pipeline from Nuendo 7 to Wwise to CryEngine and back again. They are often able to verify the audio in-game without needing to request code support. If a sound isn’t working in-game, it can be tweaked in Wwise or completely reworked in Nuendo. All aspects of the pipeline are version controlled and built for sharing work across the audio team.

“It’s a really tight workflow and we can do things quickly. In the game world, speed is everything,” says Pressey. “The faster you get your game to market the sooner you recoup on your very heavy R&D.”

Two factors that propelled this workflow are the collaboration between Crytek, Audiokinetic and Steinberg in designing software tailored to the specific needs of game audio pros, and Crytek’s overhaul of CryEngine where they removed the integrated FMOD-based audio engine in favor of using an external audio engine. Running the audio engine separate from the game engine not only improves the game engine efficiency, it also allows updates to the audio engine as needed without fear of breaking the game engine.

Within hours of Wwise releasing an update, for example, Pressey says their system can be up to date. “Previously, it could’ve been a long and complicated process to incorporate the latest updates. There was always the risk of crashing the whole system by making a change because the code was so mixed up with the rest of the system. By separating them we can always be running the latest versions of things without risking anything.”

Having that adaptability is essential for VR content creation since the industry is changing all the time. For example, Sony’s PS4 VR headset release is slated for this fall, so they’re releasing a new SDK about every week or so, according to Pressey.

CryEngine is freely available for anyone to use. VR games developed with CryEngine will work for any VR platform. CryEngine is also audio middleware agnostic, meaning it can talk to any audio middleware, be it Wwise, FMOD or proprietary middleware. Users can choose a workflow that best suits the needs of their game.

Pressey finds creating for VR to be an intensely experimental process, for every discipline involved in game development. While most members on the Crytek team have solved problems relating to a new IP or a new console, Pressey says, “We were not prepared for this amount of new. We were all used to knowing what we were doing, and now we are experimenting with no net to fall back on. The experience is surprisingly different; the interaction using your eye and head tracking is much more physical. It is more intimate. There is an undeniable and inescapable immersion, in that you can’t look away as the game world is all around you. You can’t switch off your ears.” The first time Pressey put on a VR headset, he knew there was no going back. “Before that, I had no real idea. It is the difference between reading about a country and visiting it.”

Upcoming Release
Crytek will be presenting a new VR release titled Robinson — The Journey at E3 this month, and Pressey gives us a few hints as to what the game experience might be like. He says that VR offers new ways of storytelling, such as nonlinear storytelling. “Crytek and the CryEngine team have developed a radically new Dynamic Response System to allow the game to be intelligent in what dialog gets presented to the player at what time. Aspects of a story can be sewn together and presented based on the player’s approach to the game. This technology takes the idea of RPG-like branching storylines to a new level, and allows narrative progression in what I hope will be new and exciting territory for VR.”

The Climb uses this Dynamic Response System in a limited capacity during the tutorial where the instructor is responsive to the player’s actions. “Previously, to be that responsive, a narrative designer or level designer would have to write pages of logic to do what our new system does very simply,” concludes Pressey.

Jennifer Walden is an audio engineer and writer based in New Jersey.

Talking VR content with Phillip Moses of studio Rascali

Phillip Moses, head of VR content developer Rascali, has been working in visual effects for over 25 years. His resume boasts some big-name films, including Alice in Wonderland, Speed Racer and Spider-Man 3, just to name a few. Seven years ago he launched a small boutique visual effects studio, called The Resistance VFX, with VFX supervisor Jeff Goldman.

Two years ago, after getting a demo of an Oculus pre-release Dev Kit 2, Moses realized that “we were poised on the edge of not just a technological breakthrough, but what will ultimately be a new platform for consuming content. To me, this was a shift almost as big as the smartphone, and an exciting opportunity for content creators to begin creating in a whole new ecosystem.”

Phillip Moses

Phillip Moses

Shortly after that, his friends James Chung and Taehoon Oh launched Reload Studios, with the vision of creating the first independently-developed first-person shooter game, designed from the ground up for VR. “As one of the first companies formed around the premise of VR, they attracted quite a bit of interest in the non-gaming sector as well,” he explains. “Last year, they asked me to come aboard and direct their non-gaming division, Rascali. I saw this as a huge opportunity to do what I love best: explore, create and innovate.”

Rascali has been busy. They recently debuted trailers for their first episodic VR projects, Raven and The Storybox Project, on YouTube, Facebook/Oculus Video, Jaunt, Littlstar, Vrideo and Samsung MilkVR. Let’s find out more…

You recently directed two VR trailers. How is directing for VR different than directing for traditional platforms?
Directing for VR is a tricky beast and requires a lot of technical knowledge of the whole process that would not normally be required of directors. To be fair, today’s directors are a very savvy bunch, and most have a solid working knowledge of how visual effects are used in the process. However, for the way I have chosen to shoot the series, it requires the ability to have a pretty solid understanding of not just what can be done, but how to actually do it. To be able to previsualize the process and, ultimately, the end result in your head first is critical to being able to communicate that vision down the line.

Also, from a script and performance perspective, I think it’s important to start with a very important question of “Why VR?” And once you believe you have a compelling answer to that question, then you need to start thinking about how to use VR in your story.  Will you require interaction and participation from the viewer? Will you involve the viewer in any way? Or will you simply allow VR to serve as an additional element of presence and immersion for the viewer?

While you gain many things in VR, you also have to go into the process with a full knowledge of what you ultimately lose. The power of lenses, for example, to capture nuance and to frame an image to evoke an emotional response, is all but lost. You find yourself going back to exploring what works best in a real-world framing — almost like you are directing a play in an intimate theater.

What is the biggest challenge in the post workflow for VR?
Rendering! Everything we are producing for Raven is at 4K left eye, 4K right eye and 60fps. The rendering process alone guarantees that the process will take longer than you hoped. It also guarantees that you will need more data storage than you ever thought necessary.

But other than rendering, I find that the editorial process is also more challenging. With VR, those shots that you thought you were holding onto way too long are actually still too short, and it involves an elaborate process to conform everything for review in a headset between revisions. In many ways, it’s similar to the old process of making your edit decisions, then walking the print into the screening room. You forget how tedious the process can be.
By the way, I’m looking forward to integrating some realtime 360 review into the editorial process. Make it happen Adobe/Avid!

These trailers are meant to generate interest from production partners to green light these as full episodic series. What is the intended length of each episode, and what’s the projected length of time from concept to completion for each episode of the all-CG Storybox, and live-action Raven?
Each one of these projects is designed for completely different audiences, so the answer is a bit different for each one. For Storybox, we are looking to keep each episode under five minutes, with the intention that it is a fairly easy-to-consume piece of content that is accessible to a broad spectrum of ages. We really hope to make the experiences fun, playful and surprising for the viewer, and to create a context for telling these stories that fuels the imagination of kids.

For Storybox, I believe that we can start delivering finished episodes before the end of the third quarter — with a full season representing 12 to 15 episodes. Raven, on the other hand, is a much more complex undertaking. While the VR market is being developed, we are betting on the core VR consumers to really want stories and experiences that range closer to 12 to 15 minutes in duration. We feel this is enough time to tell more complex stories, but still make each episode feel like a fantastic experience that they could not experience anywhere else. If green-lit tomorrow, I believe we would be looking at a four-month production schedule for the pilot episode.

Rascali is a division of Reload Studios, which is developing VR games. Is there a technology transfer of workflows and pipelines and shared best practices across production for entertainment content and games within the company?
Absolutely! While VR is a new technology, there is such a rich heritage of knowledge present at Reload Studios. For example, one question that VR directors are asking themselves is: “How can I direct my audience’s attention to action in ways that are organic and natural?” While this is a new question for film directors — who typically rely on camera to do this work for them — this is a question that the gaming community has been answering for years. Having some of the top designers in the game industry at our disposal is an invaluable asset.

That being said, Reload is much different than most independent game companies. One of their first hires was senior Disney animator Nik Ranieri. Our producing team is composed of top animation producers from Marvel and DC. We have a deep bench of people who give the whole company a very comprehensive knowledge of how content of all types is created.

What was the equipment set-up for the Raven VR shoot? Which camera was used? What tools were used in the post pipeline?
Much of the creative IP for Raven is very much in development, including designs, characters, etc. For this reason, we elected to construct a teaser that highlighted immersive VR vistas that you could expect in the world we are creating. This required us to lean very heavily on the visual effects / CG production process — the VFX pipeline included Autodesk 3ds Max, rendering in V-Ray, with some assistance from Nuke and even Softimage XSI. The entire project was edited in Adobe Premiere.

For our one live-action element, this was shot with a single Red camera, and then projected onto geometry for accurate stereo integration.

Where do you think the prevailing future of VR content is? Narrative, training, therapy, gaming, etc.?
I think your question represents the future of VR. Games, for sure, are going to be leading the charge, as this demographic is the only one on a large scale that will be purchasing the devices required to build a viable market. But much more than games, I’m excited to see growth in all of the areas you listed above, including, most significantly, education. Education could be a huge winner in the growing VR/AR ecosystem.

The reason I elected to join Rascali is to help provide solutions and pave the way for solutions in markets that mostly don’t yet exist.  It’s exciting to be a part of a new industry that has the power to improve and benefit so many aspects of the global community.

What does Fraunhofer Digital Media Alliance do? A lot!

By Jonathan Abrams

While the vast majority of the companies with exhibit space at NAB are for-profit, there is one non-profit that stands out. With a history of providing ubiquitous technology to the masses since 1949, Fraunhofer focuses on applied research and developments that end up — at some point in the near future — as practical products or ready-for-market technology.

In terms of their revenue, one-third of their funding is for basic research, with the remaining two-thirds applied toward industry projects and coming directly from private companies. Their business model is focused on contract research and licensing of technologies. They have sold first prototypes and work with distributors, though Fraunhofer always keeps the rights to continue development.

What projects were they showcasing at NAB 2106 that have real-world applications in the near future? You may have heard about the Lytro camera. Fraunhofer Digital Media Alliance member Fraunhofer IIS has been taking a camera agnostic approach to their work with light-field technology. Their goal is to make this technology available for many different camera set-ups, and they were proving it with a demo of their multi-cam light-field plug-in for The Foundry’s Nuke. After capturing a light-field, users can perform framing correction and relighting, including changes to angles, depth and the creation of point clouds.

The Nuke plug-in (see our main image) allows the user to create virtual lighting (relighting) and interactive lighting. Light-field data also allows for depth estimation (called depth maps) and is useful for mattes and secondary color correction. Similar to Lytro, focus pulling can be performed with this light-field plug-in. Why Nuke? That is what their users requested. Even though Nuke is an OFX host, the Fraunhofer IIS light field plug-in only works within Nuke. As for using this light-field plug-in outside of Nuke, I was told that “porting to Mac should be an easy task.” Hopefully that is an accurate statement, though we will have to wait to find out.

DCP
Fraunhofer IIS has its hand in other parts of production and post as well. The last two steps of most projects are the creation of deliverables and their delivery. If you need to create and deliver a DCP (Digital Cinema Package), then easyDCP may be for you.easydcp1

This project began in 2008, when creating a DCP was not as familiar as it is today to most users, and a deep expertise of the specifications for correctly making a DCP was very complex. Small- to medium-sized post companies, in particular, profit from the easy-to-use easyDCP suite. The engineers of Fraunhofer IIS were also working on behalf of the DCI specifications for Digital Cinema, therefore they are experienced in integrating all important features in this software for DCPs.

The demo I saw indicated that the JPEG2000 encode was as fast as 108fps! In 2013, Fraunhofer partnered with both Blackmagic and Quantel to make this software available to the users of those respective finishing suites. The demo I saw was using a Final Cut Pro X project file and it was with the Creator+ version since it had support for encryption. Avid Media Composer users will have to export their sequence and import it into Resolve to use easyDCP Creator. Amazingly, this software works as far back as Mac OS X Leopard. IMF creation and playback can also be done with the easyDCP software suite.

VR/360
VR and 360-degree video were prominent at NAB, and the institutes of the Fraunhofer Digital Media Alliance are involved in this as well, having worked on live streaming and surround sound as part of a project with the Berlin Symphony Orchestra.

Fraunhofer had a VR demo pod at the ATSC 3.0 Consumer Experience (in South Hall Upper) — I tried it and the sound did track with my head movement. Speaking of ATSC 3.0, it calls for an immersive audio codec. Each country or geographic region that adopts ATSC 3.0 can choose to implement either Dolby AC-4 or MPEG-H, the latter of which is the result of research and development by Fraunhofer, Technicolor and Qualcomm. South Korea announced earlier this year that they will begin ATSC 3.0 (UHDTV) broadcasting in February 2017 using the MPEG-H audio codec.

From what you see to what you hear, from post to delivery, the Fraunhofer Digital Media Alliance has been involved in the process.

Jonathan S. Abrams is the Chief Technical Engineer at Nutmeg, a creative marketing, production and post resource.

NAB: Las Vegas SuperMeet adds VR/360 to event coverage

The Rio Hotel will be hopping on April 19 when it hosts this year’s Las Vegas SuperMeet. The annual Creative Pro User Group (CPUG) Network event draws Final Cut Pro, Adobe, Avid and DaVinci Resolve editors, gurus, digital filmmakers and content creators during NAB.

The second half of this year’s event is focusing on VR and 360 video, the hot topics at this year’s show. We wanted to know what attendees can expect, so we threw some questions at Daniel Bérubé and Michael Horton, the architects of this event, to find out more.

Some compare VR and 360 video to stereo 3D. Why do you feel this is different?
VR/360 video is more accessible to the indie filmmaker than 3D ever was. The camera rigs can be inexpensive and still be professional, or you can rent the expensive ones. The feeling we are getting from everyone is one of revolution, and we have not seen that since the year 2000. This is a new way to tell stories. There are no rules yet, and we are making a lot of this stuff up as we go along, but that’s what is fun. We are actually seeing people giggle again. We never saw this level of excitement with 3D. All we really saw was skepticism.

In what ways are you going to be highlighting VR/360 video?
The second half of the SuperMeet will be devoted to VR and 360 video. We are titling it, “Can I Tell a Compelling Story in VR and 360 Video?” Futurist Ted Schilowitz is going to act as a sort of ringmaster and introduce us to what we need to know. He will then bring on Csillia Kozma Andersen from Nokia to show off the new Ozo camera and how to use it. Next will be John Hendicott of Aurelia Soundworks, who will explain how spatial audio works. And, finally, we will introduce Alex Gollner, who will show how we edit all this stuff.

So the idea here is to try and give you a bit of what you need to know, and then hope it will help you get started on your way to creating your own compelling VR masterpiece.

What can attendees expect?
Expect to have a crazy fun time. Even if you have zero interest in 360 video, SuperMeets are a place to hang out with each other and network. Honestly, you just might meet someone who will change your life. You also can hang out at one of the 25 sponsor tables, where folks will be showing off the latest and greatest software and hardware solutions. VR camera rigs will be running around this area as well. And there will be free food, cash bars and close to $100,000 worth of raffle prizes to give away. It’s going to be a great show and, more importantly, a great time.

To enjoy $5 off your ticket price for the Las Vegas SuperMeet, courtesy of postPerspective, click here.

Daniel Bérubé, of the Boston Creative Pro User Group (BOSCPUG), is co-producer of these SuperMeets with Michael Horton, the founder of the Los Angeles Creative Pro User Group (LACPUG).