Tag Archives: MR

30 Ninja’s Julina Tatlock to keynote SMPTE 2018, will focus on emerging tech

30 Ninjas CEO Julina Tatlock, an award-winning writer-producer, virtual reality director and social TV specialist, will present the keynote address at the SMPTE 2018 conference, which takes place from October 22-25 in downtown Los Angeles. The keynote by Tatlock will take place on the 23rd at 9am, immediately following the SMPTE Annual general membership meeting.

Tatlock specializes in producing and directing VR, creating social media and web-based narrative games for movies and broadcast, as well as collaborating with developers on integrating new tech intellectual property into interactive stories.

During her keynote, she will discuss the ways that content creation and entertainment production can leverage emerging technologies. Tatlock will also address topics such as how best to evaluate what might be the next popular entertainment technology and platform, as well as how to write, direct and build for technology and platforms that don’t exist yet.

Tatlock’s 30 Ninjas, is an award-winning immersive-entertainment company she founded along with director Doug Liman (Bourne Identity, Mr. & Mrs. Smith, Edge of Tomorrow, American Made). 30 Ninjas creates original narratives and experiences in new technologies such as virtual reality, augmented reality and mixed reality and location-based entertainment for clients such as Warner Bros., USA Network, Universal Cable Productions and Harper Collins.

Tatlock also is the executive producer and director of episodes three and four of the six-part VR miniseries “Invisible,” with production partners Condé Nast Entertainment, Jaunt VR and Samsung.

Before founding 30 Ninjas, she spent eight years at Oxygen Media, where she was VP of programming strategy. In an earlier role with Martha Stewart Living Omnimedia, Tatlock wrote and produced more than 100 of NBC’s Martha Stewart Living morning show segments.

Registration is open for both SMPTE 2018 and for the SMPTE 2018 Symposium, an all-day session that will precede the technical conference and exhibition on Oct. 22. Pre-registration pricing is available through Oct. 13. Further details are available at smpte2018.org.

NAB 2016: VR/AR/MR and light field technology impressed

By Greg Ciaccio

The NAB 2016 schedule included its usual share of evolutionary developments, which are truly exciting (HDR, cloud hosting/rendering, etc.). One, however, was a game changer with reach far beyond media and entertainment.

This year’s NAB floor plan featured a Virtual Reality Pavilion in the North Hall. In addition, the ETC (USC’s Entertainment Technology Center) held a Virtual Reality Summit that featured many great panel discussions and opened quite a few minds. At least that’s what I gathered by the standing room only crowds that filled the suite. The ETC’s Ken Williams and Erik Weaver, among others, should be credited for delivering quite a program. While VR itself is not a new development, the availability of relatively inexpensive viewers (with Google Cardboard the most accessible) will put VR in the hands of practically everyone.

Programs included discussions on where VR/AR (Augmented Reality) and now MR (Mixed Reality) are heading, business cases and, not to be forgotten, audio. Keep in mind that with headset VR experiences, multi-channel directional sound must be perceivable with just our two ears.

The panels included experts in the field, including Dolby, DTS, Nokia, NextVR, Fox and CNN. In fact, Juan Santillian from Vantage.tv mentioned that Coachella is streaming live in VR. Often, concerts and other live events have a fixed audience size, and many can’t attend due to financial or sell-out situations. VR can allow a much more intimate and immersive experience than being almost anywhere but onstage.

One example, from Fox Sports’ Michael Davies, involved two friends in different cities virtually attending a football game in a third city. They sat next to each other and chatted during the game, with their audio correctly mapped to their seats. There are no limits to applications for VR/AR/MR, and, by all accounts, once you experience it, there is no doubt that this tech is here to stay.

I’ve heard many times this year that mobile will be the monetary driver for wide adoption of VR. Halsey Minor with Voxelus estimates that 85 percent of VR usage will be via a mobile device. Given that more photos and videos are shot on our phones (by far) than on dedicated cameras, this is not surprising. Some of the latest crop of mobile phones are not only fast and contain high dynamic range and wide color gamut, they feature high-end audio processing from Dolby and others. Plus, our reliance on our mobiles ensures that you’ll never forget to bring it with you.

Light Field Imaging
On both Sunday and Tuesday of NAB 2016, programs were devoted to light field imaging. I was already familiar with this truly revolutionary tech, and learned about Lytro, Inc. a few years ago from Internet ads for an early consumer camera. I was intrigued with the idea of controlling focus after shooting. I visited www.lytro.com and was impressed, but the resolution was low, so, for me, this was mainly a proof of concept. Fast forward three years, and Lytro now has a cinema camera!

Jon Karafin (pictured right), Lytro’s head of Light Field Imaging, not only unveiled the camera onstage, but debuted their short Life, produced in association with The Virtual Reality Company (VRC). Life takes us through a man’s life and is told with no dialog, letting us take in the moving images without distraction. Jon then took us through all the picture aspects using Nuke plug-ins, and minds started blowing. The short is directed by Academy Award-winner Robert Stromberg, and shot by veteran cinematographer David Stump, who is chief imaging scientist at VRC.

Many of us are familiar with camera raw capture and know that ISO, color temperature and other picture aspects can be changed post-shooting. This has proven to be very valuable. However, things like focus, f-stop, shutter angle and many other parameters can now be changed, thanks to light field technology — think of it as an X-ray compared to an MRI. In the interests of trying to keep a complicated technology relatively simple, sensors in the camera capture light fields in not only in X and Y space, but two more “angular” directions, forming what Lytro calls 4D space. The result is accurate depth mapping which opens up so many options for filmmakers.

Lytro_Cinema_2

Lytro Cinema Camera

For those who may think that this opens up too many options in post, all parameters can be locked so only those who are granted access can make edits. Some of the parameters that can be changed in post include: Focus, F-Stop, Depth of Field, Shutter Speed, Camera Position, Shutter Angle, Shutter Blade Count, Aperture Aspect Ratio and Fine Control of Depth (for mattes/comps).

Yes, this camera generates a lot of data. The good news is that you can make changes anywhere with an Internet connection, thanks to proxy mode in Nuke and processing rendered in the cloud. Jon demoed this, and images were quickly processed using Google’s cloud.

The camera itself is very large, but Lytro knows that they’ll need to reduce the size (from around seven feet long) to a more maneuverable form factor. However, this is a huge step in proving that a light field cinema camera and a powerful, manageable workflow is not only possible, but will no doubt prove valuable to filmmakers wanting the power and control offered by light field cinematography.

Greg Ciaccio is a technologist focused primarily on finding new technology and workflow solutions for Motion Picture and Television clients. Ciaccio served in technical management roles for the respective Creative Services divisions for both Deluxe and Technicolor.