Tag Archives: 4K

Hollywood’s Digital Jungle moves to Santa Clarita

Digital Jungle, a long-time Hollywood-based post house, has moved its operations to a new facility in Santa Clarita, California, which has become a growing hub for production and post in the suburbs of Los Angeles. The new headquarters is now home to both Digital Jungle Post and its recent off-shoot Digital Jungle Pictures, a feature film development and production studio.

“I don’t mind saying, it was a bit of an experiment moving to Santa Clarita,” explains Digital Jungle president and chief creative Dennis Ho. “With so many filmmakers and productions working out here — including Disney/ABC Studios, Santa Clarita Studios and Universal Locations — this area has developed into a vast untapped market for post production professionals. I decided that now was a good time to tap into that opportunity.”

Digital Jungle’s new facility offers the full complement of digital workflow solutions for HD to 4K. The facility has multiple suites featuring Smoke, DaVinci Resolve, audio recording via Avid’s S6 console and Pro Tools, production offices, a conference area, a full kitchen and a client lounge.

Digital Jungle is well into the process of adding further capabilities with a new high-end luxury DI 4K theater and screening room, greenscreen stage, VFX bullpen, multiple edit bays and additional production offices as part of their phase two build-out.

Digital Jungle Post services include DI/color grading; VFX/motion graphics; audio recording/mixing and sound design; ADR and VO; HD to 4K deliverables for tape and data; DCI and DCDM; promo/bumper design and film/television title design.

Commenting on Digital Jungle Pictures, Ho says, “It was a natural step for me. I started my career by directing and producing promos and interstitials for network TV, studios and distributors. I think that our recent involvement in producing several independent films has enhanced our credibility on the post side. Filmmakers tend to feel more comfortable entrusting their post work to other filmmakers. One example is we recently completed audio post and DI for a new Hallmark film called Love at First Glance.”

In addition to Love at First Glance, Digital Jungle Productions’ recent projects include indie films Day of Days, A Better Place (available now on digital and DVD) and Broken Memories, which was screened at the Sedona Film Festival.

 

Review: BenQ’s 4K/UHD monitor

By Brady Betzel

If you have been dabbling in higher-than 1920×1080 resolution multimedia production, you have likely been investigating a color-accurate and affordable 4K/UHD monitoring solution.

If you’ve Googled the Dolby PRM-4220 professional reference monitor you probably had a heart attack when you saw the near $40K price tag. This monitor is obviously not for the prosumer, or even the work-at-home professional. You may have found yourself in the forum-reading rabbit-hole where Flanders Scientific, Inc. (FSI) comes up a lot — unfortunately, if you aren’t able to shell out between $2K and $8K then you have been left in the dark.

PV3200pt_regular_front2While Dolby, FSI and others, like Sony, have amazing reference monitor solutions they come with that price tag that is hot and fast stop for anyone on a work-at-home budget. This is where the BenQ PV3200PT 32-inch LED backlit LCD IPS monitor comes in.
BenQ has been around for a while. You may remember them being in stores like Best Buy or Circuit City (if you are that old). When I worked at Best Buy, BenQ was the option next to Sony, but I remember thinking, “I’ve never heard if BenQ!” Now, after playing around with the PV3200PT monitor, I know the name BenQ, and I won’t forget it.

Digging In
The 32-inch PV3200PT monitor is a professional multimedia monitor. Not only is it a gigantic and gorgeous 32-inch 10-bit display, it has some great technology running it — including 100% Rec. 709 color accuracy. If you don’t deal with the tech spec-nerdom behind color science, Rec. 709 is the international technical standard for high-definition color given by the Radiocommunication Sector (you’ve probably heard it referred to as CCIR if you’ve heard it at all).

Simply, it’s the standard that color is broadcast across television in a high-definition environment, and if you produce video or other multimedia across televisions you want as close to 100% accuracy in your Rec. 709 color as possible. That way you can be confident that what you are creating on your monitor is technically what most people will see when it is broadcast… whew, that is long and boring but essential to me saying that the PV3200PT monitor is 100% Rec. 709 accurate.

But it is not Rec. 2020 accurate, which is the newer standard applied to ultra-high definition television — think 4K UHD (2160p) and the imminent 8K UHD (4320p). So while you are accurate color-wise in HD space, you can’t necessarily rely on it for the wider range of color values that is offered in UHD, Rec. 2020. This is not necessarily a bad thing, but something to be aware of. And, once you see the price you probably won’t care anyway. As I write this review, it is being sold on BenQ’s website for $1,299. This is a really, really great price for the punch this monitor packs.

As a video editor, I love large, color-accurate monitors. Who doesn’t? I want my whites properly exposed (if possible) and my blacks detailed and dark; it’s a lot to ask for but it’s what I want and what I need when color correcting footage. While using the BenQ PV3200PT, I was not disappointed with its output.

Rotation
I am also testing an HP z1G3 all-in-one workstation at the moment, so I opened the BenQ box and plugged the PV3200PT right into the HP z1G3 mini-displayport and was off and running. I noticed immediately how many ways I could physically move the display around to match the environment I was in, including 90 degrees for some sweet Adobe Photoshop vertical work, visit www.postperspective.com to read all the articles at once, or even use it to display your Media Pool when using Blackmagic’ DaVinci Resolve 12.5 (and, yes, it does work with the vertical display!!) Using the PV3200PT vertically in Resolve was really mind opening and could become a really great way to use such big screen real estate.

To get the PV3200PT to rotate the image vertically I tried using the BenQ provided software, Display Pilot, but eventually realized that I had to use Nvidia’s Control Panel. That did get me into using the Display Pilot to break up the BenQ’s (and the other monitor for that matter) into quadrants to display multiple windows at once easily and efficiently.

I put Adobe Premiere on my left screen and set up the BenQ PV3200PT to have it split three ways: a large left column with Adobe Media Encoder and two right rows with Internet browsers. I really liked that feature, especially because I love to watch tutorials, and this type of set-up allows me to watch and create at the same time. It’s an awesome feature.

When using the PV3200PT I didn’t notice any lag time or smearing, which you can sometimes see on lower-priced monitors. I also noticed that they shipped the monitor with Brightness set to 100% and Color Mode set to Standard, so if you want your eyes to not bug out of your head after 10 hours of work and you want that Rec. 709 color, you need to enable that yourself. Luckily, the menu on the monitor is easy to navigate, which isn’t always the case with monitors so I wanted to make sure to point that out. It isn’t a touch screen monitor, so don’t be a dummy like me and poke at your monitor wondering why the menus aren’t working.

I hand picked a few tech specs below for the BenQ PV3200PT monitor that I felt are important but you can see the entire list here under Specs:
– Resolution: 3840×2160 (UHD – NOT true 4K)
– Native Contrast: 1000:1
– Panel Type: IPS
– Response Time : 5ms
– Display Colors: 1.07 B
– Color Gamut: 100% Rec. 709
– Color Bit 10bits
– Input connectors: HDMI 1.4, Display Port 1.2, mini Display Port 1.2
– Weight: 27 to 33 pounds depending on the mounting option installed

BenQ Vertical ResolveSumming Up
In the end I really loved this monitor, not only for its price but for the technology inside of it. From the beautiful 32-inch IPS real estate to the SD card reader and two USB 3.0 ports built in. I learned that I love the vertical feature and may have to incorporate that into my daily color correction and editing style.

One thing I didn’t mention earlier is the external OSD Controller included that allows you to quickly select between Rec. 709, EBU and SMPTE-C color spaces. Also included is the BenQ proprietary Palette Master Element Calibration Software that allows for custom calibration with devices like the Spyder by @Datacolor.

I would recommend taking a look at this beautiful display if you are in the market for a UHD, 100% Rec. 709 color accurate, adjustable display for around $1,299, if you are lucky enough to get in on that price.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Sony launches Z Series line of UHD TVs for 4K HDR

Sony recently introduced a line of UHD television displays that could be suitable as client monitors in post houses. Sony’s new Z series television display technology — including the X930D and X940D — has adopted Sony’s Backlight Master Drive backlight boosting technology, which expands brightness and contrast to better exploit 4K HDR. While testing will need to be done, rumor has it the monitor may easily comply with Ultra HD Alliance requirements, making this an excellent choice for large size monitors for the client experience.

To further enhance contrast, the Backlight Master Drive includes a dense LED structure, discrete lighting control and an optical design with a calibrated beam LED. Previously, local dimming was controlled by zones with several LEDs. The discrete LED control feature allows the Backlight Master Drive to dim and boost each LED individually for greater precision, contrast and realism.

Additionally, the Z series features a newly developed 4K image processor, the 4K HDR Processor X1 Extreme. Combined with Backlight Master Drive, the Z series features expanded contrast and more accurate color expression. The 4K Processor X1 Extreme incorporates three new technologies: an object-based HDR remaster, dual database processing and Super Bit Mapping 4K HDR. With these three technologies, 4K HDR Processor X1 Extreme reproduces a wide variety of content with immersive 4K HDR picture quality.

The Z series runs on Android TV with a Sony user interface that includes a new content bar with enhanced content navigation, voice search and a genre filtering function. Instead of selecting a program from several channels, users can select from favorite genres, including as sports, music, news.

Pricing is as follows:

  • XBR65Z9D, 65″ class (64.5″ diagonal), $6,999 MSRP, available summer 2016
  • XBR75Z9D, 75″ class (74.5″ diagonal), $9,999 MSRP, available summer 2016
  • XBR100Z9D, 100″ class (99.5″ diagonal), pricing and availability details to be announced later this year.

Storage Workflows for 4K and Beyond

Technicolor-Postworks and Deluxe Creative Services share their stories.

By Beth Marchant

Once upon a time, an editorial shop was a sneaker-net away from the other islands in the pipeline archipelago. That changed when the last phases of the digital revolution set many traditional editorial facilities into swift expansion mode to include more post production services under one roof.

The consolidating business environment in the post industry of the past several years then brought more of those expanded, overlapping divisions together. That’s a lot for any network to handle, let alone one containing some of the highest quality and most data-dense sound and pictures being created today. The networked storage systems connecting them all must be robust, efficient and realtime without fail, but also capable of expanding and contracting with the fluctuations of client requests, job sizes, acquisitions and, of course, evolving technology.

There’s a “relief valve” in the cloud and object storage, say facility CTOs minding the flow, but it’s still a delicate balance between local pooled and tiered storage and iron-clad cloud-based networks their clients will trust.

Technicolor-Postworks
Joe Beirne, CTO of Technicolor-PostWorks New York, is probably as familiar as one can be with complex nonlinear editorial workflows. A user of Avid’s earliest NLEs, an early adopter of networked editing and an immersive interactive filmmaker who experimented early with bluescreen footage, Beirne began his career as a technical advisor and producer for high-profile mixed-format feature documentaries, including Michael Moore’s Fahrenheit 9/11 and the last film in Godfrey Reggio’s KOYAANISQATSI trilogy.

Joe Beirne

Joe Beirne

In his 11 years as a technology strategist at Technicolor-PostWorks New York, Beirne has also become fluent in evolving color, DI and audio workflows for clients such as HBO, Lionsgate, Discovery and Amazon Studios. CTO since 2011, when PostWorks NY acquired the East Coast Technicolor facility and the color science that came with it, he now oversees the increasingly complicated ecosystem that moves and stores vast amounts of high-resolution footage and data while simultaneously holding those separate and variously intersecting workflows together.

As the first post facility in New York to handle petabyte levels of editorial-based storage, Technicolor-PostWorks learned early how to manage the data explosion unleashed by digital cameras and NLEs. “That’s not because we had a petabyte SAN or NAS or near-line storage,” explains Beirne. “But we had literally 25 to 30 Avid Unity systems that were all in aggregate at once. We had a lot of storage spread out over the campus of buildings that we ran on the traditional PostWorks editorial side of the business.”

The TV finishing and DI business that developed at PostWorks in 2005, when Beirne joined the company (he was previously a client), eventually necessitated a different route. “As we’ve grown, we’ve expanded out to tiered storage, as everyone is doing, and also to the cloud,” he says. “Like we’ve done with our creative platforms, we have channeled our different storage systems and subsystems to meet specific needs. But they all have a very promiscuous relationship with each other!”

TPW’s high-performance storage in its production network is a combination of local or semi-locally attached near-line storage tethered by several Quantum StorNext SANs, all of it air-gapped — or physically segregated —from the public Internet. “We’ve got multiple SANs in the main Technicolor mothership on Leroy Street with multiple metadata controllers,” says Beirne. “We’ve also got some client-specific storage, so we have a SAN that can be dedicated to a particular account. We did that for a particular client who has very restrictive policies about shared storage.”

TPW’s editorial media, for the most part, resides in Avid’s ISIS system and is in the process of transitioning to its software-defined replacement, Nexis. “We have hundreds of Avids, a few Adobe and even some Final Cut systems connected to that collection of Nexis and ISIS and Unity systems,” he says. “We’re currently testing the Nexis pipeline for our needs but, in general, we’re going to keep using this kind of storage for the foreseeable future. We have multiple storage servers that serve that part of our business.”

Beirne says most every project the facility touches is archived to LTO tape. “We have a little bit of disc-to-tape archiving going on for the same reasons everybody else does,” he adds. “And some SAN volume hot spots that are all SSD (solid state drives) or a hybrid.” The facility is also in the process of improving the bandwidth of its overall switching fabric, both on the Fibre Channel side and on the Ethernet side. “That means we’re moving to 32Gb and multiple 16Gb links,” he says. “We’re also exploring a 40Gb Ethernet backbone.”

Technicolor-Postworks 4K theater at their Leroy Street location.

This backbone, he adds, carries an exponential amount of data every day. “Now we have what are like two nested networks of storage at a lot of the artist workstations,” he explains. “That’s a complicating feature. It’s this big, kind of octopus, actually. Scratch that: it’s like two octopi on top of one another. That’s not even mentioning the baseband LAN network that interweaves this whole thing. They, of course, are now getting intermixed because we are also doing IT-based switching. The entire, complex ecosystem is evolving and everything that interacts with it is evolving right along with it.”

The cloud is providing some relief and handles multiple types of storage workflows across TPW’s various business units. “Different flavors of the commercial cloud, as well as our own private cloud, handle those different pools of storage outside our premises,” Beirne says. “We’re collaborating right now with an international account in another territory and we’re touching their storage envelope through the Azure cloud (Microsoft’s enterprise-grade cloud platform). Our Azure cloud and theirs touch and we push data from that storage back and forth between us. That particular collaboration happened because we both had an Azure instance, and those kinds of server-to-server transactions that occur entirely in the cloud work very well. We also had a relationship with one of the studios in which we made a similar connection through Amazon’s S3 cloud.”

Given the trepidations most studios still have about the cloud, Beirne admits there will always be some initial, instinctive mistrust from both clients and staff when you start moving any content away from computers that are not your own and you don’t control. “What made that first cloud solution work, and this is kind of goofy, is we used Aspera to move the data, even though it was between adjacent racks. But we took advantage of the high-bandwidth backbone to do it efficiently.”

Both TPW in New York and Technicolor in Los Angeles have since leveraged the cloud aggressively. “We our own cloud that we built, and big Technicolor has a very substantial purpose-built cloud, as well as Technicolor Pulse, their new storage-related production service in the cloud. They also use object storage and have some even newer technology that will be launching shortly.”

The caveat to moving any storage-related workflow into the cloud is thorough and continual testing, says Beirne. “Do I have more concern for my clients’ media in the cloud than I do when sending my own tax forms electronically? Yea, I probably do,” he says. “It’s a very, very high threshold that we need to pass. But that said, there’s quite a bit of low-impact support stuff that we can do on the cloud. Review and approval stuff has been happening in the cloud for some time.” As a result, the facility has seen an increase, like everyone else, in virtual client sessions, like live color sessions and live mix sessions from city to city or continent to continent. “To do that, we usually have a closed circuit that we open between two facilities and have calibrated displays on either end. And, we also use PIX and other normal dailies systems.”

“How we process and push this media around ultimately defines our business,” he concludes. “It’s increasingly bigger projects that are made more demanding from a computing point of view. And then spreading that out in a safe and effective way to where people want to access it, that’s the challenge we confront every single day. There’s this enormous tension between the desire to be mobile and open and computing everywhere and anywhere, with these incredibly powerful computer systems we now carry around in our pockets and the bandwidth of the content that we’re making, which is high frame rate, high resolution, high dynamic range and high everything. And with 8K — HDR and stereo wavefront data goes way beyond 8K and what the retina even sees — and 10-bit or more coming in the broadcast chain, it will be more of the same.” TPW is already doing 16-bit processing for all of its film projects and most of its television work. “That’s piles and piles and piles of data that also scales linearly. It’s never going to stop. And we have a VR lab here now, and there’s no end of the data when you start including everything in and outside of the frame. That’s what keeps me up at night.”

Deluxe Creative Services
Before becoming CTO at Deluxe Creative Services, Mike Chiado had a 15-year career as a color engineer and image scientist at Company 3, the grading and finishing powerhouse acquired by Deluxe in 2010. He now manages the pipelines of a commercial, television and film Creative Services division that encompasses not just dailies, editorial and color, but sound, VFX, 3D conversion, virtual reality, interactive design and restoration.

MikeChiado

Mike Chiado

That’s a hugely data-heavy load to begin with, and as VR and 8K projects become more common, managing the data stored and coursing through DCS’ network will get even more demanding. Branded companies currently under the monster Deluxe umbrella include Beast, Company 3, DDP, Deluxe/Culver City, Deluxe VR, Editpool, Efilm, Encore, Flagstaff Studios, Iloura, Level 3, Method Studios, StageOne Sound, Stereo D, and Rushes.

“Actually, that’s nothing when you consider that all the delivery and media teams from Deluxe Delivery and Deluxe Digital Cinema are downstream of Creative Services,” says Chiado. “That’s a much bigger network and storage challenge at that level.” Still, the storage challenges of Chiado’s segment are routinely complicated by the twin monkey wrenches of the collaborative and computer kind that can unhinge any technology-driven art form.

“Each area of the business has its own specific problems that recur: television has its issues, commercial work has its issues and features its issues. For us, commercials and features are more alike than you might think, partly due to the constantly changing visual effects but also due to shifting schedules. Television is much more regimented,” he says. “But sometimes we get hard drives in on a commercial or feature and we think, ‘Well that’s not what we talked about at all!”

Company 3’s file-based digital intermediate work quickly clarified Chiado’s technical priorities. “The thing that we learned early on is realtime playback is just so critical,” he says. “When we did our very first file-based DI job 13 years ago, we were so excited that we could display a certain resolution. OK, it was slipping a little bit from realtime, maybe we’ll get 22 frames a second, or 23, but then the director walked out after five minutes and said, ‘No. This won’t work.’ He couldn’t care less about the resolution because it was only always about realtime and solid playback. Luckily, we learned our lesson pretty quickly and learned it well! In Deluxe Creative Services, that still is the number one priority.”

It’s also helped him cut through unnecessary sales pitches from storage vendors unfamiliar with Deluxe’s business. “When I talk to them, I say, ‘Don’t tell me about bit rates. I’m going to tell you a frame rate I want to hit and a resolution, and you tell me if we can hit it or not with your solution. I don’t want to argue bits; I want tell you this is what I need to do and you’re going to tell me whether or not your storage can do that.’ The storage vendors that we’re going to bank our A-client work on better understand fundamentally what we need.”

Because some of the Deluxe company brands share office space — Method and Company 3 moved into a 63,376-square-foot former warehouse in Santa Monica a few years ago — they have access to the same storage infrastructure. “But there are often volumes specially purpose-built for a particular job,” says Chiado. “In that way, we’ve created volumes focused on supporting 4K feature work and others set up specifically for CG desktop environments that are shared across 400 people in that one building. We also have similar business units in Company 3 and Efilm, so sometimes it makes sense that we would want, for artist or client reasons, to have somebody in a different location from where the data resides. For example, having the artist in Santa Monica and the director and DP in Hollywood is something we do regularly.”

Chiado says Deluxe has designed and built with network solution and storage solution providers a system “that suits our needs. But for the most part, we’re using off-the-shelf products for storage. The magic is how we tune them to be able to work with our systems.”

Those vendors include Quantum, DDN Storage and EMC’s network-attached storage Isilon. “For our most robust needs, like 4K feature workflows, we rely on DDN,” he says. “We’ve actually already done some 8K workflows. Crazy world we live in!” For long-term archiving, each Deluxe Creative Service location worldwide has an LTO-tape robot library. “In some cases, we’ll have a near-line tier two volume that stages it. And for the past few years, we’re using object storage in some locations to help with that.”

Although the entire group of Deluxe divisions and offices are linked by a robust 10GigE network that sometimes takes advantage of dark fiber, unused fiber optic cables leased from larger fiber-optic communications companies, Chiado says the storage they use is all very specific to each business unit. “We’re moving stuff around all the time but projects are pretty much residing in one spot or another,” he says. “Often, there are a thousand reasons why — it may be for tax incentives in a particular location, it may be for project-specific needs. Or it’s just that we’re talking about the London and LA locations.”

With one eye on the future and another on budgets, Chiado says pooled storage has helped DCS keep costs down while managing larger and larger subsets of data-heavy projects. “We are always on the lookout for ways to handle the next thing, like the arrival of 8K workflows, but we’ve gained huge, huge efficiencies from pooled storage,” he says. “So that’s the beauty of what we build, specific to each of our world locations. We move it around if we have to between locations but inside that location, everybody works with the content in one place. That right there was a major efficiency in our workflows.”

Beyond that, he says, how to handle 8K is still an open question. “We may have to make an island, and it’s been testing so far, but we do everything we can to keep it in one place and leverage whatever technology that’s required for the job,” Chiado says. “We have isolated instances of SSDs (solid-state drives) but we don’t have large-scale deployment of SSDs yet. On the other end, we’re working with cloud vendors, too, to be able to maximize our investments.”

Although the company is still working through cloud security issues, Chiado says Deluxe is “actively engaging with cloud vendors because we aren’t convinced that our clients are going to be happy with the security protocols in place right now. The nature of the business is we are regularly involved with our clients and MPAA and have ongoing security audits. We also have a group within Deluxe that helps us maintain the best standards, but each show that comes in may have its own unique security needs. It’s a constant, evolving process. It’s been really difficult to get our heads and our clients’ heads around using the cloud for rendering, transcoding or for storage.”

Luckily, that’s starting to change. “We’re getting good traction now, with a few of the studios getting ready to greenlight cloud use and our own pipeline development to support it,” he adds. “They are hand in hand. But I think once we move over this hurdle, this is going to help the industry tremendously.”

Beyond those longer-term challenges, Chiado says the day-to-day demands of each division haven’t changed much. “Everybody always needs more storage, so we are constantly looking at ways to make that happen,” he says. “The better we can monitor our storage and make our in-house people feel comfortable moving stuff off near-line to tape and bring it back again, the better we can put the storage where we need it. But I’m very optimistic about the future, especially about having a relief valve in the cloud.”

Our main image is the shared 4K theater at Company 3 and Method.

Talking VR content with Phillip Moses of studio Rascali

Phillip Moses, head of VR content developer Rascali, has been working in visual effects for over 25 years. His resume boasts some big-name films, including Alice in Wonderland, Speed Racer and Spider-Man 3, just to name a few. Seven years ago he launched a small boutique visual effects studio, called The Resistance VFX, with VFX supervisor Jeff Goldman.

Two years ago, after getting a demo of an Oculus pre-release Dev Kit 2, Moses realized that “we were poised on the edge of not just a technological breakthrough, but what will ultimately be a new platform for consuming content. To me, this was a shift almost as big as the smartphone, and an exciting opportunity for content creators to begin creating in a whole new ecosystem.”

Phillip Moses

Phillip Moses

Shortly after that, his friends James Chung and Taehoon Oh launched Reload Studios, with the vision of creating the first independently-developed first-person shooter game, designed from the ground up for VR. “As one of the first companies formed around the premise of VR, they attracted quite a bit of interest in the non-gaming sector as well,” he explains. “Last year, they asked me to come aboard and direct their non-gaming division, Rascali. I saw this as a huge opportunity to do what I love best: explore, create and innovate.”

Rascali has been busy. They recently debuted trailers for their first episodic VR projects, Raven and The Storybox Project, on YouTube, Facebook/Oculus Video, Jaunt, Littlstar, Vrideo and Samsung MilkVR. Let’s find out more…

You recently directed two VR trailers. How is directing for VR different than directing for traditional platforms?
Directing for VR is a tricky beast and requires a lot of technical knowledge of the whole process that would not normally be required of directors. To be fair, today’s directors are a very savvy bunch, and most have a solid working knowledge of how visual effects are used in the process. However, for the way I have chosen to shoot the series, it requires the ability to have a pretty solid understanding of not just what can be done, but how to actually do it. To be able to previsualize the process and, ultimately, the end result in your head first is critical to being able to communicate that vision down the line.

Also, from a script and performance perspective, I think it’s important to start with a very important question of “Why VR?” And once you believe you have a compelling answer to that question, then you need to start thinking about how to use VR in your story.  Will you require interaction and participation from the viewer? Will you involve the viewer in any way? Or will you simply allow VR to serve as an additional element of presence and immersion for the viewer?

While you gain many things in VR, you also have to go into the process with a full knowledge of what you ultimately lose. The power of lenses, for example, to capture nuance and to frame an image to evoke an emotional response, is all but lost. You find yourself going back to exploring what works best in a real-world framing — almost like you are directing a play in an intimate theater.

What is the biggest challenge in the post workflow for VR?
Rendering! Everything we are producing for Raven is at 4K left eye, 4K right eye and 60fps. The rendering process alone guarantees that the process will take longer than you hoped. It also guarantees that you will need more data storage than you ever thought necessary.

But other than rendering, I find that the editorial process is also more challenging. With VR, those shots that you thought you were holding onto way too long are actually still too short, and it involves an elaborate process to conform everything for review in a headset between revisions. In many ways, it’s similar to the old process of making your edit decisions, then walking the print into the screening room. You forget how tedious the process can be.
By the way, I’m looking forward to integrating some realtime 360 review into the editorial process. Make it happen Adobe/Avid!

These trailers are meant to generate interest from production partners to green light these as full episodic series. What is the intended length of each episode, and what’s the projected length of time from concept to completion for each episode of the all-CG Storybox, and live-action Raven?
Each one of these projects is designed for completely different audiences, so the answer is a bit different for each one. For Storybox, we are looking to keep each episode under five minutes, with the intention that it is a fairly easy-to-consume piece of content that is accessible to a broad spectrum of ages. We really hope to make the experiences fun, playful and surprising for the viewer, and to create a context for telling these stories that fuels the imagination of kids.

For Storybox, I believe that we can start delivering finished episodes before the end of the third quarter — with a full season representing 12 to 15 episodes. Raven, on the other hand, is a much more complex undertaking. While the VR market is being developed, we are betting on the core VR consumers to really want stories and experiences that range closer to 12 to 15 minutes in duration. We feel this is enough time to tell more complex stories, but still make each episode feel like a fantastic experience that they could not experience anywhere else. If green-lit tomorrow, I believe we would be looking at a four-month production schedule for the pilot episode.

Rascali is a division of Reload Studios, which is developing VR games. Is there a technology transfer of workflows and pipelines and shared best practices across production for entertainment content and games within the company?
Absolutely! While VR is a new technology, there is such a rich heritage of knowledge present at Reload Studios. For example, one question that VR directors are asking themselves is: “How can I direct my audience’s attention to action in ways that are organic and natural?” While this is a new question for film directors — who typically rely on camera to do this work for them — this is a question that the gaming community has been answering for years. Having some of the top designers in the game industry at our disposal is an invaluable asset.

That being said, Reload is much different than most independent game companies. One of their first hires was senior Disney animator Nik Ranieri. Our producing team is composed of top animation producers from Marvel and DC. We have a deep bench of people who give the whole company a very comprehensive knowledge of how content of all types is created.

What was the equipment set-up for the Raven VR shoot? Which camera was used? What tools were used in the post pipeline?
Much of the creative IP for Raven is very much in development, including designs, characters, etc. For this reason, we elected to construct a teaser that highlighted immersive VR vistas that you could expect in the world we are creating. This required us to lean very heavily on the visual effects / CG production process — the VFX pipeline included Autodesk 3ds Max, rendering in V-Ray, with some assistance from Nuke and even Softimage XSI. The entire project was edited in Adobe Premiere.

For our one live-action element, this was shot with a single Red camera, and then projected onto geometry for accurate stereo integration.

Where do you think the prevailing future of VR content is? Narrative, training, therapy, gaming, etc.?
I think your question represents the future of VR. Games, for sure, are going to be leading the charge, as this demographic is the only one on a large scale that will be purchasing the devices required to build a viable market. But much more than games, I’m excited to see growth in all of the areas you listed above, including, most significantly, education. Education could be a huge winner in the growing VR/AR ecosystem.

The reason I elected to join Rascali is to help provide solutions and pave the way for solutions in markets that mostly don’t yet exist.  It’s exciting to be a part of a new industry that has the power to improve and benefit so many aspects of the global community.

UHD Alliance’s Victor Matsuda: updates from NAB 2016

Victor Matsuda from the UHD Alliance was at NAB 2016. The Alliance was formed about 15 months ago as 4K UHD products began exploding into the market. The goal of the Alliance was to establish certifications for these new products and for content. All of this is to ensure a quality experience for consumers, who will ultimately drive 4K/UHD adoption throughout the market.

Watch our video with Matsuda to find out more.

Digging Deeper: NASA TV UHD executive producer Joel Marsden

It’s hard to deny the beauty of images of Earth captured from outer space. And NASA and partner Harmonic agree, boldly going where no one has gone before — creating NASA TV UHD, the first non-commercial consumer UHD channel in North America. Leveraging the resolution of ultra high definition, the channel gives viewers a front row seat to some gorgeous views captured from the International Space Station (ISS), other current NASA missions and remastered historical footage.

We recently reached out to Joel Marsden, executive producer of NASA TV UHD, to find out how this exciting new endeavor reached “liftoff.”

Joel Marsden

Joel Marsden

This was obviously a huge undertaking. How did you get started and how is the channel set up?
The new channel was launched with programming created from raw video footage and imagery supplied by NASA. Since that time, Harmonic has also shot and contributed 4K footage, including video of recent rocket launches. They provide the end-to-end UHD video delivery system and post production services while managing operations. It’s all hosted at a NASA facility managed by Encompass Digital Media in Atlanta, which is home to the agency’s satellite and NASA TV hubs.

Like the current NASA TV channels, and on the same transponder, NASA TV UHD is transmitted via the SES AMC-18C satellite, in the clear, with a North American footprint. The channel is delivered at 13.5Mbps, as compared with many of the UHD demo channels in the industry, which have required between 50 and 100 Mbps. NASA’s ability to minimize bandwidth use is based on a combination of encoding technology from Harmonic in conjunction with the next-generation H.265 HEVC compression algorithm.

Can you talk about how the footage was captured and how it got to you for post?
When the National Aeronautics and Space Act of 1958 was created, one of the legal requirements of NASA was to keep the public apprised of its work in the most efficient means possible and with the ultimate goal of bringing everyone on Earth as close as possible to being in space. Over the years, NASA has used imagery as the primary means of demonstration. The group in charge of these efforts, the NASA Imagery Experts Program, provides the public with a wide array of digital television, web video and still images based on the agency’s activities. Today, NASA’s broadcast offerings via NASA TV include an HD consumer channel, an HD media channel and an SD education channel.

In 2015, the agency introduced NASA TV UHD. Naturally, NASA archives provide remastered footage from historical missions and shots from NASA’s development and training processes, all of which are used for production of broadcast programming. In fact, before the agency launched NASA TV, it had already begun production of its own documentary series, based on footage collected during missions.

Just five or six years ago, NASA also began documenting major events in 4K resolution or higher. The agency has been using 6K Red Dragon digital cinema cameras for some time. NASA TV UHD video content is sourced from high-resolution images and video generated on the ISS, Hubble Space Telescope and other current NASA missions. The raw content files are then sent to Harmonic for post.

Can you walk us through the workflow?
Raw video files are mailed on physical discs or sent via FTP from a variety of NASA facilities to Harmonic’s post studio in San Jose and stored on the Harmonic MediaGrid system, which supports an edit-in-place workflow with Final Cut Pro and other third-party editing tools.

During the content processing phase, Harmonic uses Adobe After Effects to paint out dead pixels that result from the impact of cosmic radiation on camera sensors. They have built bad-pixel maps that they use in post production to remove the distracting white dots from the picture. The detail of UHD means that the footage also shows scratches on the windows of the ISS through which the camera is shooting, but these are left in for authenticity.

 

A Blackmagic DaVinci Resolve is used to color grade footage, and Maxon Cinema 4D Studio is used to create animations of images. Final Cut Pro X and Adobe Creative Suite are used to set the video to music and add text and graphics, along with the programming name, logo and branding.

Final programs are then transferred in HD back to the NASA teams for review, and in UHD to the Harmonic team in Atlanta to be loaded onto the Spectrum X for playout.

————

You can check out NASA TV’s offerings here.

A glimpse at what Sony has in store for NAB

By Fergus Burnett

I visited Sony HQ in Manhattan for their pre-NAB Show press conference recently. In a board room with tiny muffins, mini bagels and a great view of New York, we sat pleasantly for a few hours to learn about the direction the company is taking in 2016.

Sony announced details for a slew of 4K-, HDR-capable broadcast cameras and workflow systems, all backwards compatible with standard HD to ease the professional and consumer transition to Ultra-HD.

As well as broadcast and motion picture, Sony’s Pro division has a finger in the corporate, healthcare, education and faith markets. They have been steadily pushing their new products and systems into universities, private companies, hospitals and every other kind of institution. Last year, they helped to fit out the very first 4K church.

I work as a DIT/dailies technician in the motion picture industry rather than broadcast, so many of these product announcements were outside my sphere of professional interest, but it was fascinating to gain an understanding of the immense scale and variety of markets that Sony is working in.

There were only a handful of new additions the CineAlta (pictured) line, firmware updates for the F5 and F55, and a new 4K recording module. These two cameras have really endured in popularity since their introduction in 2012.

The new AXS-R7 recording module (right) offers a few improvements over its predecessor the AXS-R5. It’s capable of full 4K up to 120fps and has a nifty 30-second cache capability, which is going to be really useful for shooting water droplets in slow motion. The AXS-R7 uses a new kind of high-speed media card that looks like a slightly smaller SxS — it’s called AXSM-S48. Sony is really on fire with these names!

A common and unfortunate problem when I am dealing with on-set dailies is sketchy card readers. This is something that ALL motion picture camera companies are guilty of producing. USB 3.0 is just not fast enough when copying huge chunks of critical camera data to multiple drives, and I’ve found the power connector on the current AXS card reader to be touchy on separate occasions with different readers, causing the card to eject in the midst of offloading. Though there are no details yet, I was assured that the AXSM-S48 reader would use a faster connection than USB 3.0. I certainly hope so; it’s a weak point in what is otherwise a fairly trouble-free camera ecosystem.

Looming at the top of the CineAlta lineup, the F65 is still Sony’s flagship camera for cinema production. Its specs were outrageous four years ago and still are, but it never became a common sight on film sets. The 8K resolution was mostly unnecessary even for top-tier productions. I inquired where Sony saw the F65 sitting among its competition, from Arri and Red, as well as their own F55 which has become a staple of TV drama.

Sony sees the F65 as their true cinema camera, ideally suited for projection on large screens. They admitted that while uptake of the camera was slow after its introduction, rentals have been increasing as more DPs gain experience with the camera, enjoying its low-light capabilities, color gamut and sheer physical bulk.

Sony manufactures a gigantic fleet of sensible, soberly named cameras for every conceivable purpose. They are very capable production tools, but it’s only a small part of Sony’s overall strategy.

With 4K HDR delivery fast becoming standard and expected, we are headed for a future world where pictures are more appealing than reality. From production to consumption, Sony could well be set to dominate that world. We already watch Sony-produced movies shot on Sony cameras playing on Sony screens, and we listen to Sony musicians on Sony stereos as we make our way to worship the God of sound and vision in a 4K church.

Enjoy NAB everyone!

Panasonic offers compact 4K Super 35 VariCam LT

At an event held in LA at the Directors Guild Theater, Panasonic introduced the VariCam LT, it’s next-gen of 4K cinema cameras. The lightweight The VariCam LT camcorder features the super 35mm sensor and imaging capabilities that its VariCam 35 offers, but with reductions in size, weight and price.

Incorporating this identical imaging “DNA” in a more compact rendition, the VariCam LT (model AU-V35LT1G) delivers 14+ stops of dynamic range with V-Log, and cinematic VariCam image quality and color science, as well as the VariCam 35’s dual native ISOs of 800/5000.

Weighing in at just at under six pounds, the VariCam LT is suited for handheld, SteadiCam, jib, crane, drone, gimbal and overall cinema verité work. The VariCam LT will also target owner/operators, independent filmmakers, documentary makers and corporate productions.

The VariCam LT will be available at the end of March in two packages, with a suggested list price of $18,000 (body only) and $24,000 (body + AU-VCVF10G viewfinder).

The VariCam LT handles formats ranging from 4K, UHD, 2K and HD, and like the VariCam 35, is fully capable of High Dynamic Range (HDR) field capture. The new 4K camcorder offers Apple ProRes 4444 (up to 30p) and ProRes 422 HQ (up to 60p) support for HD recording, as well as Panasonic’s AVC-ULTRA family of advanced video codecs.

New codecs introduced in the VariCam LT include AVC-Intra LT and AVC-Intra 2K-LT, both of which are designed to offer capture rates up to 240fps in imager crop mode, ideal for sports and other fast motion footage.

The new camera features color management capabilities along with VariCam’s extended color gamut and support for the Academy Color Encoding System (ACES) workflow, which allows for full fidelity mastering of original source material. The VariCam LT offers in-camera color grading, with the ability to record an ungraded 4K master along with all on-set grading metadata. A new color-processing feature is V-Look, which acts as a blend of V-Log and video, and allows filmic documentary acquisition without the same need for intense color grading.

The VariCam LT differs from the VariCam 35 in being a one-piece, short-bodied camcorder versus a two-piece camera head plus recorder. While the VariCam LT does not feature parallel sub-recording, it does have an SD slot for high-resolution proxy recording. Proxy files can be wirelessly uploaded via FTP, which facilitates wireless color grading. Variable frame rates are available with LongG6 recording.

There is one expressP2 card for all formats, including high frame rate and HD/2K/UHD and 4K recording (the 256GB expressP2 card can record up to 90 minutes of 4K/4:2:2/23.98p content). RAW output from SDI will likely be supported by a firmware upgrade in early summer 2016.

The VariCam LT features an EF mount (vs. the VariCam 35’s PL mount), suitable for the wide array of lenses available for smaller cameras. The EF mount can be switched out to a robust standard PL mount, expanding the range of compatible lenses that can be used. The control panel can be separated from the camera body to facilitate realtime control and easy menu access. The camcorder has a production-tough magnesium body to assure durability and reliability in challenging shooting locations.

Other features new to the VariCam LT are power hot swap, IR shooting (further enhancing the camcorder’s extreme low-light capture at ISO 5000), 23.98 PsF output and image presets as scene files.

Among the camcorder’s top-level production assets are ND filters (CLEAR, 0.6, 1.2, 1.8), an optional OLED electronic viewfinder (EVF) with optical zoom functionality, 24-bit LPCM audio for in-camera audio master recording, Focus Assist, anamorphic lens de-squeeze, special REC functions (PreRec, interval, one-shot), IP control via Panasonic’s AK-HRP200 camera remote controller, and built-in GPS.

Pro interfaces include 3G-HD-SDI x 3 (SDI-OUT X 2 and VF), LAN, genlock in, timecode in/out, USB2.0 Host and USB2.0 Device (mini B) and three XLR inputs (one 5-pin, two 3-pin) to record four channels of 24-bit, 48KHz audio. In addition, its flexible interfaces allow use of the Panasonic AU-VCVF10G viewfinder, as well as third-party viewfinder solutions.

Light Iron beefs up TV division, adds colorist Jeremy Sawyer

President Michael Cioni discusses increased episodic work and his studio’s growth.

The quality of television programming — broadcast, cable and streaming — has never been better… from the writing to the acting to the final look of the shows. In response to the new business this production has brought to its facility, Light Iron is growing its television episodic division with talent and gear.

New hire Jeremy Sawyer is a colorist who brings with him a wealth of experience with TV, including grading The Walking Dead, The Closer, South Park, Major Crimes, Limitless and The Affair. He comes to Light Iron from MTI.  Prior to that he spent time at Company 3, The Syndicate and Finish Post.

Light Iron’s Hollywood location is adding a second television bay, a new online room and a dailies department for in-house and overnight dailies. Expect a similar expansion at the company’s New York studio in early 2016. In both cases, new hardware has been added specifically for television workflow, such as UHD and HDR monitors and dedicated SAN storage.

Michael Cioni

Michael Cioni

“We are coloring with the Sony BVM X300, which satisfies our needs for HDR 4K displays,” explains Light Iron president Michael Cionni. “We are also using the Sony 940c for a consumer confidence monitor check for HDR 4K material, which our clients appreciate. Our newest, optimized 1 Petabyte SAN comes from Quantum and runs StorNext 5.”

 

Sawyer’s upcoming projects at Light Iron include Season 6 of AMC’s The Walking Dead, Season 1 of History Channel’s Live to Tell and Season 1 of OWN’s Greenleaf. The post house, which is a Panavision company, says to expect more hires in the near future.

In terms of color grading gear, Sawyer is currently using Blackmagic DaVinci Resolve 12 Studio, running on Supermicro computers with multiple Nvidia GeForce GTX Titan X graphics cards to be optimized for 4K 60p content, which Light Iron is already using on one of their new shows — Wheeler Dealers  for Discovery Channel.

“Episodic projects make up about a third of our DI business in Los Angeles right now,” reports Cioni. “We expect to increase episodic finishing significantly in 2016 at our Los Angeles and New York facilities. Our newest location in New Orleans will support dailies and editorial for both episodic and feature projects.”

One can’t help but wonder how much of this television work is thanks to streaming services now creating their own content. “The truth is that OTT episodic content owners, such as Amazon and Netflix, are very interested in future-proofing their investments by embracing the same elements that Light Iron has been championing for years: file-based capture, mobile post, high dynamic range, wide color gamut and 4K-plus resolutions,” explains Cioni. “Our broadband clients are helping drive many of these innovations, and we’re excited that the balance of projects is shifting.”

Earlier in this piece, Cioni referenced Light Iron’s new studio in New Orleans. This location is part of parent company Panavision’s new 30,500-square-foot space, which will also house Light Iron’s first brick-and-mortar facility in Louisiana. The facility represents the first location the companies have shared since Panavision acquired Light Iron at the start of 2015.