Category Archives: 4k

Hollywood’s Digital Jungle moves to Santa Clarita

Digital Jungle, a long-time Hollywood-based post house, has moved its operations to a new facility in Santa Clarita, California, which has become a growing hub for production and post in the suburbs of Los Angeles. The new headquarters is now home to both Digital Jungle Post and its recent off-shoot Digital Jungle Pictures, a feature film development and production studio.

“I don’t mind saying, it was a bit of an experiment moving to Santa Clarita,” explains Digital Jungle president and chief creative Dennis Ho. “With so many filmmakers and productions working out here — including Disney/ABC Studios, Santa Clarita Studios and Universal Locations — this area has developed into a vast untapped market for post production professionals. I decided that now was a good time to tap into that opportunity.”

Digital Jungle’s new facility offers the full complement of digital workflow solutions for HD to 4K. The facility has multiple suites featuring Smoke, DaVinci Resolve, audio recording via Avid’s S6 console and Pro Tools, production offices, a conference area, a full kitchen and a client lounge.

Digital Jungle is well into the process of adding further capabilities with a new high-end luxury DI 4K theater and screening room, greenscreen stage, VFX bullpen, multiple edit bays and additional production offices as part of their phase two build-out.

Digital Jungle Post services include DI/color grading; VFX/motion graphics; audio recording/mixing and sound design; ADR and VO; HD to 4K deliverables for tape and data; DCI and DCDM; promo/bumper design and film/television title design.

Commenting on Digital Jungle Pictures, Ho says, “It was a natural step for me. I started my career by directing and producing promos and interstitials for network TV, studios and distributors. I think that our recent involvement in producing several independent films has enhanced our credibility on the post side. Filmmakers tend to feel more comfortable entrusting their post work to other filmmakers. One example is we recently completed audio post and DI for a new Hallmark film called Love at First Glance.”

In addition to Love at First Glance, Digital Jungle Productions’ recent projects include indie films Day of Days, A Better Place (available now on digital and DVD) and Broken Memories, which was screened at the Sedona Film Festival.

 

Mozart in the Jungle

The colorful dimensions of Amazon’s Mozart in the Jungle

By Randi Altman

How do you describe Amazon’s Mozart in the Jungle? Well, in its most basic form it’s a comedy about the changing of the guard — or maestro — at the New York Philharmonic, and the musicians that make up that orchestra. When you dig deeper you get a behind-the-scenes look at the back-biting and crazy that goes on in the lives and heads of these gifted artists.

Timothy Vincent

Timothy Vincent

Based on the novel Mozart in the Jungle: Sex, Drugs, and Classical Music by oboist Blair Tindall, the series — which won the Golden Globe last year and was nominated this year — has shot in a number of locations over its three seasons, including Mexico and Italy.

Since its inception, Mozart in the Jungle has been finishing in 4K and streaming in both SDR and HDR. We recently reached out to Technicolor’s senior color timer, Timothy Vincent, who has been on the show since the pilot to find out more about the show’s color workflow.

Did Technicolor have to gear up infrastructure-wise for the show’s HDR workflow?
We were doing UHD 4K already and were just getting our HDR workflows worked out.

What is the workflow from offline to online to color?
The dailies are done in New York based on the Alexa K1S1 709 LUT. (Technicolor On-Location Services handled dailies out of Italy, and Technicolor PostWorks in New York.) After the offline and online, I get the offline reference made with the dailies so I can look at if I have a question about what was intended.

If someone was unsure about watching in HDR versus SDR, what would you tell them?
The emotional feel of both the SDR and the HDR is the same. That is always the goal in the HDR pass for Mozart. One of the experiences that is enhanced in the HDR is the depth of field and the three-dimensional quality you gain in the image. This really plays nicely with the feel in the landscapes of Italy, the stage performances where you feel more like you are in the audience, and the long streets of New York just to name a few.

Mozart in the JungleWhen I’m grading the HDR version, I’m able to retain more highlight detail than I was in the SDR pass. For someone who has not yet been able to experience HDR, I would actually recommend that they watch an episode of the show in SDR first and then in HDR so they can see the difference between them. At that point they can choose what kind of viewing experience they want. I think that Mozart looks fantastic in both versions.

What about the “look” of the show. What kind of direction where you given?
We established the look of the show based on conversations and collaboration in my bay. It has always been a filmic look with soft blacks and yellow warm tones as the main palette for the show. Then we added in a fearlessness to take the story in and out of strong shadows. We shape the look of the show to guide the viewers to exactly the story that is being told and the emotions that we want them to feel. Color has always been used as one of the storytelling tools on the show. There is a realistic beauty to the show.

What was your creative partnership like with the show’s cinematographer, Tobias Datum?
I look forward to each episode and discovering what Tobias has given me as palette and mood for each scene. For Season 3 we picked up where we left off at the end of Season 2. We had established the look and feel of the show and only had to account for a large portion of Season 3 being shot in Italy. Making sure to feel the different quality of light and feel of the warmth and beauty of Italy. We did this by playing with natural warm skin tones and the contrast of light and shadow he was creating for the different moods and locations. The same can be said for the two episodes in Mexico in Season 2. I know now what Tobias likes and can make decisions I’m confident that he will like.

Mozart in the JungleFrom a director and cinematographer’s point of view, what kind of choices does HDR open up creatively?
It depends on if they want to maintain the same feel of the SDR or if they want to create a new feel. If they choose to go in a different direction, they can accentuate the contrast and color more with HDR. You can keep more low-light detail while being dark, and you can really create a separate feel to different parts of the show… like a dream sequence or something like that.

Any workflow tricks/tips/trouble spots within the workflow or is it a well-oiled machine at this point?
I have actually changed the way I grade my shows based on the evolution of this show. My end results are the same, but I learned how to build grades that translate to HDR much easier and consistently.

Do you have a color assistant?
I have a couple of assistants that I work with who help me with prepping the show, getting proxies generated, color tracing and some color support.

What tools do you use — monitor, software, computer, scope, etc.?
I am working on Autodesk Lustre 2017 on an HP Z840, while monitoring on both a Panasonic CZ950 and a Sony X300. I work on Omnitek scopes off the downconverter to 2K. The show is shot on both Alexa XT and Alexa Mini, framing for 16×9. All finishing is done in 4K UHD for both SDR and HDR.

Anything you would like to add?
I would only say that everyone should be open to experiencing both SDR and HDR and giving themselves that opportunity to choose which they want to watch and when.

G-Tech 6-15

Digital Anarchy’s new video sharpening tool for FCP X, Premiere and AE

Digital Anarchy has released Samurai Sharpen for Video, a sharpening tool for artists using Apple FCP X, Adobe After Effects and Adobe Premiere Pro. Samurai Sharpen is edge-aware sharpening software that uses a number of intelligent algorithms, as well as GPU acceleration to give users working with HD and 4K footage precise image quality control.

Samurai Sharpen provides video editors and colorists the ability to sharpen the details they want and be highly selective of those areas they want untouched. This means areas like skin tones or dark, noisy background areas, which usually do not look better sharpened, can be protected and not sharpened.

The built-in masks within the software allow editors and colorists to apply specific focus on areas within the video for sharpening, while avoiding dark, shadow areas and bright, highlight areas. This allows for maintaining pristine image quality throughout the shot. The edge-aware algorithm provides another level of masking by restricting the sharpening to only the significant details you choose. For example, enhancing details of an actor’s eyes while preventing areas like skin from being affected.

While the software supports After Effects, Premiere Pro and Final Cut Pro X currently, support for Avid and OpenFX is coming soon. Samurai Sharpen is available now. It’s regularly priced at $129, but Digital Anarchy is offering it for $99 until November 15.


Quick Chat: Josh Haynie Light Iron’s VP of US operations

Post services company Light Iron has named veteran post pro Josh Haynie to VP of US operations, a newly created position. Based in Light Iron’s Hollywood facility, Haynie will be responsible for leveraging the company’s resources across Los Angeles, New York, New Orleans and future locations.

Haynie joins Light Iron after 13 years at Efilm, where, as managing director, he maintained direct responsibility for all aspects of the company’s operations, including EC3 (on-location services), facility dailies, trailers, digital intermediate, home video and restoration. He managed a team of 100-plus employees. Previously, Haynie held positions at Sunset Digital, Octane/Lightning Dubs and other production and post companies. Haynie is an associate member of the ASC and is also actively involved in the HPA, SMPTE, and VES.

“From the expansion of Light Iron’s episodic services and New York facilities to the development of the color science in the new Millennium DXL camera, it is clear that the integration of Panavision and Light Iron brings significant benefits to clients,” says Haynie.

He was kind enough to take time out of his schedule to answer some of our questions…

Your title hints Light Iron opening up in new territories. Can you talk about this ? What is happening in the industry that this makes sense?
We want to be strategically located near the multiple Panavision locations. Productions and filmmakers need the expertise and familiarity of Light Iron resources in the region with the security and stability of a solid infrastructure. Projects often have splinter and multiple units in various locations, and they demand a workflow continuity in these disparate locations. We can help facilitate projects working in those various regions and offer unparalleled support and guidance.

What do you hope to accomplish in your first 6 to 12 months? What are your goals for Light Iron?
I want to learn from this very agile team of professionals and bring in operational and workflow options to the rapidly changing production/post production convergence we are all encountering. We have a very solid footing in LA, NY and NOLA. I want to ensure that each unit is working together using effective skills and technology to collaborate and allow filmmakers creative freedom. My goal is to help navigate this team though the traditional growth patterns as well as the unpredictable challenges that lie ahead in the emerging market.

You have a wealth of DI experience and knowledge. How has DI changed over the years?
The change depends on the elevation. From a very high level, it was the same simple process for many years: shoot, edit, scan, VFX, color — and our hero was always a film print. Flying lower, we have seen massive shifts in technology that have re-written the play books. The DI really starts in the camera testing phase and begins to mature during the production photography stage. The importance of look setting, dailies and VFX collaboration take on a whole new meaning with each day of shooting.

The image data that is captured needs to be available for near set cutting while VFX elements are being pulled within a few short days of photography. This image data needs to be light and nimble, albeit massive in file size and run time. The turnarounds are shrinking in the feature space exponentially. We are experiencing international collaboration on the finish and color of each project, and the final render dates are increasingly close to worldwide release dates. We are now seeing a tipping point like we encountered a few years back when we asked ourselves, “Is the hero a print or DCP?” Today, we are at the next hero question, DCP or HDR?

Do you have any advice for younger DI artists based on your history?
I think it is always good to learn from the past and understand how we got here. I would say younger artists need to aggressively educate themselves on workflow, technology, and collaboration. Each craft in the journey has experienced rapid evolvement in the last few years. There are many outlets to learn about the latest capture, edit, VFX, sound and distribution techniques being offered, and that research time needs to be on everyone’s daily task list. Seeking out new emerging creative talent is critical learning at this stage as well. Everyday a filmmaker is formulating a vision that is new to the world. We are fortunate here at Light Iron to work with these emerging filmmakers who share the same passion for taking that bold next step in storytelling.


Bluefish444 offering new range of I/O cards with Kronos

Bluefish444, makers of uncompressed 4K/2K/HD/SD video I/O cards for Windows, Mac OS X and Linux, has introduced the Kronos range of video and audio I/O cards. The new line extends the feature set of Bluefish444’s Epoch video cards, which support up to 4K 60 frame per second workflows. Kronos is developed for additional workflows requiring Ultra HD up to 8K, high frame rates up to 120fps, high dynamic range and video over IP.

With these capabilities, Bluefish444 cards are developed to support all areas of TV and feature film production, post, display and restoration, in addition to virtual reality and augmented reality.

Kronos adds video processing technologies including resolution scaling, video interlace and de-interlace; hardware CODEC support, SDI to IP and IP to SDI conversion, and continues to offer the 12-bit color space conversion and low-latency capabilities of Epoch.

“With the choice of HD BNC SD/HD/3G connectivity, or SFP+ connectivity enabling greater than 3G SDI and Video over IP across 10Gbps Ethernet, the Kronos range will suit today’s demanding requirements and cover emerging technologies as they mature,” says product manager Tom Lithgow. “4K ultra high definition, high frame rate and ultra-high frame rate video, high dynamic range, video over IP and hardware assisted processing are now available to OEM developers, professional content creators and system integrators with the Bluefish444 Kronos range.”

HDMI 2.0 I/O, additional SMPTE 2022 IP standards and emerging IP standards are earmarked for future support via firmware update.

Kronos will offer the choice of SDI I/O connectivity with the Kronos elektron featuring eight high-density BNC connectors capable of SD/HD/3G SDI. Each HD BNC connector is fully bi-directional enabling numerous configuration options, including eight input, eight output, or a mixture of SDI input and output connections.

The Kronos optikós offers future proofing connectivity with three SFP+ cages in addition to two HD BNC connectors for SD/HD/3G SDI I/O. The SFP+ cages on Kronos optikós provide limitless connectivity options, exposing greater than 3G SDI, IP connectivity across 10Gb Ethernet, and flexibility to choose from numerous physical interfaces.

All Kronos cards will have an eight-lane Gen 3 PCIe interface and will provide access to high bandwidth UHD, high frame rate and high dynamic range video IO and processing across traditional SDI and also emerging IP standards such as SMPTE 2022.

Kronos specs include:
* SD SMPTE295M
* HD 1.5G SMPTE292M
* 3G (A+B) SMPTE424M
* ASI
* 4:2:2:4 / 4:4:4:4 SDI
* Single Link / Dual Link / Quad Link interfaces
* 12/10-bit SDI
* Full 4K frame buffer
* 3Gbps Bypass Relays
* 12-bit video processing pipeline
* 4 x 4 x 32bit Matrix
* MR2 Routing resources
* Hardware Keyer (2K/HD)
* Customizable and flexible pixel formats
* AES Audio Input / AES Audio Output
* LTC I / O
* RS422
* Bi/Tri-level Genlock Input & Crosslocking
* Genlock loop through
* VANC complete access
* HANC Embedded Audio/Payload ID/Custom Packets/RP188
* MSA and Non-MSA compatible SFP+
* SMPTE 2022-6

The Kronos range will be available in Q4 2016 with pricing announced then.


Sony launches Z Series line of UHD TVs for 4K HDR

Sony recently introduced a line of UHD television displays that could be suitable as client monitors in post houses. Sony’s new Z series television display technology — including the X930D and X940D — has adopted Sony’s Backlight Master Drive backlight boosting technology, which expands brightness and contrast to better exploit 4K HDR. While testing will need to be done, rumor has it the monitor may easily comply with Ultra HD Alliance requirements, making this an excellent choice for large size monitors for the client experience.

To further enhance contrast, the Backlight Master Drive includes a dense LED structure, discrete lighting control and an optical design with a calibrated beam LED. Previously, local dimming was controlled by zones with several LEDs. The discrete LED control feature allows the Backlight Master Drive to dim and boost each LED individually for greater precision, contrast and realism.

Additionally, the Z series features a newly developed 4K image processor, the 4K HDR Processor X1 Extreme. Combined with Backlight Master Drive, the Z series features expanded contrast and more accurate color expression. The 4K Processor X1 Extreme incorporates three new technologies: an object-based HDR remaster, dual database processing and Super Bit Mapping 4K HDR. With these three technologies, 4K HDR Processor X1 Extreme reproduces a wide variety of content with immersive 4K HDR picture quality.

The Z series runs on Android TV with a Sony user interface that includes a new content bar with enhanced content navigation, voice search and a genre filtering function. Instead of selecting a program from several channels, users can select from favorite genres, including as sports, music, news.

Pricing is as follows:

  • XBR65Z9D, 65″ class (64.5″ diagonal), $6,999 MSRP, available summer 2016
  • XBR75Z9D, 75″ class (74.5″ diagonal), $9,999 MSRP, available summer 2016
  • XBR100Z9D, 100″ class (99.5″ diagonal), pricing and availability details to be announced later this year.

Storage Workflows for 4K and Beyond

Technicolor-Postworks and Deluxe Creative Services share their stories.

By Beth Marchant

Once upon a time, an editorial shop was a sneaker-net away from the other islands in the pipeline archipelago. That changed when the last phases of the digital revolution set many traditional editorial facilities into swift expansion mode to include more post production services under one roof.

The consolidating business environment in the post industry of the past several years then brought more of those expanded, overlapping divisions together. That’s a lot for any network to handle, let alone one containing some of the highest quality and most data-dense sound and pictures being created today. The networked storage systems connecting them all must be robust, efficient and realtime without fail, but also capable of expanding and contracting with the fluctuations of client requests, job sizes, acquisitions and, of course, evolving technology.

There’s a “relief valve” in the cloud and object storage, say facility CTOs minding the flow, but it’s still a delicate balance between local pooled and tiered storage and iron-clad cloud-based networks their clients will trust.

Technicolor-Postworks
Joe Beirne, CTO of Technicolor-PostWorks New York, is probably as familiar as one can be with complex nonlinear editorial workflows. A user of Avid’s earliest NLEs, an early adopter of networked editing and an immersive interactive filmmaker who experimented early with bluescreen footage, Beirne began his career as a technical advisor and producer for high-profile mixed-format feature documentaries, including Michael Moore’s Fahrenheit 9/11 and the last film in Godfrey Reggio’s KOYAANISQATSI trilogy.

Joe Beirne

Joe Beirne

In his 11 years as a technology strategist at Technicolor-PostWorks New York, Beirne has also become fluent in evolving color, DI and audio workflows for clients such as HBO, Lionsgate, Discovery and Amazon Studios. CTO since 2011, when PostWorks NY acquired the East Coast Technicolor facility and the color science that came with it, he now oversees the increasingly complicated ecosystem that moves and stores vast amounts of high-resolution footage and data while simultaneously holding those separate and variously intersecting workflows together.

As the first post facility in New York to handle petabyte levels of editorial-based storage, Technicolor-PostWorks learned early how to manage the data explosion unleashed by digital cameras and NLEs. “That’s not because we had a petabyte SAN or NAS or near-line storage,” explains Beirne. “But we had literally 25 to 30 Avid Unity systems that were all in aggregate at once. We had a lot of storage spread out over the campus of buildings that we ran on the traditional PostWorks editorial side of the business.”

The TV finishing and DI business that developed at PostWorks in 2005, when Beirne joined the company (he was previously a client), eventually necessitated a different route. “As we’ve grown, we’ve expanded out to tiered storage, as everyone is doing, and also to the cloud,” he says. “Like we’ve done with our creative platforms, we have channeled our different storage systems and subsystems to meet specific needs. But they all have a very promiscuous relationship with each other!”

TPW’s high-performance storage in its production network is a combination of local or semi-locally attached near-line storage tethered by several Quantum StorNext SANs, all of it air-gapped — or physically segregated —from the public Internet. “We’ve got multiple SANs in the main Technicolor mothership on Leroy Street with multiple metadata controllers,” says Beirne. “We’ve also got some client-specific storage, so we have a SAN that can be dedicated to a particular account. We did that for a particular client who has very restrictive policies about shared storage.”

TPW’s editorial media, for the most part, resides in Avid’s ISIS system and is in the process of transitioning to its software-defined replacement, Nexis. “We have hundreds of Avids, a few Adobe and even some Final Cut systems connected to that collection of Nexis and ISIS and Unity systems,” he says. “We’re currently testing the Nexis pipeline for our needs but, in general, we’re going to keep using this kind of storage for the foreseeable future. We have multiple storage servers that serve that part of our business.”

Beirne says most every project the facility touches is archived to LTO tape. “We have a little bit of disc-to-tape archiving going on for the same reasons everybody else does,” he adds. “And some SAN volume hot spots that are all SSD (solid state drives) or a hybrid.” The facility is also in the process of improving the bandwidth of its overall switching fabric, both on the Fibre Channel side and on the Ethernet side. “That means we’re moving to 32Gb and multiple 16Gb links,” he says. “We’re also exploring a 40Gb Ethernet backbone.”

Technicolor-Postworks 4K theater at their Leroy Street location.

This backbone, he adds, carries an exponential amount of data every day. “Now we have what are like two nested networks of storage at a lot of the artist workstations,” he explains. “That’s a complicating feature. It’s this big, kind of octopus, actually. Scratch that: it’s like two octopi on top of one another. That’s not even mentioning the baseband LAN network that interweaves this whole thing. They, of course, are now getting intermixed because we are also doing IT-based switching. The entire, complex ecosystem is evolving and everything that interacts with it is evolving right along with it.”

The cloud is providing some relief and handles multiple types of storage workflows across TPW’s various business units. “Different flavors of the commercial cloud, as well as our own private cloud, handle those different pools of storage outside our premises,” Beirne says. “We’re collaborating right now with an international account in another territory and we’re touching their storage envelope through the Azure cloud (Microsoft’s enterprise-grade cloud platform). Our Azure cloud and theirs touch and we push data from that storage back and forth between us. That particular collaboration happened because we both had an Azure instance, and those kinds of server-to-server transactions that occur entirely in the cloud work very well. We also had a relationship with one of the studios in which we made a similar connection through Amazon’s S3 cloud.”

Given the trepidations most studios still have about the cloud, Beirne admits there will always be some initial, instinctive mistrust from both clients and staff when you start moving any content away from computers that are not your own and you don’t control. “What made that first cloud solution work, and this is kind of goofy, is we used Aspera to move the data, even though it was between adjacent racks. But we took advantage of the high-bandwidth backbone to do it efficiently.”

Both TPW in New York and Technicolor in Los Angeles have since leveraged the cloud aggressively. “We our own cloud that we built, and big Technicolor has a very substantial purpose-built cloud, as well as Technicolor Pulse, their new storage-related production service in the cloud. They also use object storage and have some even newer technology that will be launching shortly.”

The caveat to moving any storage-related workflow into the cloud is thorough and continual testing, says Beirne. “Do I have more concern for my clients’ media in the cloud than I do when sending my own tax forms electronically? Yea, I probably do,” he says. “It’s a very, very high threshold that we need to pass. But that said, there’s quite a bit of low-impact support stuff that we can do on the cloud. Review and approval stuff has been happening in the cloud for some time.” As a result, the facility has seen an increase, like everyone else, in virtual client sessions, like live color sessions and live mix sessions from city to city or continent to continent. “To do that, we usually have a closed circuit that we open between two facilities and have calibrated displays on either end. And, we also use PIX and other normal dailies systems.”

“How we process and push this media around ultimately defines our business,” he concludes. “It’s increasingly bigger projects that are made more demanding from a computing point of view. And then spreading that out in a safe and effective way to where people want to access it, that’s the challenge we confront every single day. There’s this enormous tension between the desire to be mobile and open and computing everywhere and anywhere, with these incredibly powerful computer systems we now carry around in our pockets and the bandwidth of the content that we’re making, which is high frame rate, high resolution, high dynamic range and high everything. And with 8K — HDR and stereo wavefront data goes way beyond 8K and what the retina even sees — and 10-bit or more coming in the broadcast chain, it will be more of the same.” TPW is already doing 16-bit processing for all of its film projects and most of its television work. “That’s piles and piles and piles of data that also scales linearly. It’s never going to stop. And we have a VR lab here now, and there’s no end of the data when you start including everything in and outside of the frame. That’s what keeps me up at night.”

Deluxe Creative Services
Before becoming CTO at Deluxe Creative Services, Mike Chiado had a 15-year career as a color engineer and image scientist at Company 3, the grading and finishing powerhouse acquired by Deluxe in 2010. He now manages the pipelines of a commercial, television and film Creative Services division that encompasses not just dailies, editorial and color, but sound, VFX, 3D conversion, virtual reality, interactive design and restoration.

MikeChiado

Mike Chiado

That’s a hugely data-heavy load to begin with, and as VR and 8K projects become more common, managing the data stored and coursing through DCS’ network will get even more demanding. Branded companies currently under the monster Deluxe umbrella include Beast, Company 3, DDP, Deluxe/Culver City, Deluxe VR, Editpool, Efilm, Encore, Flagstaff Studios, Iloura, Level 3, Method Studios, StageOne Sound, Stereo D, and Rushes.

“Actually, that’s nothing when you consider that all the delivery and media teams from Deluxe Delivery and Deluxe Digital Cinema are downstream of Creative Services,” says Chiado. “That’s a much bigger network and storage challenge at that level.” Still, the storage challenges of Chiado’s segment are routinely complicated by the twin monkey wrenches of the collaborative and computer kind that can unhinge any technology-driven art form.

“Each area of the business has its own specific problems that recur: television has its issues, commercial work has its issues and features its issues. For us, commercials and features are more alike than you might think, partly due to the constantly changing visual effects but also due to shifting schedules. Television is much more regimented,” he says. “But sometimes we get hard drives in on a commercial or feature and we think, ‘Well that’s not what we talked about at all!”

Company 3’s file-based digital intermediate work quickly clarified Chiado’s technical priorities. “The thing that we learned early on is realtime playback is just so critical,” he says. “When we did our very first file-based DI job 13 years ago, we were so excited that we could display a certain resolution. OK, it was slipping a little bit from realtime, maybe we’ll get 22 frames a second, or 23, but then the director walked out after five minutes and said, ‘No. This won’t work.’ He couldn’t care less about the resolution because it was only always about realtime and solid playback. Luckily, we learned our lesson pretty quickly and learned it well! In Deluxe Creative Services, that still is the number one priority.”

It’s also helped him cut through unnecessary sales pitches from storage vendors unfamiliar with Deluxe’s business. “When I talk to them, I say, ‘Don’t tell me about bit rates. I’m going to tell you a frame rate I want to hit and a resolution, and you tell me if we can hit it or not with your solution. I don’t want to argue bits; I want tell you this is what I need to do and you’re going to tell me whether or not your storage can do that.’ The storage vendors that we’re going to bank our A-client work on better understand fundamentally what we need.”

Because some of the Deluxe company brands share office space — Method and Company 3 moved into a 63,376-square-foot former warehouse in Santa Monica a few years ago — they have access to the same storage infrastructure. “But there are often volumes specially purpose-built for a particular job,” says Chiado. “In that way, we’ve created volumes focused on supporting 4K feature work and others set up specifically for CG desktop environments that are shared across 400 people in that one building. We also have similar business units in Company 3 and Efilm, so sometimes it makes sense that we would want, for artist or client reasons, to have somebody in a different location from where the data resides. For example, having the artist in Santa Monica and the director and DP in Hollywood is something we do regularly.”

Chiado says Deluxe has designed and built with network solution and storage solution providers a system “that suits our needs. But for the most part, we’re using off-the-shelf products for storage. The magic is how we tune them to be able to work with our systems.”

Those vendors include Quantum, DDN Storage and EMC’s network-attached storage Isilon. “For our most robust needs, like 4K feature workflows, we rely on DDN,” he says. “We’ve actually already done some 8K workflows. Crazy world we live in!” For long-term archiving, each Deluxe Creative Service location worldwide has an LTO-tape robot library. “In some cases, we’ll have a near-line tier two volume that stages it. And for the past few years, we’re using object storage in some locations to help with that.”

Although the entire group of Deluxe divisions and offices are linked by a robust 10GigE network that sometimes takes advantage of dark fiber, unused fiber optic cables leased from larger fiber-optic communications companies, Chiado says the storage they use is all very specific to each business unit. “We’re moving stuff around all the time but projects are pretty much residing in one spot or another,” he says. “Often, there are a thousand reasons why — it may be for tax incentives in a particular location, it may be for project-specific needs. Or it’s just that we’re talking about the London and LA locations.”

With one eye on the future and another on budgets, Chiado says pooled storage has helped DCS keep costs down while managing larger and larger subsets of data-heavy projects. “We are always on the lookout for ways to handle the next thing, like the arrival of 8K workflows, but we’ve gained huge, huge efficiencies from pooled storage,” he says. “So that’s the beauty of what we build, specific to each of our world locations. We move it around if we have to between locations but inside that location, everybody works with the content in one place. That right there was a major efficiency in our workflows.”

Beyond that, he says, how to handle 8K is still an open question. “We may have to make an island, and it’s been testing so far, but we do everything we can to keep it in one place and leverage whatever technology that’s required for the job,” Chiado says. “We have isolated instances of SSDs (solid-state drives) but we don’t have large-scale deployment of SSDs yet. On the other end, we’re working with cloud vendors, too, to be able to maximize our investments.”

Although the company is still working through cloud security issues, Chiado says Deluxe is “actively engaging with cloud vendors because we aren’t convinced that our clients are going to be happy with the security protocols in place right now. The nature of the business is we are regularly involved with our clients and MPAA and have ongoing security audits. We also have a group within Deluxe that helps us maintain the best standards, but each show that comes in may have its own unique security needs. It’s a constant, evolving process. It’s been really difficult to get our heads and our clients’ heads around using the cloud for rendering, transcoding or for storage.”

Luckily, that’s starting to change. “We’re getting good traction now, with a few of the studios getting ready to greenlight cloud use and our own pipeline development to support it,” he adds. “They are hand in hand. But I think once we move over this hurdle, this is going to help the industry tremendously.”

Beyond those longer-term challenges, Chiado says the day-to-day demands of each division haven’t changed much. “Everybody always needs more storage, so we are constantly looking at ways to make that happen,” he says. “The better we can monitor our storage and make our in-house people feel comfortable moving stuff off near-line to tape and bring it back again, the better we can put the storage where we need it. But I’m very optimistic about the future, especially about having a relief valve in the cloud.”

Our main image is the shared 4K theater at Company 3 and Method.


Blending Ursa Mini and Red footage for Aston Martin spec spot

By Daniel Restuccio

When producer/director Jacob Steagall set out to make a spec commercial for Aston Martin, he chose to lens it on the Blackmagic Ursa Mini 4.6k and the Scarlet Red. He says the camera combo worked so seamlessly he dares anyone to tell which shots are Blackmagic and which are Red.

L-R Blackmagic’s Moritz Fortmann and Shawn Carlson with Jacob Steagall and Scott Stevens.

“I had the idea of filming a spec commercial to generate new business,” says Steagall. He convinced the high-end car maker to lend him an Aston Martin 2016 V12 Vanquish for a weekend. “The intent was to make a nice product that could be on their website and also be a good-looking piece on the demo reel for my production company.”

Steagall immediately pulled together his production team, which consisted of co-director Jonathan Swecker and cinematographers Scott Stevens and Adam Pacheco. “The team and I collaborated together about the vision for the spot which was to be quick, clean and to the point, but we would also accentuate the luxury and sexiness of the car.”

“We had access to the new Blackmagic Ursa Mini 4.6k and an older Red Scarlet with the MX chip,” says Stevens. “I was really interested in seeing how both cameras performed.”

He set up the Ursa Mini to shoot ProRes HQ at Ultra HD (3840×2160) and the Scarlet at 8:1 compression at 4K (4096×2160). He used both Canon still camera primes and a 24-105mm zoom, switching them from camera to camera depending on the shot. “For some wide shots we set them up side by side,” explains Stevens. “We also would have one camera shooting the back of the car and the other camera shooting a close-up on the side.”

In addition to his shooting duties, Stevens also edited the spot, using Adobe Premiere, and exported the XML into Blackmagic Resolve Studio 12. Stevens notes that, in addition to loving cinematography, he’s also “really into” color correction. “Jacob (Steagall) and I liked the way the Red footage looked straight out of the camera in the RedGamma4 color space. I matched the Blackmagic footage to the Red footage to get a basic look.”

Blackmagic colorist Moritz Fortmann took Stevens’ basis color correction and finessed the grade even more. “The first step was to talk to Jacob and Scott and find out what they were envisioning, what feel and look they were going for. They had already established a look so we saved a few stills as reference images to work off. The spot was shot on two different types of cameras, and in different formats. Step two was to analyze the characteristics of each camera and establish a color correction to match the two.  Step three was to tweak and refine the look. We did what I would describe as a simple color grade, only relying on primaries, without using any Power Windows or keys.”

If you’re planning to shoot mixed footage, Fortmann suggests you use cameras with similar characteristics, matching resolution, dynamic range and format. “Shooting RAW and/or Log provides for the highest dynamic range,” he says. “The more ‘room’ a colorist has to make adjustments, the easier it will be to match mixed footage. When color correcting, the key is to make mixed footage look consistent. One camera may perform well in low light while another one does not. You’ll need to find that sweet spot that works for all of your footage, not just one camera.”

Daniel Restuccio is a writer and chair of the multimedia department at California Lutheran University.


Quick Chat: New president/GM Deluxe TV post services Dom Rom

Domenic Rom, a fixture in the New York post community for 30 years, has been promoted to president and GM of Deluxe TV Post Production Services. Rom was most recently managing director of Deluxe’s New York studio, which incorporates Encore/Company 3/Method. He will now be leading Deluxe’s global services for television, specifically, the Encore and Level 3 branded companies. He will be making the move to Los Angeles.

Rom’s resume is long. He joined DuArt Film Labs in 1984 as a colorist, working his way up to EVP of the company, running both its digital and film lab divisions. In 2000, he joined stock footage/production company Sekani (acquired by Corbis), helping to build the first fully digital content distribution network. In 2002, he founded The Lab at Moving Images, the first motion picture lab to open in in New York in 25 years. It was acquired by PostWorks, which named Rom COO overseeing its Avid rentals, remote set-ups, audio mixing, color correction and editorial businesses. In 2010, Rom joined Technicolor NY as SVP post production. When PostWorks NY acquired Technicolor NY, Rom again became COO of the now-larger company. He joined Deluxe in 2013 as GM of its New York operations.

“I love what I’m seeing today in the industry,” he says. “It has been said many times, but we’re truly in a golden age of television. The best entertainment in the world is coming from the networks and a whole new generation of original content creators. It’s exciting to be in a position to service that work. There are few, if any, companies that have invested in the research, technology and talent to the degree Deluxe has, to help clients take advantage of the latest advancements — whether it’s HDR, 4K, or whatever comes next, to create amazing new experiences for viewers.”

postPerspective reached out to Rom, as he was making his transition to the West Coast, to find out more about his new role and his move.

What does this position mean to you?
This position is the biggest honor and challenge of my career. I have always considered Encore and Level 3 to be the premier television facilities in the world, and to be responsible for them is amazing and daunting all at the same time. I am so looking forward to working with the facilities in Vancouver, Toronto, New York and London.

What do you expect/hope to accomplish in this new role?
To bring our worldwide teams even closer and grow the client relationships even stronger than they already are, because at the end of the day this is a total relationship business and probably my favorite part of the job.

How have you seen post and television change over the years?
I was talking about this with the staff out here the other day. I have seen the business go from film to 2-inch tape to 1-inch to D2 to D5 to HDCAM (more formats than I can remember) to nonlinear editing and digital acquisition — I could go on and on. Right now the quality and sheer amount of content coming from the studios, networks, cablenets and many, many new creators is both exciting and challenging. The fact that this business is constantly changing helps to keep me young.

How is today’s production and post technology helping make TV an even better experience for audiences?
In New York we just completed the first Dolby Vision project for an entire episodic television season (which I can’t name yet), and it looks beautiful. HDR opens up a whole new visual world to the artists and the audience.

Are you looking forward to living in Los Angeles?
I have always danced with the idea of living in LA throughout my career, and to do so this far in is interesting timing. My family and, most importantly, my brand new grandson are all on the east coast so I will maintain my roots there while spreading them out west as well.

Talking VR content with Phillip Moses of studio Rascali

Phillip Moses, head of VR content developer Rascali, has been working in visual effects for over 25 years. His resume boasts some big-name films, including Alice in Wonderland, Speed Racer and Spider-Man 3, just to name a few. Seven years ago he launched a small boutique visual effects studio, called The Resistance VFX, with VFX supervisor Jeff Goldman.

Two years ago, after getting a demo of an Oculus pre-release Dev Kit 2, Moses realized that “we were poised on the edge of not just a technological breakthrough, but what will ultimately be a new platform for consuming content. To me, this was a shift almost as big as the smartphone, and an exciting opportunity for content creators to begin creating in a whole new ecosystem.”

Phillip Moses

Phillip Moses

Shortly after that, his friends James Chung and Taehoon Oh launched Reload Studios, with the vision of creating the first independently-developed first-person shooter game, designed from the ground up for VR. “As one of the first companies formed around the premise of VR, they attracted quite a bit of interest in the non-gaming sector as well,” he explains. “Last year, they asked me to come aboard and direct their non-gaming division, Rascali. I saw this as a huge opportunity to do what I love best: explore, create and innovate.”

Rascali has been busy. They recently debuted trailers for their first episodic VR projects, Raven and The Storybox Project, on YouTube, Facebook/Oculus Video, Jaunt, Littlstar, Vrideo and Samsung MilkVR. Let’s find out more…

You recently directed two VR trailers. How is directing for VR different than directing for traditional platforms?
Directing for VR is a tricky beast and requires a lot of technical knowledge of the whole process that would not normally be required of directors. To be fair, today’s directors are a very savvy bunch, and most have a solid working knowledge of how visual effects are used in the process. However, for the way I have chosen to shoot the series, it requires the ability to have a pretty solid understanding of not just what can be done, but how to actually do it. To be able to previsualize the process and, ultimately, the end result in your head first is critical to being able to communicate that vision down the line.

Also, from a script and performance perspective, I think it’s important to start with a very important question of “Why VR?” And once you believe you have a compelling answer to that question, then you need to start thinking about how to use VR in your story.  Will you require interaction and participation from the viewer? Will you involve the viewer in any way? Or will you simply allow VR to serve as an additional element of presence and immersion for the viewer?

While you gain many things in VR, you also have to go into the process with a full knowledge of what you ultimately lose. The power of lenses, for example, to capture nuance and to frame an image to evoke an emotional response, is all but lost. You find yourself going back to exploring what works best in a real-world framing — almost like you are directing a play in an intimate theater.

What is the biggest challenge in the post workflow for VR?
Rendering! Everything we are producing for Raven is at 4K left eye, 4K right eye and 60fps. The rendering process alone guarantees that the process will take longer than you hoped. It also guarantees that you will need more data storage than you ever thought necessary.

But other than rendering, I find that the editorial process is also more challenging. With VR, those shots that you thought you were holding onto way too long are actually still too short, and it involves an elaborate process to conform everything for review in a headset between revisions. In many ways, it’s similar to the old process of making your edit decisions, then walking the print into the screening room. You forget how tedious the process can be.
By the way, I’m looking forward to integrating some realtime 360 review into the editorial process. Make it happen Adobe/Avid!

These trailers are meant to generate interest from production partners to green light these as full episodic series. What is the intended length of each episode, and what’s the projected length of time from concept to completion for each episode of the all-CG Storybox, and live-action Raven?
Each one of these projects is designed for completely different audiences, so the answer is a bit different for each one. For Storybox, we are looking to keep each episode under five minutes, with the intention that it is a fairly easy-to-consume piece of content that is accessible to a broad spectrum of ages. We really hope to make the experiences fun, playful and surprising for the viewer, and to create a context for telling these stories that fuels the imagination of kids.

For Storybox, I believe that we can start delivering finished episodes before the end of the third quarter — with a full season representing 12 to 15 episodes. Raven, on the other hand, is a much more complex undertaking. While the VR market is being developed, we are betting on the core VR consumers to really want stories and experiences that range closer to 12 to 15 minutes in duration. We feel this is enough time to tell more complex stories, but still make each episode feel like a fantastic experience that they could not experience anywhere else. If green-lit tomorrow, I believe we would be looking at a four-month production schedule for the pilot episode.

Rascali is a division of Reload Studios, which is developing VR games. Is there a technology transfer of workflows and pipelines and shared best practices across production for entertainment content and games within the company?
Absolutely! While VR is a new technology, there is such a rich heritage of knowledge present at Reload Studios. For example, one question that VR directors are asking themselves is: “How can I direct my audience’s attention to action in ways that are organic and natural?” While this is a new question for film directors — who typically rely on camera to do this work for them — this is a question that the gaming community has been answering for years. Having some of the top designers in the game industry at our disposal is an invaluable asset.

That being said, Reload is much different than most independent game companies. One of their first hires was senior Disney animator Nik Ranieri. Our producing team is composed of top animation producers from Marvel and DC. We have a deep bench of people who give the whole company a very comprehensive knowledge of how content of all types is created.

What was the equipment set-up for the Raven VR shoot? Which camera was used? What tools were used in the post pipeline?
Much of the creative IP for Raven is very much in development, including designs, characters, etc. For this reason, we elected to construct a teaser that highlighted immersive VR vistas that you could expect in the world we are creating. This required us to lean very heavily on the visual effects / CG production process — the VFX pipeline included Autodesk 3ds Max, rendering in V-Ray, with some assistance from Nuke and even Softimage XSI. The entire project was edited in Adobe Premiere.

For our one live-action element, this was shot with a single Red camera, and then projected onto geometry for accurate stereo integration.

Where do you think the prevailing future of VR content is? Narrative, training, therapy, gaming, etc.?
I think your question represents the future of VR. Games, for sure, are going to be leading the charge, as this demographic is the only one on a large scale that will be purchasing the devices required to build a viable market. But much more than games, I’m excited to see growth in all of the areas you listed above, including, most significantly, education. Education could be a huge winner in the growing VR/AR ecosystem.

The reason I elected to join Rascali is to help provide solutions and pave the way for solutions in markets that mostly don’t yet exist.  It’s exciting to be a part of a new industry that has the power to improve and benefit so many aspects of the global community.