Tag Archives: HDR

HPA releases 2019 Tech Retreat program, includes eSports

The Hollywood Professional Association (HPA) has set its schedule for the 2019 HPA Tech Retreat, set for February 11-15. The Tech Retreat, which is celebrating its 25th year, takes place over the course of a week at the JW Marriott Resort & Spa in Palm Desert, California.

The HPA Tech Retreat spans five days of sessions, technology demonstrations and events. During this week, important aspects of production, broadcast, post, distribution and related M&E trends are explored. One of the key differentiators of the Tech Retreat is its strict adherence to a non-commercial focus: marketing-oriented presentations are prohibited except at breakfast roundtables.

“Once again, we’ve received many more submissions than we could use,” says Mark Schubin, the Program Maestro of the HPA Tech Retreat. “To say this year’s were ‘compelling’ is an understatement. We could have programmed a few more days. Rejecting terrific submissions is always the hardest thing we have to do. I’m really looking forward to learning the latest on HDR, using artificial intelligence to restore old movies and machine learning to deal with grunt work, the Academy’s new software foundation, location-based entertainment with altered reality and much more.”

This year’s program is as follows:

Monday February 11: TR-X
eSports: Dropping the Mic on Center Stage
Separate registration required
A half day of targeted panels, speakers and interaction, TR-X will focus on the rapidly growing arena of eSports, with a keynote from Yvette Martinez, CEO – North America of eSports organizer and production company ESL North America.
Tuesday February 12: Supersession
Next-Gen Workflows and Infrastructure: From the Set to the Consumer

Tuesday February 12: Supersession
Next-Gen Workflows and Infrastructure: From the Set to the Consumer

Wednesday February 13: Main Program Highlights
• Mark Schubin’s Technology Year in Review
• Washington Update (Jim Burger, Thompson Coburn LLP)
The highly anticipated review of legislation and its impact on our business from a leading Washington attorney.

• Deep Fakes (Moderated by Debra Kaufman, ETCentric; Panelists Marc Zorn, HBO; Ed Grogan, Department of Defense; Alex Zhukov, Video Gorillas)
It might seem nice to be able to use actors long dead, but the concept of “fake news” takes a terrifying new turn with deepfakes, the term that Wikipedia describes as a portmanteau of “deep learning” and “fake.” Although people have been manipulating images for centuries – long before the creation of Adobe Photoshop – the new AI-powered tools allow the creation of very convincing fake audio and video.

• The Netflix Media Database (Rohit Puri, Netflix)
An optimized user interface, meaningful personalized recommendations, efficient streaming and a high-quality catalog of content are the principal factors that define theNetflix end-user experience. A myriad of business workflows of varying complexities come together to realize this experience. Under the covers, they use computationally expensive computer vision, audio processing and natural language-processing based media analysis algorithms. These algorithms generate temporally and spatially dynamic metadata that is shared across the various use cases. The Netflix Media DataBase (NMDB) is a multi-tenant, data system that is used to persist this deeply technical metadata about various media assets at Netflix and that enables querying the same at scale. The “shared nothing” distributed database architecture allows NMDB to store large amounts of media timeline data, thus forming the backbone for various Netflix media processing systems.

• AI Film Restoration at 12 Million Frames per Second (Alex Zhukov, Video Gorillas)

• Is More Media Made for Subways Than for TV and Cinema? (and does it Make More $$$?) (Andy Quested, BBC)

• Broadcasters Panel (Moderator: Matthew Goldman, MediaKind)

• CES Review (Peter Putman, ROAM Consulting)
Pete Putman traveled to Las Vegas to see what’s new in the world of consumer electronics and returns to share his insights with the HPA Tech Retreat audience.

• 8K: Whoa! How’d We Get There So Quickly (Peter Putman, ROAM Consulting)

• Issues with HDR Home Video Deliverables for Features (Josh Pines, Technicolor)

• HDR “Mini” Session
• HDR Intro: Seth Hallen, Pixelogic
• Ambient Light Compensation for HDR Presentation: Don Eklund, Sony Pictures Entertainment
• HDR in Anime: Haruka Miyagawa, Netflix
• Pushing the Limits of Motion Appearance in HDR: Richard Miller, Pixelworks
• Downstream Image Presentation Management for Consumer Displays:
• Moderator: Michael Chambliss, International Cinematographers Guild
• Michael Keegan, Netflix
• Annie Chang, UHD Alliance
• Steven Poster, ASC, International Cinematographers Guild
• Toshi Ogura, Sony

• Solid Cinema Screens with Front Sound: Do They Work? (Julien Berry, Delair Studios)
Direct-view displays bring high image quality in the cinema but suffer from low pixel fill factor that can lead to heavy moiré and aliasing patterns. Cinema projectors have a much better fill factor which avoids most of those issues even though some moiré effect can be produced due to the screen perforations needed for the audio. With the advent of high contrast, EDR and soon HDR image quality in cinema, screen perforations impact the perceived brightness and contrast from the same image, though the effect has never been quantified since some perforations had always been needed for cinema audio. With the advent of high-quality cinema audio system, it is possible to quantify this effect.

Thursday, February 14: Main Program Highlights

• A Study Comparing Synthetic Shutter and HFR for Judder Reduction (Ianik Beitzel and Aaron Kuder, ARRI and Stuttgart Media University (HdM))

• Using Drones and Photogrammetry Techniques to Create Detailed (High Resolution) Point Cloud Scenes (Eric Pohl, Singularity Imaging)
Drone aerial photography may be used to create multiple geotagged images that are processed to create a 3D point cloud set of a ground scene. The point cloud may be used for production previsualization or background creation for videogames or VR/AR new-media products.

• Remote and Mobile Production Panel (Moderator: Mark Chiolis, Mobile TV Group; Wolfgang Schram, PRG; Scott Rothenberg, NEP)
With a continuing appetite for content from viewers of all the major networks, as well as niche networks, streaming services, web, eGames/eSports and venue and concert-tour events, the battle is on to make it possible to watch almost every sporting and entertainment event that takes place, all live as it is happening. Key members of the remote and mobile community explore what’s new and what workflows are behind the content production and delivery in today’s fast-paced environments. Expect to hear about new REMI applications, IP workflows, AI, UHD/HDR, eGames, and eSports.

• IMSC 1.1: A Single Subtitle and Caption Format for the Entertainment Chain (Pierre-Anthony Lemieux, Sandflow Consulting (supported by MovieLabs); Dave Kneeland, Fox)
IMSC is a W3C standard for worldwide subtitles/captions, and the result of an international collaboration. The initial version of IMSC (IMSC 1) was published in 2016, and has been widely adopted, including by SMPTE, MPEG, ATSC and DVB. With the recent publication of IMSC 1.1, we now have the opportunity to converge on a single subtitle/caption format across the entire entertainment chain, from authoring to consumer devices. IMSC 1.1 improves on IMSC 1 with support for HDR, advanced Japanese language features, and stereoscopic 3D. Learn about IMSC’s history, capabilities, operational deployment, implementation experience, and roadmap — and how to get involved.

• ACESNext and the Academy Digital Source Master: Extensions, Enhancements and a Standardized Deliverable (Andy Maltz, Academy of Motion Picture Arts & Sciences; Annie Chang, Universal Pictures)

• Mastering for Multiple Display and Surround Brightness Levels Using the Human Perceptual Model to Insure the Original Creative Intent Is Maintained (Bill Feightner, Colorfront)
Maintaining a consistent creative look across today’s many different cinema and home displays can be a big challenge, especially with the wide disparity in possible display brightness and contrast as well as the viewing environments or surrounds. Even if it was possible to have individual creative sessions, maintaining creative consistency would be very difficult at best. By using the knowledge of how the human visual system works, the perceptual model, processing source content to fit a given displays brightness and surround can be automatically applied while maintaining the original creative intent with little to no trimming.

• Cloud: Where Are We Now? (Moderator: Erik Weaver, Western Digital)

• Digitizing Workflow – Leveraging Platforms for Success (Roger Vakharia, Salesforce)
While the business of content creation hasn’t changed much over time, the technology enabling processes around production, digital supply chain and marketing resource management among other areas have become increasingly complex. Enabling an agile, platform-based workflow can help in decreasing time and complexity but cost, scale and business sponsorship are often inhibitors in driving success.

Driving efficiency at scale can be daunting but many media leaders have taken the plunge to drive agility across their business process. Join this discussion to learn best practices, integrations, workflows and techniques that successful companies have used to drive simplicity and rigor around their workflow and business process.

• Leveraging Machine Learning in Image Processing (Rich Welsh, Sundog Media Toolkit)
How to use AI (ML and DL networks) to perform “creative” tasks that are boring and humans spend time doing but don’t want to (working real world examples included)

• Leveraging AI in Post Production: Keeping Up with Growing Demands for More Content (Van Bedient, Adobe)
Expectations for more and more content continue to increase — yet staffing remains the same or only marginally bigger. How can advancements from machine learning help content creators? AI can be an incredible boon to remove repetitive tasks and tedious steps allowing humans to concentrate on the creative; ultimately AI can provide the one currency creatives yearn for more than anything else: Time.

• Deploying Component-Based Workflows: Experiences from the Front Lines (Moderator: Pierre-Anthony Lemieux, Sandflow Consulting (supported by MovieLabs))
The content landscape is shifting, with an ever-expanding essence and metadata repertoire, viewing experiences, global content platforms and automated workflows. Component-based workflows and formats, such as the Interoperable Master Format (IMF) standard, are being deployed to meet the challenges brought by this shift. Come and join us for a first-hand account from those on the front lines.

• Content Rights, Royalties and Revenue Management via Blockchain (Adam Lesh, SingularDTV)
The blockchain entertainment economy: adding transparency, disintermediating the supply chain, and empowering content creators to own, manage and monetize their IP to create sustainable, personal and connected economies. As we all know, rights and revenue (including royalties, residuals, etc.) management is a major pain point for content creators in the entertainment industry.

Friday, February 15: Main Program Highlights

• Beyond SMPTE Time Code: The TLX Project: (Peter Symes)
SMPTE Time Code, ST 12, was developed and standardized in the 1970s to support the emerging field of electronic editing. It has been, and continues to be, a robust standard; its application is almost universal in the media industry, and the standard has found use in other industries. However, ST 12 was developed using criteria and restrictions that are not appropriate today, and it has many shortcomings in today’s environment.

A new project in SMPTE, the Extensible Time Label (TLX) is gaining traction and appears to have the potential to meet a wide range of requirements. TLX is designed to be transport-agnostic and with a modern data structure.

• Blindsided: The Game-Changers We Might Not See Coming (Mark Harrison, Digital Production Partnership)
The world’s number one company for gaming revenue makes as much as Sony and Microsoft combined. It isn’t American or Japanese. Marketeers project that by 2019, video advertising on out-of-home displays will be as important as their spending on TV. Meanwhile, a single US tech giant could buy every franchise of the top five US sports leagues. From its off-shore reserves. And still have $50 billion change.

We all know consumers like OTT video. But that’s the least of it. There are trends in the digital economy that, if looked at globally, could have sudden, and profound, implications for the professional content creation industry. In this eye-widening presentation, Mark Harrison steps outside the western-centric, professional media industry perspective to join the technology, consumer and media dots and ask: what could blindside us if we don’t widen our point of view?

• Interactive Storytelling: Choose What Happens Next (Andy Schuler, Netflix)
Looking to experiment with nonlinear storytelling, Netflix launched its first interactive episodes in 2017. Both in children’s programming, the shows encouraged even the youngest of viewers to touch or click on their screens to control the trajectory of the story (think Choose Your Own Adventure books from the 1980s). How did Netflix overcome some of the more interesting technical challenges of the project (i.e., mastering, encoding, streaming), how was SMPTE IMF used to streamline the process and why are we more formalized mastering practices needed for future projects?

• HPA Engineering Excellence Award Winners (Moderator: Joachim Zell, EFILM, Chair HPA Engineering Excellence Awards; Joe Bogacz, Canon; Paul Saccone, Blackmagic Design; Lance Maurer, Cinnafilm; Michael Flathers, IBM; Dave Norman, Telestream).

Since the HPA launched in 2008, the HPA Awards for Engineering Excellence have honored some of the most groundbreaking, innovative, and impactful technologies. Spend a bit of time with a select group of winners and their contributions to the way we work and the industry at large.

• The Navajo Strategic Digital Plan (John Willkie, Luxio)

• Adapting to a COTS Hardware World (Moderator: Stan Moote, IABM)
Transitioning to off-the-shelf hardware is one of the biggest topics on all sides of the industry, from manufacturers, software and service providers through to system integrators, facilities and users themselves. It’s also incredibly uncomfortable. Post production was an early adopter of specialized workstations (e.g. SGI), and has now embraced a further migration up the stack to COTS hardware and IP networks, whether bare metal, virtualized, hybrid or fully cloud based. As the industry deals with the global acceleration of formats, platforms and workflows, what are the limits of COTS hardware when software innovation is continually testing the limits of general-purpose CPUs, GPUs and network protocols? Covering “hidden” issues in using COTS hardware, from the point of view of users and facility operators as well as manufacturers, services and systems integrators.

• Academy Software Foundation: Enabling Cross-Industry Collaboration for Open Source Projects (David Morin, Academy Software Foundation)
In August 2018, the Academy of Motion Picture Arts and Sciences and The Linux Foundation launched the Academy Software Foundation (ASWF) to provide a neutral forum for open source software developers in the motion picture and broader media industries to share resources and collaborate on technologies for image creation, visual effects, animation and sound. This presentation will explain why the Foundation was formed and how it plans to increase the quality and quantity of open source contributions by lowering the barrier to entry for developing and using open source software across the industry.

AJA ships HDR Image Analyzer developed with Colorfont

AJA is now shipping HDR Image Analyzer, a realtime HDR monitoring and analysis solution developed in partnership with Colorfront. HDR Image Analyzer features waveform, histogram and vectorscope monitoring and analysis of 4K/UltraHD/2K/HD, HDR and WCG content for broadcast and OTT production, post, QC and mastering.

Combining AJA’s video I/O with HDR analysis tools from Colorfront in a compact 1RU chassis, the HDR Image Analyzer features a toolset for monitoring and analyzing HDR formats, including Perceptual Quantizer (PQ) and Hybrid Log Gamma (HLG) for 4K/UltraHD workflows. The HDR Image Analyzer takes in up to 4K sources across 4x 3G-SDI inputs and loops the video out, allowing analysis at any point in the production workflow.

Additional feature highlights include:
– Support for display referred SDR (Rec.709), HDR ST 2084/PQ and HLG analysis
– Support for scene referred ARRI, Canon, Panasonic, Red and Sony camera color spaces
– Display and color processing look up table (LUT) support
– Automatic color space conversion based on the award winning Colorfront Engine
– CIE graph, vectorscope, waveform and histogram support– Nit levels and phase metering
– False color mode to easily spot out-of-gamut/out-of-brightness pixels
– Advanced out-of-gamut and out-of-brightness detection with error intolerance
– Data analyzer with pixel picker
– Line mode to focus a region of interest onto a single horizontal or vertical line
– File-based error logging with timecode
– Reference still store
– UltraHD UI for native-resolution picture display
– Up to 4K/UltraHD 60p over 4x 3G-SDI inputs, with loop out
– SDI auto signal detection
– Loop through output to broadcast monitors
– Three-year warranty

The HDR Image Analyzer is the second technology collaboration between AJA and Colorfront, following the integration of Colorfront Engine into AJA’s FS-HDR realtime HDR/WCG converter. Colorfront has exclusively licensed its Colorfront HDR Image Analyzer software to AJA for the HDR Image Analyzer.

The HDR Image Analyzer is available through AJA’s worldwide reseller network for $15,995.

Roundtable Post tackles HFR, UHD and HDR image processing

If you’re involved in post production, especially episodic TV, documentaries and feature films, then it’s highly probable that High Frame Rate (HFR), Ultra High Definition (UHD) and High Dynamic Range (HDR) have come your way.

“On any single project, the combination of HFR, UHD and HDR image-processing can be a pretty demanding, cutting-edge technical challenge, but it’s even more exacting when particular specs and tight turnarounds are involved,” says Jack Jones, digital colorist and CTO of full-service boutique facility Roundtable Post Production.

Among the central London facility’s credits are online virals for brands including Kellogg’s, Lurpak, Rolex and Ford, music films for Above & Beyond and John Mellencamp, plus broadcast TV series and feature documentaries for ITV, BBC, Sky, Netflix, Amazon, Discovery, BFI, Channel 4, Showtime and film festivals worldwide. These include Sean McAllister’s A Northern Soul, Germaine Bloody Greer (BBC) and White Right: Meeting The Enemy (ITV Exposure/Netflix).

“Yes, you can render-out HFR/UHD/HDR deliverables from a variety of editing and grading systems, but there are not many that can handle the simultaneous combination of these formats, never mind the detailed delivery stipulations and crunching deadlines that often accompany such projects,” says Jones.

Rewinding to the start of 2017, Jones says that, “Looking forward, to the future landscape of post, the proliferation of formats, resolutions, frame rates and color spaces involved in modern screened entertainment seemed an inevitability for our business. We realized that we were going to need to tackle the impending scenario head-on. Having assessed the alternatives, we took the plunge and gambled on Colorfront Transkoder.”

Transkoder is a standalone, automated system for fast digital file conversion. Roundtable Post’s initial use of Colorfront Transkoder turned out to be the creation of encrypted DCP masters and worldwide deliverables of a variety of long-form projects, such as Nick Broomfield’s Whitney: Can I Be Me, Noah Media Group’s Bobby Robson: More Than a Manager, Peter Medak’s upcoming feature The Ghost of Peter Sellers, and the Colombian feature-documentary To End A War, directed by Marc Silver.

“We discovered from these experiences that, along with incredible quality in terms of image science, color transforms and codecs, Transkoder is fast,” says Jones. “For example, the deliverables for To End A War, involved 10 different language versions, plus subtitles. It would have taken several days to complete these out straight of out of an Avid, but rendering in Transkoder took just four hours.”

More recently, Roundtable Post was faced with the task of delivering country-specific graphics packages, designed and created by production agency Noah Media Group, for use by FIFA rights holders and broadcasters during the 2018 World Cup.

The project involved delivering a mix of HFR, UHD, HDR and HD SDR formats, resulting in 240 bespoke animations, and the production of a mammoth 1,422 different deliverables. These included: 59.94p UHD HDR, 50p UHD HDR, 59.94p HD SDR, 50p HD SDR, 59.94i HD SDR and 50i HD SDR with a variety of clock, timecode, pre-roll, soundtrack, burn-in and metadata requirements as part of the overall specification. Furthermore, the job encompassed the final QC of all deliverables, and it had to be completed within a five-day work week.

“For a facility of our size, this was a significant job in terms of its scale and deadline,” says Jones. “Traditionally, projects like these would involve throwing a lot of people and time at them, and there’s always the chance of human error creeping in. Thankfully, we already had positive experiences with Transkoder, and were eager to see how we could harness its power.”

Using technical data from FIFA, Jones built an XML file containing timelines all of the relevant timecode, clock, image metadata, Wav audio and file-naming information of the required deliverables. He also liaised with Colorfront’s R&D team, and was quickly provided with an initial set of Python script templates that would help to automate the various requirements of the job in Transkoder.

Roundtable Post was able to complete the FIFA 2018 World Cup job, including the client-attend QC of the 1,422 different UHD HDR and HD SDR assets, in under three days.

Our Virtual Color Roundtable

By Randi Altman

The number of things you can do with color in today’s world is growing daily. It’s not just about creating a look anymore, it’s using color to tell or enhance a story. And because filmmakers recognize this power, they are getting colorists involved in the process earlier than ever before. And while the industry is excited about HDR and all it offers, this process also creates its own set of challenges and costs.

To find out what those in the trenches are thinking, we reached out to makers of color gear as well as hands-on colorists with the same questions, all in an effort to figure out today’s trends and challenges.

Company 3 Senior Colorist Stephen Nakamura
Company 3 is a global group of creative studios specializing in color and post services for features, TV and commercials. 

How has the finishing of color evolved most recently?
By far, the most significant change in the work that I do is the requirement to master for all the different exhibition mediums. There’s traditional theatrical projection at 14 footlamberts (fL) and HDR theatrical projection at 30fL. There’s IMAX. For home video, there’s UHD and different flavors of HDR. Our task with all of these is to master the movie so it feels and looks the way it’s supposed to feel and look on all the different formats.

There’s no one-size-fits-all approach. The colorist’s job is to work with the filmmakers and make those interpretations. At Company 3 we’re always creating custom LUTs. There are other techniques that help us get where we need to be to get the most out of all these different display types, but there’s no substitute for taking the time and interpreting every shot for the specific display format.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not too long ago, a cinematographer could expose an image specifically for one display format — a film print projected at 14fL. They knew exactly where they could place their highlights and shadows to get a precise look onscreen. Today, they’re thinking in terms of the HDR version, where if they don’t preserve detail in the blacks and whites it can really hurt the quality of the image in some of the newer display methods.

I work frequently with Dariuisz Wolski (Sicario: Day of the Soldado, All the Money in the World). We’ve spoken about this a lot, and he’s said that when he started shooting features, he often liked to expose things right at the edge of underexposure because he knew exactly what the resulting print would be like. But now, he has to preserve the detail and fine-tune it with me in post because it has to work in so many different display formats.

There are also questions about how the filmmakers want to use the different ways of seeing the images. Sometimes they really like the qualities of the traditional theatrical standard and really don’t want HDR to look very different and to make the most of the dynamic range. If we have more dynamic range, more light, to work with, it means that in essence we have a larger “canvas” to work on. But you need to take the time to individually treat every shot if you want to get the most out of that “canvas.”

Where do you see the industry moving in the near future?
The biggest change I expect to see is the development of even brighter, higher-contrast exhibition mediums. At NAB, Sony unveiled this wall of LED panels that are stitched together without seams and can display up to 1000 nits. It can be the size of a screen in a movie theater. If that took off, it could be a game changer. If theatrical exhibition gets better with brighter, higher-contrast screens, I think the public will enjoy it, provided that the images are mastered appropriately.

Sicario: Day of the Soldado

What is the biggest challenge you see for the color grading process now and beyond?
As there are more formats, there will be more versions of the master. From P3 to Rec.709 to HDR video in PQ — they all translate color information differently. It’s not just the brightness and contrast but the individual colors. If there’s a specific color blue the filmmakers want for Superman’s suit, or red for Spiderman, or whatever it is, there are multiple layers of challenges involved in maintaining those across different displays. Those are things you have to take a lot of care with when you get to the finishing stage.

What’s the best piece of work you’ve seen that you didn’t work on?
I know it was 12 years ago now, but I’d still say 300, which was colored by Company 3 CEO Stefan Sonnenfeld. I think that was enormously significant. Everyone who has seen that movie is aware of the graphic-novel-looking imagery that Stefan achieved in color correction working with Zack Snyder and Larry Fong.

We could do a lot in a telecine bay for television, but a lot of people still thought of digital color correction for feature films as an extension of the timing process from the photochemical world. But the look in 300 would be impossible to achieve photo-chemically, and I think that opened a lot of people’s minds about the power of digital color correction.

Alt Systems Senior Product Specialist Steve MacMillian
Alt Systems is a systems provider, integrating compositing, DI, networking and storage solutions for the media and entertainment industry.

How has the finishing of color evolved most recently?
Traditionally, there has been such a huge difference between the color finishing process for television production verses for cinematic release. It used to be that a target format was just one thing, and finishing for TV was completely different than finishing for the cinema.

Colorists working on theatrical films will spend most of their efforts on grading for projection, and only after there is a detailed trim pass to make a significantly different version for the small screen. Television colorists, who are usually under much tighter schedules, will often only be concerned with making Rec.709 look good on a standard broadcast monitor. Unless there is a great deal of care to preserve the color and dynamic range of the digital negative throughout the process, the Rec.709 grade will not be suitable for translation to other expanded formats like HDR.

Now, there is an ever-growing number of distribution formats with different color and brightness requirements. And with the expectation of delivering to all of these on ever-tighter production budgets, it has become important to use color management techniques so that the work is not duplicated. If done properly, this allows for one grade to service all of these requirements with the least amount of trimming needed.

How has laser projection and HDR impacted the work?
HDR display technology, in my opinion, has changed everything. The biggest impact on color finishing is the need for monitoring in both HDR and SDR in different color spaces. Also, there is a much larger set of complex delivery requirements, along with the need for greater technical expertise and capabilities. Much of this complexity can be reduced by having the tools that make the various HDR image transforms and complex delivery formats as automatic as possible.

Color management is more important than ever. Efficient and consistent workflows are needed for dealing with multiple sources with unique color sciences, integrating visual effects and color grading while preserving the latitude and wide color gamut of the image.

The color toolset should support remapping to multiple deliverables in a variety of color spaces and luminance levels, and include support for dynamic HDR metadata systems like Dolby and HDR10+. As HDR color finishing has evolved, so has the way it is delivered to studios. Most commonly it is delivered in an HDR IMF package. It is common that Rec.2020 HDR deliverables be color constrained to the P3 color volume and also that Light Level histograms and HDR QC reports be delivered.

Do you feel DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not as much as you would think. Two things are working against this. First, film and high-end digital cameras themselves have for some time been capturing latitude suitable for HDR production. Proper camera exposure is all that is needed to ensure that an image with a wide enough dynamic range is recorded. So from a capture standpoint, nothing needs to change.

The other is cost. There are currently only a small number of suitable HDR broadcast monitors, and most of these are extremely expensive and not designed well for the set. I’m sure HDR monitoring is being used on-set, but not as much as expected for productions destined for HDR release.

Also, it is difficult to truly judge HDR displays in a bright environment, and cinematographers may feel that monitoring in HDR is not needed full time. Traditionally with film production, cinematographers became accustomed to not being able to monitor accurately on-set, and they rely on their experience and other means of judging light and exposure. I think the main concern for cinematographers is the effect of lighting choices and apparent resolution, saturation and contrast when viewed in HDR.

Highlights in the background can potentially become distracting when displayed at 1000 nits verses being clamped at 100. Framing and lighting choices are informed by proper HDR monitoring. I believe we will see more HDR monitoring on-set as more suitable displays become available.

Colorfront’s Transkoder

Where do you see the industry moving in the near future?
Clearly HDR display technology is still evolving, and we will see major advances in HDR emissive displays for the cinema in the very near future. This will bring new challenges and require updated infrastructure for post as well as the cinema. It’s also likely that color finishing for the cinema will become more and more similar to the production of HDR for the home, with only relatively small differences in overall luminance and the ambient light of the environment.

Looking forward, standard dynamic range will eventually go away in the same way that standard definition video did. As we standardize on consumer HDR displays, and high-performance panels become cheaper to make, we may not need the complexity of HDR dynamic remapping systems. I expect that headset displays will continue to evolve and will become more important as time goes on.

What is the biggest challenge you see for the color grading process now and beyond?
We are experiencing a period of change that can be compared to the scope of change from SD to HD production, except it is happening much faster. Even if HDR in the home is slow to catch on, it is happening. And nobody wants their production to be dated as SDR-only. Eventually, it will be impossible to buy a TV that is not HDR-capable.

Aside from the changes in infrastructure, colorists used to working in SDR have some new skills to learn. I think it is a mistake to do separate grading versions for every major delivery format. Even though we have standards for HDR formats, they will continue to evolve, so post production must evolve too. The biggest challenge is meeting all of these different delivery requirements on budgets that are not growing as fast as the formats.

Northern Lights Flame Artist and Colorist Chris Hengeveld
NY- and SF-based Northern Lights, along with sister companies Mr. Wonderful for design, SuperExploder for composing and audio post, and Bodega for production, offers one-stop-shop services.

How has the finishing of color evolved most recently?
It’s interesting that you use the term “finishing of color.” In my clients’ world, finishing and color now go hand in hand. My commercial clients expect not only a great grade but seamless VFX work in finalizing their spots. Both of these are now often taking place with the same artist. Work has been pushed from just straight finishing with greenscreen, product replacement and the like to doing a grade up to par with some of the higher-end coloring studios. Price is pushing vastly separate disciplines into one final push.

Clients now expect to have a rough look ready not only of the final VFX, but also of the color pass before they attend the session. I usually only do minor VFX tweaks when clients arrive. Sending QuickTimes back and forth between studio and client usually gets us to a place where our client, and their client, are satisfied with at least the direction if not the final composites.

Color, as a completely subjective experience, is best enjoyed with the colorist in the room. We do grade some jobs remotely, but my experience has clearly been that from both time and creativity standpoints, it’s best to be in the grading suite. Unfortunately, recently due to time constraints and budget issues, even higher-end projects are being evaluated on a computer/phone/tablet back at the office. This leads to more iterations and less “the whole is greater than the sum of the parts” mentality. Client interaction, especially at the grading level, is best enjoyed in the same room as the colorist. Often the final product is markedly better than what either could envision separately.

Where do you see the industry moving in the near future?
I see the industry continuing to coalesce around multi-divisional companies that are best suited to fulfill many clients’ needs at once. Most projects that come to us have diverse needs that center around one creative idea. We’re all just storytellers. We do our best to tell the client’s story with the best talent we offer, in a reasonable timeframe and at a reasonable cost.

The future will continue to evolve, putting more pressure on the editorial staff to deliver near perfect rough cuts that could become finals in the not-too-distant future.

Invisalign

The tools continue to level the playing field. More generalists will be trained in disciplines including video editing, audio mixing, graphic design, compositing and color grading. This is not to say that the future of singularly focused creatives is over. It’s just that those focused creatives are assuming more and more responsibilities. This is a continuation of the consolidation of roles that has been going on for several years now.

What is the biggest challenge you see for the color grading process now and beyond?
The biggest challenge going forward is both technical and budgetary. Many new formats have emerged, including the new ProRes RAW. New working color spaces have also emerged. Many of us work without on-staff color scientists and must find our way through the morass of HDR, ACES, Scene Linear and Rec.709. Working with materials that round trip in-house is vastly easier than dealing with multiple shops all with their own way of working. As we collaborate with outside shops, it behooves us to stay at the forefront of technology.

But truth be told, perhaps the biggest challenge is keeping the creative flow and putting the client’s needs first. Making sure the technical challenges don’t get in the way. Clients need to see a seamless view without technical hurdles.

What’s the best piece of work you’ve seen that you didn’t work on?
I am constantly amazed at the quality of work coming out of Netflix. Some of the series are impeccably graded. Early episodes of Bloodline, which was shot with the Sony F65, come to mind. The visuals were completely absorbing, both daytime and nighttime scenes.

Codex VP Business Development Brian Gaffney
Codex designs tools for color, dailies creation, archiving, review and networked attached storage. Their offerings include the new Codex ColorSynth with Keys and the MediaVault desktop NAS.

How has the finishing of color evolved most recently?
While it used to be a specialized suite in a post facility, color finishing has evolved tremendously over the last 10 years with low-cost access to powerful systems like Resolve for use on-set in commercial finishing to final DI color grading. These systems have evolved from being more than just color. Now they are editorial, sound mixing and complete finishing platforms.

How has laser projection and HDR impacted the work?
Offering brighter images in the theatre and the home with laser projection, OLED walls and HDR displays will certainly change the viewers’ experience, and it has helped create more work in post, offering up another pass for grading.

However, brighter images also show off image artifacts and can bring attention to highlights that may already be clipping. Shadow detail that was graded in SDR may now look milky in HDR. These new display mediums require that you spend more time optimizing the color correction for both display types. There is no magic one grade fits all.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
I think cinematographers are still figuring this out. Much like color correction between SDR and HDR, lighting for the two is different. A window that was purposely blown out in SDR, to hide a lighting rig outside, may show up in HDR, exposing the rig itself. Color correction might be able to correct for this, but unless a cinematographer can monitor in HDR on-set, these issues will come up in post. To do it right, lighting optimization between the two spaces is required, plus SDR and HDR monitoring on-set and near-set and in editorial.

Where do you see the industry moving in the near future?
It’s all about content. With the traditional studio infrastructure and broadcast television market changing to Internet Service Providers (ISPs), the demand for content, both original and HDR remastered libraries, is helping prop up post production and is driving storage- and cloud-based services.

Codex’s ColorSynth and Media Vault

In the long term, if the competition in this space continues and the demand for new content keeps expanding, traditional post facilities will become “secure data centers” and managed service providers. With cloud-based services, the talent no longer needs to be in the facility with the client. Shared projects with realtime interactivity from desktop and mobile devices will allow more collaboration among global-based productions.

What is the biggest challenge you see for the color grading process now and beyond?
Project management — sharing color set-ups among different workstations. Monitoring of the color with proper calibrated displays in both SDR and HDR and in support of multiple deliverables is always a challenge. New display technologies, like laser projection and new Samsung and Sony videowalls, may not be cost effective for the creative community to access for final grading. Only certain facilities may wind up specializing in this type of grading experience, limiting global access for directors and cinematographers to fully visualize how their product will look like on these new display mediums. It’s a cost that may not get the needed ROI, so in the near future many facilities may not be able to support the full demand of deliverables properly.

Blackmagic Director of Sales/Operations Bob Caniglia
Blackmagic creates DaVinci Resolve, a solution that combines professional offline and online editing, color correction, audio post production and visual effects in one software tool.

How has the finishing of color evolved most recently?
The ability to work in 8K, and whatever flavor of HDR you see, is happening. But if you are talking evolution, it is about the ability to collaborate with everyone in the post house, and the ability to do high-quality color correction anywhere. Editors, colorists, sound engineers and VFX artists should not be kept apart or kept from being able to collaborate on the same project at the same time.

New collaborative workflows will speed up post production because you will no longer need to import, export or translate projects between different software applications.

How has laser projection and HDR impacted the work?
The most obvious impact has been on the need for colorists to be using software that can finish a project in whatever HDR format the client asks for. That is the same with laser projection. If you do not use software that is constantly updating to whatever new format is introduced, being able to bid on HDR projects will be hard.

HDR is all about more immersive colors. Any colorist should be ecstatic to be able to work with images that are brighter, sharper and with more data. This should allow them to be even more creative with telling a story with color.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
As for cinematographers, HDR gives viewers a whole new level of image details. But that hyper reality could draw the viewer from the wanted target in a shot. The beautiful details shining back on a coffee pot in a tracking shot may not be worth worrying about in SDR, but in HDR every shot will create more work for the colorist to make sure the viewer doesn’t get distracted by the little things. For DPs, it means they are going to have to be much more aware of lighting, framing and planning the impact of every possible item and shadow in an image.

Where do you see the industry moving in the near future?
Peace in our time amongst all of the different post silos, because those silos will finally be open. And there will be collaboration between all parts of the post workflow. Everyone — audio, VFX, editing and color correction — can work together on the same project seamlessly.

For example, in our Resolve tool, post pros can move between them all. This is what we see happening with colorists and post houses right now, as each member of the post team can be much more creatively flexible because anyone can explore new toolsets. And with new collaboration tools, multiple assistants, editors, colorists, sound designers and VFX artists can all work on the same project at the same time.

Resolve 15

For a long-term view, you will always have true artists in each of the post areas. People who have mastered the craft and can separate themselves as being color correction artists. What is really going to change is that everyone up and down the post workflow at larger post houses will be able to be much more creative and efficient, while small boutique shops and freelancers can offer their clients a full set of post production services.

What is the biggest challenge you see for the color grading process now and beyond?
Speed and flexibility. Because with everyone now collaborating and the colorist being part of every part of the post process, you will be asked to do things immediately… and in any format. So if you are not able to work in real time or with whatever footage format thrown at you, they will find someone who can.

This also comes with the challenge of changing the old notion that the colorist is one of the last people to touch a project. You will be asked to jump in early and often. Because every client would love to show early edits that are graded to get approvals faster.

FilmLight CEO Wolfgang Lempp
FilmLight designs, creates and manufactures color grading systems, image processing applications and workflow tools for the film and television industry

How has the finishing of color evolved recently?
When we started FilmLight 18 years ago, color management was comparatively simple: Video looked like video, and digital film was meant to look like film. And that was also the starting point for the DCI — the digital cinema standard tried to make digital projection look exactly like conventional cinema. This understanding lasted for a good 10 years, and even ACES today is very much built around film as the primary reference. But now we have an explosion of new technologies, new display devices and new delivery formats.

There are new options in resolution, brightness, dynamic range, color gamut, frame rate and viewing environments. The idea of a single deliverable has gone: There are just too many ways of getting the content to the viewer. That is certainly affecting the finishing process — the content has to look good everywhere. But there is another trend visible, too, which here in the UK you can see best on TV. The color and finishing tools are getting more powerful and the process is getting more productive. More programs than ever before are getting a professional color treatment before they go out, and they look all the better for it.

Either way, there is more work for the colorist and finishing house, which is of course something we welcome.

How has laser projection and HDR impacted the work?
Laser projection and HDR for cinema and TV are examples of what I described above. We have the color science and the tools to move comfortably between these different technologies and environments, in that the color looks “right,” but that is not the whole story.

The director and DP will choose to use a format that will best suit their story, and will shoot for their target environment. In SDR, you might have a bright window in an interior scene, for example, which will shape the frame but not get in the way of the story. But in HDR, that same window will be too bright, obliterate the interior scene and distract from the story. So you would perhaps frame it differently, or light up the interior to restore some balance. In other words, you have to make a choice.

HDR shouldn’t be an afterthought, it shouldn’t be a decision made after the shoot is finished. The DP wants to keep us on the edge of our seats — but you can’t be on the edge in HDR and SDR at the same time. There is a lot that can be done in post, but we are still a long way from recreating the multispectral, three-dimensional real world from the output of a camera.

HDR, of course, looks fantastic, but the industry is still learning how to shoot for best effect, as well as how to serve all the distribution formats. It might well become the primary mastering format soon, but SDR will never go away.

Where do you see the industry moving in the future?
For me, it is clear that as we have pushed resolution, frame rate, brightness and color gamut, it has affected the way we tell stories. Less is left to the imagination. Traditional “film style” gave a certain pace to the story, because there was the expectation that the audience was having to interpret, to think through to fill in the black screen in between.

Now technology has made things more explicit and more immersive. We now see true HDR cinema technology emerging with a brightness of 600 nits and more. Technology will continue to surge forward, because that is how manufacturers sell more televisions or projectors — or even phones. And until there is a realistic simulation of a full virtual reality environment, I don’t see that process coming to a halt. We have to be able to master for all these new technologies, but still ensure compatibility with existing standards.

What is the biggest challenge for color grading now and in the future?
Color grading technology is very much unfinished business. There is so much that can be done to make it more productive, to make the content look better and to keep us entertained.

Blackboard

As much as we might welcome all the extra work for our customers, generating an endless stream of versions for each program is not what color grading should be about. So it will be interesting to see how this problem will be solved. Because one way or another, it will have to be. But while this is a big challenge, it hopefully isn’t what we put all our effort into over the coming years.

BlackboardThe real challenge is to understand what makes us appreciate certain images over others. How composition and texture, how context, noise and temporal dynamics — not just color itself — affect our perception.

It is interesting that film as a capture medium is gaining popularity again, especially large-format capture. It is also interesting that the “film look” is still precious when it comes to color grading. It puts all the new technology into perspective. Filmmaking is storytelling. Not just a window to the world outside, replaced by a bigger and clearer window with new technology, but a window to a different world. And the colorist can shape that world to a degree that is limited only by her imagination.

Olympusat Entertainment Senior DI Colorist Jim Wicks
A colorist since 2007, Jim has been a senior DI colorist at Olympusat Entertainment since 2011. He has color restored hundreds of classic films and is very active in the color community.

How has the finishing of color evolved most recently?
The phrase I’m keying in on in your question is “most recently.” I believe the role of a colorist has been changing exponentially for the last several years, maybe longer. I would say that we are becoming, if we haven’t already, more like finishing artists. Color is now just one part of what we do. Because technologies are changing more rapidly than at any time I’ve witnessed, we now have a lot to understand and comprehend in addition to just color. There is ACES, HDR, changing color spaces, integrating VFX workflows into our timelines, laser projection and so on. The list isn’t endless, and it’s growing.

How has laser projection and HDR impacted the work?
For the time being, they do not impact my work. I am currently required to deliver in Rec.709. However, within that confine I am grading a wider range of media than ever before, such as 2K and 4K uncompressed DPX; Phantom Digital Video Files; Red Helium 8K in the IPP2 workspace; and much more. Laser projection and HDR is something that I continue to study by attending symposiums, or wherever I can find that information. I believe laser projection and HDR are important to know now. When the opportunity to work with laser projection and HDR is available to me, I plan to be ready.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Of course! At the very heart of every production, the cinematographer is the creator and author of the image. It is her creative vision. The colorist is the protector of that image. The cinematographer entrusts us with her vision. In this respect, the colorist needs to be in sync with the cinematographer as never before. As cinematographers move because of technology, so we move. It’s all about the deliverable and how it will be displayed. I see no benefit for the colorist and the cinematographer to not be on the same page because of changing technology.

Where do you see the industry moving in the near future and the long-range future?
In the near future: HDR, laser projection, 4K and larger and larger formats.

In the long-range future: I believe we only need to look to the past to see the changes that are inevitably ahead of us.

Technological changes forced film labs, telecine and color timers to change and evolve. In the nearly two decades since O Brother Where Art Thou? we no longer color grade movies the way we did back when the Coen classic was released in 2000. I believe it is inevitable: Change begets change. Nothing stays the same.

In keeping with the types of changes that came before, it is only a matter of time before today’s colorist is forced to change and evolve just as those before us were forced to do so. In this respect I believe AI technology is a game-changer. After all, we are moving towards driverless cars. So, if AI advances the way we have been told, will we need a human colorist in the future?

What is the biggest challenge you see for the color grading process now and beyond?
Not to sound like a “get off my lawn rant,” but education is the biggest challenge, and it’s a two-fold problem. Firstly, at many fine film schools in the US color grading is not taught as a degree-granting course, or at all.

Secondly, the glut of for-profit websites that teach color grading courses have no standardized curriculum, which wouldn’t be a problem, but at present there is no way to measure how much anyone actually knows. I have personally encountered individuals who claim to be colorists and yet do not know how to color grade. As a manager I have interviewed them — their resumes look strong, but their skills are not there. They can’t do the work.

What’s the best piece of work you’ve seen that you didn’t work on?
Just about anything shot by Roger Deakins. I am a huge fan of his work. Mitch Paulson and his team at Efilm did great work on protecting Roger’s vision for Blade Runner 2049.

Colorist David Rivero
This Madrid-born colorist is now based in China. He color grades and supervises the finishing of feature films and commercials, normally all versions, and often the trailers associated with them.

How has the finishing of color evolved most recently?
The line between strictly color grading and finishing is getting blurrier by the year. Although it is true there is still a clearer separation in the commercial world, on the film side the colorist has become the “de facto” finishing or supervising finishing artist. I think it is another sign of the bigger role the color grading is starting to play in post.

In the last two to three years I’ve noticed that fewer clients are looking at it as an afterthought, or as simply “color matching.” I’ve seen how the very same people went from a six- to seven-day DI schedule five years ago to a 20-day schedule now. The idea that spending a relatively small amount of extra time and budget on the final step can get you a far superior result is finally sinking in.

The tools and technology are finally moving into a “modern age” of grading:
– HDR is a game changer on the image-side of things, providing a noticeable difference for the audience and a different approach on our side on how to deal with all that information.

– The eventual acceptance by all color systems of what was traditionally compositing or VFX tools is also a turning point, although controversial. There are many that think that colorists should focus on grading. However, I think that rather than colorists becoming compositors, it is the color grading concept and mission that is (still) evolving.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Well, on my side of the world (China), the laser and HDR technologies are just starting to get to the public. Cinematographers are not really changing how they work yet, as it is a very small fraction of the whole exhibition system.

As for post, it requires a more careful way of handling the image, as it needs higher quality plates, compositions, CG, VFX, a more careful grade, and you can’t get away with as many tricks as you did when it was just SDR. The bright side is the marvelous images, and how different they can be from each other. I believe HDR is totally compatible with every style you could do in SDR, while opening the doors to new ones. There are also different approaches on shooting and lighting for cinematographers and CG artists.

Goldbuster

The biggest challenge it has created has been on the exhibition side in China. Although Dolby cinemas (Vision+Atmos) are controlled and require a specific pass and DCP, there are other laser projection theaters that show the same DCP being delivered to common (xenon lamp) theaters. This creates a frustrating environment. For example, during the 3D grading, you not only need to consider the very dark theaters with 3FL-3.5FL, but also the new laser rooms that are racking up their lamps to show off why they charge higher ticket prices with to 7FL-8FL.

Where do you see the industry moving in the near future and the long-range future?
I hope to see the HDR technologies settling and becoming the new standard within the next five to six years, and using this as the reference master from which all other deliveries are created. I also expect all these relative new practices and workflows (involving ACES, EXRs with the VFX/CG passes, non-LUT deliveries) to become more standardized and controlled.

In the long term, I could imagine two main changes happening, closely related to each other:
– The concept of grading and colorist, especially in films or long formats, evolving in importance and relationship within the production. I believe the separation or independence between photography and grading will get wider (and necessary) as tools evolve and the process is more standardized. We might get into something akin to how sound editors and sound mixers relate and work together on the sound.

– The addition of (serious) compositing in essentially all the main color systems is the first step towards the possibilities of future grading. A feature like the recent FaceRefinement in Resolve is one of the things I dreamed about five or six years ago.

What is the biggest challenge you see for the color grading process now and beyond?
Nowadays one of the biggest challenges is possibly the multi-mastering environment, with several versions on different color spaces, displays and aspect ratios. It is becoming easier, but it is still more painful than it should be.

Shrinking margins is something that also hurts the whole industry. We all work thanks to the benefits, but cutting on budgets and expecting the same results is not something that is going to happen.

What’s the best piece of work you’ve seen that you didn’t work on?
The Revanant, Mad Max, Fury and 300.

Carbon Colorist Aubrey Woodiwiss
Full-service creative studio Carbon has offices in New York, Chicago and Los Angeles.

How has the finishing of color evolved most recently?
It is always evolving, and the tools are becoming ever more powerful, and camera formats are becoming larger with more range and information in them. Probably the most significant evolution I see is a greater understanding of color science and color space workflows.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
These elements impact how footage is viewed and dealt with in post. As far as I can see, it isn’t affecting how things are shot.

Where do you see the industry moving in the near future? What about in the long-range future?
I see formats becoming larger, viewing spaces and color gamuts becoming wider, and more streaming- and laptop-based technologies and workflows.

What is the biggest challenge you see for the color grading process now and beyond?
The constant challenge is integrating the space you traditionally color grade in to how things are viewed outside of this space.

What’s the best piece of work you’ve seen that you didn’t work on?
Knight of Cups, directed by Terrence Malick with cinematography by Emanuel Lubezki.

Ntropic Colorist Nick Sanders
Ntropic creates and produces work for commercials, music videos, and feature films as well as experiential and interactive VR and AR media. They have locations in San Francisco, Los Angeles and New York City.

How has the finishing of color evolved most recently?
SDR grading in Rec.709 and 2.4 Gamma is still here, still looks great, and will be prominent for a long time. However, I think we’re becoming more aware of how exciting grading in HDR is, and how many creative doors it opens. I’ve noticed a feeling of disappointment when switching from an HDR to an SDR version of a project, and wondered for a second if I’m accidentally viewing the ungraded raw footage, or if my final SDR grade is actually as flat as it appears to my eyes. There is a dramatic difference between the two formats.

HDR is incredible because you can make the highlights blisteringly hot, saturate a color to nuclear levels or keep things mundane and save those heavier-handed tools in your pocket for choice moments in the edit where you might want some extra visceral impact.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
In one sense, cinematographers don’t need to do anything differently. Colorists are able to create high-quality SDR and HDR interpretations of the exact same source footage, so long as it was captured in a high-bit-depth raw format and exposed well. We’re even seeing modern HDR reimaginings of classic films. Movies as varied in subject matter as Saving Private Ryan and the original Blade Runner are coming back to life because the latitude of classic film stocks allows it. However, HDR has the power to greatly exaggerate details that may have otherwise been subtle or invisible in SDR formats, so some extra care should be taken in projects destined for HDR.

Extra contrast and shadow detail mean that noise is far more apparent in HDR projects, so ISO and exposure should be adjusted on-set accordingly. Also, the increased highlight range has some interesting consequences in HDR. For example, large blown-out highlights, such as overexposed skies, can look particularly bad. HDR can also retain more detail and color in the upper ranges in a way that may not be desirable. An unremarkable, desaturated background in SDR can become a bright, busy and colorful background in HDR. It might prove distracting to the point that the DP may want to increase his or her key lighting on the foreground subjects to refocus our attention on them.

Panasonic “PvP”

Where do you see the industry moving in the near future? What about the long-range future?
I foresee more widespread adoption of HDR — in a way that I don’t with 3D and VR — because there’s no headset device required to feel and enjoy it. Having some HDR nature footage running on a loop is a great way to sell a TV in Best Buy. Where the benefits of another recent innovation, 4K, are really only detectable on larger screens and begin to deteriorate with the slightest bit of compression in the image pipeline, HDR’s magic is apparent from the first glance.

I think we’ll first start to see HDR and SDR orders on everything, then a gradual phasing out of the SDR deliverables as the technology becomes more ubiquitous, just like we saw with the standard definition transition to HD.

For the long-range, I wouldn’t be surprised to see a phasing out of projectors as LED walls become more common for theater exhibitions due to their deeper black levels. This would effectively blur the line between technologies available for theater and home for good.

What is the biggest challenge you see for the color grading process now and beyond?
The lack of a clear standard makes workflow decisions a little tricky at the moment. One glaring issue is that consumer HDR displays don’t replicate the maximum brightness of professional monitors, so there is a question of mastering one’s work for the present, or for the near future when that higher capability will be more widely available. And where does this evolution stop? 4,000 nits? 10,000 nits?

Maybe a more pertinent creative challenge in the crossover period is which version to grade first, SDR or HDR, and how to produce the other version. There are a couple of ways to go about it, from using LUTs to initiate and largely automate the conversion to starting over from scratch and regrading the source footage in the new format.

What’s the best piece of work you’ve seen that you didn’t work on?
Chef’s Table on Netflix was one of the first things I saw in HDR; I still think it looks great!

Main Image: Courtesy of Jim Wicks.

Understanding and partnering on HDR workflows

By Karen Moltenbrey

Every now and then a new format or technology comes along that has a profound effect on post production. Currently, that tech is high dynamic range, or HDR, which offers a heightened visual experience through a greater dynamic range of luminosity.

Michel Suissa

So why is HDR important to the industry? “That is a massive question to answer, but to make a pretty long story relatively short, it is by far one of the recent technologies to emerge with the greatest potential to change how images are affecting audiences,” says Michel Suissa, manager of professional solutions at The Studio–B&H. “Regardless of the market and the medium used to distribute programming, irrelevant to where and how these images are consumed, it is a clearly noticeable enhancement, and at the same time a real marketing gold mine for manufacturers as well as content producers, since a premium can be attached to offering HDR as a feature.”

And he should know. Suissa has been helping a multitude of post studios navigate the HDR waters in their quest for the equipment necessary to meet their high dynamic range needs.

Suissa started seeing a growing appetite for HDR roughly three years ago, both in the consumer and professional markets and at about the same time. “Three years ago, if someone had said they were creating HDR content, a very small percentage of the community would have known what they were talking about,” he notes. “Now, if you don’t know what HDR is and you’re in the industry, then you are probably behind the times.”

Nevertheless, HDR is demanding in terms of the knowledge one needs to create HDR content and distribute it, as well as make sure people can consume it in a way that’s satisfying, Suissa points out. “And there’s still a lot of technical requirements that people have to carefully navigate through because it is hardly trivial,” he says.

How does a company like B&H go about helping a post studio select the right tools for their individual workflow needs? “The basic yet critically important task is understanding their workflow, their existing tool set and what is expected of them in terms of delivery to their clients,” says Suissa.

To assist studios and content creators working in post, The Studio–B&H team follows a blueprint that’s based on engaging customers about the nature of the work they do, asking questions like: Which camera material do they work from? In which form is the original camera material used? What platform do they use for editing? What is the preferred application to master HDR images? What is the storage and network infrastructure? What are the master delivery specifications they must adhere to (what flavor of HDR)?

“People have the most difficulty understanding the nature of the workflow: Do the images need to be captured differently from a camera? Do they need to be ingested in the post system differently? Do they need to be viewed differently? Do they need to be formatted differently? Do they need to be mastered differently? All those things created a new set of specifications that people have to learn, and this is where it has changed the way people handle post production,” Suissa contends. “There’s a lot of intricacies, and you have to understand what it is you’re looking at in order to make sure you’re making the correct decisions — not just technically, but creatively as well.”

When adding an HDR workflow, studios typically approach B&H looking for equipment across their entire pipeline. However, Suissa states that similar parameters apply for HDR work as for other high-performance environments. People will continue to need decent workstations, powerful GPUs, professional storage for performance and increased capacity, and an excellent understanding of monitoring. “Other aspects of a traditional pipeline can sometimes remain in play, but it is truly a case-by-case analysis,” he says.

The most critical aspect of working with HDR is the viewing experience, Suissa says, so selecting an appropriate monitoring solution is vital — as is knowing the output specifications that will be used for final delivery of the content.

Without question, Suissa has seen an increase in the number of studios asking about HDR equipment of late. “Generally speaking, the demand by people wanting to at least understand what they need in order to deliver HDR content is growing, and that’s because the demand for content is growing,” he says.

Yes, there are compromises that studios are making in terms of HDR that are based on budget. Nevertheless, there is a tipping point that can lead to the rejection of a project if it is not up to HDR standards. In fact, Suissa foresees in the next six months or so the tightening of standards on the delivery side, whether for Amazon, Netflix or the networks, and the issuance of mandates by over-the-air distribution channels in order for content to be approved as HDR.

B&H/Light Iron Collaboration
Among the studios that have purchased HDR equipment from B&H is Light Iron, a Panavision company with six facilities spanning the US that offer a range of post solutions, including dailies and DI. According to Light Iron co-founder Katie Fellion, the number of their clients requesting HDR finishing has increased in the past year. She estimates that one out of every three clients is considering HDR finishing, and in some cases, they are doing so even if they don’t have distribution in place yet.

Suissa and Light Iron SVP of innovation Michael Cioni gradually began forging a fruitful collaboration during the last few years, partnering a number of times at various industry events. “At the same time, we doubled up on our relationship of providing technology to them,” Suissa adds, whether for demonstrations or for Light Iron’s commercial production environment.

Katie Fellion

For some time, Light Iron has been moving toward HDR, purchasing equipment from various vendors along the way. In fact, Light Iron was one of the very first vendors to become involved with HDR finishing when Amazon introduced HDR-10 mastering for the second season of one of its flagship shows, Transparent, in 2015.

“Shortly after Transparent, we had several theatrical releases that also began to remaster in both HDR-10 and Dolby Vision, but the requests were not necessarily the norm,” says Fellion. “Over the last three years, that has steadily changed, as more studios are selling content to platforms that offer HDR distribution. Now, we have several shows that started their Season 1 with a traditional HD finish, but then transitioned to 4K HDR finishes in order to accommodate these additional distribution platform requirements.”

Some of the more recent HDR-finished projects at Light Iron include Glow (Season 2) and Thirteen Reasons Why (Season 2) for Netflix, Uncle Drew for Lionsgate, Life Itself for Amazon, Baskets (Season 3) and Better Things (Season 2) for FX and Action Point for Paramount.

Without question, HDR is important to today’s finishing, but one cannot just step blindly into this new, highly detailed world. There are important factors to consider. For instance, the source requirements for HDR mastering — 4K 16-bit files — require more robust tools and storage. “A show that was previously shot and mastered in 2K or HD may now require three or four times the amount of storage in a 4K HDR workflow. Since older post facilities had been previously designed around a 2K/HD infrastructure, newer companies that had fewer issues with legacy infrastructure were able to adopt 4K HDR faster,” says Fellion. Light Iron was designed around a 4K+ infrastructure from day one, she adds, allowing the post house to much more easily integrate HDR at a time when other facilities were still transitioning from 2K to 4K.

Nevertheless, this adoption required changes to the post house’s workflow. Fellion explains: “In a theatrical world, because HDR color is set in a much larger color gamut than P3, the technically correct way to master is to start with the HDR color first and then trim down for P3. However, since HDR theatrical exhibition is still in its infancy, there are not options for most feature films to monitor in a projected environment — which, in a feature workflow, is an expected part of the finishing process. As a result, we often use color-managed workflows that allow us to master first in a P3 theatrical projection environment and then to version for HDR as a secondary pass.”

Light-Iron-NY colorist-Steven Bodner grading music video Picture-Day in HDR on a Sony BVM X300.

In the episodic world, if a project is delivering in HDR, unless creative preference determines otherwise, Light Iron will typically start with the HDR version first and then trim down for the SDR Rec.709 versions.

For either, versioning and delivery have to be considered. For Dolby Vision, this starts with an analysis of the timeline to output an XML for the 709 derivative, explains Fellion of Light Iron’s workflow. And then from that 709 derivative, the colorist will review and tweak the XML values as necessary, sometimes going back to the HDR version and re-analyzing if a larger adjustment needs to be made for the Rec.709 version. For an HDR-10 workflow, this usually involves a different color pass and delivered file set, as well as analysis of the final HDR sequence, to create metadata values, she adds.

Needless to say, embracing HDR is not without challenges. Currently, HDR is only used in the final color process since there’s not many workflows to support HDR throughout the dailies or editorial process, says Fellion. “This can certainly be a challenge to creatives who have spent the past few months staring at images in SDR only to have a different reaction when they first view them in HDR.” Also, in HDR there may be elements on screen that weren’t previously visible in SDR dailies or offline (such as outside a window or production cables under a table), which creates new VFX requirements in order to adjust those elements.

“As more options are developed for on-set monitoring — such as Light Iron’s HDR Video Village System — productions are given an opportunity to see HDR earlier in the process and make mental and physical adjustments to help accommodate for the final HDR picture,” Fellion says.

Having an HDR monitor on set can aid in flagging potential issues that might not be seen in SDR. Currently, however, for dailies and editorial, HDR monitoring is not really used, according to Fellion, who hopes to see that change in the future. Conversely, in the finishing world, “an HDR monitor capable of a minimum 1,000-nit display, such as the Sony [BVM] X300, as well as a consumer-grade HDR UHD TV for client reviews, are part of our standard tool set for mastering,” she notes.

In fact, several months ago, Light Iron purchased new high-end HDR mastering monitors from B&H. The studio also sourced AJA Hi5 4K Plus converter boxes from B&H for its HDR workflow.

And, no doubt, there will be additional HDR equipment needs in Light Iron’s future, as delivery of HDR content continues to ramp up. But there’s a hefty cost involved in moving to HDR. Depending on whether a facility’s DI systems already had the capacity to play back 4K 16-bit files — a key requirement for HDR mastering — the cost can range from a few thousand dollars for a consumer-grade monitor to tens of thousands for professional reference monitoring, DI system, storage and network upgrades, as well as licensing and training for the Dolby Vision platform, according to Fellion.

That is one reason why it’s important for suppliers and vendors to form relationships. But there are other reasons, too. “Those leading the charge [in HDR] are innovators and people you want to be associated with,” Suissa explains. “You learn a lot by associating yourself with professionals on the other side of things. We provide technology. We understand it. We learn it. But we also practice it differently than people who create content. The exchange of knowledge is critical, and it enables us to help our customers better understand the technology they are purchasing.”

Main Image: Netflix’s Glow


Karen Maierhofer is a longtime technical writer with more than two decades of experience in segments of the CG and post industries.

A colorist weighs in on ‘the new world’ of HDR

By Maxine Gervais

HDR is on many people’s minds these days. Some embrace it, some are hesitant and some simply do not like the idea.

But what is HDR really? I find that manufacturers often use the term too loosely. Anything that offers higher dynamic range can fall into the HDR category, but let’s focus on the luminance and greater contrast ratio brought by HDR.

We have come a long way in the last 12 years — from film prints to digital projection. This was a huge shift, and one could argue it happened relatively fast. Since then, technology has been on the fast forward.

Film allows incredible capture of detail information in large formats, and when digital was first introduced we couldn’t say the same. At the time, cameras were barely capable of capturing true 2K and wide dynamic range. Many would shoot film and scan it into digital files hoping to preserve more of the dynamic range offered by film. Eventually, cameras got better and film started to disappear, mostly for convenience and cost reasons.

Through all this, target devices (projectors and monitors) stayed pretty much the same. Monitors went from CRT to plasma to LCD, but kept the same characteristics. For monitors, everything was in a Rec.709 color space and a luminance of 100 nits. Projectors were in the P3 colors space, but with a lower luminance of about 48 nits.

Maxine at work on the FilmLight Baselight.

Philosophically, one could argue that all creative intent was in some ways limited by the display. The files might of held much more information than the display was able to show. So, the aesthetics we learned to love were a direct result of the displays’ limitations.

What About Now?
Now, we are at the break in the revolution of these displays. With the introduction of OLEDs for monitors and laser projection for theaters, the contrast ratios, color spaces and luminance are now larger than before. It is now possible to see the details captured by cameras and or film. This allows for greater artistic freedom: since there is less limitation one can push the aesthetic to a new level.

However, that doesn’t mean all of a sudden everything is brighter and more colorful. It is very easy to create the same aesthetic one used to love, but it is now possible to bring to the screens details in shadows and highlights that were never an option prior. This even means better color separation. What creatives can do with “HDR” is still very much in their control.

The more difficult part is that HDR has not yet taken over theaters and or homes. If someone has set their look in a P3 48-nits world and is now asked to take this look into a 4000-nits P3 PQ display, it might be difficult to decide how to approach it. How do we maintain the original intent yet embrace what HDR has to offer? There are many ways to go about it, and not one is better than the other. You can redefine your look for the new displays, and in some ways have a new look that becomes its own entity, or you can mimic your original look, taking advantage of only a few elements of HDR.

The more we start using brighter luminance, bigger contrast ratio and color cube as our starting point, the more we will be able to future-proof and protect the creative intent. The afterthought of HDR, in terms of never having planned for it, is still something difficult to do and controversial in some cases.

The key is to have those philosophical discussions with creatives ahead of time and come up with a workflow that will have the expected results.

Main Image: Maxine Gervais working director Albert Hughes on his upcoming film, Alpha.


Maxine Gervais is a senior supervising colorist at Technicolor Hollywood.  Her past credits include Black Panther; The 15:17 to Paris; Pitch Perfect 3 and American Sniper.

Colorist Stephen Nakamura on grading Stephen King’s It

By Randi Altman

A scary clown can be thanked for helping boost what had been a lackluster summer box office. In its first weekend, Stephen King’s It opened with an impressive $125 million. Not bad!

Stephen Nakamura

This horror film takes place in a seemingly normal small town, but of course things aren’t what they seem. And while most horror films set most of the action in shadowy darkness, the filmmakers decided to let a lot of this story unfold in the bright glow of daylight in order to make the most of the darkness that eventually takes over. That presented some interesting opportunities for Deluxe’s Company 3 veteran colorist Stephen Nakamura.

How early did you get involved on It?
We came onboard early to do the first trailer. The response on YouTube and other places was enormous. I can’t speak for the filmmakers, but that was when I first realized how much excitement there was out there for this movie.

Had you worked with director Andy Muschietti before? What kind of direction were you given and how did he explain the look he wanted?
One of the concepts about the look that evolved during production, and we continued it in the DI, was this idea that a lot of the film takes place in fairly high-key situations, not the kind of dark, shadowy world some horror films exist in. It’s a period piece. It’s set in a small town that sort of looks like this pleasant place to be, but all this wild stuff is happening! You see these scary movies and everything’s creepy and it’s overcast outside and it’s clearly a horror movie from the outset. Naturally, that can work, but it can be even scarier when you play against that. The violence and everything feels more shocking.

How would you describe the look of the film?
You have the parts that are like I just described and then it does get very dark and shadowy as the action goes into dark spaces and into the sewer. And all that is particularly effective because we’ve kind of gotten to know all the kids who are in what’s called the Losers’ Club, and we’re rooting for them and scared about what might happen to them.

Can you talk about the Dolby Cinema pass? People generally talk about how bright you can get something with HDR, but I understand you were more interested in how dark the image can look.
Right. When you’re working in HDR, like Dolby lets you do, you have a lot more contrast to work with than you do in the normal digital cinema version. I worked on some of the earliest movies to do a Dolby Cinema version, and when I was working with Brad Bird and Claudio Miranda on Tomorrowland, we experimented with how much brighter we could make portions of the frame than what would be possible with normal digital cinema projection, without making the image into something that had a completely different feel from the P3 version. But when you’re in that space, you can also make things appear much much darker too. So the overall level in the theater can get really dark but because of that contrast you can actually see more detail on a person’s face, or a killer clown’s face, even when the overall level is so low. It’s more like you’re really in that dark space.

It doesn’t make it a whole different movie or anything, but it’s a good example of where Dolby can add something to the experience. I’d tell people to see it in Dolby Cinema if they could.

There was obviously a lot of VFX work that helped the terrifying shapeshifting clown, Pennywise, do what he does, but you also did some work on him in the DI, correct?
Yes. We had alpha channel mattes cut around his eyes for every shot he’s in and we used the color corrector to make changes to his eyes. Sometimes the changes were very subtle — making them brighter or pushing the color around — and sometimes we went more extreme, but I don’t want to talk about that too much. People can see for themselves when they see the movie.

What system do you use, and why? How does that tool allow you to be more creative?
I use Blackmagic’s DaVinci Resolve. I’ve been a colorist since the ‘90s and I’ve used Resolve pretty much my whole career. There are other systems out there that are also very good, but for the kinds of projects I do and the way I like to work, I find it the fastest and most intuitive and every time there’s a new upgrade, I find some new tool that helps me be even more efficient.

Canon targets HDR with EOS C200, C200B cinema cameras

Canon has grown its Cinema EOS line of pro cinema cameras with the EOS C200 and EOS C200B. These new offerings target filmmakers and TV productions. They offer two 4K video formats — Canon’s new Cinema RAW Light and MP4 — and are optimized for those interested in shooting HDR video.

Alongside a newly developed dual Digic DV6 image processing system, Canon’s Dual Pixel CMOS AF system and improved operability for pros, these new cameras are built for capturing 4K video across a variety of production applications.

Based on feedback from Cinema EOS users, these new offerings will be available in two configurations, while retaining the same core technologies within. The Canon EOS C200 is a production-ready solution that can be used right out of the box, accompanied by an LCD monitor, LCD attachment, camera grip and handle unit. The camera also features a 1.77 million-dot OLED electronic view finder (EVF). For users who need more versatility and the ability to craft custom setups tailored to their subject or environment, the C200B offers cinematographers the same camera without these accessories and the EVF to optimize shooting using a gimbal, drone or a variety of other configurations.

Canon’s Peter Marr was at Cine Gear demo-ing the new cameras.

New Features
Both cameras feature the same 8.85MP CMOS sensor that combines with a newly developed dual Digic DV6 image processing system to help process high-resolution image data and record video from full HD (1920×1080) and 2K (2048×1080) to 4K UHD (3840×2160) and 4K DCI (4096×2160). A core staple of the third-generation Cinema EOS system, this new processing platform offers wide-ranging expressive capabilities and improved operation when capturing high-quality HDR video.

The combination of the sensor and a newly developed processing system also allows for the support for two new 4K file formats designed to help optimize workflow and make 4K and HDR recording more accessible to filmmakers. Cinema RAW Light, available in 4K 60p/50p at 10-bit and 30p/25p/24p at 12-bit, allows users to record data internally to a CFast card by cutting data size to about one-third to one-fifth of a Cinema RAW file, without losing grading flexibility. Due to the reduced file size, users will appreciate rich dynamic range and easier post processing without sacrificing true 4K quality. Alongside recording to a CFast card, proxy data (MP4) can also be simultaneously recorded to an SD card for use in offline editing.

Additionally, filmmakers will also be able to export 4K in MP4 format on SD media cards at 60/50/30/25/24P at 8-bit. Support for UHD recording allows for use in cinema and broadcasting applications or scenarios where long recording times are needed while still maintaining top image quality. The digital cinema cameras also offer slow-motion full-HD recording support at up to 120fps.

The Canon EOS C200and Canon EOS C200B feature Innovative Focus Control that helps assist with 4K shooting that demands precise focusing, whether from single or remote operation. According to Canon, its Dual Pixel CMOS AF technology helps to expand the distance of the subject area to enable faster focus during 4K video recording. This also allows for highly accurate continuous AF and face detection AF when using EF lenses. For 4K video opportunities that call for precise focus accuracy that can’t be checked on an HD monitor, users can also take advantage of the LCD Monitor LM-V1 (supplied with the EOS C200 camera), which provides intuitive touch focusing support to help filmmakers achieve sophisticated focusing even as a single operator.

In addition to these features, the cameras offer:
• Oversampling HD processing: enhances sensitivity and helps minimize noise
• Wide DR Gamma: helps reduce overexposure by retaining continuity with a gamma curve
• ISO 100-102400 and 54db gain: high quality in both low sensitivity and low-light environments
• In-camera ND filter: internal ND unit allows cleaning of glass for easier maintenance
• ACESproxy support: delivers standardized color space in images, helping to improve efficiency
• Two SD card and one CFast card slots for internal recording
• Improved grip and Cinema-EOS-system-compatible attachment method
• Support for Canon Cine-Servo and EF cinema lenses

Editing and grading of the Cinema RAW Light video format will be supported in Blackmagic Resolve. Editing will also be possible in Avid Media Composer, using a Canon RAW plugin for Avid Media Access. This format can also be processed using the Canon application, Cinema RAW Development.

Also, Premiere Pro CC of Adobe will support this format until the end of 2017. Editing will also be possible in Final Cut Pro X from Apple, using the Canon RAW Plugin for Final Cut Pro X after the second half of this year.

The Canon EOS C200 and EOS C200B are scheduled to be available in August for estimated retail prices of $7,499 and $5,999, respectively. The EOS C200 comes equipped with additional accessories including the LM-V1 LCD monitor, LA-V1 LCD attachment, GR-V1 camera grip and HDU-2 handle unit. Available in September, these accessories will also be sold separately.

Bluefish444 supports Adobe CC and 4K HDR with Epoch card

Bluefish444 Epoch video audio and data I/O cards now support the advanced 4K high dynamic range (HDR) workflows offered in the latest versions of the Adobe Creative Cloud.

Epoch SDI and HDMI solutions are suited for Adobe’s Premiere Pro CC, After Effects CC, Audition CC and other tools that are part of the Creative Cloud. With GPU-accelerated performance for emerging post workflows, including 4K HDR and video over IP, Adobe and Bluefish444 are providing a strong option for pros.

Bluefish444’s Adobe Mercury Transmit support for Adobe Creative Cloud brings improved performance in demanding workflows requiring realtime video I/O from UHD and 4K HDR sequences.

Bluefish444 Epoch video card support adds:
• HD/SD SDI input and output
• 4K/2K SDI input and output
• 12/10/8-bit SDI input and output
• 4K/2K/HD/SD HDMI preview
• Quad split 4K UHD SDI
• Two sample interleaved 4K UHD SDI
• 23, 24, 25, 29, 30fps video input and output
• 48, 50, 59, 60fps video input and output
• Dual-link 1.5Gbps SDI
• 3Gbps level A & B SDI
• Quad link 1.5Gbps and 3Gbps SDI
• AES digital audio
• Analog audio monitoring
• RS-422 machine control
• 12-bit video color space conversions

“Recent updates have enabled performance which was previously unachievable,” reports Tom Lithgow, product manager at Bluefish444. “Thanks to GPU acceleration, and [the] Adobe Mercury Transmit plug-in, Bluefish444 and Adobe users can be confident of smooth realtime video performance for UHD 4K 60fps and HDR content.”

Sony’s offerings at NAB

By Daniel Rodriguez

Sony has always been a company that prioritizes and implements the requests of the customer. They are constantly innovation throughout all aspects of production — from initial capture to display. At NAB 2017, Sony’s goal was to further expand benchmarks the company has made in the past few months.

To reflect its focus as a company, Sony’s NAB booth was focused on four areas: image capture, media solutions, IP Live and HDR (High Dynamic Range). Sony’s focus was to demonstrate its ability to anticipate for future demands in capture and distribution while introducing firmware updates to many of their existing products to complement these future demands.

Cameras
Since Sony provides customers and clients with a path from capture to delivery, it’s natural to start with what’s new for imaging. Having already tackled the prosumer market with its introduction of the a7sii, a7rii, FS5 and FS7ii, and firmly established its presence in the cinema camera line with the Sony F5, F55 and F65, it’s natural that Sony’s immediate steps weren’t to follow up on these models so soon, but rather introduce models that fit more specific needs and situations.

The newest Sony camera introduced at NAB was the UMC-S3CA. Sporting the extremely popular sensor from the a7sii, the UMC-S3CA is a 4K interchangeable lens E mount camera that is much smaller than its sensor sibling. Its Genlock ability allows any user to monitor, operate and sync many at a time, something extremely promising for emerging media like VR and 360 video. It boasts an incredible ISO range from 100-409,600 and recording internal 4K UHD recording at 23.98p, 25fps and 29.97p in 100Mbps and 60Mbps modes. The size of this particularly small camera is promising for those who love the a7sii but want to employ it in more specific cases, such as crash cams, drones, cranes and sliders.

To complement its current camera line, Sony has released an updated version of their electronic viewfinder DVF-EL100 —the DVF-EL200 (pictured)— which also boasts a full 1920x1080p resolution image and is about twice as bright as the previous model. Much like updated versions of Sony’s cameras, this monitor’s ergonomics are attributed to the vast input from users of the previous model, something that the company prides itself on. (Our main image show the F55 with the DVF-EL200 viewfinder.)

Just because Sony is introducing new products doesn’t mean that it has forgotten about older products, especially those that are part of its camera lines. Prosumer models, like the Sony PXW-Z150 and Sony PXW-FS5, to professional cinema cameras, such as the Sony PMW-F5 and PMW-F55, are all receiving firmware updates coming in July 2017.

The most notable firmware update of the Z150 will be its ability to capture images in HLG (Hybrid Log Gamma) to support easier HDR capture and workflow. The FS5 will also receive the ability to capture in HLG, in addition to the ability to change the native ISO from 2000 to 3200 when shooting in SLog2 or SLog3 and 120fps capabilities at 1080p full HD. While many consider the F65 to be Sony’s flagship camera, some consider the F55 to be the more industry friendly of Sony’s cinema camera line, and Sony backs that up by increasing it’s high frame rate capture in a new firmware update. This new firmware update will allow the F55 to record in 72, 75, 90, 96 and 100fps in 4K RAW and in the company’s new compressed Extended Original Camera Negative (X-OCN) format.

X-OCN
Sony’s new X-OCN codec continues to be a highlight of the company’s developments as it boasts an incredible 16-bit bit-depth despite it being compressed, and it’s virtually indistinguishable from Sony’s own RAW format. Due to its compression, it boasts file sizes that are equivalent to 50 percent less than 2K 4:3 Arriraw and 4K ProRes 4444 XQ and 30 percent less than F55 RAW. It’s considered the most optimal and suitable format for HDR content capturing. With cameras like the F5, F55 and its smaller alternatives, like the FS7 and FS7II allowing RAW recording, Sony is offering a nearly indistinguishable alternative to cut down on storage space as well as allow more recording time on set.

Speed and Storage
As Sony continues to increase its support for HDR and larger resolutions like 8K, it’s easy to consider the emergence of X-OCN as an introduction of what to expect from Sony in the future.

Despite the introduction of X-OCN being the company’s answer to large file sizes from shooting RAW, Sony still maintain a firm understanding of the need for storage and the read/write speeds that come with such innovations. As part of such innovations, Sony has introduced the AXS-AR1 AXS memory and SXS Thunderbolt card reader. Using a Thunderbolt 2 connector, which can be daisy-chained since the reader has two inputs, the reader has a theoretical transfer speed of approximately 9.6Gbps, or 1200MBps. Supporting SxS and Sony’s new AXS cards, if one were to download an hour’s worth of true 4K footage at 24fps, shot in X-OCN, it would only take about 2.5 minutes to complete the transfer.

To complement these leaps in storage space and read/write speeds, Sony’s Optical Disc Archive Generation 2 is designed as an optic disc-based storage media with expandable robotic libraries called PetaSites, which through the use of 3.3TB Optical Disc Archive Cartridges guarantee a staggering 100-year shelf life. Unlike LTOs, which are generally only used a handful of times for storing and retrieving, Sony’s optical discs can be quickly and randomly accessed as needed.

HDR
HDR continues to gain traction in the world of broadcast and cinema. From capture to monitoring, the introduction of HDR has spurred many companies to implement new ways to create, monitor, display and distribute HDR content. As mentioned earlier, Sony is implementing firmware updates in many of its cameras to allow internal HLG, or Instant HDR, capture without the need for color grading, as well as compressed X-OCN RAW recording to allow more complex HDR grading to be possible without the massive amounts of data that uncompressed RAW takes up.

HDR gamma displays can now be monitored on screens like the Sony FS5’s, as well as higher-end displays such as their BVM E171, BVM X300/2 and PVM X550.

IP Live
What stood out about Sony’s mission with HDR is to further implement its use in realtime, non-fiction content, and broadcasts like sporting events through IP Live. The goal is to offer instantaneous conversions to not only output media in 4K HDR and SDR but also offer full HD HDR and SDR at the same time. With its SR Live System Sony hopes to implement updates in their camera lines with HLG to provide instant HDR which can be processed through its HDRC-4000 converters. As the company’s business model has stated Sony’s goal is to offer full support throughout the production process, which has led to the introduction of XDCAM Air, which will be an ENG-based cloud service that addresses the growing need for speed to air. XDCAM Air will launch in June 2017.

Managing Files
To round out its production through delivery goals, Sony continues with Media Backbone Navigator X, which is designed to be an online content storage and management solution to ease the work between capture and delivery. It accepts nearly any file type and allows multiple users to easily search for keywords and even phrases spoken in videos while being able to stream in realtime speeds.

Media Backbone Navigator X is designed for productions that create an environment of constant back and forth and will eliminate any excessive deliberation when figuring out storage and distribution of materials.

Sony’s goal at NAB wasn’t to shock or awe but rather to build on an established foundation for current and new clients and customers who are readying for an ever-changing production environment. For Sony, this year’s NAB could be considered preparation for the “upcoming storm” as firmware updates roll out more support for promising formats like HDR.


Daniel Rodriquez is a New York-based cinematographer, photographer and director. Follow him on Instragram: https://www.instagram.com/realdanrodriguez.