Tag Archives: HDR

Abu Dhabi’s twofour54 is now Dolby Vision certified

Abu Dhabi’s twofour54 has become Dolby Vision certified in an effort to meet the demand for color grading and mastering Dolby Vision HDR content. twofour54 is the first certified Dolby Vision facility in the UAE, providing work in both Arabic and English.

“The way we consume content has been transformed by connectivity and digitalization, with consumers able to choose not only what they watch but where, when and how,” says Katrina Anderson, director of commercial services at twofour54. “This means it is essential that content creators have access to technology such as Dolby Vision in order to ensure their content reaches as wide an audience as possible around the world.”

With Netflix, Amazon Prime and others now competing with existing broadcasters, there is a big demand around the world for high-quality production facilities. According to twofour54, Netflix’s expenditure on content creation soared from $4.6 billion in 2015 to $12 billion last year, while other platforms — such as Amazon Prime, Apple TV and YouTube — are also seeking to create more unique content. Consequently, the global demand for production facilities such as those offered by twofour54 is outstripping supply.

“We have seen an increased interest for Dolby Vision in home entertainment due to growing popularity of digital streaming services in Middle East, and we are now able to support studios and content creators with leading-edge tools that are deployed at twofour54 world-class post facility,” explains Pankaj Kedia, managing director of emerging markets for Dolby Laboratories. “Dolby Vision is the preferred HDR mastering workflow for leading studios and a growing number of content creators, and hence this latest offering demonstrates twofour54 commitment to make Abu Dhabi a preferred location for film and TV production.”

Why is this important? For color grading of movies and episodic content, Dolby has created a workflow that generates shot-by-shot dynamic metadata that allows filmmakers to see how their content will look on consumer devices. The colorist can then add “trims” to adjust how the mapping looks and to deliver a better-looking SDR version for content providers serving early Ultra HD (UHD) televisions that are capable only of SDR reproduction.

The colorists at twofour54 use both Blackmagic DaVinci Resolve and FilmLight Baselight systems.

Main Image: Engineer Noura Al Ali

Dalet AmberFin now includes Dolby Vision HDR technology

Dalet’s AmberFin software-based transcoding asset management system now includes Dolby Vision HDR technology targeting media professionals, studios and streaming service providers. The newly updated Dalet AmberFin engine enables mastering and distribution workflows for production, post and streaming

In addition to Dolby Vision and a bunch of other new software-only features, Dalet AmberFin can also be enhanced with hardware acceleration tools such as the M820L HEVC encoder plugin card from SoC (System On a Chip) specialist Socionext. 

Dalet AmberFin media processing capabilities enable automated Dolby Vision mastering and distribution workflows that generate outputs from IMF, through broadcast, cable and satellite packages to OTT HLS and DASH bundles. The integrated BPMN-compliant workflow engine and API let administrators configure user interfaces, assign tasks for operators, and push through ad-hoc QC processes; creating workflows that are just right for their facility.

Dalet AmberFin’s file outputs can vary in packaging sophistication from simple MP4s through to complex IMF packages that are automatically synthesized from collections of input files and automation instructions. Dalet is an active participent in SMPTE’s IMF plugfests, making sure that all of its IMF processes are interoperable at the highest level.

“By integrating Dalet AmberFin with our Dalet Galaxy MAM platform we have managed to build the content factory that we always dreamed of,” says Jean-Christophe Coin, CEO from French post house VDM. “Work orders for original and versioned content flow quickly and efficiently through our facility with human operators lending creative skills where required and automation fulfilling all other processes.

“There are more and more variations in the complexity of a modern title,” he continues. “When it was just SD or HD life was simple. Today we track different HDR variants and multiple languages, compliance and re-versioned variants for the large number of platforms that a single title might appear on.”

Dalet AmberFin version 11.8.2.0 is now available to all existing customers on a support contract.

Updated Apple Final Cut Pro features new Metal engine

Apple has updated Final Cut Pro X with a new Metal engine designed to provide performance gains across a wide range of Mac systems. It takes advantage of the new Mac Pro and the high-resolution, high-dynamic-range viewing experience of Apple Pro Display XDR. The company also optimized Motion and Compressor with Metal as well.

The Metal-based engine improves playback and accelerates graphics tasks in FCP X, including rendering, realtime effects and exporting on compatible Mac computers. According to Apple, video editors with a 15-inch MacBook Pro will benefit from performance that’s up to 20 percent faster, while editors using an iMac Pro will see gains up to 35 percent.

Final Cut Pro also works with the new Sidecar feature of macOS Catalina, which allows users to extend their Mac workspace by using an iPad as a second display to show the browser or viewer. Video editors can use Sidecar with a cable or they can connect wirelessly.

Final Cut Pro will now support multiple GPUs and up to 28 CPU cores. This means that rendering is up to 2.9 times faster and transcoding is up to 3.2 times faster than on the previous-generation 12-core Mac Pro. And Final Cut Pro uses the new Afterburner card when working with ProRes and ProRes Raw. This allows editors to simultaneously play up to 16 streams of 4K ProRes 422 video or work in 8K resolution with support for up to three streams of 8K ProRes Raw video.

Pro Display XDR
The Pro Display XDR features a 32-inch Retina 6K display, P3 wide color and extreme dynamic range. Final Cut Pro users can view, edit, grade and deliver HDR video with 1,000 nits of full screen sustained brightness, 1,600 nits peak brightness and a 1,000,000:1 contrast ratio. Pro Display XDR connects to the Mac through a single Thunderbolt cable, and pros using Final Cut Pro on Mac Pro can simultaneously use up to three Pro Display XDR units — two for the Final Cut Pro interface and one as a dedicated professional reference monitor.

Final Cut Pro 10.4.7 is available now as a free update for existing users and for $299.99 for new users on the Mac App Store. Motion 5.4.4 and Compressor 4.4.5 are also available today as free updates for existing users and for $49.99 each for new users on the Mac App Store.

AJA adds HDR Image Analyzer 12G and more at IBC

AJA will soon offer the new HDR Image Analyzer 12G, bringing 12G-SDI connectivity to its realtime HDR monitoring and analysis platform developed in partnership with Colorfront. The new product streamlines 4K/Ultra HD HDR monitoring and analysis workflows by supporting the latest high-bandwidth 12G-SDI connectivity. The HDR Image Analyzer 12G will be available this fall for $19,995.

HDR Image Analyzer 12G offers waveform, histogram and vectorscope monitoring and analysis of 4K/Ultra HD/2K/HD, HDR and WCG content for broadcast and OTT production, post, QC and mastering. It also features HDR-capable monitor outputs that not only go beyond HD resolutions and offer color accuracy but make it possible to configure layouts to place the preferred tool where needed.

“Since its release, HDR Image Analyzer has powered HDR monitoring and analysis for a number of feature and episodic projects around the world. In listening to our customers and the industry, it became clear that a 12G version would streamline that work, so we developed the HDR Image Analyzer 12G,” says Nick Rashby, president of AJA.

AJA’s video I/O technology integrates with HDR analysis tools from Colorfront in a compact 1-RU chassis to bring HDR Image Analyzer 12G users a comprehensive toolset to monitor and analyze HDR formats, including PQ (Perceptual Quantizer) and hybrid log gamma (HLG). Additional feature highlights include:

● Up to 4K/Ultra HD 60p over 12G-SDI inputs, with loop-through outputs
● Ultra HD UI for native resolution picture display over DisplayPort
● Remote configuration, updates, logging and screenshot transfers via an integrated web UI
● Remote Desktop support
● Support for display referred SDR (Rec.709), HDR ST 2084/PQ and HLG analysis
● Support for scene referred ARRI, Canon, Panasonic, Red and Sony camera color spaces
● Display and color processing lookup table (LUT) support
● Nit levels and phase metering
● False color mode to easily spot pixels out of gamut or brightness
● Advanced out-of-gamut and out-of-brightness detection with error intolerance
● Data analyzer with pixel picker
● Line mode to focus a region of interest onto a single horizontal or vertical line
● File-based error logging with timecode
● Reference still store

At IBC 2019, AJA also showed new products and updates designed to advance broadcast, production, post and pro AV workflows. On the stand were the Kumo 6464-12G for routing and the newly shipping Corvid 44 12G developer I/O models. AJA has also introduced the FS-Mini utility frame sync Mini-Converter and three new OpenGear-compatible cards: OG-FS-Mini, OG-ROI-DVI and OG-ROI-HDMI. Additionally, the company previewed Desktop Software updates for Kona, Io and T-Tap; Ultra HD support for IPR Mini-Converter receivers; and FS4 frame synchronizer enhancements.

SGO Mistika Boutique at IBC with Dolby Vision, color workflows

At IBC, SGO will be showing enhancements and upgrades of its subscription-based finishing solution, Mistika Boutique. The company will demo color management solutions as well as HDR content delivery workflows with recently integrated Dolby Vision support.

This professional color grading toolset combined with the finishing functionality of Mistika Boutique will be showcased running on a Mac Pro workstation with Tangent Arc control panels and output to a Canon 4K HDR reference display through Blackmagic Design DeckLink I/O.

Mistika Boutique is hardware-agnostic and runns on both Windows and MacOS.

SGO is offering a variety of sessions highlighting the trending topics for the content creation industry that feature Mistika Boutique as well as Mistika Workflows and Mistika VR at their stand.

While at the show, SGO is offering a special IBC promotion for Mistika Boutique. Anyone who subscribes by September 30, 2019 will get the Professional Immersive Edition for €99/month or €990/year (or whatever your bank’s conversion rate is), which represents a saving of over 65% from the normal price. The special IBC promotional price will be maintained as long as the subscription is not canceled and remains active.

GLOW’s DP and colorist adapt look of new season for Vegas setting

By Adrian Pennington

Netflix’s Gorgeous Ladies of Wrestling (GLOW) are back in the ring for a third round of the dramatic comedy, but this time the girls are in Las Vegas. The glitz and glamour of Sin City seems tailor-made for the 1980s-set GLOW and provided the main creative challenge for Season 3 cinematographer Chris Teague (Russian Doll, Broad City).

DP Chris Teague

“Early on, I met with Christian Sprenger, who shot the first season and designed the initial look,” says Teague, who was recently nominated for an Emmy for his work on Russian Doll. “We still want GLOW to feel like GLOW, but the story and character arc of Season 3 and the new setting led us to build on the look and evolve elements like lighting and dynamic range.”

The GLOW team is headlining the Fan-Tan Hotel & Casino, one of two main sets along with a hotel built for the series and featuring the distinctive Vegas skyline as a backdrop.

“We discussed compositing actors against greenscreen, but that would have turned every shot into a VFX shot and would have been too costly, not to mention time-intensive on a TV schedule like ours,” he says. “Plus, working with a backdrop just felt aesthetically right.”

In that vein, production designer Todd Fjelsted built a skyline using miniatures, a creative decision in keeping with the handcrafted look of the show. That decision, though, required extensive testing of lenses, lighting and look prior to shooting. This testing was done in partnership with post house Light Iron.

“There was no overall shift in the look of the show, but together with Light Iron, we felt the baseline LUT needed to be built on, particularly in terms of how we lit the sets,” explains Teague.

“Chris was clear early on that he wanted to build upon the look of the first two seasons,” says Light Iron colorist Ian Vertovec. “We adjusted the LUT to hold a little more color in the highlights than in past seasons. Originally, the LUT was based on a film emulation and adjusted for HDR. In Season 1, we created a period film look and transformed it for HDR to get a hybrid film emulation LUT. For Season 3, for HDR and standard viewing, we made tweaks to the LUT so that some of the colors would pop more.”

The show was also finished in Dolby Vision HDR. “There was some initial concern about working with backdrops and stages in HDR,” Teague says. “We are used to the way film treats color over its exposure range — it tends to desaturate as it gets more overexposed — whereas HDR holds a lot more color information in overexposure. However, Ian showed how it can be a creative tool.”

Colorist Ian Vertovec

“The goal was to get the 1980s buildings in the background and out the hotel windows to look real — emulating marquees with flashing lights,” adds Vertovec. “We also needed it to be a believable Nevada sky and skyline. Skies and clouds look different in HDR. So, when dialing this in, we discussed how they wanted it to look. Did it feel real? Is the sky in this scene too blue? Information from testing informed production, so everything was geared toward these looks.”

“Ian has been on the first two seasons, so he knows the look inside and out and has a great eye,” Teague continues. “It’s nice to come into a room and have his point of view. Sometimes when you are staring at images all day, it’s easy to lose your objectivity, so I relied on Ian’s insight.” Vertovec grades the show on FilmLight’s Baselight.

As with Season 2, GLOW Season 3 was a Red Helium shoot using Red’s IPP2 color pipeline in conjunction with Vertovec’s custom LUTs all the way to post. Teague shot full 8K resolution to accommodate his choice of Cooke anamorphic lenses, desqueezed and finished in a 2:1 ratio.

“For dailies I used an iPad with Moxion, which is perhaps the best dailies viewing platform I’ve ever worked with. I feel like the color is more accurate than other platforms, which is extremely useful for checking out contrast and shadow level. Too many times with dailies you get blacks washed out and highlights blown and you can’t judge anything critical.”

Teague sat in on the grade of the first three of the 10 episodes and then used the app to pull stills and make notes remotely. “With Ian I felt like we were both on the same page. We also had a great DIT [Peter Brunet] who was doing on-set grading for reference and was able to dial in things at a much higher level than I’ve been able to do in the past.”

The most challenging but also rewarding work was shooting the wrestling performances. “We wanted to do something that felt a little bigger, more polished, more theatrical,” Teague says. “The performance space had tiered seating, which gave us challenges and options in terms of moving the cameras. For example, we could use telescoping crane work to reach across the room and draw characters in as they enter the wrestling ring.”

He commends gaffer Eric Sagot for inspiring lighting cues and building them into the performance. “The wrestling scenes were the hardest to shoot but they’re exciting to watch — dynamic, cinematic and deliberately a little hokey in true ‘80s Vegas style.”


Adrian Pennington is a UK-based journalist, editor and commentator in the film and TV production space. He has co-written a book on stereoscopic 3D and edited several publications.

Blackmagic intros Teranex Mini SDI to DisplayPort 8K HDR monitor

Blackmagic’s Teranex Mini SDI to DisplayPort 8K HDR, is an advanced 8K monitoring solution that lets you use the new Apple Pro Display XDR as a color-critical reference monitor on set and in post.

With dual on-screen scope overlays, HDR, 33-point 3D LUTs and monitor calibration that’s designed for the pro film and television market, the new Teranex Mini SDI to DisplayPort 8K HDR works with the new generation of monitors, like Apple’s just-announced Pro Display XDR. The Teranex Mini SDI to DisplayPort 8K HDR will be available in October for $1,295.

The Teranex Mini SDI to DisplayPort 8K HDR can use third-party calibration probes to accurately align connected displays for precise color. There are two on-screen scopes that can be selected between WFM, Parade, Vector and Histogram.

The front panel includes controls and a color display for input video, audio meters and the video standard indicator. The rear panel has Quad Link 12G-SDI for HD, Ultra HD and 8K formats. There are two DisplayPort connections for regular computer monitors or USB-C-style DisplayPort monitors, such as the Pro Display XDR. The built-in scaler will ensure the video input standard is scaled to the native resolution of the connected DisplayPort monitor. Customers can even connect both 2SI or Square Division inputs.

Teranex Mini SDI to DisplayPort 8K HDR makes it easy to work in 8K. Users just need only to connect an HDR-compatible DisplayPort monitor to allow HDR SDI monitoring. Static metadata PQ and Hybrid Log Gamma (HLG) formats in the VPID are handled according to the ST2108-1, ST2084 and the ST425 standards.

Teranex Mini SDI to DisplayPort 8K HDR handles ST425, which defines two new bits in the VPID to indicate transfer characteristic of SDR, HLG or PQ. Plus the ST2108-1 standard defines how to transport HDR static or dynamic metadata over SDI. Plus there is support for ST2082-10 for 12G SDI as well as ST425 for 3G-SDI sources. It also supports both Rec.2020 and Rec.709 colorspaces and 100% of the DCI-P3 format.

Features include:
• Support for HDR via SDI and DisplayPort
• Two built-in scopes live overlaid on the monitor
• Film industry quality 33-point 3D LUTs
• Automatic monitor calibration support using color probes
• Advanced Quad Link 12G-SDI inputs for 8K
• Scales input video to the native monitor resolution
• Includes LCD for monitoring and menu settings
• Utility software included for Mac and Windows
• Supports latest 8K DisplayPort monitors and displays
• Can be used on a desktop or rack mounted

Colorfront at NAB with 8K HDR, product updates

Colorfront, which makes on-set dailies and transcoding systems, has rolled out new 8K HDR capabilities and updates across its product lines. The company has also deepened its technology partnership with AJA and entered into a new collaboration with Pomfort to bring more efficient color and HDR management on-set.

Colorfront Transkoder is a post workflow tool for handling UHD, HDR camera, color and editorial/deliverables formats, with recent customers such as Sky, Pixelogic, The Picture Shop and Hulu. With a new HDR GUI, Colorfront’s Transkoder 2019 performs the realtime decompression/de-Bayer/playback of Red and Panavision DXL2 8K R3D material displayed on a Samsung 82-inch Q900R QLED 8K Smart TV in HDR and in full 8K resolution (7680 X 4320). The de-Bayering process is optimized through Nvidia GeForce RTX graphics cards with Turing GPU architecture (also available on Colorfront On-Set Dailies 2019), with 8K video output (up to 60p) using AJA Kona 5 video cards.

“8K TV sets are becoming bigger, as well as more affordable, and people are genuinely awestruck when they see 8K camera footage presented on an 8K HDR display,” said Aron Jaszberenyi, managing director, Colorfront. “We are actively working with several companies around the world originating 8K HDR content. Transkoder’s new 8K capabilities — across on-set, post and mastering — demonstrate that 8K HDR is perfectly accessible to an even wider range of content creators.”

Powered by a re-engineered version of Colorfront Engine and featuring the HDR GUI and 8K HDR workflow, Transkoder 2019 supports camera/editorial formats including Apple ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE (High Density Encoding).

Transkoder 2019’s mastering toolset has been further expanded to support Dolby Vision 4.0 as well as Dolby Atmos for the home with IMF and Immersive Audio Bitstream capabilities. The new Subtitle Engine 2.0 supports CineCanvas and IMSC 1.1 rendering for preservation of content, timing, layout and styling. Transkoder can now also package multiple subtitle language tracks into the timeline of an IMP. Further features support fast and efficient audio QC, including solo/mute of individual tracks on the timeline, and a new render strategy for IMF packages enabling independent audio and video rendering.

Colorfront also showed the latest versions of its On-Set Dailies and Express Dailies products for motion pictures and episodic TV production. On-Set Dailies and Express Dailies both now support ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE. As with Transkoder 2019, the new version of On-Set Dailies supports real-time 8K HDR workflows to support a set-to-post pipeline from HDR playback through QC and rendering of HDR deliverables.

In addition, AJA Video Systems has released v3.0 firmware for its FS-HDR realtime HDR/WCG converter and frame synchronizer. The update introduces enhanced coloring tools together with several other improvements for broadcast, on-set, post and pro AV HDR production developed by Colorfront.

A new, integrated Colorfront Engine Film Mode offers an ACES-based grading and look creation toolset with ASC Color Decision List (CDL) controls, built-in LOOK selection including film emulation looks, and variable Output Mastering Nit Levels for PQ, HLG Extended and P3 colorspace clamp.

Since launching in 2018, FS-HDR has been used on a wide range of TV and live outside broadcast productions, as well as motion pictures including Paramount Pictures’ Top Gun: Maverick, shot by Claudio Miranda, ASC.

Colorfront licensed its HDR Image Analyzer software to AJA for AJA’s HDR Image Analyzer in 2018. A new version of AJA HDR Image Analyzer is set for release during Q3 2019.

Finally, Colorfront and Pomfort have teamed up to integrate their respective HDR-capable on-set systems. This collaboration, harnessing Colorfront Engine, will include live CDL reading in ACES pipelines between Colorfront On-Set/Express Dailies and Pomfort LiveGrade Pro, giving motion picture productions better control of HDR images while simplifying their on-set color workflows and dailies processes.

Atomos’ new Shogun 7: HDR monitor, recorder, switcher

The new Atomos Shogun 7 is a seven-inch HDR monitor, recorder and switcher that offers an all-new 1500-nit, daylight-viewable, 1920×1200 panel with a 1,000,000:1 contrast ratio and 15+ stops of dynamic range displayed. It also offers ProRes RAW recording and realtime Dolby Vision output. Shogun 7 will be available in June 2019, priced at $1,499.

The Atomos screen uses a combination of advanced LED and LCD technologies which together offer deeper, better blacks the company says rivals OLED screens, “but with the much higher brightness and vivid color performance of top-end LCDs.”

A new 360-zone backlight is combined with this new screen technology and controlled by the Dynamic AtomHDR engine to show millions of shades of brightness and color. It allows Shogun 7 to display 15+ stops of real dynamic range on-screen. The panel, says Atomos, is also incredibly accurate, with ultra-wide color and 105% of DCI-P3 covered, allowing for the same on-screen dynamic range, palette of colors and shades that your camera sensor sees.

Atomos and Dolby have teamed up to create Dolby Vision HDR “live” — a tool that allows you to see HDR live on-set and carry your creative intent from the camera through into HDR post. Dolby have optimized their target display HDR processing algorithm which Atomos has running inside the Shogun 7. It brings realtime automatic frame-by-frame analysis of the Log or RAW video and processes it for optimal HDR viewing on a Dolby Vision-capable TV or monitor over HDMI. Connect Shogun 7 to the Dolby Vision TV and AtomOS 10 automatically analyzes the image, queries the TV and applies the right color and brightness profiles for the maximum HDR experience on the display.

Shogun 7 records images up to 5.7kp30, 4kp120 or 2kp240 slow motion from compatible cameras, in RAW/Log or HLG/PQ over SDI/HDMI. Footage is stored directly to AtomX SSDmini or approved off-the-shelf SATA SSD drives. There are recording options for Apple ProRes RAW and ProRes, Avid DNx and Adobe CinemaDNG RAW codecs. Shogun 7 has four SDI inputs plus a HDMI 2.0 input, with both 12G-SDI and HDMI 2.0 outputs. It can record ProRes RAW in up to 5.7kp30, 4kp120 DCI/UHD and 2kp240 DCI/HD, depending on the camera’s capabilities. Also, 10-bit 4:2:2 ProRes or DNxHR recording is available up to 4Kp60 or 2Kp240. The four SDI inputs enable the connection of most quad-link, dual-link or single-link SDI cinema cameras. Pixels are preserved with data rates of up to 1.8Gb/s.

In terms of audio, Shogun 7 eliminates the need for a separate audio recorder. Users can add 48V stereo mics via an optional balanced XLR breakout cable, or select mic or line input levels, plus record up to 12 channels of 24/96 digital audio from HDMI or SDI. Monitoring selected stereo tracks is via the 3.5mm headphone jack. There are dedicated audio meters, gain controls and adjustments for frame delay.

Shogun 7 features the latest version of the AtomOS 10 touchscreen interface, first seen on the Ninja V.  The new body of Shogun 7 has a Ninja V-like exterior with ARRI anti-rotation mounting points on the top and bottom of the unit to ensure secure mounting.

AtomOS 10 on Shogun 7 has the full range of monitoring tools, including Waveform, Vectorscope, False Color, Zebras, RGB parade, Focus peaking, Pixel-to-pixel magnification, Audio level meters and Blue only for noise analysis.

Shogun 7 can also be used as a portable touchscreen-controlled multi-camera switcher with asynchronous quad-ISO recording. Users can switch up to four 1080p60 SDI streams, record each plus the program output as a separate ISO, then deliver ready-for-edit recordings with marked cut-points in XML metadata straight to your NLE. The current Sumo19 HDR production monitor-recorder will also gain the same functionality in a free firmware update.

There is asynchronous switching, plus use genlock in and out to connect to existing AV infrastructure. Once the recording is over, users can import the XML file into an NLE and the timeline populates with all the edits in place. XLR audio from a separate mixer or audio board is recorded within each ISO, alongside two embedded channels of digital audio from the original source. The program stream always records the analog audio feed as well as a second track that switches between the digital audio inputs to match the switched feed.

Pixelogic London adds audio mix, digital cinema theaters

Pixelogic has added new theaters and production suites to its London facility, which offers creation and mastering of digital cinema packages and theatrical screening of digital cinema content, as well as feature and episodic audio mixing.

Pixelogic’s London location now features six projector-lit screening rooms: three theaters and three production suites. Purpose-built from the ground up, the theaters offer HDR picture and immersive audio technologies, including Dolby Atmos and DTS:X.

The equipment offered in the three theaters includes Avid S6 and S3 consoles and Pro Tools systems that support a wide range of theatrical mixing services, complemented by two new ADR booths.

HPA releases 2019 Tech Retreat program, includes eSports

The Hollywood Professional Association (HPA) has set its schedule for the 2019 HPA Tech Retreat, set for February 11-15. The Tech Retreat, which is celebrating its 25th year, takes place over the course of a week at the JW Marriott Resort & Spa in Palm Desert, California.

The HPA Tech Retreat spans five days of sessions, technology demonstrations and events. During this week, important aspects of production, broadcast, post, distribution and related M&E trends are explored. One of the key differentiators of the Tech Retreat is its strict adherence to a non-commercial focus: marketing-oriented presentations are prohibited except at breakfast roundtables.

“Once again, we’ve received many more submissions than we could use,” says Mark Schubin, the Program Maestro of the HPA Tech Retreat. “To say this year’s were ‘compelling’ is an understatement. We could have programmed a few more days. Rejecting terrific submissions is always the hardest thing we have to do. I’m really looking forward to learning the latest on HDR, using artificial intelligence to restore old movies and machine learning to deal with grunt work, the Academy’s new software foundation, location-based entertainment with altered reality and much more.”

This year’s program is as follows:

Monday February 11: TR-X
eSports: Dropping the Mic on Center Stage
Separate registration required
A half day of targeted panels, speakers and interaction, TR-X will focus on the rapidly growing arena of eSports, with a keynote from Yvette Martinez, CEO – North America of eSports organizer and production company ESL North America.
Tuesday February 12: Supersession
Next-Gen Workflows and Infrastructure: From the Set to the Consumer

Tuesday February 12: Supersession
Next-Gen Workflows and Infrastructure: From the Set to the Consumer

Wednesday February 13: Main Program Highlights
• Mark Schubin’s Technology Year in Review
• Washington Update (Jim Burger, Thompson Coburn LLP)
The highly anticipated review of legislation and its impact on our business from a leading Washington attorney.

• Deep Fakes (Moderated by Debra Kaufman, ETCentric; Panelists Marc Zorn, HBO; Ed Grogan, Department of Defense; Alex Zhukov, Video Gorillas)
It might seem nice to be able to use actors long dead, but the concept of “fake news” takes a terrifying new turn with deepfakes, the term that Wikipedia describes as a portmanteau of “deep learning” and “fake.” Although people have been manipulating images for centuries – long before the creation of Adobe Photoshop – the new AI-powered tools allow the creation of very convincing fake audio and video.

• The Netflix Media Database (Rohit Puri, Netflix)
An optimized user interface, meaningful personalized recommendations, efficient streaming and a high-quality catalog of content are the principal factors that define theNetflix end-user experience. A myriad of business workflows of varying complexities come together to realize this experience. Under the covers, they use computationally expensive computer vision, audio processing and natural language-processing based media analysis algorithms. These algorithms generate temporally and spatially dynamic metadata that is shared across the various use cases. The Netflix Media DataBase (NMDB) is a multi-tenant, data system that is used to persist this deeply technical metadata about various media assets at Netflix and that enables querying the same at scale. The “shared nothing” distributed database architecture allows NMDB to store large amounts of media timeline data, thus forming the backbone for various Netflix media processing systems.

• AI Film Restoration at 12 Million Frames per Second (Alex Zhukov, Video Gorillas)

• Is More Media Made for Subways Than for TV and Cinema? (and does it Make More $$$?) (Andy Quested, BBC)

• Broadcasters Panel (Moderator: Matthew Goldman, MediaKind)

• CES Review (Peter Putman, ROAM Consulting)
Pete Putman traveled to Las Vegas to see what’s new in the world of consumer electronics and returns to share his insights with the HPA Tech Retreat audience.

• 8K: Whoa! How’d We Get There So Quickly (Peter Putman, ROAM Consulting)

• Issues with HDR Home Video Deliverables for Features (Josh Pines, Technicolor)

• HDR “Mini” Session
• HDR Intro: Seth Hallen, Pixelogic
• Ambient Light Compensation for HDR Presentation: Don Eklund, Sony Pictures Entertainment
• HDR in Anime: Haruka Miyagawa, Netflix
• Pushing the Limits of Motion Appearance in HDR: Richard Miller, Pixelworks
• Downstream Image Presentation Management for Consumer Displays:
• Moderator: Michael Chambliss, International Cinematographers Guild
• Michael Keegan, Netflix
• Annie Chang, UHD Alliance
• Steven Poster, ASC, International Cinematographers Guild
• Toshi Ogura, Sony

• Solid Cinema Screens with Front Sound: Do They Work? (Julien Berry, Delair Studios)
Direct-view displays bring high image quality in the cinema but suffer from low pixel fill factor that can lead to heavy moiré and aliasing patterns. Cinema projectors have a much better fill factor which avoids most of those issues even though some moiré effect can be produced due to the screen perforations needed for the audio. With the advent of high contrast, EDR and soon HDR image quality in cinema, screen perforations impact the perceived brightness and contrast from the same image, though the effect has never been quantified since some perforations had always been needed for cinema audio. With the advent of high-quality cinema audio system, it is possible to quantify this effect.

Thursday, February 14: Main Program Highlights

• A Study Comparing Synthetic Shutter and HFR for Judder Reduction (Ianik Beitzel and Aaron Kuder, ARRI and Stuttgart Media University (HdM))

• Using Drones and Photogrammetry Techniques to Create Detailed (High Resolution) Point Cloud Scenes (Eric Pohl, Singularity Imaging)
Drone aerial photography may be used to create multiple geotagged images that are processed to create a 3D point cloud set of a ground scene. The point cloud may be used for production previsualization or background creation for videogames or VR/AR new-media products.

• Remote and Mobile Production Panel (Moderator: Mark Chiolis, Mobile TV Group; Wolfgang Schram, PRG; Scott Rothenberg, NEP)
With a continuing appetite for content from viewers of all the major networks, as well as niche networks, streaming services, web, eGames/eSports and venue and concert-tour events, the battle is on to make it possible to watch almost every sporting and entertainment event that takes place, all live as it is happening. Key members of the remote and mobile community explore what’s new and what workflows are behind the content production and delivery in today’s fast-paced environments. Expect to hear about new REMI applications, IP workflows, AI, UHD/HDR, eGames, and eSports.

• IMSC 1.1: A Single Subtitle and Caption Format for the Entertainment Chain (Pierre-Anthony Lemieux, Sandflow Consulting (supported by MovieLabs); Dave Kneeland, Fox)
IMSC is a W3C standard for worldwide subtitles/captions, and the result of an international collaboration. The initial version of IMSC (IMSC 1) was published in 2016, and has been widely adopted, including by SMPTE, MPEG, ATSC and DVB. With the recent publication of IMSC 1.1, we now have the opportunity to converge on a single subtitle/caption format across the entire entertainment chain, from authoring to consumer devices. IMSC 1.1 improves on IMSC 1 with support for HDR, advanced Japanese language features, and stereoscopic 3D. Learn about IMSC’s history, capabilities, operational deployment, implementation experience, and roadmap — and how to get involved.

• ACESNext and the Academy Digital Source Master: Extensions, Enhancements and a Standardized Deliverable (Andy Maltz, Academy of Motion Picture Arts & Sciences; Annie Chang, Universal Pictures)

• Mastering for Multiple Display and Surround Brightness Levels Using the Human Perceptual Model to Insure the Original Creative Intent Is Maintained (Bill Feightner, Colorfront)
Maintaining a consistent creative look across today’s many different cinema and home displays can be a big challenge, especially with the wide disparity in possible display brightness and contrast as well as the viewing environments or surrounds. Even if it was possible to have individual creative sessions, maintaining creative consistency would be very difficult at best. By using the knowledge of how the human visual system works, the perceptual model, processing source content to fit a given displays brightness and surround can be automatically applied while maintaining the original creative intent with little to no trimming.

• Cloud: Where Are We Now? (Moderator: Erik Weaver, Western Digital)

• Digitizing Workflow – Leveraging Platforms for Success (Roger Vakharia, Salesforce)
While the business of content creation hasn’t changed much over time, the technology enabling processes around production, digital supply chain and marketing resource management among other areas have become increasingly complex. Enabling an agile, platform-based workflow can help in decreasing time and complexity but cost, scale and business sponsorship are often inhibitors in driving success.

Driving efficiency at scale can be daunting but many media leaders have taken the plunge to drive agility across their business process. Join this discussion to learn best practices, integrations, workflows and techniques that successful companies have used to drive simplicity and rigor around their workflow and business process.

• Leveraging Machine Learning in Image Processing (Rich Welsh, Sundog Media Toolkit)
How to use AI (ML and DL networks) to perform “creative” tasks that are boring and humans spend time doing but don’t want to (working real world examples included)

• Leveraging AI in Post Production: Keeping Up with Growing Demands for More Content (Van Bedient, Adobe)
Expectations for more and more content continue to increase — yet staffing remains the same or only marginally bigger. How can advancements from machine learning help content creators? AI can be an incredible boon to remove repetitive tasks and tedious steps allowing humans to concentrate on the creative; ultimately AI can provide the one currency creatives yearn for more than anything else: Time.

• Deploying Component-Based Workflows: Experiences from the Front Lines (Moderator: Pierre-Anthony Lemieux, Sandflow Consulting (supported by MovieLabs))
The content landscape is shifting, with an ever-expanding essence and metadata repertoire, viewing experiences, global content platforms and automated workflows. Component-based workflows and formats, such as the Interoperable Master Format (IMF) standard, are being deployed to meet the challenges brought by this shift. Come and join us for a first-hand account from those on the front lines.

• Content Rights, Royalties and Revenue Management via Blockchain (Adam Lesh, SingularDTV)
The blockchain entertainment economy: adding transparency, disintermediating the supply chain, and empowering content creators to own, manage and monetize their IP to create sustainable, personal and connected economies. As we all know, rights and revenue (including royalties, residuals, etc.) management is a major pain point for content creators in the entertainment industry.

Friday, February 15: Main Program Highlights

• Beyond SMPTE Time Code: The TLX Project: (Peter Symes)
SMPTE Time Code, ST 12, was developed and standardized in the 1970s to support the emerging field of electronic editing. It has been, and continues to be, a robust standard; its application is almost universal in the media industry, and the standard has found use in other industries. However, ST 12 was developed using criteria and restrictions that are not appropriate today, and it has many shortcomings in today’s environment.

A new project in SMPTE, the Extensible Time Label (TLX) is gaining traction and appears to have the potential to meet a wide range of requirements. TLX is designed to be transport-agnostic and with a modern data structure.

• Blindsided: The Game-Changers We Might Not See Coming (Mark Harrison, Digital Production Partnership)
The world’s number one company for gaming revenue makes as much as Sony and Microsoft combined. It isn’t American or Japanese. Marketeers project that by 2019, video advertising on out-of-home displays will be as important as their spending on TV. Meanwhile, a single US tech giant could buy every franchise of the top five US sports leagues. From its off-shore reserves. And still have $50 billion change.

We all know consumers like OTT video. But that’s the least of it. There are trends in the digital economy that, if looked at globally, could have sudden, and profound, implications for the professional content creation industry. In this eye-widening presentation, Mark Harrison steps outside the western-centric, professional media industry perspective to join the technology, consumer and media dots and ask: what could blindside us if we don’t widen our point of view?

• Interactive Storytelling: Choose What Happens Next (Andy Schuler, Netflix)
Looking to experiment with nonlinear storytelling, Netflix launched its first interactive episodes in 2017. Both in children’s programming, the shows encouraged even the youngest of viewers to touch or click on their screens to control the trajectory of the story (think Choose Your Own Adventure books from the 1980s). How did Netflix overcome some of the more interesting technical challenges of the project (i.e., mastering, encoding, streaming), how was SMPTE IMF used to streamline the process and why are we more formalized mastering practices needed for future projects?

• HPA Engineering Excellence Award Winners (Moderator: Joachim Zell, EFILM, Chair HPA Engineering Excellence Awards; Joe Bogacz, Canon; Paul Saccone, Blackmagic Design; Lance Maurer, Cinnafilm; Michael Flathers, IBM; Dave Norman, Telestream).

Since the HPA launched in 2008, the HPA Awards for Engineering Excellence have honored some of the most groundbreaking, innovative, and impactful technologies. Spend a bit of time with a select group of winners and their contributions to the way we work and the industry at large.

• The Navajo Strategic Digital Plan (John Willkie, Luxio)

• Adapting to a COTS Hardware World (Moderator: Stan Moote, IABM)
Transitioning to off-the-shelf hardware is one of the biggest topics on all sides of the industry, from manufacturers, software and service providers through to system integrators, facilities and users themselves. It’s also incredibly uncomfortable. Post production was an early adopter of specialized workstations (e.g. SGI), and has now embraced a further migration up the stack to COTS hardware and IP networks, whether bare metal, virtualized, hybrid or fully cloud based. As the industry deals with the global acceleration of formats, platforms and workflows, what are the limits of COTS hardware when software innovation is continually testing the limits of general-purpose CPUs, GPUs and network protocols? Covering “hidden” issues in using COTS hardware, from the point of view of users and facility operators as well as manufacturers, services and systems integrators.

• Academy Software Foundation: Enabling Cross-Industry Collaboration for Open Source Projects (David Morin, Academy Software Foundation)
In August 2018, the Academy of Motion Picture Arts and Sciences and The Linux Foundation launched the Academy Software Foundation (ASWF) to provide a neutral forum for open source software developers in the motion picture and broader media industries to share resources and collaborate on technologies for image creation, visual effects, animation and sound. This presentation will explain why the Foundation was formed and how it plans to increase the quality and quantity of open source contributions by lowering the barrier to entry for developing and using open source software across the industry.

AJA ships HDR Image Analyzer developed with Colorfont

AJA is now shipping HDR Image Analyzer, a realtime HDR monitoring and analysis solution developed in partnership with Colorfront. HDR Image Analyzer features waveform, histogram and vectorscope monitoring and analysis of 4K/UltraHD/2K/HD, HDR and WCG content for broadcast and OTT production, post, QC and mastering.

Combining AJA’s video I/O with HDR analysis tools from Colorfront in a compact 1RU chassis, the HDR Image Analyzer features a toolset for monitoring and analyzing HDR formats, including Perceptual Quantizer (PQ) and Hybrid Log Gamma (HLG) for 4K/UltraHD workflows. The HDR Image Analyzer takes in up to 4K sources across 4x 3G-SDI inputs and loops the video out, allowing analysis at any point in the production workflow.

Additional feature highlights include:
– Support for display referred SDR (Rec.709), HDR ST 2084/PQ and HLG analysis
– Support for scene referred ARRI, Canon, Panasonic, Red and Sony camera color spaces
– Display and color processing look up table (LUT) support
– Automatic color space conversion based on the award winning Colorfront Engine
– CIE graph, vectorscope, waveform and histogram support– Nit levels and phase metering
– False color mode to easily spot out-of-gamut/out-of-brightness pixels
– Advanced out-of-gamut and out-of-brightness detection with error intolerance
– Data analyzer with pixel picker
– Line mode to focus a region of interest onto a single horizontal or vertical line
– File-based error logging with timecode
– Reference still store
– UltraHD UI for native-resolution picture display
– Up to 4K/UltraHD 60p over 4x 3G-SDI inputs, with loop out
– SDI auto signal detection
– Loop through output to broadcast monitors
– Three-year warranty

The HDR Image Analyzer is the second technology collaboration between AJA and Colorfront, following the integration of Colorfront Engine into AJA’s FS-HDR realtime HDR/WCG converter. Colorfront has exclusively licensed its Colorfront HDR Image Analyzer software to AJA for the HDR Image Analyzer.

The HDR Image Analyzer is available through AJA’s worldwide reseller network for $15,995.

Roundtable Post tackles HFR, UHD and HDR image processing

If you’re involved in post production, especially episodic TV, documentaries and feature films, then it’s highly probable that High Frame Rate (HFR), Ultra High Definition (UHD) and High Dynamic Range (HDR) have come your way.

“On any single project, the combination of HFR, UHD and HDR image-processing can be a pretty demanding, cutting-edge technical challenge, but it’s even more exacting when particular specs and tight turnarounds are involved,” says Jack Jones, digital colorist and CTO of full-service boutique facility Roundtable Post Production.

Among the central London facility’s credits are online virals for brands including Kellogg’s, Lurpak, Rolex and Ford, music films for Above & Beyond and John Mellencamp, plus broadcast TV series and feature documentaries for ITV, BBC, Sky, Netflix, Amazon, Discovery, BFI, Channel 4, Showtime and film festivals worldwide. These include Sean McAllister’s A Northern Soul, Germaine Bloody Greer (BBC) and White Right: Meeting The Enemy (ITV Exposure/Netflix).

“Yes, you can render-out HFR/UHD/HDR deliverables from a variety of editing and grading systems, but there are not many that can handle the simultaneous combination of these formats, never mind the detailed delivery stipulations and crunching deadlines that often accompany such projects,” says Jones.

Rewinding to the start of 2017, Jones says that, “Looking forward, to the future landscape of post, the proliferation of formats, resolutions, frame rates and color spaces involved in modern screened entertainment seemed an inevitability for our business. We realized that we were going to need to tackle the impending scenario head-on. Having assessed the alternatives, we took the plunge and gambled on Colorfront Transkoder.”

Transkoder is a standalone, automated system for fast digital file conversion. Roundtable Post’s initial use of Colorfront Transkoder turned out to be the creation of encrypted DCP masters and worldwide deliverables of a variety of long-form projects, such as Nick Broomfield’s Whitney: Can I Be Me, Noah Media Group’s Bobby Robson: More Than a Manager, Peter Medak’s upcoming feature The Ghost of Peter Sellers, and the Colombian feature-documentary To End A War, directed by Marc Silver.

“We discovered from these experiences that, along with incredible quality in terms of image science, color transforms and codecs, Transkoder is fast,” says Jones. “For example, the deliverables for To End A War, involved 10 different language versions, plus subtitles. It would have taken several days to complete these out straight of out of an Avid, but rendering in Transkoder took just four hours.”

More recently, Roundtable Post was faced with the task of delivering country-specific graphics packages, designed and created by production agency Noah Media Group, for use by FIFA rights holders and broadcasters during the 2018 World Cup.

The project involved delivering a mix of HFR, UHD, HDR and HD SDR formats, resulting in 240 bespoke animations, and the production of a mammoth 1,422 different deliverables. These included: 59.94p UHD HDR, 50p UHD HDR, 59.94p HD SDR, 50p HD SDR, 59.94i HD SDR and 50i HD SDR with a variety of clock, timecode, pre-roll, soundtrack, burn-in and metadata requirements as part of the overall specification. Furthermore, the job encompassed the final QC of all deliverables, and it had to be completed within a five-day work week.

“For a facility of our size, this was a significant job in terms of its scale and deadline,” says Jones. “Traditionally, projects like these would involve throwing a lot of people and time at them, and there’s always the chance of human error creeping in. Thankfully, we already had positive experiences with Transkoder, and were eager to see how we could harness its power.”

Using technical data from FIFA, Jones built an XML file containing timelines all of the relevant timecode, clock, image metadata, Wav audio and file-naming information of the required deliverables. He also liaised with Colorfront’s R&D team, and was quickly provided with an initial set of Python script templates that would help to automate the various requirements of the job in Transkoder.

Roundtable Post was able to complete the FIFA 2018 World Cup job, including the client-attend QC of the 1,422 different UHD HDR and HD SDR assets, in under three days.

Our Virtual Color Roundtable

By Randi Altman

The number of things you can do with color in today’s world is growing daily. It’s not just about creating a look anymore, it’s using color to tell or enhance a story. And because filmmakers recognize this power, they are getting colorists involved in the process earlier than ever before. And while the industry is excited about HDR and all it offers, this process also creates its own set of challenges and costs.

To find out what those in the trenches are thinking, we reached out to makers of color gear as well as hands-on colorists with the same questions, all in an effort to figure out today’s trends and challenges.

Company 3 Senior Colorist Stephen Nakamura
Company 3 is a global group of creative studios specializing in color and post services for features, TV and commercials. 

How has the finishing of color evolved most recently?
By far, the most significant change in the work that I do is the requirement to master for all the different exhibition mediums. There’s traditional theatrical projection at 14 footlamberts (fL) and HDR theatrical projection at 30fL. There’s IMAX. For home video, there’s UHD and different flavors of HDR. Our task with all of these is to master the movie so it feels and looks the way it’s supposed to feel and look on all the different formats.

There’s no one-size-fits-all approach. The colorist’s job is to work with the filmmakers and make those interpretations. At Company 3 we’re always creating custom LUTs. There are other techniques that help us get where we need to be to get the most out of all these different display types, but there’s no substitute for taking the time and interpreting every shot for the specific display format.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not too long ago, a cinematographer could expose an image specifically for one display format — a film print projected at 14fL. They knew exactly where they could place their highlights and shadows to get a precise look onscreen. Today, they’re thinking in terms of the HDR version, where if they don’t preserve detail in the blacks and whites it can really hurt the quality of the image in some of the newer display methods.

I work frequently with Dariuisz Wolski (Sicario: Day of the Soldado, All the Money in the World). We’ve spoken about this a lot, and he’s said that when he started shooting features, he often liked to expose things right at the edge of underexposure because he knew exactly what the resulting print would be like. But now, he has to preserve the detail and fine-tune it with me in post because it has to work in so many different display formats.

There are also questions about how the filmmakers want to use the different ways of seeing the images. Sometimes they really like the qualities of the traditional theatrical standard and really don’t want HDR to look very different and to make the most of the dynamic range. If we have more dynamic range, more light, to work with, it means that in essence we have a larger “canvas” to work on. But you need to take the time to individually treat every shot if you want to get the most out of that “canvas.”

Where do you see the industry moving in the near future?
The biggest change I expect to see is the development of even brighter, higher-contrast exhibition mediums. At NAB, Sony unveiled this wall of LED panels that are stitched together without seams and can display up to 1000 nits. It can be the size of a screen in a movie theater. If that took off, it could be a game changer. If theatrical exhibition gets better with brighter, higher-contrast screens, I think the public will enjoy it, provided that the images are mastered appropriately.

Sicario: Day of the Soldado

What is the biggest challenge you see for the color grading process now and beyond?
As there are more formats, there will be more versions of the master. From P3 to Rec.709 to HDR video in PQ — they all translate color information differently. It’s not just the brightness and contrast but the individual colors. If there’s a specific color blue the filmmakers want for Superman’s suit, or red for Spiderman, or whatever it is, there are multiple layers of challenges involved in maintaining those across different displays. Those are things you have to take a lot of care with when you get to the finishing stage.

What’s the best piece of work you’ve seen that you didn’t work on?
I know it was 12 years ago now, but I’d still say 300, which was colored by Company 3 CEO Stefan Sonnenfeld. I think that was enormously significant. Everyone who has seen that movie is aware of the graphic-novel-looking imagery that Stefan achieved in color correction working with Zack Snyder and Larry Fong.

We could do a lot in a telecine bay for television, but a lot of people still thought of digital color correction for feature films as an extension of the timing process from the photochemical world. But the look in 300 would be impossible to achieve photo-chemically, and I think that opened a lot of people’s minds about the power of digital color correction.

Alt Systems Senior Product Specialist Steve MacMillian
Alt Systems is a systems provider, integrating compositing, DI, networking and storage solutions for the media and entertainment industry.

How has the finishing of color evolved most recently?
Traditionally, there has been such a huge difference between the color finishing process for television production verses for cinematic release. It used to be that a target format was just one thing, and finishing for TV was completely different than finishing for the cinema.

Colorists working on theatrical films will spend most of their efforts on grading for projection, and only after there is a detailed trim pass to make a significantly different version for the small screen. Television colorists, who are usually under much tighter schedules, will often only be concerned with making Rec.709 look good on a standard broadcast monitor. Unless there is a great deal of care to preserve the color and dynamic range of the digital negative throughout the process, the Rec.709 grade will not be suitable for translation to other expanded formats like HDR.

Now, there is an ever-growing number of distribution formats with different color and brightness requirements. And with the expectation of delivering to all of these on ever-tighter production budgets, it has become important to use color management techniques so that the work is not duplicated. If done properly, this allows for one grade to service all of these requirements with the least amount of trimming needed.

How has laser projection and HDR impacted the work?
HDR display technology, in my opinion, has changed everything. The biggest impact on color finishing is the need for monitoring in both HDR and SDR in different color spaces. Also, there is a much larger set of complex delivery requirements, along with the need for greater technical expertise and capabilities. Much of this complexity can be reduced by having the tools that make the various HDR image transforms and complex delivery formats as automatic as possible.

Color management is more important than ever. Efficient and consistent workflows are needed for dealing with multiple sources with unique color sciences, integrating visual effects and color grading while preserving the latitude and wide color gamut of the image.

The color toolset should support remapping to multiple deliverables in a variety of color spaces and luminance levels, and include support for dynamic HDR metadata systems like Dolby and HDR10+. As HDR color finishing has evolved, so has the way it is delivered to studios. Most commonly it is delivered in an HDR IMF package. It is common that Rec.2020 HDR deliverables be color constrained to the P3 color volume and also that Light Level histograms and HDR QC reports be delivered.

Do you feel DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not as much as you would think. Two things are working against this. First, film and high-end digital cameras themselves have for some time been capturing latitude suitable for HDR production. Proper camera exposure is all that is needed to ensure that an image with a wide enough dynamic range is recorded. So from a capture standpoint, nothing needs to change.

The other is cost. There are currently only a small number of suitable HDR broadcast monitors, and most of these are extremely expensive and not designed well for the set. I’m sure HDR monitoring is being used on-set, but not as much as expected for productions destined for HDR release.

Also, it is difficult to truly judge HDR displays in a bright environment, and cinematographers may feel that monitoring in HDR is not needed full time. Traditionally with film production, cinematographers became accustomed to not being able to monitor accurately on-set, and they rely on their experience and other means of judging light and exposure. I think the main concern for cinematographers is the effect of lighting choices and apparent resolution, saturation and contrast when viewed in HDR.

Highlights in the background can potentially become distracting when displayed at 1000 nits verses being clamped at 100. Framing and lighting choices are informed by proper HDR monitoring. I believe we will see more HDR monitoring on-set as more suitable displays become available.

Colorfront’s Transkoder

Where do you see the industry moving in the near future?
Clearly HDR display technology is still evolving, and we will see major advances in HDR emissive displays for the cinema in the very near future. This will bring new challenges and require updated infrastructure for post as well as the cinema. It’s also likely that color finishing for the cinema will become more and more similar to the production of HDR for the home, with only relatively small differences in overall luminance and the ambient light of the environment.

Looking forward, standard dynamic range will eventually go away in the same way that standard definition video did. As we standardize on consumer HDR displays, and high-performance panels become cheaper to make, we may not need the complexity of HDR dynamic remapping systems. I expect that headset displays will continue to evolve and will become more important as time goes on.

What is the biggest challenge you see for the color grading process now and beyond?
We are experiencing a period of change that can be compared to the scope of change from SD to HD production, except it is happening much faster. Even if HDR in the home is slow to catch on, it is happening. And nobody wants their production to be dated as SDR-only. Eventually, it will be impossible to buy a TV that is not HDR-capable.

Aside from the changes in infrastructure, colorists used to working in SDR have some new skills to learn. I think it is a mistake to do separate grading versions for every major delivery format. Even though we have standards for HDR formats, they will continue to evolve, so post production must evolve too. The biggest challenge is meeting all of these different delivery requirements on budgets that are not growing as fast as the formats.

Northern Lights Flame Artist and Colorist Chris Hengeveld
NY- and SF-based Northern Lights, along with sister companies Mr. Wonderful for design, SuperExploder for composing and audio post, and Bodega for production, offers one-stop-shop services.

How has the finishing of color evolved most recently?
It’s interesting that you use the term “finishing of color.” In my clients’ world, finishing and color now go hand in hand. My commercial clients expect not only a great grade but seamless VFX work in finalizing their spots. Both of these are now often taking place with the same artist. Work has been pushed from just straight finishing with greenscreen, product replacement and the like to doing a grade up to par with some of the higher-end coloring studios. Price is pushing vastly separate disciplines into one final push.

Clients now expect to have a rough look ready not only of the final VFX, but also of the color pass before they attend the session. I usually only do minor VFX tweaks when clients arrive. Sending QuickTimes back and forth between studio and client usually gets us to a place where our client, and their client, are satisfied with at least the direction if not the final composites.

Color, as a completely subjective experience, is best enjoyed with the colorist in the room. We do grade some jobs remotely, but my experience has clearly been that from both time and creativity standpoints, it’s best to be in the grading suite. Unfortunately, recently due to time constraints and budget issues, even higher-end projects are being evaluated on a computer/phone/tablet back at the office. This leads to more iterations and less “the whole is greater than the sum of the parts” mentality. Client interaction, especially at the grading level, is best enjoyed in the same room as the colorist. Often the final product is markedly better than what either could envision separately.

Where do you see the industry moving in the near future?
I see the industry continuing to coalesce around multi-divisional companies that are best suited to fulfill many clients’ needs at once. Most projects that come to us have diverse needs that center around one creative idea. We’re all just storytellers. We do our best to tell the client’s story with the best talent we offer, in a reasonable timeframe and at a reasonable cost.

The future will continue to evolve, putting more pressure on the editorial staff to deliver near perfect rough cuts that could become finals in the not-too-distant future.

Invisalign

The tools continue to level the playing field. More generalists will be trained in disciplines including video editing, audio mixing, graphic design, compositing and color grading. This is not to say that the future of singularly focused creatives is over. It’s just that those focused creatives are assuming more and more responsibilities. This is a continuation of the consolidation of roles that has been going on for several years now.

What is the biggest challenge you see for the color grading process now and beyond?
The biggest challenge going forward is both technical and budgetary. Many new formats have emerged, including the new ProRes RAW. New working color spaces have also emerged. Many of us work without on-staff color scientists and must find our way through the morass of HDR, ACES, Scene Linear and Rec.709. Working with materials that round trip in-house is vastly easier than dealing with multiple shops all with their own way of working. As we collaborate with outside shops, it behooves us to stay at the forefront of technology.

But truth be told, perhaps the biggest challenge is keeping the creative flow and putting the client’s needs first. Making sure the technical challenges don’t get in the way. Clients need to see a seamless view without technical hurdles.

What’s the best piece of work you’ve seen that you didn’t work on?
I am constantly amazed at the quality of work coming out of Netflix. Some of the series are impeccably graded. Early episodes of Bloodline, which was shot with the Sony F65, come to mind. The visuals were completely absorbing, both daytime and nighttime scenes.

Codex VP Business Development Brian Gaffney
Codex designs tools for color, dailies creation, archiving, review and networked attached storage. Their offerings include the new Codex ColorSynth with Keys and the MediaVault desktop NAS.

How has the finishing of color evolved most recently?
While it used to be a specialized suite in a post facility, color finishing has evolved tremendously over the last 10 years with low-cost access to powerful systems like Resolve for use on-set in commercial finishing to final DI color grading. These systems have evolved from being more than just color. Now they are editorial, sound mixing and complete finishing platforms.

How has laser projection and HDR impacted the work?
Offering brighter images in the theatre and the home with laser projection, OLED walls and HDR displays will certainly change the viewers’ experience, and it has helped create more work in post, offering up another pass for grading.

However, brighter images also show off image artifacts and can bring attention to highlights that may already be clipping. Shadow detail that was graded in SDR may now look milky in HDR. These new display mediums require that you spend more time optimizing the color correction for both display types. There is no magic one grade fits all.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
I think cinematographers are still figuring this out. Much like color correction between SDR and HDR, lighting for the two is different. A window that was purposely blown out in SDR, to hide a lighting rig outside, may show up in HDR, exposing the rig itself. Color correction might be able to correct for this, but unless a cinematographer can monitor in HDR on-set, these issues will come up in post. To do it right, lighting optimization between the two spaces is required, plus SDR and HDR monitoring on-set and near-set and in editorial.

Where do you see the industry moving in the near future?
It’s all about content. With the traditional studio infrastructure and broadcast television market changing to Internet Service Providers (ISPs), the demand for content, both original and HDR remastered libraries, is helping prop up post production and is driving storage- and cloud-based services.

Codex’s ColorSynth and Media Vault

In the long term, if the competition in this space continues and the demand for new content keeps expanding, traditional post facilities will become “secure data centers” and managed service providers. With cloud-based services, the talent no longer needs to be in the facility with the client. Shared projects with realtime interactivity from desktop and mobile devices will allow more collaboration among global-based productions.

What is the biggest challenge you see for the color grading process now and beyond?
Project management — sharing color set-ups among different workstations. Monitoring of the color with proper calibrated displays in both SDR and HDR and in support of multiple deliverables is always a challenge. New display technologies, like laser projection and new Samsung and Sony videowalls, may not be cost effective for the creative community to access for final grading. Only certain facilities may wind up specializing in this type of grading experience, limiting global access for directors and cinematographers to fully visualize how their product will look like on these new display mediums. It’s a cost that may not get the needed ROI, so in the near future many facilities may not be able to support the full demand of deliverables properly.

Blackmagic Director of Sales/Operations Bob Caniglia
Blackmagic creates DaVinci Resolve, a solution that combines professional offline and online editing, color correction, audio post production and visual effects in one software tool.

How has the finishing of color evolved most recently?
The ability to work in 8K, and whatever flavor of HDR you see, is happening. But if you are talking evolution, it is about the ability to collaborate with everyone in the post house, and the ability to do high-quality color correction anywhere. Editors, colorists, sound engineers and VFX artists should not be kept apart or kept from being able to collaborate on the same project at the same time.

New collaborative workflows will speed up post production because you will no longer need to import, export or translate projects between different software applications.

How has laser projection and HDR impacted the work?
The most obvious impact has been on the need for colorists to be using software that can finish a project in whatever HDR format the client asks for. That is the same with laser projection. If you do not use software that is constantly updating to whatever new format is introduced, being able to bid on HDR projects will be hard.

HDR is all about more immersive colors. Any colorist should be ecstatic to be able to work with images that are brighter, sharper and with more data. This should allow them to be even more creative with telling a story with color.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
As for cinematographers, HDR gives viewers a whole new level of image details. But that hyper reality could draw the viewer from the wanted target in a shot. The beautiful details shining back on a coffee pot in a tracking shot may not be worth worrying about in SDR, but in HDR every shot will create more work for the colorist to make sure the viewer doesn’t get distracted by the little things. For DPs, it means they are going to have to be much more aware of lighting, framing and planning the impact of every possible item and shadow in an image.

Where do you see the industry moving in the near future?
Peace in our time amongst all of the different post silos, because those silos will finally be open. And there will be collaboration between all parts of the post workflow. Everyone — audio, VFX, editing and color correction — can work together on the same project seamlessly.

For example, in our Resolve tool, post pros can move between them all. This is what we see happening with colorists and post houses right now, as each member of the post team can be much more creatively flexible because anyone can explore new toolsets. And with new collaboration tools, multiple assistants, editors, colorists, sound designers and VFX artists can all work on the same project at the same time.

Resolve 15

For a long-term view, you will always have true artists in each of the post areas. People who have mastered the craft and can separate themselves as being color correction artists. What is really going to change is that everyone up and down the post workflow at larger post houses will be able to be much more creative and efficient, while small boutique shops and freelancers can offer their clients a full set of post production services.

What is the biggest challenge you see for the color grading process now and beyond?
Speed and flexibility. Because with everyone now collaborating and the colorist being part of every part of the post process, you will be asked to do things immediately… and in any format. So if you are not able to work in real time or with whatever footage format thrown at you, they will find someone who can.

This also comes with the challenge of changing the old notion that the colorist is one of the last people to touch a project. You will be asked to jump in early and often. Because every client would love to show early edits that are graded to get approvals faster.

FilmLight CEO Wolfgang Lempp
FilmLight designs, creates and manufactures color grading systems, image processing applications and workflow tools for the film and television industry

How has the finishing of color evolved recently?
When we started FilmLight 18 years ago, color management was comparatively simple: Video looked like video, and digital film was meant to look like film. And that was also the starting point for the DCI — the digital cinema standard tried to make digital projection look exactly like conventional cinema. This understanding lasted for a good 10 years, and even ACES today is very much built around film as the primary reference. But now we have an explosion of new technologies, new display devices and new delivery formats.

There are new options in resolution, brightness, dynamic range, color gamut, frame rate and viewing environments. The idea of a single deliverable has gone: There are just too many ways of getting the content to the viewer. That is certainly affecting the finishing process — the content has to look good everywhere. But there is another trend visible, too, which here in the UK you can see best on TV. The color and finishing tools are getting more powerful and the process is getting more productive. More programs than ever before are getting a professional color treatment before they go out, and they look all the better for it.

Either way, there is more work for the colorist and finishing house, which is of course something we welcome.

How has laser projection and HDR impacted the work?
Laser projection and HDR for cinema and TV are examples of what I described above. We have the color science and the tools to move comfortably between these different technologies and environments, in that the color looks “right,” but that is not the whole story.

The director and DP will choose to use a format that will best suit their story, and will shoot for their target environment. In SDR, you might have a bright window in an interior scene, for example, which will shape the frame but not get in the way of the story. But in HDR, that same window will be too bright, obliterate the interior scene and distract from the story. So you would perhaps frame it differently, or light up the interior to restore some balance. In other words, you have to make a choice.

HDR shouldn’t be an afterthought, it shouldn’t be a decision made after the shoot is finished. The DP wants to keep us on the edge of our seats — but you can’t be on the edge in HDR and SDR at the same time. There is a lot that can be done in post, but we are still a long way from recreating the multispectral, three-dimensional real world from the output of a camera.

HDR, of course, looks fantastic, but the industry is still learning how to shoot for best effect, as well as how to serve all the distribution formats. It might well become the primary mastering format soon, but SDR will never go away.

Where do you see the industry moving in the future?
For me, it is clear that as we have pushed resolution, frame rate, brightness and color gamut, it has affected the way we tell stories. Less is left to the imagination. Traditional “film style” gave a certain pace to the story, because there was the expectation that the audience was having to interpret, to think through to fill in the black screen in between.

Now technology has made things more explicit and more immersive. We now see true HDR cinema technology emerging with a brightness of 600 nits and more. Technology will continue to surge forward, because that is how manufacturers sell more televisions or projectors — or even phones. And until there is a realistic simulation of a full virtual reality environment, I don’t see that process coming to a halt. We have to be able to master for all these new technologies, but still ensure compatibility with existing standards.

What is the biggest challenge for color grading now and in the future?
Color grading technology is very much unfinished business. There is so much that can be done to make it more productive, to make the content look better and to keep us entertained.

Blackboard

As much as we might welcome all the extra work for our customers, generating an endless stream of versions for each program is not what color grading should be about. So it will be interesting to see how this problem will be solved. Because one way or another, it will have to be. But while this is a big challenge, it hopefully isn’t what we put all our effort into over the coming years.

BlackboardThe real challenge is to understand what makes us appreciate certain images over others. How composition and texture, how context, noise and temporal dynamics — not just color itself — affect our perception.

It is interesting that film as a capture medium is gaining popularity again, especially large-format capture. It is also interesting that the “film look” is still precious when it comes to color grading. It puts all the new technology into perspective. Filmmaking is storytelling. Not just a window to the world outside, replaced by a bigger and clearer window with new technology, but a window to a different world. And the colorist can shape that world to a degree that is limited only by her imagination.

Olympusat Entertainment Senior DI Colorist Jim Wicks
A colorist since 2007, Jim has been a senior DI colorist at Olympusat Entertainment since 2011. He has color restored hundreds of classic films and is very active in the color community.

How has the finishing of color evolved most recently?
The phrase I’m keying in on in your question is “most recently.” I believe the role of a colorist has been changing exponentially for the last several years, maybe longer. I would say that we are becoming, if we haven’t already, more like finishing artists. Color is now just one part of what we do. Because technologies are changing more rapidly than at any time I’ve witnessed, we now have a lot to understand and comprehend in addition to just color. There is ACES, HDR, changing color spaces, integrating VFX workflows into our timelines, laser projection and so on. The list isn’t endless, and it’s growing.

How has laser projection and HDR impacted the work?
For the time being, they do not impact my work. I am currently required to deliver in Rec.709. However, within that confine I am grading a wider range of media than ever before, such as 2K and 4K uncompressed DPX; Phantom Digital Video Files; Red Helium 8K in the IPP2 workspace; and much more. Laser projection and HDR is something that I continue to study by attending symposiums, or wherever I can find that information. I believe laser projection and HDR are important to know now. When the opportunity to work with laser projection and HDR is available to me, I plan to be ready.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Of course! At the very heart of every production, the cinematographer is the creator and author of the image. It is her creative vision. The colorist is the protector of that image. The cinematographer entrusts us with her vision. In this respect, the colorist needs to be in sync with the cinematographer as never before. As cinematographers move because of technology, so we move. It’s all about the deliverable and how it will be displayed. I see no benefit for the colorist and the cinematographer to not be on the same page because of changing technology.

Where do you see the industry moving in the near future and the long-range future?
In the near future: HDR, laser projection, 4K and larger and larger formats.

In the long-range future: I believe we only need to look to the past to see the changes that are inevitably ahead of us.

Technological changes forced film labs, telecine and color timers to change and evolve. In the nearly two decades since O Brother Where Art Thou? we no longer color grade movies the way we did back when the Coen classic was released in 2000. I believe it is inevitable: Change begets change. Nothing stays the same.

In keeping with the types of changes that came before, it is only a matter of time before today’s colorist is forced to change and evolve just as those before us were forced to do so. In this respect I believe AI technology is a game-changer. After all, we are moving towards driverless cars. So, if AI advances the way we have been told, will we need a human colorist in the future?

What is the biggest challenge you see for the color grading process now and beyond?
Not to sound like a “get off my lawn rant,” but education is the biggest challenge, and it’s a two-fold problem. Firstly, at many fine film schools in the US color grading is not taught as a degree-granting course, or at all.

Secondly, the glut of for-profit websites that teach color grading courses have no standardized curriculum, which wouldn’t be a problem, but at present there is no way to measure how much anyone actually knows. I have personally encountered individuals who claim to be colorists and yet do not know how to color grade. As a manager I have interviewed them — their resumes look strong, but their skills are not there. They can’t do the work.

What’s the best piece of work you’ve seen that you didn’t work on?
Just about anything shot by Roger Deakins. I am a huge fan of his work. Mitch Paulson and his team at Efilm did great work on protecting Roger’s vision for Blade Runner 2049.

Colorist David Rivero
This Madrid-born colorist is now based in China. He color grades and supervises the finishing of feature films and commercials, normally all versions, and often the trailers associated with them.

How has the finishing of color evolved most recently?
The line between strictly color grading and finishing is getting blurrier by the year. Although it is true there is still a clearer separation in the commercial world, on the film side the colorist has become the “de facto” finishing or supervising finishing artist. I think it is another sign of the bigger role the color grading is starting to play in post.

In the last two to three years I’ve noticed that fewer clients are looking at it as an afterthought, or as simply “color matching.” I’ve seen how the very same people went from a six- to seven-day DI schedule five years ago to a 20-day schedule now. The idea that spending a relatively small amount of extra time and budget on the final step can get you a far superior result is finally sinking in.

The tools and technology are finally moving into a “modern age” of grading:
– HDR is a game changer on the image-side of things, providing a noticeable difference for the audience and a different approach on our side on how to deal with all that information.

– The eventual acceptance by all color systems of what was traditionally compositing or VFX tools is also a turning point, although controversial. There are many that think that colorists should focus on grading. However, I think that rather than colorists becoming compositors, it is the color grading concept and mission that is (still) evolving.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Well, on my side of the world (China), the laser and HDR technologies are just starting to get to the public. Cinematographers are not really changing how they work yet, as it is a very small fraction of the whole exhibition system.

As for post, it requires a more careful way of handling the image, as it needs higher quality plates, compositions, CG, VFX, a more careful grade, and you can’t get away with as many tricks as you did when it was just SDR. The bright side is the marvelous images, and how different they can be from each other. I believe HDR is totally compatible with every style you could do in SDR, while opening the doors to new ones. There are also different approaches on shooting and lighting for cinematographers and CG artists.

Goldbuster

The biggest challenge it has created has been on the exhibition side in China. Although Dolby cinemas (Vision+Atmos) are controlled and require a specific pass and DCP, there are other laser projection theaters that show the same DCP being delivered to common (xenon lamp) theaters. This creates a frustrating environment. For example, during the 3D grading, you not only need to consider the very dark theaters with 3FL-3.5FL, but also the new laser rooms that are racking up their lamps to show off why they charge higher ticket prices with to 7FL-8FL.

Where do you see the industry moving in the near future and the long-range future?
I hope to see the HDR technologies settling and becoming the new standard within the next five to six years, and using this as the reference master from which all other deliveries are created. I also expect all these relative new practices and workflows (involving ACES, EXRs with the VFX/CG passes, non-LUT deliveries) to become more standardized and controlled.

In the long term, I could imagine two main changes happening, closely related to each other:
– The concept of grading and colorist, especially in films or long formats, evolving in importance and relationship within the production. I believe the separation or independence between photography and grading will get wider (and necessary) as tools evolve and the process is more standardized. We might get into something akin to how sound editors and sound mixers relate and work together on the sound.

– The addition of (serious) compositing in essentially all the main color systems is the first step towards the possibilities of future grading. A feature like the recent FaceRefinement in Resolve is one of the things I dreamed about five or six years ago.

What is the biggest challenge you see for the color grading process now and beyond?
Nowadays one of the biggest challenges is possibly the multi-mastering environment, with several versions on different color spaces, displays and aspect ratios. It is becoming easier, but it is still more painful than it should be.

Shrinking margins is something that also hurts the whole industry. We all work thanks to the benefits, but cutting on budgets and expecting the same results is not something that is going to happen.

What’s the best piece of work you’ve seen that you didn’t work on?
The Revanant, Mad Max, Fury and 300.

Carbon Colorist Aubrey Woodiwiss
Full-service creative studio Carbon has offices in New York, Chicago and Los Angeles.

How has the finishing of color evolved most recently?
It is always evolving, and the tools are becoming ever more powerful, and camera formats are becoming larger with more range and information in them. Probably the most significant evolution I see is a greater understanding of color science and color space workflows.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
These elements impact how footage is viewed and dealt with in post. As far as I can see, it isn’t affecting how things are shot.

Where do you see the industry moving in the near future? What about in the long-range future?
I see formats becoming larger, viewing spaces and color gamuts becoming wider, and more streaming- and laptop-based technologies and workflows.

What is the biggest challenge you see for the color grading process now and beyond?
The constant challenge is integrating the space you traditionally color grade in to how things are viewed outside of this space.

What’s the best piece of work you’ve seen that you didn’t work on?
Knight of Cups, directed by Terrence Malick with cinematography by Emanuel Lubezki.

Ntropic Colorist Nick Sanders
Ntropic creates and produces work for commercials, music videos, and feature films as well as experiential and interactive VR and AR media. They have locations in San Francisco, Los Angeles and New York City.

How has the finishing of color evolved most recently?
SDR grading in Rec.709 and 2.4 Gamma is still here, still looks great, and will be prominent for a long time. However, I think we’re becoming more aware of how exciting grading in HDR is, and how many creative doors it opens. I’ve noticed a feeling of disappointment when switching from an HDR to an SDR version of a project, and wondered for a second if I’m accidentally viewing the ungraded raw footage, or if my final SDR grade is actually as flat as it appears to my eyes. There is a dramatic difference between the two formats.

HDR is incredible because you can make the highlights blisteringly hot, saturate a color to nuclear levels or keep things mundane and save those heavier-handed tools in your pocket for choice moments in the edit where you might want some extra visceral impact.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
In one sense, cinematographers don’t need to do anything differently. Colorists are able to create high-quality SDR and HDR interpretations of the exact same source footage, so long as it was captured in a high-bit-depth raw format and exposed well. We’re even seeing modern HDR reimaginings of classic films. Movies as varied in subject matter as Saving Private Ryan and the original Blade Runner are coming back to life because the latitude of classic film stocks allows it. However, HDR has the power to greatly exaggerate details that may have otherwise been subtle or invisible in SDR formats, so some extra care should be taken in projects destined for HDR.

Extra contrast and shadow detail mean that noise is far more apparent in HDR projects, so ISO and exposure should be adjusted on-set accordingly. Also, the increased highlight range has some interesting consequences in HDR. For example, large blown-out highlights, such as overexposed skies, can look particularly bad. HDR can also retain more detail and color in the upper ranges in a way that may not be desirable. An unremarkable, desaturated background in SDR can become a bright, busy and colorful background in HDR. It might prove distracting to the point that the DP may want to increase his or her key lighting on the foreground subjects to refocus our attention on them.

Panasonic “PvP”

Where do you see the industry moving in the near future? What about the long-range future?
I foresee more widespread adoption of HDR — in a way that I don’t with 3D and VR — because there’s no headset device required to feel and enjoy it. Having some HDR nature footage running on a loop is a great way to sell a TV in Best Buy. Where the benefits of another recent innovation, 4K, are really only detectable on larger screens and begin to deteriorate with the slightest bit of compression in the image pipeline, HDR’s magic is apparent from the first glance.

I think we’ll first start to see HDR and SDR orders on everything, then a gradual phasing out of the SDR deliverables as the technology becomes more ubiquitous, just like we saw with the standard definition transition to HD.

For the long-range, I wouldn’t be surprised to see a phasing out of projectors as LED walls become more common for theater exhibitions due to their deeper black levels. This would effectively blur the line between technologies available for theater and home for good.

What is the biggest challenge you see for the color grading process now and beyond?
The lack of a clear standard makes workflow decisions a little tricky at the moment. One glaring issue is that consumer HDR displays don’t replicate the maximum brightness of professional monitors, so there is a question of mastering one’s work for the present, or for the near future when that higher capability will be more widely available. And where does this evolution stop? 4,000 nits? 10,000 nits?

Maybe a more pertinent creative challenge in the crossover period is which version to grade first, SDR or HDR, and how to produce the other version. There are a couple of ways to go about it, from using LUTs to initiate and largely automate the conversion to starting over from scratch and regrading the source footage in the new format.

What’s the best piece of work you’ve seen that you didn’t work on?
Chef’s Table on Netflix was one of the first things I saw in HDR; I still think it looks great!

Main Image: Courtesy of Jim Wicks.

Understanding and partnering on HDR workflows

By Karen Moltenbrey

Every now and then a new format or technology comes along that has a profound effect on post production. Currently, that tech is high dynamic range, or HDR, which offers a heightened visual experience through a greater dynamic range of luminosity.

Michel Suissa

So why is HDR important to the industry? “That is a massive question to answer, but to make a pretty long story relatively short, it is by far one of the recent technologies to emerge with the greatest potential to change how images are affecting audiences,” says Michel Suissa, manager of professional solutions at The Studio–B&H. “Regardless of the market and the medium used to distribute programming, irrelevant to where and how these images are consumed, it is a clearly noticeable enhancement, and at the same time a real marketing gold mine for manufacturers as well as content producers, since a premium can be attached to offering HDR as a feature.”

And he should know. Suissa has been helping a multitude of post studios navigate the HDR waters in their quest for the equipment necessary to meet their high dynamic range needs.

Suissa started seeing a growing appetite for HDR roughly three years ago, both in the consumer and professional markets and at about the same time. “Three years ago, if someone had said they were creating HDR content, a very small percentage of the community would have known what they were talking about,” he notes. “Now, if you don’t know what HDR is and you’re in the industry, then you are probably behind the times.”

Nevertheless, HDR is demanding in terms of the knowledge one needs to create HDR content and distribute it, as well as make sure people can consume it in a way that’s satisfying, Suissa points out. “And there’s still a lot of technical requirements that people have to carefully navigate through because it is hardly trivial,” he says.

How does a company like B&H go about helping a post studio select the right tools for their individual workflow needs? “The basic yet critically important task is understanding their workflow, their existing tool set and what is expected of them in terms of delivery to their clients,” says Suissa.

To assist studios and content creators working in post, The Studio–B&H team follows a blueprint that’s based on engaging customers about the nature of the work they do, asking questions like: Which camera material do they work from? In which form is the original camera material used? What platform do they use for editing? What is the preferred application to master HDR images? What is the storage and network infrastructure? What are the master delivery specifications they must adhere to (what flavor of HDR)?

“People have the most difficulty understanding the nature of the workflow: Do the images need to be captured differently from a camera? Do they need to be ingested in the post system differently? Do they need to be viewed differently? Do they need to be formatted differently? Do they need to be mastered differently? All those things created a new set of specifications that people have to learn, and this is where it has changed the way people handle post production,” Suissa contends. “There’s a lot of intricacies, and you have to understand what it is you’re looking at in order to make sure you’re making the correct decisions — not just technically, but creatively as well.”

When adding an HDR workflow, studios typically approach B&H looking for equipment across their entire pipeline. However, Suissa states that similar parameters apply for HDR work as for other high-performance environments. People will continue to need decent workstations, powerful GPUs, professional storage for performance and increased capacity, and an excellent understanding of monitoring. “Other aspects of a traditional pipeline can sometimes remain in play, but it is truly a case-by-case analysis,” he says.

The most critical aspect of working with HDR is the viewing experience, Suissa says, so selecting an appropriate monitoring solution is vital — as is knowing the output specifications that will be used for final delivery of the content.

Without question, Suissa has seen an increase in the number of studios asking about HDR equipment of late. “Generally speaking, the demand by people wanting to at least understand what they need in order to deliver HDR content is growing, and that’s because the demand for content is growing,” he says.

Yes, there are compromises that studios are making in terms of HDR that are based on budget. Nevertheless, there is a tipping point that can lead to the rejection of a project if it is not up to HDR standards. In fact, Suissa foresees in the next six months or so the tightening of standards on the delivery side, whether for Amazon, Netflix or the networks, and the issuance of mandates by over-the-air distribution channels in order for content to be approved as HDR.

B&H/Light Iron Collaboration
Among the studios that have purchased HDR equipment from B&H is Light Iron, a Panavision company with six facilities spanning the US that offer a range of post solutions, including dailies and DI. According to Light Iron co-founder Katie Fellion, the number of their clients requesting HDR finishing has increased in the past year. She estimates that one out of every three clients is considering HDR finishing, and in some cases, they are doing so even if they don’t have distribution in place yet.

Suissa and Light Iron SVP of innovation Michael Cioni gradually began forging a fruitful collaboration during the last few years, partnering a number of times at various industry events. “At the same time, we doubled up on our relationship of providing technology to them,” Suissa adds, whether for demonstrations or for Light Iron’s commercial production environment.

Katie Fellion

For some time, Light Iron has been moving toward HDR, purchasing equipment from various vendors along the way. In fact, Light Iron was one of the very first vendors to become involved with HDR finishing when Amazon introduced HDR-10 mastering for the second season of one of its flagship shows, Transparent, in 2015.

“Shortly after Transparent, we had several theatrical releases that also began to remaster in both HDR-10 and Dolby Vision, but the requests were not necessarily the norm,” says Fellion. “Over the last three years, that has steadily changed, as more studios are selling content to platforms that offer HDR distribution. Now, we have several shows that started their Season 1 with a traditional HD finish, but then transitioned to 4K HDR finishes in order to accommodate these additional distribution platform requirements.”

Some of the more recent HDR-finished projects at Light Iron include Glow (Season 2) and Thirteen Reasons Why (Season 2) for Netflix, Uncle Drew for Lionsgate, Life Itself for Amazon, Baskets (Season 3) and Better Things (Season 2) for FX and Action Point for Paramount.

Without question, HDR is important to today’s finishing, but one cannot just step blindly into this new, highly detailed world. There are important factors to consider. For instance, the source requirements for HDR mastering — 4K 16-bit files — require more robust tools and storage. “A show that was previously shot and mastered in 2K or HD may now require three or four times the amount of storage in a 4K HDR workflow. Since older post facilities had been previously designed around a 2K/HD infrastructure, newer companies that had fewer issues with legacy infrastructure were able to adopt 4K HDR faster,” says Fellion. Light Iron was designed around a 4K+ infrastructure from day one, she adds, allowing the post house to much more easily integrate HDR at a time when other facilities were still transitioning from 2K to 4K.

Nevertheless, this adoption required changes to the post house’s workflow. Fellion explains: “In a theatrical world, because HDR color is set in a much larger color gamut than P3, the technically correct way to master is to start with the HDR color first and then trim down for P3. However, since HDR theatrical exhibition is still in its infancy, there are not options for most feature films to monitor in a projected environment — which, in a feature workflow, is an expected part of the finishing process. As a result, we often use color-managed workflows that allow us to master first in a P3 theatrical projection environment and then to version for HDR as a secondary pass.”

Light-Iron-NY colorist-Steven Bodner grading music video Picture-Day in HDR on a Sony BVM X300.

In the episodic world, if a project is delivering in HDR, unless creative preference determines otherwise, Light Iron will typically start with the HDR version first and then trim down for the SDR Rec.709 versions.

For either, versioning and delivery have to be considered. For Dolby Vision, this starts with an analysis of the timeline to output an XML for the 709 derivative, explains Fellion of Light Iron’s workflow. And then from that 709 derivative, the colorist will review and tweak the XML values as necessary, sometimes going back to the HDR version and re-analyzing if a larger adjustment needs to be made for the Rec.709 version. For an HDR-10 workflow, this usually involves a different color pass and delivered file set, as well as analysis of the final HDR sequence, to create metadata values, she adds.

Needless to say, embracing HDR is not without challenges. Currently, HDR is only used in the final color process since there’s not many workflows to support HDR throughout the dailies or editorial process, says Fellion. “This can certainly be a challenge to creatives who have spent the past few months staring at images in SDR only to have a different reaction when they first view them in HDR.” Also, in HDR there may be elements on screen that weren’t previously visible in SDR dailies or offline (such as outside a window or production cables under a table), which creates new VFX requirements in order to adjust those elements.

“As more options are developed for on-set monitoring — such as Light Iron’s HDR Video Village System — productions are given an opportunity to see HDR earlier in the process and make mental and physical adjustments to help accommodate for the final HDR picture,” Fellion says.

Having an HDR monitor on set can aid in flagging potential issues that might not be seen in SDR. Currently, however, for dailies and editorial, HDR monitoring is not really used, according to Fellion, who hopes to see that change in the future. Conversely, in the finishing world, “an HDR monitor capable of a minimum 1,000-nit display, such as the Sony [BVM] X300, as well as a consumer-grade HDR UHD TV for client reviews, are part of our standard tool set for mastering,” she notes.

In fact, several months ago, Light Iron purchased new high-end HDR mastering monitors from B&H. The studio also sourced AJA Hi5 4K Plus converter boxes from B&H for its HDR workflow.

And, no doubt, there will be additional HDR equipment needs in Light Iron’s future, as delivery of HDR content continues to ramp up. But there’s a hefty cost involved in moving to HDR. Depending on whether a facility’s DI systems already had the capacity to play back 4K 16-bit files — a key requirement for HDR mastering — the cost can range from a few thousand dollars for a consumer-grade monitor to tens of thousands for professional reference monitoring, DI system, storage and network upgrades, as well as licensing and training for the Dolby Vision platform, according to Fellion.

That is one reason why it’s important for suppliers and vendors to form relationships. But there are other reasons, too. “Those leading the charge [in HDR] are innovators and people you want to be associated with,” Suissa explains. “You learn a lot by associating yourself with professionals on the other side of things. We provide technology. We understand it. We learn it. But we also practice it differently than people who create content. The exchange of knowledge is critical, and it enables us to help our customers better understand the technology they are purchasing.”

Main Image: Netflix’s Glow


Karen Maierhofer is a longtime technical writer with more than two decades of experience in segments of the CG and post industries.

A colorist weighs in on ‘the new world’ of HDR

By Maxine Gervais

HDR is on many people’s minds these days. Some embrace it, some are hesitant and some simply do not like the idea.

But what is HDR really? I find that manufacturers often use the term too loosely. Anything that offers higher dynamic range can fall into the HDR category, but let’s focus on the luminance and greater contrast ratio brought by HDR.

We have come a long way in the last 12 years — from film prints to digital projection. This was a huge shift, and one could argue it happened relatively fast. Since then, technology has been on the fast forward.

Film allows incredible capture of detail information in large formats, and when digital was first introduced we couldn’t say the same. At the time, cameras were barely capable of capturing true 2K and wide dynamic range. Many would shoot film and scan it into digital files hoping to preserve more of the dynamic range offered by film. Eventually, cameras got better and film started to disappear, mostly for convenience and cost reasons.

Through all this, target devices (projectors and monitors) stayed pretty much the same. Monitors went from CRT to plasma to LCD, but kept the same characteristics. For monitors, everything was in a Rec.709 color space and a luminance of 100 nits. Projectors were in the P3 colors space, but with a lower luminance of about 48 nits.

Maxine at work on the FilmLight Baselight.

Philosophically, one could argue that all creative intent was in some ways limited by the display. The files might of held much more information than the display was able to show. So, the aesthetics we learned to love were a direct result of the displays’ limitations.

What About Now?
Now, we are at the break in the revolution of these displays. With the introduction of OLEDs for monitors and laser projection for theaters, the contrast ratios, color spaces and luminance are now larger than before. It is now possible to see the details captured by cameras and or film. This allows for greater artistic freedom: since there is less limitation one can push the aesthetic to a new level.

However, that doesn’t mean all of a sudden everything is brighter and more colorful. It is very easy to create the same aesthetic one used to love, but it is now possible to bring to the screens details in shadows and highlights that were never an option prior. This even means better color separation. What creatives can do with “HDR” is still very much in their control.

The more difficult part is that HDR has not yet taken over theaters and or homes. If someone has set their look in a P3 48-nits world and is now asked to take this look into a 4000-nits P3 PQ display, it might be difficult to decide how to approach it. How do we maintain the original intent yet embrace what HDR has to offer? There are many ways to go about it, and not one is better than the other. You can redefine your look for the new displays, and in some ways have a new look that becomes its own entity, or you can mimic your original look, taking advantage of only a few elements of HDR.

The more we start using brighter luminance, bigger contrast ratio and color cube as our starting point, the more we will be able to future-proof and protect the creative intent. The afterthought of HDR, in terms of never having planned for it, is still something difficult to do and controversial in some cases.

The key is to have those philosophical discussions with creatives ahead of time and come up with a workflow that will have the expected results.

Main Image: Maxine Gervais working director Albert Hughes on his upcoming film, Alpha.


Maxine Gervais is a senior supervising colorist at Technicolor Hollywood.  Her past credits include Black Panther; The 15:17 to Paris; Pitch Perfect 3 and American Sniper.

Colorist Stephen Nakamura on grading Stephen King’s It

By Randi Altman

A scary clown can be thanked for helping boost what had been a lackluster summer box office. In its first weekend, Stephen King’s It opened with an impressive $125 million. Not bad!

Stephen Nakamura

This horror film takes place in a seemingly normal small town, but of course things aren’t what they seem. And while most horror films set most of the action in shadowy darkness, the filmmakers decided to let a lot of this story unfold in the bright glow of daylight in order to make the most of the darkness that eventually takes over. That presented some interesting opportunities for Deluxe’s Company 3 veteran colorist Stephen Nakamura.

How early did you get involved on It?
We came onboard early to do the first trailer. The response on YouTube and other places was enormous. I can’t speak for the filmmakers, but that was when I first realized how much excitement there was out there for this movie.

Had you worked with director Andy Muschietti before? What kind of direction were you given and how did he explain the look he wanted?
One of the concepts about the look that evolved during production, and we continued it in the DI, was this idea that a lot of the film takes place in fairly high-key situations, not the kind of dark, shadowy world some horror films exist in. It’s a period piece. It’s set in a small town that sort of looks like this pleasant place to be, but all this wild stuff is happening! You see these scary movies and everything’s creepy and it’s overcast outside and it’s clearly a horror movie from the outset. Naturally, that can work, but it can be even scarier when you play against that. The violence and everything feels more shocking.

How would you describe the look of the film?
You have the parts that are like I just described and then it does get very dark and shadowy as the action goes into dark spaces and into the sewer. And all that is particularly effective because we’ve kind of gotten to know all the kids who are in what’s called the Losers’ Club, and we’re rooting for them and scared about what might happen to them.

Can you talk about the Dolby Cinema pass? People generally talk about how bright you can get something with HDR, but I understand you were more interested in how dark the image can look.
Right. When you’re working in HDR, like Dolby lets you do, you have a lot more contrast to work with than you do in the normal digital cinema version. I worked on some of the earliest movies to do a Dolby Cinema version, and when I was working with Brad Bird and Claudio Miranda on Tomorrowland, we experimented with how much brighter we could make portions of the frame than what would be possible with normal digital cinema projection, without making the image into something that had a completely different feel from the P3 version. But when you’re in that space, you can also make things appear much much darker too. So the overall level in the theater can get really dark but because of that contrast you can actually see more detail on a person’s face, or a killer clown’s face, even when the overall level is so low. It’s more like you’re really in that dark space.

It doesn’t make it a whole different movie or anything, but it’s a good example of where Dolby can add something to the experience. I’d tell people to see it in Dolby Cinema if they could.

There was obviously a lot of VFX work that helped the terrifying shapeshifting clown, Pennywise, do what he does, but you also did some work on him in the DI, correct?
Yes. We had alpha channel mattes cut around his eyes for every shot he’s in and we used the color corrector to make changes to his eyes. Sometimes the changes were very subtle — making them brighter or pushing the color around — and sometimes we went more extreme, but I don’t want to talk about that too much. People can see for themselves when they see the movie.

What system do you use, and why? How does that tool allow you to be more creative?
I use Blackmagic’s DaVinci Resolve. I’ve been a colorist since the ‘90s and I’ve used Resolve pretty much my whole career. There are other systems out there that are also very good, but for the kinds of projects I do and the way I like to work, I find it the fastest and most intuitive and every time there’s a new upgrade, I find some new tool that helps me be even more efficient.

Canon targets HDR with EOS C200, C200B cinema cameras

Canon has grown its Cinema EOS line of pro cinema cameras with the EOS C200 and EOS C200B. These new offerings target filmmakers and TV productions. They offer two 4K video formats — Canon’s new Cinema RAW Light and MP4 — and are optimized for those interested in shooting HDR video.

Alongside a newly developed dual Digic DV6 image processing system, Canon’s Dual Pixel CMOS AF system and improved operability for pros, these new cameras are built for capturing 4K video across a variety of production applications.

Based on feedback from Cinema EOS users, these new offerings will be available in two configurations, while retaining the same core technologies within. The Canon EOS C200 is a production-ready solution that can be used right out of the box, accompanied by an LCD monitor, LCD attachment, camera grip and handle unit. The camera also features a 1.77 million-dot OLED electronic view finder (EVF). For users who need more versatility and the ability to craft custom setups tailored to their subject or environment, the C200B offers cinematographers the same camera without these accessories and the EVF to optimize shooting using a gimbal, drone or a variety of other configurations.

Canon’s Peter Marr was at Cine Gear demo-ing the new cameras.

New Features
Both cameras feature the same 8.85MP CMOS sensor that combines with a newly developed dual Digic DV6 image processing system to help process high-resolution image data and record video from full HD (1920×1080) and 2K (2048×1080) to 4K UHD (3840×2160) and 4K DCI (4096×2160). A core staple of the third-generation Cinema EOS system, this new processing platform offers wide-ranging expressive capabilities and improved operation when capturing high-quality HDR video.

The combination of the sensor and a newly developed processing system also allows for the support for two new 4K file formats designed to help optimize workflow and make 4K and HDR recording more accessible to filmmakers. Cinema RAW Light, available in 4K 60p/50p at 10-bit and 30p/25p/24p at 12-bit, allows users to record data internally to a CFast card by cutting data size to about one-third to one-fifth of a Cinema RAW file, without losing grading flexibility. Due to the reduced file size, users will appreciate rich dynamic range and easier post processing without sacrificing true 4K quality. Alongside recording to a CFast card, proxy data (MP4) can also be simultaneously recorded to an SD card for use in offline editing.

Additionally, filmmakers will also be able to export 4K in MP4 format on SD media cards at 60/50/30/25/24P at 8-bit. Support for UHD recording allows for use in cinema and broadcasting applications or scenarios where long recording times are needed while still maintaining top image quality. The digital cinema cameras also offer slow-motion full-HD recording support at up to 120fps.

The Canon EOS C200and Canon EOS C200B feature Innovative Focus Control that helps assist with 4K shooting that demands precise focusing, whether from single or remote operation. According to Canon, its Dual Pixel CMOS AF technology helps to expand the distance of the subject area to enable faster focus during 4K video recording. This also allows for highly accurate continuous AF and face detection AF when using EF lenses. For 4K video opportunities that call for precise focus accuracy that can’t be checked on an HD monitor, users can also take advantage of the LCD Monitor LM-V1 (supplied with the EOS C200 camera), which provides intuitive touch focusing support to help filmmakers achieve sophisticated focusing even as a single operator.

In addition to these features, the cameras offer:
• Oversampling HD processing: enhances sensitivity and helps minimize noise
• Wide DR Gamma: helps reduce overexposure by retaining continuity with a gamma curve
• ISO 100-102400 and 54db gain: high quality in both low sensitivity and low-light environments
• In-camera ND filter: internal ND unit allows cleaning of glass for easier maintenance
• ACESproxy support: delivers standardized color space in images, helping to improve efficiency
• Two SD card and one CFast card slots for internal recording
• Improved grip and Cinema-EOS-system-compatible attachment method
• Support for Canon Cine-Servo and EF cinema lenses

Editing and grading of the Cinema RAW Light video format will be supported in Blackmagic Resolve. Editing will also be possible in Avid Media Composer, using a Canon RAW plugin for Avid Media Access. This format can also be processed using the Canon application, Cinema RAW Development.

Also, Premiere Pro CC of Adobe will support this format until the end of 2017. Editing will also be possible in Final Cut Pro X from Apple, using the Canon RAW Plugin for Final Cut Pro X after the second half of this year.

The Canon EOS C200 and EOS C200B are scheduled to be available in August for estimated retail prices of $7,499 and $5,999, respectively. The EOS C200 comes equipped with additional accessories including the LM-V1 LCD monitor, LA-V1 LCD attachment, GR-V1 camera grip and HDU-2 handle unit. Available in September, these accessories will also be sold separately.

Bluefish444 supports Adobe CC and 4K HDR with Epoch card

Bluefish444 Epoch video audio and data I/O cards now support the advanced 4K high dynamic range (HDR) workflows offered in the latest versions of the Adobe Creative Cloud.

Epoch SDI and HDMI solutions are suited for Adobe’s Premiere Pro CC, After Effects CC, Audition CC and other tools that are part of the Creative Cloud. With GPU-accelerated performance for emerging post workflows, including 4K HDR and video over IP, Adobe and Bluefish444 are providing a strong option for pros.

Bluefish444’s Adobe Mercury Transmit support for Adobe Creative Cloud brings improved performance in demanding workflows requiring realtime video I/O from UHD and 4K HDR sequences.

Bluefish444 Epoch video card support adds:
• HD/SD SDI input and output
• 4K/2K SDI input and output
• 12/10/8-bit SDI input and output
• 4K/2K/HD/SD HDMI preview
• Quad split 4K UHD SDI
• Two sample interleaved 4K UHD SDI
• 23, 24, 25, 29, 30fps video input and output
• 48, 50, 59, 60fps video input and output
• Dual-link 1.5Gbps SDI
• 3Gbps level A & B SDI
• Quad link 1.5Gbps and 3Gbps SDI
• AES digital audio
• Analog audio monitoring
• RS-422 machine control
• 12-bit video color space conversions

“Recent updates have enabled performance which was previously unachievable,” reports Tom Lithgow, product manager at Bluefish444. “Thanks to GPU acceleration, and [the] Adobe Mercury Transmit plug-in, Bluefish444 and Adobe users can be confident of smooth realtime video performance for UHD 4K 60fps and HDR content.”

Sony’s offerings at NAB

By Daniel Rodriguez

Sony has always been a company that prioritizes and implements the requests of the customer. They are constantly innovation throughout all aspects of production — from initial capture to display. At NAB 2017, Sony’s goal was to further expand benchmarks the company has made in the past few months.

To reflect its focus as a company, Sony’s NAB booth was focused on four areas: image capture, media solutions, IP Live and HDR (High Dynamic Range). Sony’s focus was to demonstrate its ability to anticipate for future demands in capture and distribution while introducing firmware updates to many of their existing products to complement these future demands.

Cameras
Since Sony provides customers and clients with a path from capture to delivery, it’s natural to start with what’s new for imaging. Having already tackled the prosumer market with its introduction of the a7sii, a7rii, FS5 and FS7ii, and firmly established its presence in the cinema camera line with the Sony F5, F55 and F65, it’s natural that Sony’s immediate steps weren’t to follow up on these models so soon, but rather introduce models that fit more specific needs and situations.

The newest Sony camera introduced at NAB was the UMC-S3CA. Sporting the extremely popular sensor from the a7sii, the UMC-S3CA is a 4K interchangeable lens E mount camera that is much smaller than its sensor sibling. Its Genlock ability allows any user to monitor, operate and sync many at a time, something extremely promising for emerging media like VR and 360 video. It boasts an incredible ISO range from 100-409,600 and recording internal 4K UHD recording at 23.98p, 25fps and 29.97p in 100Mbps and 60Mbps modes. The size of this particularly small camera is promising for those who love the a7sii but want to employ it in more specific cases, such as crash cams, drones, cranes and sliders.

To complement its current camera line, Sony has released an updated version of their electronic viewfinder DVF-EL100 —the DVF-EL200 (pictured)— which also boasts a full 1920x1080p resolution image and is about twice as bright as the previous model. Much like updated versions of Sony’s cameras, this monitor’s ergonomics are attributed to the vast input from users of the previous model, something that the company prides itself on. (Our main image show the F55 with the DVF-EL200 viewfinder.)

Just because Sony is introducing new products doesn’t mean that it has forgotten about older products, especially those that are part of its camera lines. Prosumer models, like the Sony PXW-Z150 and Sony PXW-FS5, to professional cinema cameras, such as the Sony PMW-F5 and PMW-F55, are all receiving firmware updates coming in July 2017.

The most notable firmware update of the Z150 will be its ability to capture images in HLG (Hybrid Log Gamma) to support easier HDR capture and workflow. The FS5 will also receive the ability to capture in HLG, in addition to the ability to change the native ISO from 2000 to 3200 when shooting in SLog2 or SLog3 and 120fps capabilities at 1080p full HD. While many consider the F65 to be Sony’s flagship camera, some consider the F55 to be the more industry friendly of Sony’s cinema camera line, and Sony backs that up by increasing it’s high frame rate capture in a new firmware update. This new firmware update will allow the F55 to record in 72, 75, 90, 96 and 100fps in 4K RAW and in the company’s new compressed Extended Original Camera Negative (X-OCN) format.

X-OCN
Sony’s new X-OCN codec continues to be a highlight of the company’s developments as it boasts an incredible 16-bit bit-depth despite it being compressed, and it’s virtually indistinguishable from Sony’s own RAW format. Due to its compression, it boasts file sizes that are equivalent to 50 percent less than 2K 4:3 Arriraw and 4K ProRes 4444 XQ and 30 percent less than F55 RAW. It’s considered the most optimal and suitable format for HDR content capturing. With cameras like the F5, F55 and its smaller alternatives, like the FS7 and FS7II allowing RAW recording, Sony is offering a nearly indistinguishable alternative to cut down on storage space as well as allow more recording time on set.

Speed and Storage
As Sony continues to increase its support for HDR and larger resolutions like 8K, it’s easy to consider the emergence of X-OCN as an introduction of what to expect from Sony in the future.

Despite the introduction of X-OCN being the company’s answer to large file sizes from shooting RAW, Sony still maintain a firm understanding of the need for storage and the read/write speeds that come with such innovations. As part of such innovations, Sony has introduced the AXS-AR1 AXS memory and SXS Thunderbolt card reader. Using a Thunderbolt 2 connector, which can be daisy-chained since the reader has two inputs, the reader has a theoretical transfer speed of approximately 9.6Gbps, or 1200MBps. Supporting SxS and Sony’s new AXS cards, if one were to download an hour’s worth of true 4K footage at 24fps, shot in X-OCN, it would only take about 2.5 minutes to complete the transfer.

To complement these leaps in storage space and read/write speeds, Sony’s Optical Disc Archive Generation 2 is designed as an optic disc-based storage media with expandable robotic libraries called PetaSites, which through the use of 3.3TB Optical Disc Archive Cartridges guarantee a staggering 100-year shelf life. Unlike LTOs, which are generally only used a handful of times for storing and retrieving, Sony’s optical discs can be quickly and randomly accessed as needed.

HDR
HDR continues to gain traction in the world of broadcast and cinema. From capture to monitoring, the introduction of HDR has spurred many companies to implement new ways to create, monitor, display and distribute HDR content. As mentioned earlier, Sony is implementing firmware updates in many of its cameras to allow internal HLG, or Instant HDR, capture without the need for color grading, as well as compressed X-OCN RAW recording to allow more complex HDR grading to be possible without the massive amounts of data that uncompressed RAW takes up.

HDR gamma displays can now be monitored on screens like the Sony FS5’s, as well as higher-end displays such as their BVM E171, BVM X300/2 and PVM X550.

IP Live
What stood out about Sony’s mission with HDR is to further implement its use in realtime, non-fiction content, and broadcasts like sporting events through IP Live. The goal is to offer instantaneous conversions to not only output media in 4K HDR and SDR but also offer full HD HDR and SDR at the same time. With its SR Live System Sony hopes to implement updates in their camera lines with HLG to provide instant HDR which can be processed through its HDRC-4000 converters. As the company’s business model has stated Sony’s goal is to offer full support throughout the production process, which has led to the introduction of XDCAM Air, which will be an ENG-based cloud service that addresses the growing need for speed to air. XDCAM Air will launch in June 2017.

Managing Files
To round out its production through delivery goals, Sony continues with Media Backbone Navigator X, which is designed to be an online content storage and management solution to ease the work between capture and delivery. It accepts nearly any file type and allows multiple users to easily search for keywords and even phrases spoken in videos while being able to stream in realtime speeds.

Media Backbone Navigator X is designed for productions that create an environment of constant back and forth and will eliminate any excessive deliberation when figuring out storage and distribution of materials.

Sony’s goal at NAB wasn’t to shock or awe but rather to build on an established foundation for current and new clients and customers who are readying for an ever-changing production environment. For Sony, this year’s NAB could be considered preparation for the “upcoming storm” as firmware updates roll out more support for promising formats like HDR.


Daniel Rodriquez is a New York-based cinematographer, photographer and director. Follow him on Instragram: https://www.instagram.com/realdanrodriguez.

A glimpse at what was new at NAB

By Lance Holte

I made the trek out to Las Vegas last week for the annual NAB show to take in the latest in post production technology, discuss new trends and products and get lost in a sea of exhibits. With over 1,700 exhibitors, it’s impossible to see everything (especially in the two days I was there), but here are a handful of notable things that caught my eye.

Blackmagic DaVinci Resolve Studio 14: While the “non-studio” version is still free, it’s hard to beat the $299 license for the full version of Resolve. As 4K and 3D media becomes increasingly prevalent, and with the release of their micro and mini panels, Resolve can be a very affordable solution for editors, mobile colorists and DITs.

The new editorial and audio tools are particularly appealing to someone like me, who is often more hands-on on the editorial side than the grading side of post. To that regard, the new tracking features look to provide extra ease of use for quick and simple grades. I also love that Blackmagic has gotten rid of the dongles, which removes the hassle of tracking numerous dongles in a post environment where systems and rooms are swapped regularly. Oh, and there’s bin, clip and timeline locking for collaborative workflows, which easily pushes Resolve into the competition for an end-to-end post solution.

Adobe Premiere CC 2017 with After Effects and Audition Adobe Premiere is typically my editorial application of choice, and the increased integration of AE and Audition promise to make an end-to-end Creative Cloud workflow even smoother. I’ve been hoping for a revamp of Premiere’s title tool for a while, and the Essential Graphics panel/new Title Tool appears to greatly increase and streamline Premiere’s motion graphics capabilities — especially as someone who does almost all my graphics work in After Effects and Photoshop. The more integrated the various applications can be, the better; and Adobe has been pushing that aspect for some time now.

On the audio side, Premiere’s Essential Sound Panel tools for volume matching, organization, cleanup and other effects without going directly into Audition (or exporting for ProTools, etc.) will be really helpful, especially for smaller projects and offline mixes. And as a last note, the new Camera Shake Deblur effect in After Effects is fantastic.

Dell UltraSharp 4K HDR Monitor — There were a lot of great looking HDR monitors at the show, but I liked that this one fell in the middle of the pack in terms of price point ($2K), with solid specs (1000 nits, 97.7% of P3, and 76.9% of Rec. 2020) and a reasonable size (27 inches). Seems like a good editorial or VFX display solution, though the price might be pushing budgetary constraints for smaller post houses. I wish it was DCI 4K instead of UHD and a little more affordable, but that will hopefully come with time.

On that note, I really like HP’s DreamColor Z31x Studio Display. It’s not HDR, but it’s 99% of the P3 colorspace, and it’s DCI 4K — as well as 2K, by multiplying every pixel at 2K resolution into exactly 4 pixels — so there’s no odd-numbered scaling and sharpening required. Also, I like working with large monitors, especially at high resolutions. It offers automated (and schedulable) color calibration, though I’d love to see a non-automated display in the future if it could bring the price down. I could see the HP monitor as a great alternative to using more expensive HDR displays for the majority of workstations at many post houses.

As another side note, Flanders Scientific’s OLED 55-inch HDR display was among the most beautiful I’ve ever seen, but with numerous built-in interfaces and scaling capabilities, it’s likely to come at a higher price.

Canon 4K600STZ 4K HDR laser projector — This looks to be a great projection solution for small screening rooms or large editorial bays. It offers huge 4096×2400 resolution, is fairly small and compact, and apparently has very few restraints when it comes to projection angle, which would be nice for a theatrical edit bay (or a really nice home theater). The laser light source is also attractive because it will be low maintenance. At $63K, it’s at the more affordable end of 4K projector pricing.

Mettle 360 Degree/VR Depth plug-ins: I haven’t worked with a ton of 360-degree media, but I have dealt with the challenges of doing depth-related effects in a traditional single-camera space, so the fact that Mettle is doing depth-of-field effects, dolly effects and depth volumetric effects with 360-degree/VR content is pretty incredible. Plus, their plug-ins are designed to integrate with Premiere and After Effects, which is good news for an Adobe power user. I believe they’re still going to be in beta for a while, but I’m very curious to see how their plug-ins play out.

Finally, in terms of purely interesting tech, Sony’s Bravia 4K acoustic surface TVs are pretty wild. Their displays are OLED, so they look great, and the fact that the screen vibrates to create sound instead of having separate speakers or an attached speaker bar is awfully cool. Even at very close viewing, the screen doesn’t appear to move, though it can clearly be felt vibrating when touched. A vibrating acoustic surface raises some questions about mounting, so it may not be perfect for every environment, but interesting nonetheless.


Lance Holte is an LA-based post production supervisor and producer. He has spoken and taught at such events as NAB, SMPTE, SIGGRAPH and Createasphere. You can email him at lance@lanceholte.com.

MTI Film updates Cortex for V.4, includes Dolby Vision HDR metadata editing

MTI Film is updating its family of Cortex applications and tools, highlighted by Cortex V.4. In addition to legacy features such as a dailies toolset, IMF and AS-02 packaging and up-res algorithms, Cortex V.4 adds DCP packaging (with integrated ACES color support), an extended edit tool and officially certified Dolby Vision metadata editing capabilities.

“We allow users to manipulate Dolby Vision HDR metadata in the same way that they edit segments of video,” says Randy Reck, MTI Film’s director of development. “In the edit tool, they can graphically trim, cut and paste, add metadata to video, analyze new segments that need metadata and adjust parameters within the Dolby Vision metadata on a shot-by-shot basis.”

With the integration of the Dolby Vision ecosystem, Cortex V.4 provides a method for simultaneously QC-ing HDR and SDR versions of a shot with Dolby Vision metadata. For delivery, the inclusion of the Dolby Vision “IMF-like” output format allows for the rendering and delivery of edited Dolby Vision metadata alongside HDR media in one convenient package.

Cortex V.4’s Edit Tool has been updated to include industry-standard trimming and repositioning of edited segments within the timeline through a drag-and-drop function. The entire look of the Edit Tool (available in the Dailies and Enterprise editions of Cortex) has also been updated to accommodate a new dual-monitor layout, making it easier for users to scrub through media in the source monitor while keeping the composition in context in the record monitor.

MTI Film is also offering a new subscription-based DIT+ edition of Cortex. “It doesn’t make sense for productions to purchase a full software package if their schedule includes a hiatus when it won’t be used,” explains Reck.

DIT+ contains all aspects of the free DIT version of Cortex with the added ability to render HD ProRes, DNx and H.264 files for delivery. A DIT+ subscription starts at $95 per month, and MTI Film is offering a special NAB price of $595 for the first year.

Assimilate Scratch and Scratch VR Suite upgraded to V.8.6

Assimilate is now offering an open beta for Scratch 8.6 and the Scratch VR Suite 8.6, the latest versions of its realtime post tools and workflow — VR/360 and 2D/3D content, from dailies to conform grading, compositing and finishing. Expanded HDR functions are featured throughout the product line, including in Scratch VR, which now offers stitching capabilities.

Both open beta versions gives pros the opportunity to actively use the full suite of Scratch and Scratch VR tools, while evaluating and submitting requests and recommendations for additional features or updates.

Scratch Web for cloud-based, realtime review and collaboration, and Scratch Play for immediate review and playback, are also included in the ecosystem updates. Both products support VR/360 and 2D/3D content.

Current users of the Scratch VR Suite 8.5 and Scratch Finishing 8.5 can download the Scratch 8.6 open beta. Scratch 8.6 open beta and the Scratch VR Suite open beta are available now.

“V8.6 is a major update for both Scratch and the Scratch VR Suite with significant enhancements to the HDR and ACES workflows. We’ve added stitching to the VR toolset so that creators have a complete and streamlined end-to-end VR workflow,” says Jeff Edson, CEO at Assimilate. “The open Beta helps us to continue developing the best and most useful post production features and techniques all artists need to perfect their creativity in color grading and finishing. We act on all input, much of it immediately and some in regular updates.”

Here are some details of the update:

HDR
• PQ and HLG transfer functions are now an integral part of Scratch color management.
• Scopes automatically switch to HDR mode if needed and show levels in a nit-scale; highlights any reference level that you set.
• At the project level, define the HDR mastering metadata: color space, color primaries and white levels, luminance levels and more. The metadata is automatically included in the Video HDMI interface (AJA, BMD, Bluefish444) for display.
• Static metadata has the function to calculate dynamic luminance metadata like MaxCLL and MaxFall.
• HDR footage can be published directly to YouTube with HDR metadata.

VR/360 – Scratch VR Suite
• 360 stitching functionality: load all your source media from your 360 cameras into Scratch VR and combine it to a single equirectangular image. Support for camera stitch templates: AutoPano projects, Hugin and PTStitch scripts.
• Ambisonic Audio: Scratch VR can load, set and playback ambisonic audio files to complete the 360 immersive experience.
• Video with 360 sound can be published directly to YouTube 360.
• Additional overlay handles to the existing 2D-equirectangular feature for more easily positioning 2D elements in a 360 scene.

DIT Reporting Function
• Create a report of all clips of either a timeline, a project or just a selection of shots.
• Reports include metadata, such as a thumbnail, clip-name, timecode, scene, take, comments and any metadata attached to a clip.
• Choose from predefined templates or create your own.

Mozart in the Jungle

The colorful dimensions of Amazon’s Mozart in the Jungle

By Randi Altman

How do you describe Amazon’s Mozart in the Jungle? Well, in its most basic form it’s a comedy about the changing of the guard — or maestro — at the New York Philharmonic, and the musicians that make up that orchestra. When you dig deeper you get a behind-the-scenes look at the back-biting and crazy that goes on in the lives and heads of these gifted artists.

Timothy Vincent

Timothy Vincent

Based on the novel Mozart in the Jungle: Sex, Drugs, and Classical Music by oboist Blair Tindall, the series — which won the Golden Globe last year and was nominated this year — has shot in a number of locations over its three seasons, including Mexico and Italy.

Since its inception, Mozart in the Jungle has been finishing in 4K and streaming in both SDR and HDR. We recently reached out to Technicolor’s senior color timer, Timothy Vincent, who has been on the show since the pilot to find out more about the show’s color workflow.

Did Technicolor have to gear up infrastructure-wise for the show’s HDR workflow?
We were doing UHD 4K already and were just getting our HDR workflows worked out.

What is the workflow from offline to online to color?
The dailies are done in New York based on the Alexa K1S1 709 LUT. (Technicolor On-Location Services handled dailies out of Italy, and Technicolor PostWorks in New York.) After the offline and online, I get the offline reference made with the dailies so I can look at if I have a question about what was intended.

If someone was unsure about watching in HDR versus SDR, what would you tell them?
The emotional feel of both the SDR and the HDR is the same. That is always the goal in the HDR pass for Mozart. One of the experiences that is enhanced in the HDR is the depth of field and the three-dimensional quality you gain in the image. This really plays nicely with the feel in the landscapes of Italy, the stage performances where you feel more like you are in the audience, and the long streets of New York just to name a few.

Mozart in the JungleWhen I’m grading the HDR version, I’m able to retain more highlight detail than I was in the SDR pass. For someone who has not yet been able to experience HDR, I would actually recommend that they watch an episode of the show in SDR first and then in HDR so they can see the difference between them. At that point they can choose what kind of viewing experience they want. I think that Mozart looks fantastic in both versions.

What about the “look” of the show. What kind of direction where you given?
We established the look of the show based on conversations and collaboration in my bay. It has always been a filmic look with soft blacks and yellow warm tones as the main palette for the show. Then we added in a fearlessness to take the story in and out of strong shadows. We shape the look of the show to guide the viewers to exactly the story that is being told and the emotions that we want them to feel. Color has always been used as one of the storytelling tools on the show. There is a realistic beauty to the show.

What was your creative partnership like with the show’s cinematographer, Tobias Datum?
I look forward to each episode and discovering what Tobias has given me as palette and mood for each scene. For Season 3 we picked up where we left off at the end of Season 2. We had established the look and feel of the show and only had to account for a large portion of Season 3 being shot in Italy. Making sure to feel the different quality of light and feel of the warmth and beauty of Italy. We did this by playing with natural warm skin tones and the contrast of light and shadow he was creating for the different moods and locations. The same can be said for the two episodes in Mexico in Season 2. I know now what Tobias likes and can make decisions I’m confident that he will like.

Mozart in the JungleFrom a director and cinematographer’s point of view, what kind of choices does HDR open up creatively?
It depends on if they want to maintain the same feel of the SDR or if they want to create a new feel. If they choose to go in a different direction, they can accentuate the contrast and color more with HDR. You can keep more low-light detail while being dark, and you can really create a separate feel to different parts of the show… like a dream sequence or something like that.

Any workflow tricks/tips/trouble spots within the workflow or is it a well-oiled machine at this point?
I have actually changed the way I grade my shows based on the evolution of this show. My end results are the same, but I learned how to build grades that translate to HDR much easier and consistently.

Do you have a color assistant?
I have a couple of assistants that I work with who help me with prepping the show, getting proxies generated, color tracing and some color support.

What tools do you use — monitor, software, computer, scope, etc.?
I am working on Autodesk Lustre 2017 on an HP Z840, while monitoring on both a Panasonic CZ950 and a Sony X300. I work on Omnitek scopes off the downconverter to 2K. The show is shot on both Alexa XT and Alexa Mini, framing for 16×9. All finishing is done in 4K UHD for both SDR and HDR.

Anything you would like to add?
I would only say that everyone should be open to experiencing both SDR and HDR and giving themselves that opportunity to choose which they want to watch and when.

New England SMPTE holding free session on UHD/HDR/HFR, more

The New England Section of SMPTE is holding a free day-long “New Technologies Boot Camp” that focuses on working with high resolution (UHD, 4K and beyond), high-dynamic-range (HDR) imaging and higher frame rates (HFR). In addition, they will discuss how to maintain resolution independence on screens of every size, as well as how to leverage IP and ATSC 3.0 for more efficient movement of this media content.

The boot camp will run from 9am to 9pm on May 19 at the Holiday Inn in Dedham, Massachusetts.

“These are exciting times for those of us working on the technical side of broadcasting, and the array of new formats and standards we’re facing can be a bit overwhelming,” says Martin P. Feldman, chair of SMPTE New England Section. “No one wants — or can afford — to be left behind. That’s why we’re gathering some of the industry’s foremost experts for a free boot camp designed to bring engineers up to speed on new technologies that enable more efficient creation and delivery of a better broadcast product.”

Boot camp presentations will include:

• “High-Dynamic-Range and Wide Color Gamut in Production and Distribution” by Hugo Gaggioni, chief technical officer at Sony Electronics.
• “4K/UHD/HFR/HDR — HEVC H.265 — ATSC 3.0” by Karl Kuhn of Tektronix.
• “Where Is 4K (UHD) Product Used Today — 4K Versus HFR — 4K and HFR Challenges” by Bruce Lane of Grass Valley.
• “Using MESH Networks” by Al Kornak of JVC Kenwood Corporation.
• “IP in Infrastructure-Building (Replacing HD-SDI Systems and Accommodating UHD)” by Paul Briscoe of Evertz Microsystems;
• “Scripted Versus Live Production Requirements” by Michael Bergeron of Panasonic.
• “The Transition from SDI to IP, Including IP Infrastructure and Monitoring” by John Shike of SAM (formerly Snell/Quantel).
• “8K, High-Dynamic-Range, OLED, Flexible Displays” by consultant Peter Putman.
• “HDR: The Great, the Okay, and the WTF” by Mark Schubin, engineer-in-charge at the Metropolitan Opera, Sesame Street and Great Performances (PBS).

The program will conclude with a panel discussion by the program’s presenters.

 No RSVP is required, and both SMPTE members and non-members are welcome.

NAB 2016 from an EP’s perspective

By Tara Holmes

Almost two weeks ago, I found myself at NAB for the first time. I am the executive producer of color and finishing at Nice Shoes, a post production studio in New York City. I am not an engineer and I am not an artist, so why would an EP go to NAB? I went because one of my main goals for 2016 is to make sure the studio remains at the forefront of technology. While I feel that our engineering team and artists represent us well in that respect, I wanted to make sure that I, along with our producers, were fully educated on these emerging technologies.

One of our first priorities for NAB was to meet with top monitor manufacturers to hopefully land on what UHD HDR monitors we would find to meet our standards for professional client viewing. We came to the conclusion that the industry is not there yet and we have more research to do before we upgrade our studio viewing environments.

Everyone with me was in agreement. They aren’t where they need to be. Most are only outputting around 400-800 nits and are experiencing luminance and contrast issues. None of this should stop the process of coloring for HDR. For the master monitor for the colorist, the Sony BVM-X300 OLED master monitor, which we are currently using, seems to be the ideal choice as you can still work in traditional Rec 709 as well as Rec 2020 for HDR.

After checking out some monitors, we headed to the FilmLight booth to go over the 5.0 upgrades to Baselight. Our colorist Ron Sudul, along with Nice Shoes Creative Studio VFX supervisor Adrian Winter, sat with myself and the FilmLight reps to discuss the upgrades, which included incredible new isolation tracking capabilities.  These upgrades will reinvent what can be achieved in the color suite: from realtime comps to retouch being done in color. The possibilities are exciting.

I also spent time learning about the upgrades to Filmlight’s Flip, which is their on-set color hardware. The Flip can allow you to develop your color look on set, apply it during your edit process (with the Baselight plug-in for Avid) and refine it in final color, all without affecting your RAW files. In addition to the Flip, they developed a software that supports on-set look development and grading called Prelight. I asked if these new technologies could enable us to even do high-end things like sky replacements on set and was told that the hardware within the Flip very well could.

We also visited our friends at DFT, the manufacturers of the Scanity film scanner, to catch up and discuss the business of archiving. With Scanity, Nice Shoes can scan 4K when other scanners only scan up to 2K resolution. This is a vital tool in not only preserving past materials, but in future proofing for emerging formats when archiving scans from film.

VR
On Sunday evening before the exhibits opened, we attended a panel on VR that was hosted by the Foundry. At this event we got to experience a few of the most talked about VR projects including Defrost, one of the first narrative VR films, from the director of Grease, Randal Kleiser, who was on the panel along with moderator Morris May (CEO/founder, Specular Theory), Bryn Mooser (co-founder, RYOT), Tim Dillon (executive producer, MPC) and Jake Black (head of VR, Create Advertising).

The Foundry’s VR panel.

The panel inspired me to delve deeper into the VR world, and on Wednesday I spent most of my last day exploring the Virtual & Augmented Reality Pavilion. In addition to seeing the newest VR camera rig offerings and experiencing a live VR feed, as well as demo-ing the Samsung Gear, I explored viewing options for the color workflow. Some people I spoke to mentioned that multiple Oculus set-ups all attached to a single feed was the way to go for color workflow, but another option that we did a very preliminary exploration of was the “dome” possibility, which offers a focused 180-degree view for everyone involved to comment on the same section of a VR scene. This would enable all involved to be sure they are experiencing and viewing the same thing at the same time.

HDR Workflow
Another panel we attended was about HDR workflows. Nice Shoes has already had the opportunity to work on HDR material and have begun to develop workflows for this emerging medium. Most HDR deliverables are for episodic and long form for such companies as Netflix, Hulu and the like. It may be some time before commercial clients are requesting an HDR deliverable, but the workflows will be much the same so the development being performed now is extremely valuable.

My biggest take away was that there are still no set standards. There’s Dolby Vision vs. HDR 10 vs. PQ vs. others. But it appears that everyone agrees that standards are not needed right now. We need to get tools into the hands of the artists and figure out what works best. Standards will come out of that. The good news is that we appear to be future-proofed for the standard to change. Meaning for the most part, every camera we are shooting on is shooting for HDR and should standards change — say from 1000 nits to 10,000 nits — the footage and process is still there to go back in and color for the new request.

Summing Up
I truly believe my time spent at NAB has prepared me for the myriad of questions that will be put forth throughout the year and will help us develop our workflows to evolve the creative process of post. I’ll be sure to be there again next year in order to prepare myself for the questions of 2017 and beyond.

Our Main Image: The view walking into the South Hall Lower at the LVCC.

Quick Chat: SGO CEO Miguel Angel Doncel

By Randi Altman

When I first happened upon Spanish company SGO, they were giving demos of their Mistika system on a small stand in the back of the post production hall at IBC. That was about eight years ago. Since then, the company has grown its Mistika DI finishing system, added a new product called Mamba FX, and brought them both to the US and beyond.

With NAB fast approaching, I thought I would check in with SGO CEO Miguel Angel Doncel to find out how the company began, where they are now and where they are going. I also checked in about some industry trends.

Can you talk about the genesis of your company and the Mistika product?
SGO was born out of a technically oriented mentality to find the best ways to use open architectures and systems to improve media content creation processes. That is not a challenging concept today, but it was an innovative view in 1993 when most of the equipment used in the industry was proprietary hardware. The idea of using computers to replace proprietary solutions was the reason SGO was founded.

It seems you guys were ahead of the curve in terms of one product that could do many things. Was that your goal from the outset?
Ten years ago, most of the manufacturers approached the industry with a set of different solutions to address different parts of the workflow; this gave us an opportunity to capitalize on improving the workflow, as disjointed solutions imply inefficient workflows due to their linearity/sequentiality.

We always thought that by improving the workflow, our technology would be able to play in all those arenas without having to change the tools. Making the workflow parallel and saving time when a problem is detected avoids going backwards in the pipeline, and we can focus moving forward.

I think after so many years, the industry is saying we were right, and all are going in that direction.

How is SGO addressing HDR?
We are excited about HDR, as it really improves the visual experience, but at the same time it is a big challenge to define a workflow that can work in both HDR and SDR in a smooth way. Our solution to that challenge is the four-dimensional grading that is implemented with our 4th ball. This allows the colorist to work not only in the three traditional dimensions — R, G and B — but also to work in the highlights as a parallel dimension.

What about VR?
VR pieces together all the requirements of the most demanding 3D with the requirements of 360. Considering what SGO already offers in stereo 3D production, we feel we are well positioned to provide a 360/VR solution. For that reason, we want to introduce a specific workflow for VR that helps customers to work on VR projects, addressing the most difficult requirements, such as discontinuities in the poles, or dealing with shapes.

The new VR mode we are preparing for Mistika 8.7 will be much more than a VR visualization tool. It will allow users to work in VR environments the same way they would work in a normal production. Not having to worry about circles ending up being highly distorted ellipses and so forth.

What do you see as the most important trends happening in post and production currently?
The industry is evolving in many different directions at the moment — 8K realtime, 4K/UHD, HDR, HFR, dual-stream stereo/VR. These innovations improve and enhance the audience’s experience in many different ways. They are all interesting individually, but the most vital aspect for us is that all of them actually have something in common — they all require a very smart way of how to deal with increasing bandwidths. We believe that a variety of content will use different types of innovation relevant to the genre.

Where do you see things moving in the future?
I personally envision a lot more UHD, HDR and VR material in the near future. The technology is evolving in a direction that can really make the entertainment experience very special for audiences, leaving a lot of room to still evolve. An example is the Quantum Break game from Remedy Studios/Microsoft, where the actual users’ experience is part of the story. This is where things are headed.

I think the immersive aspect is the challenge and goal. The reason why we all exist in this industry is to make people enjoy what they see, and all these tools and formulas combined together form a great foundation on which to build realistic experiences.

Reid Burns named president of post at Cognition

Hollywood-based post studio Cognition has hired Reid Burns as president of post production. In his new role, Burns is tasked with building relationships with studios, independent producers and filmmakers. His initial focus will be on growing the company’s digital intermediate finishing and color grading business.

Launched last fall, Cognition is nearing completion of a multi-million-dollar expansion of its creative campus in Hollywood that will include the addition of two 4K digital cinema finishing theaters, a scalable visual effects pipeline, creative office space and an emerging technologies research facility.

Burns began his career as a color timer and has nearly 100 films to his credit. He later managed laboratory operations at Deluxe. That was followed by tenures as director of global sales and technology at Consolidated Film Industries, SVP at Fotokem and COO at Reliance MediaWorks. His work at Reliance included oversight of 3D work for such titles as Avatar, Prometheus and Men in Black 3. He also founded the VFX and title post house The Image Resolution and served as CEO of the VFX/DI and boutique post house Ollin Studio in Mexico City. He joined Mexico City’s Labodigital in 2012 as SVP. Burns is an associate member of the American Cinematographers Society.

Commenting on Cognition’s adoption of SGO’s Mistika platform, he says, “We are ready for EDR (Extended Dynamic Range), planning ahead for HDR and using the ACES workflow on our current feature. We are also thrilled with Mistika’s robust stereo tools and look forward to projects that are a good fit in that arena.”

Among Burns’ first tasks will be to add to the facility’s DI staff. New hires are expected to include a senior DI producer, DI colorist and a business development specialist.

New version of Media Composer supports HDR

Avid’s latest version of its Avid Media Composer editing system offers support for high dynamic range (HDR) workflows, so users can now edit and grade projects using new color specs that display a greater dynamic range than standard video. The new version also features a more intuitive interface, making the tool more friendly to those editors who also work on Premiere Pro or Final Cut.

With what the company calls “better-organized tools,” the Media Composer’s tools and interface elements are now organized in a more logical order, enabling editors to work more efficiently and intuitively, regardless of system they have worked on before.

The new version of Media Composer also enables editors to work up to 64 audio tracks — 250 percent more than previously available — and delivers more power under the hood, allowing editors to work faster and focus on the creative.

The new version of Media Composer — which allows users to work in HD, 2K, 4K, 8K and HDR — is now available to purchase from the Avid Store and as a free download to current Media Composer customers with an active Upgrade and Support Plan or Subscription.

Here are some more details of the new version:
• Faster editing and better-organized tools – Users can access key tools and features faster thanks to several updates to menus and drop downs that make accessing tools more intuitive, productive, and fun. This delivers even greater efficiency, enabling editors to focus more time on their creative storytelling.
• Better visual feedback when editing – Users can edit with more precision thanks to high-visibility feedback displayed as they edit in the timeline.
• Ability to straighten images quickly with FrameFlex rotation – Users can rotate images a little or a lot by rotating the framing box in FrameFlex.
• Better performance – all played frames — and all effects applied to those clips — are now cached in RAM. This allows for a smoother, stutter-free edit when scrubbing or playing back a complex sequence multiple times.

Other enhancements include: full screen playback mode; sync Broadcast Wave audio files with video clips with subframe accuracy; add one or more custom columns to a bin easily through a contextual menu; copy and paste frame numbers in bin columns; find and filter effects faster and easier with the updated Effects palette; rename, edit and delete custom project presets with the Preset Manager; use Media Composer on OSX 10.11 (El Capitan) and Windows 10 computers; group master clips using audio waveform analysis; start a frame count at “1” instead of “0” (zero) for timecode burn-in segments; resize and configure the Audio Mixer for your project at hand; preserve field recorder metadata across multiple audio tracks when importing/linking

Quick Chat: FilmLight CEO Wolfgang Lempp on HDR

FilmLight, creator of the popular BaseLight color grading system, has been making products targeting color since 2002. Over the years they have added other products that surround the color workflow, such as image processing applications and on-set tools for film and television.

With high dynamic range (HDR) a hot topic among those making tools for production and post and those who believe in HDR’s future, we reached out to FilmLight CEO and co-founder Wolfgang Lempp to pick his brain about the benefits of HDR and extended color gamut, and what we need to do to make it a reality.

Are you a fan of HDR?
Definitely. It opens up more creative possibilities, and it adds depth to the picture. Not everything benefits from looking more real, but the real world is certainly HDR. There is a certain aesthetic to dim highlights, as there is to black-and-white photography, but that is no justification to stick with black-and-white television, or with dim displays.

And consumers will appreciate the benefit of HDR too. When they walk into an electronics store and see a couple of HDR televisions among the standard screens, they will leap out as being clearly better. That is very different from stereo 3D technology, and it will drive the adoption of HDR in a big way.

So, what will it take to get HDR to consumers?
High-end cameras have been HDR for quite a while. It is just that we have compressed the output to make it look okay on standard displays. We now have the displays, and we are starting to get the projectors, too. The biggest obstacle is the infrastructure in between, and the implications regarding the proposed standards.

So there will be a time of confusion, as well as a time for bad HDR, before the dust will settle. And sadly, like with 4K and UHD, we probably end up with two different standards for film and TV. The big question at the moment is whether the least disruptive method, which uses the same signal for both standard dynamic range and HDR displays, will be all we can realistically hope for in TV at this point, and whether that is actually good enough.

Samsung’s HDR-ready KS9500 SUHD TV with Quantum dot display.

Is the SMPTE PQ standard the answer?
SMPTE 2084 — which formalizes the Dolby Perceptual Quantization (PQ) concept — is already in use and has its merits, certainly for movies where you can send the right version to each cinema. But it is a bit too forward-looking for the broadcast industry, which prefers to send a common signal to both standard dynamic range and HDR displays at home.

The existing broadcast infrastructure can be made to work with the current generation of HDR displays, and that might well be good enough for many years to come. SMPTE PQ is looking further into the future, but ironically the projection technology for cinema is trailing behind in terms of absolute brightness, so for the foreseeable future there is even less of a need to provide for that extra dynamic range.

The critical issue in the short term is banding of contours, not in the very dark parts of the image which we are all familiar with, but in the mid-range. PQ is the safer bet in that respect, but it needs a higher bit depth than the broadcast distribution channels are offering.

BBC in the UK and NHK, the Japanese national broadcaster, have put forward a proposal for a hybrid logarithmic and gamma encoding that could be a reasonable compromise for broadcasting, but it remains to be seen if it is a compromise too far when a wide variety of HDR content becomes available. It would be a shame if we end up with a long list of do’s and don’ts to make the images look acceptable.

At FilmLight, we support both standards, and if the industry can agree on something better, we will of course support that too. Our interest is in taking the technical limitations away from post and allowing people to concentrate on creativity.

HDR broadcast at CES 2016.

HDR broadcast at CES 2016.

What happens when an HDR signal reaches televisions in the home?
The real concern is set up — because to see the benefits you have to set things up correctly. And a relatively subtle shift, like extended color gamut or a not-so-subtle shift like HDR, has the capability of being badly configured.

When we moved from 4×3 to 16×9 displays, many people didn’t bother to adjust the screens correctly, so 4×3 content was stretched, making everyone looked squashed and fat. Even today, that problem hasn’t gone away completely. Whatever system is in place for delivering HDR to the home, it has to be simple to set up accurately for whatever receiving device the consumer chooses to use.

Some colorists are expressing concern about working with HDR and eye strain. Is this a serious issue?
The real world is HDR. Go outside into the sunshine and see what extended color and dynamics really means. The new generation of displays deliver only a pale imitation of this reality. Our eyes and brain have the ability to adjust over an amazingly wide range.

The serious point is that HDR should help to create more realistic, as well as more engaging and enticing pictures. If all we do with HDR is make the highlights brighter then it has failed as an addition to the creative toolset.

Dolby's HDR offering at CES.

Dolby’s HDR offering at CES.

Colorists today are used to working in a very dim environment. It will be different in the future, and it will take some time to get used to, but I think we all have faced more serious challenges.

What do you think the timeframe is for HDR?
It is already happening. Movies are out there and television is ready to go. NAB 2015 saw the gee-whiz demonstrations and NAB 2016 will see workable, affordable, practical solutions. January’s CES featured many HDR-ready displays on show, so there is real pressure on the broadcasters to provide the content.

If it is used carefully and creatively, I am very excited by the prospect, and I believe viewers will absolutely love it.

 

Colorfront demos UHD HDR workflows at SMPTE 2015

Colorfront used the SMPTE 2015 Conference in Hollywood to show off the capabilities of its upcoming 2016 products supporting UHD/HDR workflows. New products include the Transkoder 2016 and On-Set Dailies 2016. Upgrades allow for faster, more flexible processing of the latest UHD HDR camera, color, editorial and deliverables formats for digital cinema, high-end episodic TV and OTT Internet entertainment channels.

Colorfront’s Bruno Munger filled us in on some of the highlights:

More details:
·   Transkoder and On-Set Dailies feature Colorfront Engine, an ACES-compliant, HDR-managed color pipeline, enabling on-set look creation and ensuring color fidelity of UHD/HDR materials and metadata though the camera-to-post chain. Colorfront Engine supports the full dynamic range and color gamut of the latest digital camera formats and mapping into industry-standard deliverables such as the latest IMF specs, AS-11 DPP and HEVC, at a variety of brightness, contrast and color ranges in current display devices.
·   The mastering toolset for Transkoder 2016 is enhanced with new statistical analysis tools for immediate HDR data graphing. Highlights include MaxCLL and MaxFALL calculations, as well as HDR mastering tools with tone and gamut mapping for a variety of target color spaces, including Rec. 2020 and P3D65, as well as XYZ, PQ curve and BBC-NHK Hybrid Log Gamma.
·    New for Transkoder 2016 are tools to concurrently color grade HDR and SDR UHD versions, cutting down the complexity, time and cost of delivering multiple masters at once.
·    Transkoder 2016 will output simultaneous, realtime grades on 4K 60p material to dual Sony OLED BVM-X300 broadcast monitors — concurrently processing HDR 2084 PQ Rec. 2020 at 1000nits and SDR Rec. 709 at 100nits — while visually graphing MaxFALL/MaxCLL light values per frame.

Advanced dailies toolsets enhancements include:
·    Support for the latest camera formats, including full Panasonic Varicam35 VRAW, AVC Intra 444, 422 and LT support, Canon EOS C300 Mark II with new Canon Log2 Gamma, ARRI Alexa 65 and Alexa SXT, Red Weapon, Sony XAVC and the associated image metadata from all of these.
·    The new Multi-view Dailies capability for On-Set Dailies 2016, which allows concurrent, realtime playback and color grading of all cameras and camera views.
·    Transwrapping, which allows video essence data (the RAW, compressed audio/video and metadata inside a container such as MXF or MOV) to be passed through the transcoding process without re-encoding, enabling frame-accurate insert editing on closed digital deliverables. This workflow can be a great time saver in day-to-day production, allowing Transkoder users to quickly generate new masters based on changes and versioning of content in the major mastering formats, like IMF, DCI and ProRes, and efficient trimming of camera original media for VFX pulls and final conform from Arri, Red and Sony cameras.

Dolby Cinema combines 
HDR video, immersive surround sound

By Mel Lambert

In addition to its advances in immersive surround sound, culminating in the new object-based Atmos format for theatrical and consumer playback, Dolby remains committed to innovating video solutions for the post and digital cinema communities.

Leveraging video technologies developed for high-resolution video monitors targeted at on-location, colorist and QC displays, the company also has been developing Dolby Cinema, which combines proprietary high dynamic range (HDR) Dolby Vision with Dolby Atmos immersive sound playback.

The first Dolby Cinema installations comprise a joint venture with AMC Entertainment — the nation’s second-largest theater chain — and, according AMC’s EVP of US operations, John McDonald, the companies are planning to unveil up to 100 such “Dolby Cinema at AMC Prime” theaters around the world within the next decade. To date, approximately a dozen such premium large format (PLF) locations have opened in the US and Europe.

Dolby Vision requires two specially modified, HDR Christie Digital 4K laser projectors, together with state-of-the-art optics and image processing, to provide an HDR output with light levels significantly greater than conventional Xenon digital projectors. Dolby Vision’s HDR output, with enhanced color technology, has been lauded by filmmakers for its enhanced contrast, high brightness and gamut range that is said to more closely match human vision.

Unique to the Dolby Vision projection system, beyond its brightness and vivid color reproduction, is its claimed ability to deliver HDR images with an extended contrast ratio that exceeds any other image technology currently on the market. The result is described by Dolby as a “richer, more detailed viewing experience, with strikingly vivid and realistic images that transport audiences into a movie’s immersive world.”

During a recent system demo at AMC16 in Burbank, Doug Darrow, Dolby’s SVP of Cinema, said, “Today’s movie audiences have an insatiable appetite for experiences. They want to be moved, and they want to feel [the on-screen action]. The combination of our Dolby Vision technology and Dolby Atmos offers audiences an immersive audio-video experience.”

The new proprietary system offers up to 31-foot-Lamberts of screen brightness for 2D Dolby Vision content, more than twice the 14 fL required by the Digital Cinema Initiatives (DCI) specification.

Recent films released in Dolby Cinema include Sony’s The Perfect Guy; Paramount’s Mission: Impossible – Rogue Nation; Fox’s Maze Runner: The Scorch Trials; Fox’s The Martian, Warner’s Pan; and Universal’s’ Everest. Upcoming releases include Warner’s In the Heart of the Sea; Lionsgate’s The Hunger Games: Mockingjay — Part 2 and Disney’s The Jungle Book.

During a series of endorsement videos shown at the Burbank showcase, Wes Ball, director of Maze Runner: The Scorch Trials, said, “It’s the only way I want to show movies.”

The new theatrical presentation format fits into existing post workflows, according to Stuart Bowling, Dolby’s director of content and creative relations. “Digital cameras are capable of capturing images with tremendous dynamic range that is suitable for Dolby Vision, which is capable of delivering a wide P3 color gamut. Laser projection can also extend the P3 color space to exceed Rec. 2020 [ITU-R Recommendation BT.2020], which is invaluable for animation and VFW. For now, however, we will likely see filmmakers stay within the P3 gamut.”

For enhanced visual coverage, the large-format screens extend from wall to wall and floor to ceiling, with matte-back side wall and fittings to reduce ambient light scattering that can easily diminish the HDR experience. “Whereas conventional presentations offer maybe 2,000:1 contrast ratios,” Bowling stressed, “Dolby Vision offers 1,000,000:1 [dynamic range], with true, inky blacks.”

Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service. He can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.

IBC 2015 Blog: Rainy days but impressive displays, solutions

By Robert Keske

While I noted in my first post that we were treated to beautiful weather in Amsterdam during the first days of IBC 2015, the weather on day four was not quite as nice… it was full of rain and thunderstorms, the latter of which was heard eerily through the RAI Exhibition Centre.

CLIPSTER

The next-gen Clipster

I spent day three exploring content delivery and automation platforms.

Rohde & Schwarz’s next-gen Clipster is finally here and is a standout — built on an entirely new hardware platform. It’s seamless, simplified, faster and looks to have a hardware and software future that will not require a forklift upgrade. 

Colorfront, also a leader in on-set dailies solutions, has hit the mark with its Transkoder product. The new HDR mathematical node is nothing less than impressive, which is nothing less than expected from Colorfront engineering.

Colorfront Transkoder

Colorfront Transkoder

UHD and HDR were also forefront at the show as the need for higher quality content continues to grow, and I spent day four examining these emerging display and delivery technologies. Both governments and corporate entities are leading the global community towards delivery of UHD to households starting in 2015, so I was especially interested in seeing how display and content providers would be raising the standards in display tech.

Sony, Samsung and Panasonic (our main image) all showcased impressive results to support UHD and HDR, and I’m looking forward to seeing what further developments and improvements the industry has to offer for both professional and consumer adoption.

Overall, while its seemed like a smaller show this year, I’ve been impressed by the quality of technology on display. IBC never fails to deliver a showcase of imagination and innovation and this year was no different.  

New York-based Robert Keske is CIO/CTO at Nice Shoes (@NiceShoesOnline).

IBC 2015 Blog: HDR displays

By Simon Ray

It was an interesting couple of days in Amsterdam. I was hoping to get some more clarity on where things were going with the High Dynamic Range concept in both professional and consumer panels, as well as delivery mechanisms to get it to the consumers. I am leaving IBC knowing more, but no nearer a coherent idea as to exactly where this is heading.

I initially visited Dolby to get an update on Dolby Vision (our main image), see where they were with their Dolby Vision technology and most importantly get my reserved tickets for the screening of Fantastic Four in the Auditorium (Laser Projection and Dolby Atmos). It all sounded very positive with news of a number of consumer panel manufacturers being close to releasing Dolby Vision-capable TVs. For example, Vizio with their Reference Series panel and streaming services like VUDU streaming Dolby Vision HDR content, although this is just in the USA to begin with. I also had my first look at a Dolby “Quantum Dot” HDR display panel, which did look good and surely has the best name of any tech out here.

There are other HDR offerings out there with Amazon Prime having announced in August that they will be streaming HDR content in the UK, but not initially in the Dolby Vision format (HDR video is available with the Amazon Instant Video app for Samsung SUHD TVs like the JS9000, JS9100 and JS9500 series) and selected LG TVs (G9600 and G9700 series) and the “big” TV manufacturers have or are about to launch HDR panels. So far so good.

Pro HDR Monitors
Things got bit more vague again when I started looking into HDR-equipped professional panels for color correction. There are only two I could find in the show: Sony had an impressive HDR-ready panel connected to a Filmlight BaseLight tucked away on their large stand in Hall 12; and Canon, who had their equally impressive prototype display tucked away in Hall 11 connected to a SGO Mistika. Both displays had different brightness specs and gamma options.

canon

When I asked some other manufacturers about their HDR panels the response was the same: “We are going to wait until the specifications are finalized before committing to an HDR monitor.” This leaves me to think this is a bad time to be buying a monitor. You are either going to buy an HDR monitor now, which may not be correct to the final specifications, or you are going to be buying a non-HDR monitor that is likely to be superseded in the near future.

Another thing I noticed was that the professional HDR panels were all being shown off in a carefully (or as carefully as a trade show allows) light environment to give them the best opportunity to make an impact. Any ambient light getting into the viewing environment is going to detract from the benefits of having the increased dynamic range and brightness of the HDR display, which I imagine might be a problem in the average living room. I hope this does not reduce the chance of this technology making an impact because it is great to see images seemingly having more depth and quality to them. As a representative on the Sony stand said, “It feels more immersive — I am so much more engaged in the picture.”

sony

Dolby
The problem of the ambient light was also picked up on in an interesting talk in the Auditorium as part of the “HDR: From zero to infinity” series. There were speakers from iMax, Dolby, Barco and Sony talking about the challenges of bringing HDR to the cinema. I had come across the idea of HDR in cinema from Dolby through their “Dolby Cinema” project, which brings together HDR picture and immersive sound with Dolby Atmos.

I am in the process of building a theatre to mix theatrical soundtracks in Dolby Atmos, but despite the exciting opportunities for sound that Atmos offers the sound teams, in the UK at least the take up by Cinemas is slow. One of the best things about Dolby Atmos for me is that if you go to see a film in Atmos, you know that the speaker system is going to be of a certain standard, otherwise Dolby would not have given it Atmos status. For too long, cinemas have been allowed to let the speaker systems wear down to the point where it becomes unlistenable. If these new initiatives can give cinemas an opportunity to reinvest in the equipment (and the various financial implications and challenges and who would meet these costs were discussed) and get a return on that investment it could be a chance to stop the rot and improve the cinema going experience. And, importantly, for us in post it gives us an exciting high bench mark to be aiming for when working on films.

Simon Ray is head of operations and engineering Goldcrest Post Production in London.

Panel: The future of post production — 4K and HDR

By Larry Jordan

Last week, I had the pleasure of moderating a panel sponsored by KeyCode Media and Sony on “The Future of Post: 4K and HDR.” We spent 90 minutes discussing whether it was time for editors and post facilities to start editing 4K and/or HDR images, and what changes these new formats would require.

The panel featured Michael Cioni, president, Light Iron; Mike Whipple, executive director of post, Sony Pictures Entertainment; and Bryan McMahan, senior digital colorist, Modern VideoFilm.

Some Background
4K is the term used to describe image frame sizes that are close to 4,000×2,500 pixels. 4K actually has a variety of different aspect ratios – Michael Cioni listed six off the top of his head – along with a variation of 4K called Ultra HD (UHD).

HDR is the term used to describe High Dynamic Range video, which provides more grayscale values than traditional video. HDR is described as more “life-like,” and is especially notable because it provides richer blacks and more vibrant highlights.

HDR generally requires RAW files using a bit depth of 12-bits or greater. This means that file sizes will be much larger than standard HD video files. Also, for best results, HDR images should not use a compressed video codec. Additionally, footage needs to be captured during production as HDR, you can’t add it to footage after the fact during post.

Wide Color Gamut is the term used to describe video with greater color saturation than traditional video. Not “different” colors, but richer, more saturated colors.

In the shorthand of the panel, these formats were described as: more pixels, more gray-scales and more saturation. These new image standards are described in a SMPTE spec called “Rec. 2020.” This is similar in concept, but not in values, to the Rec. 709 spec we use for HD or Rec. 601 we used for SD.

As Cioni said: “People often speak of 4K or HDR or Wide Color Gamut. But it isn’t “or,” it’s “and.” The video we’ll be editing in the future will contain higher-resolution images and greater dynamic range and wider color gamut. Think of it as three legs of a tripod supporting the video of the future.”

Making Adjustments
New video technology often requires making adjustments to support it, however from the artist’s perspective, those adjustments are fairly minor. As McMahan described, there’s no difference from the creative perspective when grading 4K video vs. 2K or HD. There may be more pixels to work with, but the techniques he uses still work.

There is, however, a difference between color grading HDR video vs. “SDR” (or “Standard Dynamic Range” video as Cioni called it). McMahan said it took him a day or two to get comfortable with the new HDR format.

Once McMahan became comfortable with the format, he said it took him about the same amount of time to color grade an HDR master as an SDR master. In fact, “I think I can do HDR a little faster than SDR, because I have a broader palette to work with.”

The big difference with HDR, all three panelists stressed, was not the workflow, but getting a monitor that properly displays HDR video. Here, prices are not cheap. While no specific brands were suggested, a color-grade-capable HDR monitor is in the $30,000 price range.

Which brought up a key question for me: “Where’s the money?”

Who’s Buying?
Of the three panelists, only Cioni is directly involved in client prospecting and billing. So he and I talked about how editors and post houses would make money in this new format.

Cioni charges a “little bit” more for editing 4K video and “more” for HDR. We didn’t get into specific pricing.

Then he surprised me by saying, “The money for HDR and 4K won’t come from broadcasters or cable. They are a long way from updating their infrastructure to support this technology because the upgrades are expensive and time consuming. The market is broadband companies — Netflix, Amazon, Hulu, Microsoft and Apple — who are able to instantly deliver 4K media directly to the home via the Internet.”

This agrees with trends I’ve been seeing. Traditional broadcast audiences are declining for everything but live events, while audiences for Internet-based video delivery are skyrocketing. The money is still in the older distribution formats, but the audiences are on the web.

Can You See the Difference, and Does It Matter?
We had a long discussion on whether the typical audience can actually see the image improvements of 4K. While panel members felt that 4K is instantly perceptible, I am less sure. On the other hand, if editing 4K allows editors to get more work, I’m in favor of it whether anyone can see the difference or not.

Where the panel was all in agreement was that the differences in HDR were massively better than traditional HD video. As Bryan said: “Once you’ve seen a properly graded HDR image, going back to SDR looks flat and lifeless.”

At this point, Cioni made an interesting comment: “It is easy to make a 2K, even a 1080 version of a 4K master file. Those conversion transforms are well known and don’t damage the image. With HDR, there’s no easy way to convert from HDR to SDR. For those cases, you’ll need to create two different color grades of your material.”

Hardware Needs
If an editor is successfully editing 1080 video, they can probably step up to 4K without needing to buy much new gear. Clearly, 4K requires more storage space and a 4K video monitor if you need to see your images pixel accurately. But for most creative editing, seeing the image at full resolution is not necessary, which means that editors don’t need a 4K monitor to do the creative cut.

However, as Michael Whipple pointed out, it is important to see the image at full resolution at some point during the edit just to make sure shots are in focus. Viewing images in less than full resolution tends to hide focus problems.

HDR and Wide Color Gamut video requires vastly larger storage due to the size of the source files, plus video monitoring gear that allows display of the extended color range images.

The big gating factor, as McMahan pointed out, is that an HDR monitor suitable for color grading is about $30,000. Which means we need to find ways to charge more to cover the costs of the gear required.

NOTE: Currently, Avid Media Composer, Premiere Pro CC and Final Cut Pro X don’t support HDR, except in a very rudimentary fashion.

Future Proofing
I decided to put Cioni on the spot by asking: “We are currently shooting 4K, 5K, even 6K images. NHK in Japan is planning on airing 8K images next year and 16K was demonstrated at NAB last spring. Should we just wait for three months for all the resolution specs to change again?”

Michael replied: “I expect 4K to be a standard delivery format for the next 10 years. While resolutions we use in production will continue to increase, the resolution we deliver will remain constant for a while. This means that editorial houses can standardize on a 4K deliverable.”

“HDR will take longer to develop because we need to get HDR-capable TV sets into the home to drive demand. The interesting thing about HDR is that it looks great regardless of the resolution of the video. HD, even SD, looks much better when displayed using HDR.”

Summary
It was a fascinating discussion, which made me realize that both high-resolutions and HDR/Wide Color Gamut are in our future. Bu maybe not today, due to a lack of widespread software support and companies focused on streaming to the web.

But, the future evolves faster than we think and last night’s discussion gave me a good idea of where we are headed. Thanks to KeyCode for allowing me to be a part of this discussion.

A video of the complete panel is below.

Larry Jordan is a producer, director, editor, writer, consultant and trainer who has worked in media for more than 40 years. He runs the LarryJordan.com and DigitalProductionBuzz.com websites.

Lightmap updates lighting software to HDR Light Studio 5

Lightmap has released HDR Light Studio 5, a new version of the company’s lighting software tool that features an updated user interface and new features for professional 3D artists working in advertising, design, animation and VFX.

HDR Light Studio’s “click-to-light” system is designed to reduce an artist’s lighting time by placing lights directly into a render view and providing instant updates on the user’s progress. Re-engineered to handle very large images and 3D data, HDR Light Studio 5 also supports Alembic, OpenImageIO and OpenColorIO.

The tool’s new full-screen user interface is customizable and includes dockable panels and preset layouts. In addition, a wide range of light sources and effects can now be dragged and dropped directly onto the render view. Users can import their own images, with support for LDR, HDR and alpha images. HDR Light Studio 5 also supports mip-mapped files for faster lighting using large images, with features such as procedural sun/sky, linear and radial gradients, secondary alpha control for lights, and additional blend modes. Planar, 3D, and spherical content mappings can add content to an artist’s HDRI map for more creative control; for example, tilting HDRI maps or maintaining verticals.