Category Archives: HDR

Eizo intros DCI-4K reference monitor for HDR workflows

Eizo will be at NAB next week demonstrating its ColorEdge Prominence CG3145 31.1-inch reference monitor, which offers DCI-4K resolution (4096×2160) for pro HDR post workflows.

Eizo says the ColorEdge Prominence CG3145 can display both very bright and very dark areas on the screen without sacrificing the integrity of either. The monitor achieves the 1000cd/m (typical) high brightness level needed for an HDR content display. It can achieve a typical contrast ratio of 1,000,000:1 for displaying true blacks.

The ColorEdge Prominence CG3145 supports both the HLG (Hybrid Log-Gamma) and PQ (Perceptual Quantization) curves so post pros can rely on a monitor compliant with industry standards for HDR video.

The ColorEdge Prominence CG3145 supports various video formats, including HDMI input compatible with 10-bit 4:2:2 at 50/60p. The DisplayPort input supports up to 10-bit 4:4:4 at 50/60p. Additional features include 98 percent of the DCI-P3 color space smooth gradations with 10-bit display from a 24-bit LUT (look-up-table) and an optional light-shielding hood.
The ColorEdge Prominence CG3145 will begin shipping in early 2018.

In addition to the new ColorEdge Prominence CG3145 HDR reference monitor, Eizo currently offers optional HLG and PQ curves for many of its current CG Series monitors. The optimized gamma curves render images to appear more true to how the human eye perceives the real world compared to SDR. This HDR gamma support is available as an option for ColorEdge CG318-4K, CG248-4K, CG277 and CG247X. Both gamma curves were standardized by the ITU as ITU-R BT.2100. In addition, the PQ curve was standardized by SMPTE as ST-2084.

Assimilate Scratch and Scratch VR Suite upgraded to V.8.6

Assimilate is now offering an open beta for Scratch 8.6 and the Scratch VR Suite 8.6, the latest versions of its realtime post tools and workflow — VR/360 and 2D/3D content, from dailies to conform grading, compositing and finishing. Expanded HDR functions are featured throughout the product line, including in Scratch VR, which now offers stitching capabilities.

Both open beta versions gives pros the opportunity to actively use the full suite of Scratch and Scratch VR tools, while evaluating and submitting requests and recommendations for additional features or updates.

Scratch Web for cloud-based, realtime review and collaboration, and Scratch Play for immediate review and playback, are also included in the ecosystem updates. Both products support VR/360 and 2D/3D content.

Current users of the Scratch VR Suite 8.5 and Scratch Finishing 8.5 can download the Scratch 8.6 open beta. Scratch 8.6 open beta and the Scratch VR Suite open beta are available now.

“V8.6 is a major update for both Scratch and the Scratch VR Suite with significant enhancements to the HDR and ACES workflows. We’ve added stitching to the VR toolset so that creators have a complete and streamlined end-to-end VR workflow,” says Jeff Edson, CEO at Assimilate. “The open Beta helps us to continue developing the best and most useful post production features and techniques all artists need to perfect their creativity in color grading and finishing. We act on all input, much of it immediately and some in regular updates.”

Here are some details of the update:

HDR
• PQ and HLG transfer functions are now an integral part of Scratch color management.
• Scopes automatically switch to HDR mode if needed and show levels in a nit-scale; highlights any reference level that you set.
• At the project level, define the HDR mastering metadata: color space, color primaries and white levels, luminance levels and more. The metadata is automatically included in the Video HDMI interface (AJA, BMD, Bluefish444) for display.
• Static metadata has the function to calculate dynamic luminance metadata like MaxCLL and MaxFall.
• HDR footage can be published directly to YouTube with HDR metadata.

VR/360 – Scratch VR Suite
• 360 stitching functionality: load all your source media from your 360 cameras into Scratch VR and combine it to a single equirectangular image. Support for camera stitch templates: AutoPano projects, Hugin and PTStitch scripts.
• Ambisonic Audio: Scratch VR can load, set and playback ambisonic audio files to complete the 360 immersive experience.
• Video with 360 sound can be published directly to YouTube 360.
• Additional overlay handles to the existing 2D-equirectangular feature for more easily positioning 2D elements in a 360 scene.

DIT Reporting Function
• Create a report of all clips of either a timeline, a project or just a selection of shots.
• Reports include metadata, such as a thumbnail, clip-name, timecode, scene, take, comments and any metadata attached to a clip.
• Choose from predefined templates or create your own.

MTI 4.28
Mozart in the Jungle

The colorful dimensions of Amazon’s Mozart in the Jungle

By Randi Altman

How do you describe Amazon’s Mozart in the Jungle? Well, in its most basic form it’s a comedy about the changing of the guard — or maestro — at the New York Philharmonic, and the musicians that make up that orchestra. When you dig deeper you get a behind-the-scenes look at the back-biting and crazy that goes on in the lives and heads of these gifted artists.

Timothy Vincent

Timothy Vincent

Based on the novel Mozart in the Jungle: Sex, Drugs, and Classical Music by oboist Blair Tindall, the series — which won the Golden Globe last year and was nominated this year — has shot in a number of locations over its three seasons, including Mexico and Italy.

Since its inception, Mozart in the Jungle has been finishing in 4K and streaming in both SDR and HDR. We recently reached out to Technicolor’s senior color timer, Timothy Vincent, who has been on the show since the pilot to find out more about the show’s color workflow.

Did Technicolor have to gear up infrastructure-wise for the show’s HDR workflow?
We were doing UHD 4K already and were just getting our HDR workflows worked out.

What is the workflow from offline to online to color?
The dailies are done in New York based on the Alexa K1S1 709 LUT. (Technicolor On-Location Services handled dailies out of Italy, and Technicolor PostWorks in New York.) After the offline and online, I get the offline reference made with the dailies so I can look at if I have a question about what was intended.

If someone was unsure about watching in HDR versus SDR, what would you tell them?
The emotional feel of both the SDR and the HDR is the same. That is always the goal in the HDR pass for Mozart. One of the experiences that is enhanced in the HDR is the depth of field and the three-dimensional quality you gain in the image. This really plays nicely with the feel in the landscapes of Italy, the stage performances where you feel more like you are in the audience, and the long streets of New York just to name a few.

Mozart in the JungleWhen I’m grading the HDR version, I’m able to retain more highlight detail than I was in the SDR pass. For someone who has not yet been able to experience HDR, I would actually recommend that they watch an episode of the show in SDR first and then in HDR so they can see the difference between them. At that point they can choose what kind of viewing experience they want. I think that Mozart looks fantastic in both versions.

What about the “look” of the show. What kind of direction where you given?
We established the look of the show based on conversations and collaboration in my bay. It has always been a filmic look with soft blacks and yellow warm tones as the main palette for the show. Then we added in a fearlessness to take the story in and out of strong shadows. We shape the look of the show to guide the viewers to exactly the story that is being told and the emotions that we want them to feel. Color has always been used as one of the storytelling tools on the show. There is a realistic beauty to the show.

What was your creative partnership like with the show’s cinematographer, Tobias Datum?
I look forward to each episode and discovering what Tobias has given me as palette and mood for each scene. For Season 3 we picked up where we left off at the end of Season 2. We had established the look and feel of the show and only had to account for a large portion of Season 3 being shot in Italy. Making sure to feel the different quality of light and feel of the warmth and beauty of Italy. We did this by playing with natural warm skin tones and the contrast of light and shadow he was creating for the different moods and locations. The same can be said for the two episodes in Mexico in Season 2. I know now what Tobias likes and can make decisions I’m confident that he will like.

Mozart in the JungleFrom a director and cinematographer’s point of view, what kind of choices does HDR open up creatively?
It depends on if they want to maintain the same feel of the SDR or if they want to create a new feel. If they choose to go in a different direction, they can accentuate the contrast and color more with HDR. You can keep more low-light detail while being dark, and you can really create a separate feel to different parts of the show… like a dream sequence or something like that.

Any workflow tricks/tips/trouble spots within the workflow or is it a well-oiled machine at this point?
I have actually changed the way I grade my shows based on the evolution of this show. My end results are the same, but I learned how to build grades that translate to HDR much easier and consistently.

Do you have a color assistant?
I have a couple of assistants that I work with who help me with prepping the show, getting proxies generated, color tracing and some color support.

What tools do you use — monitor, software, computer, scope, etc.?
I am working on Autodesk Lustre 2017 on an HP Z840, while monitoring on both a Panasonic CZ950 and a Sony X300. I work on Omnitek scopes off the downconverter to 2K. The show is shot on both Alexa XT and Alexa Mini, framing for 16×9. All finishing is done in 4K UHD for both SDR and HDR.

Anything you would like to add?
I would only say that everyone should be open to experiencing both SDR and HDR and giving themselves that opportunity to choose which they want to watch and when.


Bluefish444 offering new range of I/O cards with Kronos

Bluefish444, makers of uncompressed 4K/2K/HD/SD video I/O cards for Windows, Mac OS X and Linux, has introduced the Kronos range of video and audio I/O cards. The new line extends the feature set of Bluefish444’s Epoch video cards, which support up to 4K 60 frame per second workflows. Kronos is developed for additional workflows requiring Ultra HD up to 8K, high frame rates up to 120fps, high dynamic range and video over IP.

With these capabilities, Bluefish444 cards are developed to support all areas of TV and feature film production, post, display and restoration, in addition to virtual reality and augmented reality.

Kronos adds video processing technologies including resolution scaling, video interlace and de-interlace; hardware CODEC support, SDI to IP and IP to SDI conversion, and continues to offer the 12-bit color space conversion and low-latency capabilities of Epoch.

“With the choice of HD BNC SD/HD/3G connectivity, or SFP+ connectivity enabling greater than 3G SDI and Video over IP across 10Gbps Ethernet, the Kronos range will suit today’s demanding requirements and cover emerging technologies as they mature,” says product manager Tom Lithgow. “4K ultra high definition, high frame rate and ultra-high frame rate video, high dynamic range, video over IP and hardware assisted processing are now available to OEM developers, professional content creators and system integrators with the Bluefish444 Kronos range.”

HDMI 2.0 I/O, additional SMPTE 2022 IP standards and emerging IP standards are earmarked for future support via firmware update.

Kronos will offer the choice of SDI I/O connectivity with the Kronos elektron featuring eight high-density BNC connectors capable of SD/HD/3G SDI. Each HD BNC connector is fully bi-directional enabling numerous configuration options, including eight input, eight output, or a mixture of SDI input and output connections.

The Kronos optikós offers future proofing connectivity with three SFP+ cages in addition to two HD BNC connectors for SD/HD/3G SDI I/O. The SFP+ cages on Kronos optikós provide limitless connectivity options, exposing greater than 3G SDI, IP connectivity across 10Gb Ethernet, and flexibility to choose from numerous physical interfaces.

All Kronos cards will have an eight-lane Gen 3 PCIe interface and will provide access to high bandwidth UHD, high frame rate and high dynamic range video IO and processing across traditional SDI and also emerging IP standards such as SMPTE 2022.

Kronos specs include:
* SD SMPTE295M
* HD 1.5G SMPTE292M
* 3G (A+B) SMPTE424M
* ASI
* 4:2:2:4 / 4:4:4:4 SDI
* Single Link / Dual Link / Quad Link interfaces
* 12/10-bit SDI
* Full 4K frame buffer
* 3Gbps Bypass Relays
* 12-bit video processing pipeline
* 4 x 4 x 32bit Matrix
* MR2 Routing resources
* Hardware Keyer (2K/HD)
* Customizable and flexible pixel formats
* AES Audio Input / AES Audio Output
* LTC I / O
* RS422
* Bi/Tri-level Genlock Input & Crosslocking
* Genlock loop through
* VANC complete access
* HANC Embedded Audio/Payload ID/Custom Packets/RP188
* MSA and Non-MSA compatible SFP+
* SMPTE 2022-6

The Kronos range will be available in Q4 2016 with pricing announced then.


Assimilate Scratch 8.5, Scratch VR Suite available for open beta

Assimilate is offering an open-beta version of Scratch 8.5, its realtime post system and workflow for dailies, conform, grading, compositing and finishing. Also in open beta is the Scratch VR Suite. Both open-beta versions give users the chance to work with the full suite of Scratch 8.5 and Scratch VR tools while evaluating and submitting requests and recommendations for additional features or updates.

Scratch Web for cloud-based, realtime review and collaboration, and Scratch Play for immediate review and playback, are also included in the ecosystem updates. Current users of Scratch 8.4 can download the Scratch 8.5 open beta. Those who are new to Scratch can access the Scratch 8.5 open-beta version for a 30-day free trial. The Scratch VR open-beta version can also be accessed for a 30-day free trial.

“Thanks to open-Beta programs, we get at lot of feedback from current Scratch users about the features and functions that will simplify their workflows, increase their productivity and enhance their storytelling,” explains Assimilate CEO Jeff Edson. “We have two significant Scratch releases a year for the open-beta program and then provide several incremental builds throughout the year. In this way Scratch is continually evolving to offer bleeding-edge functionality, as well as support for the latest formats, for example, Scratch was the first to support Arri’s mini-camera MXF format.”

New to Scratch 8.5
• Easy validation of availability of physical media and file references throughout a project, timeline and render
• Fast access to all external resources (media / LUT / CTL / etc.) through bookmarks
• Full set of ACES transforms as published by the Academy
• Publishing media directly to Facebook
• Option to launch Scratch from a command-line with a series of xml-script commands, which allows closer integration with post-infrastructure and third-party software and scripts

The new Scratch VR Suite includes all the features and functions of Scratch 8.5, Scratch Play and Scratch Web, plus substantial features, functions and enhancements that are specific to working in a 360 media environment.


New England SMPTE holding free session on UHD/HDR/HFR, more

The New England Section of SMPTE is holding a free day-long “New Technologies Boot Camp” that focuses on working with high resolution (UHD, 4K and beyond), high-dynamic-range (HDR) imaging and higher frame rates (HFR). In addition, they will discuss how to maintain resolution independence on screens of every size, as well as how to leverage IP and ATSC 3.0 for more efficient movement of this media content.

The boot camp will run from 9am to 9pm on May 19 at the Holiday Inn in Dedham, Massachusetts.

“These are exciting times for those of us working on the technical side of broadcasting, and the array of new formats and standards we’re facing can be a bit overwhelming,” says Martin P. Feldman, chair of SMPTE New England Section. “No one wants — or can afford — to be left behind. That’s why we’re gathering some of the industry’s foremost experts for a free boot camp designed to bring engineers up to speed on new technologies that enable more efficient creation and delivery of a better broadcast product.”

Boot camp presentations will include:

• “High-Dynamic-Range and Wide Color Gamut in Production and Distribution” by Hugo Gaggioni, chief technical officer at Sony Electronics.
• “4K/UHD/HFR/HDR — HEVC H.265 — ATSC 3.0” by Karl Kuhn of Tektronix.
• “Where Is 4K (UHD) Product Used Today — 4K Versus HFR — 4K and HFR Challenges” by Bruce Lane of Grass Valley.
• “Using MESH Networks” by Al Kornak of JVC Kenwood Corporation.
• “IP in Infrastructure-Building (Replacing HD-SDI Systems and Accommodating UHD)” by Paul Briscoe of Evertz Microsystems;
• “Scripted Versus Live Production Requirements” by Michael Bergeron of Panasonic.
• “The Transition from SDI to IP, Including IP Infrastructure and Monitoring” by John Shike of SAM (formerly Snell/Quantel).
• “8K, High-Dynamic-Range, OLED, Flexible Displays” by consultant Peter Putman.
• “HDR: The Great, the Okay, and the WTF” by Mark Schubin, engineer-in-charge at the Metropolitan Opera, Sesame Street and Great Performances (PBS).

The program will conclude with a panel discussion by the program’s presenters.

 No RSVP is required, and both SMPTE members and non-members are welcome.


NAB 2016 from an EP’s perspective

By Tara Holmes

Almost two weeks ago, I found myself at NAB for the first time. I am the executive producer of color and finishing at Nice Shoes, a post production studio in New York City. I am not an engineer and I am not an artist, so why would an EP go to NAB? I went because one of my main goals for 2016 is to make sure the studio remains at the forefront of technology. While I feel that our engineering team and artists represent us well in that respect, I wanted to make sure that I, along with our producers, were fully educated on these emerging technologies.

One of our first priorities for NAB was to meet with top monitor manufacturers to hopefully land on what UHD HDR monitors we would find to meet our standards for professional client viewing. We came to the conclusion that the industry is not there yet and we have more research to do before we upgrade our studio viewing environments.

Everyone with me was in agreement. They aren’t where they need to be. Most are only outputting around 400-800 nits and are experiencing luminance and contrast issues. None of this should stop the process of coloring for HDR. For the master monitor for the colorist, the Sony BVM-X300 OLED master monitor, which we are currently using, seems to be the ideal choice as you can still work in traditional Rec 709 as well as Rec 2020 for HDR.

After checking out some monitors, we headed to the FilmLight booth to go over the 5.0 upgrades to Baselight. Our colorist Ron Sudul, along with Nice Shoes Creative Studio VFX supervisor Adrian Winter, sat with myself and the FilmLight reps to discuss the upgrades, which included incredible new isolation tracking capabilities.  These upgrades will reinvent what can be achieved in the color suite: from realtime comps to retouch being done in color. The possibilities are exciting.

I also spent time learning about the upgrades to Filmlight’s Flip, which is their on-set color hardware. The Flip can allow you to develop your color look on set, apply it during your edit process (with the Baselight plug-in for Avid) and refine it in final color, all without affecting your RAW files. In addition to the Flip, they developed a software that supports on-set look development and grading called Prelight. I asked if these new technologies could enable us to even do high-end things like sky replacements on set and was told that the hardware within the Flip very well could.

We also visited our friends at DFT, the manufacturers of the Scanity film scanner, to catch up and discuss the business of archiving. With Scanity, Nice Shoes can scan 4K when other scanners only scan up to 2K resolution. This is a vital tool in not only preserving past materials, but in future proofing for emerging formats when archiving scans from film.

VR
On Sunday evening before the exhibits opened, we attended a panel on VR that was hosted by the Foundry. At this event we got to experience a few of the most talked about VR projects including Defrost, one of the first narrative VR films, from the director of Grease, Randal Kleiser, who was on the panel along with moderator Morris May (CEO/founder, Specular Theory), Bryn Mooser (co-founder, RYOT), Tim Dillon (executive producer, MPC) and Jake Black (head of VR, Create Advertising).

The Foundry’s VR panel.

The panel inspired me to delve deeper into the VR world, and on Wednesday I spent most of my last day exploring the Virtual & Augmented Reality Pavilion. In addition to seeing the newest VR camera rig offerings and experiencing a live VR feed, as well as demo-ing the Samsung Gear, I explored viewing options for the color workflow. Some people I spoke to mentioned that multiple Oculus set-ups all attached to a single feed was the way to go for color workflow, but another option that we did a very preliminary exploration of was the “dome” possibility, which offers a focused 180-degree view for everyone involved to comment on the same section of a VR scene. This would enable all involved to be sure they are experiencing and viewing the same thing at the same time.

HDR Workflow
Another panel we attended was about HDR workflows. Nice Shoes has already had the opportunity to work on HDR material and have begun to develop workflows for this emerging medium. Most HDR deliverables are for episodic and long form for such companies as Netflix, Hulu and the like. It may be some time before commercial clients are requesting an HDR deliverable, but the workflows will be much the same so the development being performed now is extremely valuable.

My biggest take away was that there are still no set standards. There’s Dolby Vision vs. HDR 10 vs. PQ vs. others. But it appears that everyone agrees that standards are not needed right now. We need to get tools into the hands of the artists and figure out what works best. Standards will come out of that. The good news is that we appear to be future-proofed for the standard to change. Meaning for the most part, every camera we are shooting on is shooting for HDR and should standards change — say from 1000 nits to 10,000 nits — the footage and process is still there to go back in and color for the new request.

Summing Up
I truly believe my time spent at NAB has prepared me for the myriad of questions that will be put forth throughout the year and will help us develop our workflows to evolve the creative process of post. I’ll be sure to be there again next year in order to prepare myself for the questions of 2017 and beyond.

Our Main Image: The view walking into the South Hall Lower at the LVCC.