Tag Archives: HDR

Canon targets HDR with EOS C200, C200B cinema cameras

Canon has grown its Cinema EOS line of pro cinema cameras with the EOS C200 and EOS C200B. These new offerings target filmmakers and TV productions. They offer two 4K video formats — Canon’s new Cinema RAW Light and MP4 — and are optimized for those interested in shooting HDR video.

Alongside a newly developed dual Digic DV6 image processing system, Canon’s Dual Pixel CMOS AF system and improved operability for pros, these new cameras are built for capturing 4K video across a variety of production applications.

Based on feedback from Cinema EOS users, these new offerings will be available in two configurations, while retaining the same core technologies within. The Canon EOS C200 is a production-ready solution that can be used right out of the box, accompanied by an LCD monitor, LCD attachment, camera grip and handle unit. The camera also features a 1.77 million-dot OLED electronic view finder (EVF). For users who need more versatility and the ability to craft custom setups tailored to their subject or environment, the C200B offers cinematographers the same camera without these accessories and the EVF to optimize shooting using a gimbal, drone or a variety of other configurations.

Canon’s Peter Marr was at Cine Gear demo-ing the new cameras.

New Features
Both cameras feature the same 8.85MP CMOS sensor that combines with a newly developed dual Digic DV6 image processing system to help process high-resolution image data and record video from full HD (1920×1080) and 2K (2048×1080) to 4K UHD (3840×2160) and 4K DCI (4096×2160). A core staple of the third-generation Cinema EOS system, this new processing platform offers wide-ranging expressive capabilities and improved operation when capturing high-quality HDR video.

The combination of the sensor and a newly developed processing system also allows for the support for two new 4K file formats designed to help optimize workflow and make 4K and HDR recording more accessible to filmmakers. Cinema RAW Light, available in 4K 60p/50p at 10-bit and 30p/25p/24p at 12-bit, allows users to record data internally to a CFast card by cutting data size to about one-third to one-fifth of a Cinema RAW file, without losing grading flexibility. Due to the reduced file size, users will appreciate rich dynamic range and easier post processing without sacrificing true 4K quality. Alongside recording to a CFast card, proxy data (MP4) can also be simultaneously recorded to an SD card for use in offline editing.

Additionally, filmmakers will also be able to export 4K in MP4 format on SD media cards at 60/50/30/25/24P at 8-bit. Support for UHD recording allows for use in cinema and broadcasting applications or scenarios where long recording times are needed while still maintaining top image quality. The digital cinema cameras also offer slow-motion full-HD recording support at up to 120fps.

The Canon EOS C200and Canon EOS C200B feature Innovative Focus Control that helps assist with 4K shooting that demands precise focusing, whether from single or remote operation. According to Canon, its Dual Pixel CMOS AF technology helps to expand the distance of the subject area to enable faster focus during 4K video recording. This also allows for highly accurate continuous AF and face detection AF when using EF lenses. For 4K video opportunities that call for precise focus accuracy that can’t be checked on an HD monitor, users can also take advantage of the LCD Monitor LM-V1 (supplied with the EOS C200 camera), which provides intuitive touch focusing support to help filmmakers achieve sophisticated focusing even as a single operator.

In addition to these features, the cameras offer:
• Oversampling HD processing: enhances sensitivity and helps minimize noise
• Wide DR Gamma: helps reduce overexposure by retaining continuity with a gamma curve
• ISO 100-102400 and 54db gain: high quality in both low sensitivity and low-light environments
• In-camera ND filter: internal ND unit allows cleaning of glass for easier maintenance
• ACESproxy support: delivers standardized color space in images, helping to improve efficiency
• Two SD card and one CFast card slots for internal recording
• Improved grip and Cinema-EOS-system-compatible attachment method
• Support for Canon Cine-Servo and EF cinema lenses

Editing and grading of the Cinema RAW Light video format will be supported in Blackmagic Resolve. Editing will also be possible in Avid Media Composer, using a Canon RAW plugin for Avid Media Access. This format can also be processed using the Canon application, Cinema RAW Development.

Also, Premiere Pro CC of Adobe will support this format until the end of 2017. Editing will also be possible in Final Cut Pro X from Apple, using the Canon RAW Plugin for Final Cut Pro X after the second half of this year.

The Canon EOS C200 and EOS C200B are scheduled to be available in August for estimated retail prices of $7,499 and $5,999, respectively. The EOS C200 comes equipped with additional accessories including the LM-V1 LCD monitor, LA-V1 LCD attachment, GR-V1 camera grip and HDU-2 handle unit. Available in September, these accessories will also be sold separately.

Bluefish444 supports Adobe CC and 4K HDR with Epoch card

Bluefish444 Epoch video audio and data I/O cards now support the advanced 4K high dynamic range (HDR) workflows offered in the latest versions of the Adobe Creative Cloud.

Epoch SDI and HDMI solutions are suited for Adobe’s Premiere Pro CC, After Effects CC, Audition CC and other tools that are part of the Creative Cloud. With GPU-accelerated performance for emerging post workflows, including 4K HDR and video over IP, Adobe and Bluefish444 are providing a strong option for pros.

Bluefish444’s Adobe Mercury Transmit support for Adobe Creative Cloud brings improved performance in demanding workflows requiring realtime video I/O from UHD and 4K HDR sequences.

Bluefish444 Epoch video card support adds:
• HD/SD SDI input and output
• 4K/2K SDI input and output
• 12/10/8-bit SDI input and output
• 4K/2K/HD/SD HDMI preview
• Quad split 4K UHD SDI
• Two sample interleaved 4K UHD SDI
• 23, 24, 25, 29, 30fps video input and output
• 48, 50, 59, 60fps video input and output
• Dual-link 1.5Gbps SDI
• 3Gbps level A & B SDI
• Quad link 1.5Gbps and 3Gbps SDI
• AES digital audio
• Analog audio monitoring
• RS-422 machine control
• 12-bit video color space conversions

“Recent updates have enabled performance which was previously unachievable,” reports Tom Lithgow, product manager at Bluefish444. “Thanks to GPU acceleration, and [the] Adobe Mercury Transmit plug-in, Bluefish444 and Adobe users can be confident of smooth realtime video performance for UHD 4K 60fps and HDR content.”

Sony’s offerings at NAB

By Daniel Rodriguez

Sony has always been a company that prioritizes and implements the requests of the customer. They are constantly innovation throughout all aspects of production — from initial capture to display. At NAB 2017, Sony’s goal was to further expand benchmarks the company has made in the past few months.

To reflect its focus as a company, Sony’s NAB booth was focused on four areas: image capture, media solutions, IP Live and HDR (High Dynamic Range). Sony’s focus was to demonstrate its ability to anticipate for future demands in capture and distribution while introducing firmware updates to many of their existing products to complement these future demands.

Cameras
Since Sony provides customers and clients with a path from capture to delivery, it’s natural to start with what’s new for imaging. Having already tackled the prosumer market with its introduction of the a7sii, a7rii, FS5 and FS7ii, and firmly established its presence in the cinema camera line with the Sony F5, F55 and F65, it’s natural that Sony’s immediate steps weren’t to follow up on these models so soon, but rather introduce models that fit more specific needs and situations.

The newest Sony camera introduced at NAB was the UMC-S3CA. Sporting the extremely popular sensor from the a7sii, the UMC-S3CA is a 4K interchangeable lens E mount camera that is much smaller than its sensor sibling. Its Genlock ability allows any user to monitor, operate and sync many at a time, something extremely promising for emerging media like VR and 360 video. It boasts an incredible ISO range from 100-409,600 and recording internal 4K UHD recording at 23.98p, 25fps and 29.97p in 100Mbps and 60Mbps modes. The size of this particularly small camera is promising for those who love the a7sii but want to employ it in more specific cases, such as crash cams, drones, cranes and sliders.

To complement its current camera line, Sony has released an updated version of their electronic viewfinder DVF-EL100 —the DVF-EL200 (pictured)— which also boasts a full 1920x1080p resolution image and is about twice as bright as the previous model. Much like updated versions of Sony’s cameras, this monitor’s ergonomics are attributed to the vast input from users of the previous model, something that the company prides itself on. (Our main image show the F55 with the DVF-EL200 viewfinder.)

Just because Sony is introducing new products doesn’t mean that it has forgotten about older products, especially those that are part of its camera lines. Prosumer models, like the Sony PXW-Z150 and Sony PXW-FS5, to professional cinema cameras, such as the Sony PMW-F5 and PMW-F55, are all receiving firmware updates coming in July 2017.

The most notable firmware update of the Z150 will be its ability to capture images in HLG (Hybrid Log Gamma) to support easier HDR capture and workflow. The FS5 will also receive the ability to capture in HLG, in addition to the ability to change the native ISO from 2000 to 3200 when shooting in SLog2 or SLog3 and 120fps capabilities at 1080p full HD. While many consider the F65 to be Sony’s flagship camera, some consider the F55 to be the more industry friendly of Sony’s cinema camera line, and Sony backs that up by increasing it’s high frame rate capture in a new firmware update. This new firmware update will allow the F55 to record in 72, 75, 90, 96 and 100fps in 4K RAW and in the company’s new compressed Extended Original Camera Negative (X-OCN) format.

X-OCN
Sony’s new X-OCN codec continues to be a highlight of the company’s developments as it boasts an incredible 16-bit bit-depth despite it being compressed, and it’s virtually indistinguishable from Sony’s own RAW format. Due to its compression, it boasts file sizes that are equivalent to 50 percent less than 2K 4:3 Arriraw and 4K ProRes 4444 XQ and 30 percent less than F55 RAW. It’s considered the most optimal and suitable format for HDR content capturing. With cameras like the F5, F55 and its smaller alternatives, like the FS7 and FS7II allowing RAW recording, Sony is offering a nearly indistinguishable alternative to cut down on storage space as well as allow more recording time on set.

Speed and Storage
As Sony continues to increase its support for HDR and larger resolutions like 8K, it’s easy to consider the emergence of X-OCN as an introduction of what to expect from Sony in the future.

Despite the introduction of X-OCN being the company’s answer to large file sizes from shooting RAW, Sony still maintain a firm understanding of the need for storage and the read/write speeds that come with such innovations. As part of such innovations, Sony has introduced the AXS-AR1 AXS memory and SXS Thunderbolt card reader. Using a Thunderbolt 2 connector, which can be daisy-chained since the reader has two inputs, the reader has a theoretical transfer speed of approximately 9.6Gbps, or 1200MBps. Supporting SxS and Sony’s new AXS cards, if one were to download an hour’s worth of true 4K footage at 24fps, shot in X-OCN, it would only take about 2.5 minutes to complete the transfer.

To complement these leaps in storage space and read/write speeds, Sony’s Optical Disc Archive Generation 2 is designed as an optic disc-based storage media with expandable robotic libraries called PetaSites, which through the use of 3.3TB Optical Disc Archive Cartridges guarantee a staggering 100-year shelf life. Unlike LTOs, which are generally only used a handful of times for storing and retrieving, Sony’s optical discs can be quickly and randomly accessed as needed.

HDR
HDR continues to gain traction in the world of broadcast and cinema. From capture to monitoring, the introduction of HDR has spurred many companies to implement new ways to create, monitor, display and distribute HDR content. As mentioned earlier, Sony is implementing firmware updates in many of its cameras to allow internal HLG, or Instant HDR, capture without the need for color grading, as well as compressed X-OCN RAW recording to allow more complex HDR grading to be possible without the massive amounts of data that uncompressed RAW takes up.

HDR gamma displays can now be monitored on screens like the Sony FS5’s, as well as higher-end displays such as their BVM E171, BVM X300/2 and PVM X550.

IP Live
What stood out about Sony’s mission with HDR is to further implement its use in realtime, non-fiction content, and broadcasts like sporting events through IP Live. The goal is to offer instantaneous conversions to not only output media in 4K HDR and SDR but also offer full HD HDR and SDR at the same time. With its SR Live System Sony hopes to implement updates in their camera lines with HLG to provide instant HDR which can be processed through its HDRC-4000 converters. As the company’s business model has stated Sony’s goal is to offer full support throughout the production process, which has led to the introduction of XDCAM Air, which will be an ENG-based cloud service that addresses the growing need for speed to air. XDCAM Air will launch in June 2017.

Managing Files
To round out its production through delivery goals, Sony continues with Media Backbone Navigator X, which is designed to be an online content storage and management solution to ease the work between capture and delivery. It accepts nearly any file type and allows multiple users to easily search for keywords and even phrases spoken in videos while being able to stream in realtime speeds.

Media Backbone Navigator X is designed for productions that create an environment of constant back and forth and will eliminate any excessive deliberation when figuring out storage and distribution of materials.

Sony’s goal at NAB wasn’t to shock or awe but rather to build on an established foundation for current and new clients and customers who are readying for an ever-changing production environment. For Sony, this year’s NAB could be considered preparation for the “upcoming storm” as firmware updates roll out more support for promising formats like HDR.


Daniel Rodriquez is a New York-based cinematographer, photographer and director. Follow him on Instragram: https://www.instagram.com/realdanrodriguez.

A glimpse at what was new at NAB

By Lance Holte

I made the trek out to Las Vegas last week for the annual NAB show to take in the latest in post production technology, discuss new trends and products and get lost in a sea of exhibits. With over 1,700 exhibitors, it’s impossible to see everything (especially in the two days I was there), but here are a handful of notable things that caught my eye.

Blackmagic DaVinci Resolve Studio 14: While the “non-studio” version is still free, it’s hard to beat the $299 license for the full version of Resolve. As 4K and 3D media becomes increasingly prevalent, and with the release of their micro and mini panels, Resolve can be a very affordable solution for editors, mobile colorists and DITs.

The new editorial and audio tools are particularly appealing to someone like me, who is often more hands-on on the editorial side than the grading side of post. To that regard, the new tracking features look to provide extra ease of use for quick and simple grades. I also love that Blackmagic has gotten rid of the dongles, which removes the hassle of tracking numerous dongles in a post environment where systems and rooms are swapped regularly. Oh, and there’s bin, clip and timeline locking for collaborative workflows, which easily pushes Resolve into the competition for an end-to-end post solution.

Adobe Premiere CC 2017 with After Effects and Audition Adobe Premiere is typically my editorial application of choice, and the increased integration of AE and Audition promise to make an end-to-end Creative Cloud workflow even smoother. I’ve been hoping for a revamp of Premiere’s title tool for a while, and the Essential Graphics panel/new Title Tool appears to greatly increase and streamline Premiere’s motion graphics capabilities — especially as someone who does almost all my graphics work in After Effects and Photoshop. The more integrated the various applications can be, the better; and Adobe has been pushing that aspect for some time now.

On the audio side, Premiere’s Essential Sound Panel tools for volume matching, organization, cleanup and other effects without going directly into Audition (or exporting for ProTools, etc.) will be really helpful, especially for smaller projects and offline mixes. And as a last note, the new Camera Shake Deblur effect in After Effects is fantastic.

Dell UltraSharp 4K HDR Monitor — There were a lot of great looking HDR monitors at the show, but I liked that this one fell in the middle of the pack in terms of price point ($2K), with solid specs (1000 nits, 97.7% of P3, and 76.9% of Rec. 2020) and a reasonable size (27 inches). Seems like a good editorial or VFX display solution, though the price might be pushing budgetary constraints for smaller post houses. I wish it was DCI 4K instead of UHD and a little more affordable, but that will hopefully come with time.

On that note, I really like HP’s DreamColor Z31x Studio Display. It’s not HDR, but it’s 99% of the P3 colorspace, and it’s DCI 4K — as well as 2K, by multiplying every pixel at 2K resolution into exactly 4 pixels — so there’s no odd-numbered scaling and sharpening required. Also, I like working with large monitors, especially at high resolutions. It offers automated (and schedulable) color calibration, though I’d love to see a non-automated display in the future if it could bring the price down. I could see the HP monitor as a great alternative to using more expensive HDR displays for the majority of workstations at many post houses.

As another side note, Flanders Scientific’s OLED 55-inch HDR display was among the most beautiful I’ve ever seen, but with numerous built-in interfaces and scaling capabilities, it’s likely to come at a higher price.

Canon 4K600STZ 4K HDR laser projector — This looks to be a great projection solution for small screening rooms or large editorial bays. It offers huge 4096×2400 resolution, is fairly small and compact, and apparently has very few restraints when it comes to projection angle, which would be nice for a theatrical edit bay (or a really nice home theater). The laser light source is also attractive because it will be low maintenance. At $63K, it’s at the more affordable end of 4K projector pricing.

Mettle 360 Degree/VR Depth plug-ins: I haven’t worked with a ton of 360-degree media, but I have dealt with the challenges of doing depth-related effects in a traditional single-camera space, so the fact that Mettle is doing depth-of-field effects, dolly effects and depth volumetric effects with 360-degree/VR content is pretty incredible. Plus, their plug-ins are designed to integrate with Premiere and After Effects, which is good news for an Adobe power user. I believe they’re still going to be in beta for a while, but I’m very curious to see how their plug-ins play out.

Finally, in terms of purely interesting tech, Sony’s Bravia 4K acoustic surface TVs are pretty wild. Their displays are OLED, so they look great, and the fact that the screen vibrates to create sound instead of having separate speakers or an attached speaker bar is awfully cool. Even at very close viewing, the screen doesn’t appear to move, though it can clearly be felt vibrating when touched. A vibrating acoustic surface raises some questions about mounting, so it may not be perfect for every environment, but interesting nonetheless.


Lance Holte is an LA-based post production supervisor and producer. He has spoken and taught at such events as NAB, SMPTE, SIGGRAPH and Createasphere. You can email him at lance@lanceholte.com.

MTI Film updates Cortex for V.4, includes Dolby Vision HDR metadata editing

MTI Film is updating its family of Cortex applications and tools, highlighted by Cortex V.4. In addition to legacy features such as a dailies toolset, IMF and AS-02 packaging and up-res algorithms, Cortex V.4 adds DCP packaging (with integrated ACES color support), an extended edit tool and officially certified Dolby Vision metadata editing capabilities.

“We allow users to manipulate Dolby Vision HDR metadata in the same way that they edit segments of video,” says Randy Reck, MTI Film’s director of development. “In the edit tool, they can graphically trim, cut and paste, add metadata to video, analyze new segments that need metadata and adjust parameters within the Dolby Vision metadata on a shot-by-shot basis.”

With the integration of the Dolby Vision ecosystem, Cortex V.4 provides a method for simultaneously QC-ing HDR and SDR versions of a shot with Dolby Vision metadata. For delivery, the inclusion of the Dolby Vision “IMF-like” output format allows for the rendering and delivery of edited Dolby Vision metadata alongside HDR media in one convenient package.

Cortex V.4’s Edit Tool has been updated to include industry-standard trimming and repositioning of edited segments within the timeline through a drag-and-drop function. The entire look of the Edit Tool (available in the Dailies and Enterprise editions of Cortex) has also been updated to accommodate a new dual-monitor layout, making it easier for users to scrub through media in the source monitor while keeping the composition in context in the record monitor.

MTI Film is also offering a new subscription-based DIT+ edition of Cortex. “It doesn’t make sense for productions to purchase a full software package if their schedule includes a hiatus when it won’t be used,” explains Reck.

DIT+ contains all aspects of the free DIT version of Cortex with the added ability to render HD ProRes, DNx and H.264 files for delivery. A DIT+ subscription starts at $95 per month, and MTI Film is offering a special NAB price of $595 for the first year.

Assimilate Scratch and Scratch VR Suite upgraded to V.8.6

Assimilate is now offering an open beta for Scratch 8.6 and the Scratch VR Suite 8.6, the latest versions of its realtime post tools and workflow — VR/360 and 2D/3D content, from dailies to conform grading, compositing and finishing. Expanded HDR functions are featured throughout the product line, including in Scratch VR, which now offers stitching capabilities.

Both open beta versions gives pros the opportunity to actively use the full suite of Scratch and Scratch VR tools, while evaluating and submitting requests and recommendations for additional features or updates.

Scratch Web for cloud-based, realtime review and collaboration, and Scratch Play for immediate review and playback, are also included in the ecosystem updates. Both products support VR/360 and 2D/3D content.

Current users of the Scratch VR Suite 8.5 and Scratch Finishing 8.5 can download the Scratch 8.6 open beta. Scratch 8.6 open beta and the Scratch VR Suite open beta are available now.

“V8.6 is a major update for both Scratch and the Scratch VR Suite with significant enhancements to the HDR and ACES workflows. We’ve added stitching to the VR toolset so that creators have a complete and streamlined end-to-end VR workflow,” says Jeff Edson, CEO at Assimilate. “The open Beta helps us to continue developing the best and most useful post production features and techniques all artists need to perfect their creativity in color grading and finishing. We act on all input, much of it immediately and some in regular updates.”

Here are some details of the update:

HDR
• PQ and HLG transfer functions are now an integral part of Scratch color management.
• Scopes automatically switch to HDR mode if needed and show levels in a nit-scale; highlights any reference level that you set.
• At the project level, define the HDR mastering metadata: color space, color primaries and white levels, luminance levels and more. The metadata is automatically included in the Video HDMI interface (AJA, BMD, Bluefish444) for display.
• Static metadata has the function to calculate dynamic luminance metadata like MaxCLL and MaxFall.
• HDR footage can be published directly to YouTube with HDR metadata.

VR/360 – Scratch VR Suite
• 360 stitching functionality: load all your source media from your 360 cameras into Scratch VR and combine it to a single equirectangular image. Support for camera stitch templates: AutoPano projects, Hugin and PTStitch scripts.
• Ambisonic Audio: Scratch VR can load, set and playback ambisonic audio files to complete the 360 immersive experience.
• Video with 360 sound can be published directly to YouTube 360.
• Additional overlay handles to the existing 2D-equirectangular feature for more easily positioning 2D elements in a 360 scene.

DIT Reporting Function
• Create a report of all clips of either a timeline, a project or just a selection of shots.
• Reports include metadata, such as a thumbnail, clip-name, timecode, scene, take, comments and any metadata attached to a clip.
• Choose from predefined templates or create your own.

Mozart in the Jungle

The colorful dimensions of Amazon’s Mozart in the Jungle

By Randi Altman

How do you describe Amazon’s Mozart in the Jungle? Well, in its most basic form it’s a comedy about the changing of the guard — or maestro — at the New York Philharmonic, and the musicians that make up that orchestra. When you dig deeper you get a behind-the-scenes look at the back-biting and crazy that goes on in the lives and heads of these gifted artists.

Timothy Vincent

Timothy Vincent

Based on the novel Mozart in the Jungle: Sex, Drugs, and Classical Music by oboist Blair Tindall, the series — which won the Golden Globe last year and was nominated this year — has shot in a number of locations over its three seasons, including Mexico and Italy.

Since its inception, Mozart in the Jungle has been finishing in 4K and streaming in both SDR and HDR. We recently reached out to Technicolor’s senior color timer, Timothy Vincent, who has been on the show since the pilot to find out more about the show’s color workflow.

Did Technicolor have to gear up infrastructure-wise for the show’s HDR workflow?
We were doing UHD 4K already and were just getting our HDR workflows worked out.

What is the workflow from offline to online to color?
The dailies are done in New York based on the Alexa K1S1 709 LUT. (Technicolor On-Location Services handled dailies out of Italy, and Technicolor PostWorks in New York.) After the offline and online, I get the offline reference made with the dailies so I can look at if I have a question about what was intended.

If someone was unsure about watching in HDR versus SDR, what would you tell them?
The emotional feel of both the SDR and the HDR is the same. That is always the goal in the HDR pass for Mozart. One of the experiences that is enhanced in the HDR is the depth of field and the three-dimensional quality you gain in the image. This really plays nicely with the feel in the landscapes of Italy, the stage performances where you feel more like you are in the audience, and the long streets of New York just to name a few.

Mozart in the JungleWhen I’m grading the HDR version, I’m able to retain more highlight detail than I was in the SDR pass. For someone who has not yet been able to experience HDR, I would actually recommend that they watch an episode of the show in SDR first and then in HDR so they can see the difference between them. At that point they can choose what kind of viewing experience they want. I think that Mozart looks fantastic in both versions.

What about the “look” of the show. What kind of direction where you given?
We established the look of the show based on conversations and collaboration in my bay. It has always been a filmic look with soft blacks and yellow warm tones as the main palette for the show. Then we added in a fearlessness to take the story in and out of strong shadows. We shape the look of the show to guide the viewers to exactly the story that is being told and the emotions that we want them to feel. Color has always been used as one of the storytelling tools on the show. There is a realistic beauty to the show.

What was your creative partnership like with the show’s cinematographer, Tobias Datum?
I look forward to each episode and discovering what Tobias has given me as palette and mood for each scene. For Season 3 we picked up where we left off at the end of Season 2. We had established the look and feel of the show and only had to account for a large portion of Season 3 being shot in Italy. Making sure to feel the different quality of light and feel of the warmth and beauty of Italy. We did this by playing with natural warm skin tones and the contrast of light and shadow he was creating for the different moods and locations. The same can be said for the two episodes in Mexico in Season 2. I know now what Tobias likes and can make decisions I’m confident that he will like.

Mozart in the JungleFrom a director and cinematographer’s point of view, what kind of choices does HDR open up creatively?
It depends on if they want to maintain the same feel of the SDR or if they want to create a new feel. If they choose to go in a different direction, they can accentuate the contrast and color more with HDR. You can keep more low-light detail while being dark, and you can really create a separate feel to different parts of the show… like a dream sequence or something like that.

Any workflow tricks/tips/trouble spots within the workflow or is it a well-oiled machine at this point?
I have actually changed the way I grade my shows based on the evolution of this show. My end results are the same, but I learned how to build grades that translate to HDR much easier and consistently.

Do you have a color assistant?
I have a couple of assistants that I work with who help me with prepping the show, getting proxies generated, color tracing and some color support.

What tools do you use — monitor, software, computer, scope, etc.?
I am working on Autodesk Lustre 2017 on an HP Z840, while monitoring on both a Panasonic CZ950 and a Sony X300. I work on Omnitek scopes off the downconverter to 2K. The show is shot on both Alexa XT and Alexa Mini, framing for 16×9. All finishing is done in 4K UHD for both SDR and HDR.

Anything you would like to add?
I would only say that everyone should be open to experiencing both SDR and HDR and giving themselves that opportunity to choose which they want to watch and when.

New England SMPTE holding free session on UHD/HDR/HFR, more

The New England Section of SMPTE is holding a free day-long “New Technologies Boot Camp” that focuses on working with high resolution (UHD, 4K and beyond), high-dynamic-range (HDR) imaging and higher frame rates (HFR). In addition, they will discuss how to maintain resolution independence on screens of every size, as well as how to leverage IP and ATSC 3.0 for more efficient movement of this media content.

The boot camp will run from 9am to 9pm on May 19 at the Holiday Inn in Dedham, Massachusetts.

“These are exciting times for those of us working on the technical side of broadcasting, and the array of new formats and standards we’re facing can be a bit overwhelming,” says Martin P. Feldman, chair of SMPTE New England Section. “No one wants — or can afford — to be left behind. That’s why we’re gathering some of the industry’s foremost experts for a free boot camp designed to bring engineers up to speed on new technologies that enable more efficient creation and delivery of a better broadcast product.”

Boot camp presentations will include:

• “High-Dynamic-Range and Wide Color Gamut in Production and Distribution” by Hugo Gaggioni, chief technical officer at Sony Electronics.
• “4K/UHD/HFR/HDR — HEVC H.265 — ATSC 3.0” by Karl Kuhn of Tektronix.
• “Where Is 4K (UHD) Product Used Today — 4K Versus HFR — 4K and HFR Challenges” by Bruce Lane of Grass Valley.
• “Using MESH Networks” by Al Kornak of JVC Kenwood Corporation.
• “IP in Infrastructure-Building (Replacing HD-SDI Systems and Accommodating UHD)” by Paul Briscoe of Evertz Microsystems;
• “Scripted Versus Live Production Requirements” by Michael Bergeron of Panasonic.
• “The Transition from SDI to IP, Including IP Infrastructure and Monitoring” by John Shike of SAM (formerly Snell/Quantel).
• “8K, High-Dynamic-Range, OLED, Flexible Displays” by consultant Peter Putman.
• “HDR: The Great, the Okay, and the WTF” by Mark Schubin, engineer-in-charge at the Metropolitan Opera, Sesame Street and Great Performances (PBS).

The program will conclude with a panel discussion by the program’s presenters.

 No RSVP is required, and both SMPTE members and non-members are welcome.

NAB 2016 from an EP’s perspective

By Tara Holmes

Almost two weeks ago, I found myself at NAB for the first time. I am the executive producer of color and finishing at Nice Shoes, a post production studio in New York City. I am not an engineer and I am not an artist, so why would an EP go to NAB? I went because one of my main goals for 2016 is to make sure the studio remains at the forefront of technology. While I feel that our engineering team and artists represent us well in that respect, I wanted to make sure that I, along with our producers, were fully educated on these emerging technologies.

One of our first priorities for NAB was to meet with top monitor manufacturers to hopefully land on what UHD HDR monitors we would find to meet our standards for professional client viewing. We came to the conclusion that the industry is not there yet and we have more research to do before we upgrade our studio viewing environments.

Everyone with me was in agreement. They aren’t where they need to be. Most are only outputting around 400-800 nits and are experiencing luminance and contrast issues. None of this should stop the process of coloring for HDR. For the master monitor for the colorist, the Sony BVM-X300 OLED master monitor, which we are currently using, seems to be the ideal choice as you can still work in traditional Rec 709 as well as Rec 2020 for HDR.

After checking out some monitors, we headed to the FilmLight booth to go over the 5.0 upgrades to Baselight. Our colorist Ron Sudul, along with Nice Shoes Creative Studio VFX supervisor Adrian Winter, sat with myself and the FilmLight reps to discuss the upgrades, which included incredible new isolation tracking capabilities.  These upgrades will reinvent what can be achieved in the color suite: from realtime comps to retouch being done in color. The possibilities are exciting.

I also spent time learning about the upgrades to Filmlight’s Flip, which is their on-set color hardware. The Flip can allow you to develop your color look on set, apply it during your edit process (with the Baselight plug-in for Avid) and refine it in final color, all without affecting your RAW files. In addition to the Flip, they developed a software that supports on-set look development and grading called Prelight. I asked if these new technologies could enable us to even do high-end things like sky replacements on set and was told that the hardware within the Flip very well could.

We also visited our friends at DFT, the manufacturers of the Scanity film scanner, to catch up and discuss the business of archiving. With Scanity, Nice Shoes can scan 4K when other scanners only scan up to 2K resolution. This is a vital tool in not only preserving past materials, but in future proofing for emerging formats when archiving scans from film.

VR
On Sunday evening before the exhibits opened, we attended a panel on VR that was hosted by the Foundry. At this event we got to experience a few of the most talked about VR projects including Defrost, one of the first narrative VR films, from the director of Grease, Randal Kleiser, who was on the panel along with moderator Morris May (CEO/founder, Specular Theory), Bryn Mooser (co-founder, RYOT), Tim Dillon (executive producer, MPC) and Jake Black (head of VR, Create Advertising).

The Foundry’s VR panel.

The panel inspired me to delve deeper into the VR world, and on Wednesday I spent most of my last day exploring the Virtual & Augmented Reality Pavilion. In addition to seeing the newest VR camera rig offerings and experiencing a live VR feed, as well as demo-ing the Samsung Gear, I explored viewing options for the color workflow. Some people I spoke to mentioned that multiple Oculus set-ups all attached to a single feed was the way to go for color workflow, but another option that we did a very preliminary exploration of was the “dome” possibility, which offers a focused 180-degree view for everyone involved to comment on the same section of a VR scene. This would enable all involved to be sure they are experiencing and viewing the same thing at the same time.

HDR Workflow
Another panel we attended was about HDR workflows. Nice Shoes has already had the opportunity to work on HDR material and have begun to develop workflows for this emerging medium. Most HDR deliverables are for episodic and long form for such companies as Netflix, Hulu and the like. It may be some time before commercial clients are requesting an HDR deliverable, but the workflows will be much the same so the development being performed now is extremely valuable.

My biggest take away was that there are still no set standards. There’s Dolby Vision vs. HDR 10 vs. PQ vs. others. But it appears that everyone agrees that standards are not needed right now. We need to get tools into the hands of the artists and figure out what works best. Standards will come out of that. The good news is that we appear to be future-proofed for the standard to change. Meaning for the most part, every camera we are shooting on is shooting for HDR and should standards change — say from 1000 nits to 10,000 nits — the footage and process is still there to go back in and color for the new request.

Summing Up
I truly believe my time spent at NAB has prepared me for the myriad of questions that will be put forth throughout the year and will help us develop our workflows to evolve the creative process of post. I’ll be sure to be there again next year in order to prepare myself for the questions of 2017 and beyond.

Our Main Image: The view walking into the South Hall Lower at the LVCC.

Quick Chat: SGO CEO Miguel Angel Doncel

By Randi Altman

When I first happened upon Spanish company SGO, they were giving demos of their Mistika system on a small stand in the back of the post production hall at IBC. That was about eight years ago. Since then, the company has grown its Mistika DI finishing system, added a new product called Mamba FX, and brought them both to the US and beyond.

With NAB fast approaching, I thought I would check in with SGO CEO Miguel Angel Doncel to find out how the company began, where they are now and where they are going. I also checked in about some industry trends.

Can you talk about the genesis of your company and the Mistika product?
SGO was born out of a technically oriented mentality to find the best ways to use open architectures and systems to improve media content creation processes. That is not a challenging concept today, but it was an innovative view in 1993 when most of the equipment used in the industry was proprietary hardware. The idea of using computers to replace proprietary solutions was the reason SGO was founded.

It seems you guys were ahead of the curve in terms of one product that could do many things. Was that your goal from the outset?
Ten years ago, most of the manufacturers approached the industry with a set of different solutions to address different parts of the workflow; this gave us an opportunity to capitalize on improving the workflow, as disjointed solutions imply inefficient workflows due to their linearity/sequentiality.

We always thought that by improving the workflow, our technology would be able to play in all those arenas without having to change the tools. Making the workflow parallel and saving time when a problem is detected avoids going backwards in the pipeline, and we can focus moving forward.

I think after so many years, the industry is saying we were right, and all are going in that direction.

How is SGO addressing HDR?
We are excited about HDR, as it really improves the visual experience, but at the same time it is a big challenge to define a workflow that can work in both HDR and SDR in a smooth way. Our solution to that challenge is the four-dimensional grading that is implemented with our 4th ball. This allows the colorist to work not only in the three traditional dimensions — R, G and B — but also to work in the highlights as a parallel dimension.

What about VR?
VR pieces together all the requirements of the most demanding 3D with the requirements of 360. Considering what SGO already offers in stereo 3D production, we feel we are well positioned to provide a 360/VR solution. For that reason, we want to introduce a specific workflow for VR that helps customers to work on VR projects, addressing the most difficult requirements, such as discontinuities in the poles, or dealing with shapes.

The new VR mode we are preparing for Mistika 8.7 will be much more than a VR visualization tool. It will allow users to work in VR environments the same way they would work in a normal production. Not having to worry about circles ending up being highly distorted ellipses and so forth.

What do you see as the most important trends happening in post and production currently?
The industry is evolving in many different directions at the moment — 8K realtime, 4K/UHD, HDR, HFR, dual-stream stereo/VR. These innovations improve and enhance the audience’s experience in many different ways. They are all interesting individually, but the most vital aspect for us is that all of them actually have something in common — they all require a very smart way of how to deal with increasing bandwidths. We believe that a variety of content will use different types of innovation relevant to the genre.

Where do you see things moving in the future?
I personally envision a lot more UHD, HDR and VR material in the near future. The technology is evolving in a direction that can really make the entertainment experience very special for audiences, leaving a lot of room to still evolve. An example is the Quantum Break game from Remedy Studios/Microsoft, where the actual users’ experience is part of the story. This is where things are headed.

I think the immersive aspect is the challenge and goal. The reason why we all exist in this industry is to make people enjoy what they see, and all these tools and formulas combined together form a great foundation on which to build realistic experiences.