Tag Archives: HDR

Assimilate Scratch and Scratch VR Suite upgraded to V.8.6

Assimilate is now offering an open beta for Scratch 8.6 and the Scratch VR Suite 8.6, the latest versions of its realtime post tools and workflow — VR/360 and 2D/3D content, from dailies to conform grading, compositing and finishing. Expanded HDR functions are featured throughout the product line, including in Scratch VR, which now offers stitching capabilities.

Both open beta versions gives pros the opportunity to actively use the full suite of Scratch and Scratch VR tools, while evaluating and submitting requests and recommendations for additional features or updates.

Scratch Web for cloud-based, realtime review and collaboration, and Scratch Play for immediate review and playback, are also included in the ecosystem updates. Both products support VR/360 and 2D/3D content.

Current users of the Scratch VR Suite 8.5 and Scratch Finishing 8.5 can download the Scratch 8.6 open beta. Scratch 8.6 open beta and the Scratch VR Suite open beta are available now.

“V8.6 is a major update for both Scratch and the Scratch VR Suite with significant enhancements to the HDR and ACES workflows. We’ve added stitching to the VR toolset so that creators have a complete and streamlined end-to-end VR workflow,” says Jeff Edson, CEO at Assimilate. “The open Beta helps us to continue developing the best and most useful post production features and techniques all artists need to perfect their creativity in color grading and finishing. We act on all input, much of it immediately and some in regular updates.”

Here are some details of the update:

HDR
• PQ and HLG transfer functions are now an integral part of Scratch color management.
• Scopes automatically switch to HDR mode if needed and show levels in a nit-scale; highlights any reference level that you set.
• At the project level, define the HDR mastering metadata: color space, color primaries and white levels, luminance levels and more. The metadata is automatically included in the Video HDMI interface (AJA, BMD, Bluefish444) for display.
• Static metadata has the function to calculate dynamic luminance metadata like MaxCLL and MaxFall.
• HDR footage can be published directly to YouTube with HDR metadata.

VR/360 – Scratch VR Suite
• 360 stitching functionality: load all your source media from your 360 cameras into Scratch VR and combine it to a single equirectangular image. Support for camera stitch templates: AutoPano projects, Hugin and PTStitch scripts.
• Ambisonic Audio: Scratch VR can load, set and playback ambisonic audio files to complete the 360 immersive experience.
• Video with 360 sound can be published directly to YouTube 360.
• Additional overlay handles to the existing 2D-equirectangular feature for more easily positioning 2D elements in a 360 scene.

DIT Reporting Function
• Create a report of all clips of either a timeline, a project or just a selection of shots.
• Reports include metadata, such as a thumbnail, clip-name, timecode, scene, take, comments and any metadata attached to a clip.
• Choose from predefined templates or create your own.

Mozart in the Jungle

The colorful dimensions of Amazon’s Mozart in the Jungle

By Randi Altman

How do you describe Amazon’s Mozart in the Jungle? Well, in its most basic form it’s a comedy about the changing of the guard — or maestro — at the New York Philharmonic, and the musicians that make up that orchestra. When you dig deeper you get a behind-the-scenes look at the back-biting and crazy that goes on in the lives and heads of these gifted artists.

Timothy Vincent

Timothy Vincent

Based on the novel Mozart in the Jungle: Sex, Drugs, and Classical Music by oboist Blair Tindall, the series — which won the Golden Globe last year and was nominated this year — has shot in a number of locations over its three seasons, including Mexico and Italy.

Since its inception, Mozart in the Jungle has been finishing in 4K and streaming in both SDR and HDR. We recently reached out to Technicolor’s senior color timer, Timothy Vincent, who has been on the show since the pilot to find out more about the show’s color workflow.

Did Technicolor have to gear up infrastructure-wise for the show’s HDR workflow?
We were doing UHD 4K already and were just getting our HDR workflows worked out.

What is the workflow from offline to online to color?
The dailies are done in New York based on the Alexa K1S1 709 LUT. (Technicolor On-Location Services handled dailies out of Italy, and Technicolor PostWorks in New York.) After the offline and online, I get the offline reference made with the dailies so I can look at if I have a question about what was intended.

If someone was unsure about watching in HDR versus SDR, what would you tell them?
The emotional feel of both the SDR and the HDR is the same. That is always the goal in the HDR pass for Mozart. One of the experiences that is enhanced in the HDR is the depth of field and the three-dimensional quality you gain in the image. This really plays nicely with the feel in the landscapes of Italy, the stage performances where you feel more like you are in the audience, and the long streets of New York just to name a few.

Mozart in the JungleWhen I’m grading the HDR version, I’m able to retain more highlight detail than I was in the SDR pass. For someone who has not yet been able to experience HDR, I would actually recommend that they watch an episode of the show in SDR first and then in HDR so they can see the difference between them. At that point they can choose what kind of viewing experience they want. I think that Mozart looks fantastic in both versions.

What about the “look” of the show. What kind of direction where you given?
We established the look of the show based on conversations and collaboration in my bay. It has always been a filmic look with soft blacks and yellow warm tones as the main palette for the show. Then we added in a fearlessness to take the story in and out of strong shadows. We shape the look of the show to guide the viewers to exactly the story that is being told and the emotions that we want them to feel. Color has always been used as one of the storytelling tools on the show. There is a realistic beauty to the show.

What was your creative partnership like with the show’s cinematographer, Tobias Datum?
I look forward to each episode and discovering what Tobias has given me as palette and mood for each scene. For Season 3 we picked up where we left off at the end of Season 2. We had established the look and feel of the show and only had to account for a large portion of Season 3 being shot in Italy. Making sure to feel the different quality of light and feel of the warmth and beauty of Italy. We did this by playing with natural warm skin tones and the contrast of light and shadow he was creating for the different moods and locations. The same can be said for the two episodes in Mexico in Season 2. I know now what Tobias likes and can make decisions I’m confident that he will like.

Mozart in the JungleFrom a director and cinematographer’s point of view, what kind of choices does HDR open up creatively?
It depends on if they want to maintain the same feel of the SDR or if they want to create a new feel. If they choose to go in a different direction, they can accentuate the contrast and color more with HDR. You can keep more low-light detail while being dark, and you can really create a separate feel to different parts of the show… like a dream sequence or something like that.

Any workflow tricks/tips/trouble spots within the workflow or is it a well-oiled machine at this point?
I have actually changed the way I grade my shows based on the evolution of this show. My end results are the same, but I learned how to build grades that translate to HDR much easier and consistently.

Do you have a color assistant?
I have a couple of assistants that I work with who help me with prepping the show, getting proxies generated, color tracing and some color support.

What tools do you use — monitor, software, computer, scope, etc.?
I am working on Autodesk Lustre 2017 on an HP Z840, while monitoring on both a Panasonic CZ950 and a Sony X300. I work on Omnitek scopes off the downconverter to 2K. The show is shot on both Alexa XT and Alexa Mini, framing for 16×9. All finishing is done in 4K UHD for both SDR and HDR.

Anything you would like to add?
I would only say that everyone should be open to experiencing both SDR and HDR and giving themselves that opportunity to choose which they want to watch and when.

New England SMPTE holding free session on UHD/HDR/HFR, more

The New England Section of SMPTE is holding a free day-long “New Technologies Boot Camp” that focuses on working with high resolution (UHD, 4K and beyond), high-dynamic-range (HDR) imaging and higher frame rates (HFR). In addition, they will discuss how to maintain resolution independence on screens of every size, as well as how to leverage IP and ATSC 3.0 for more efficient movement of this media content.

The boot camp will run from 9am to 9pm on May 19 at the Holiday Inn in Dedham, Massachusetts.

“These are exciting times for those of us working on the technical side of broadcasting, and the array of new formats and standards we’re facing can be a bit overwhelming,” says Martin P. Feldman, chair of SMPTE New England Section. “No one wants — or can afford — to be left behind. That’s why we’re gathering some of the industry’s foremost experts for a free boot camp designed to bring engineers up to speed on new technologies that enable more efficient creation and delivery of a better broadcast product.”

Boot camp presentations will include:

• “High-Dynamic-Range and Wide Color Gamut in Production and Distribution” by Hugo Gaggioni, chief technical officer at Sony Electronics.
• “4K/UHD/HFR/HDR — HEVC H.265 — ATSC 3.0” by Karl Kuhn of Tektronix.
• “Where Is 4K (UHD) Product Used Today — 4K Versus HFR — 4K and HFR Challenges” by Bruce Lane of Grass Valley.
• “Using MESH Networks” by Al Kornak of JVC Kenwood Corporation.
• “IP in Infrastructure-Building (Replacing HD-SDI Systems and Accommodating UHD)” by Paul Briscoe of Evertz Microsystems;
• “Scripted Versus Live Production Requirements” by Michael Bergeron of Panasonic.
• “The Transition from SDI to IP, Including IP Infrastructure and Monitoring” by John Shike of SAM (formerly Snell/Quantel).
• “8K, High-Dynamic-Range, OLED, Flexible Displays” by consultant Peter Putman.
• “HDR: The Great, the Okay, and the WTF” by Mark Schubin, engineer-in-charge at the Metropolitan Opera, Sesame Street and Great Performances (PBS).

The program will conclude with a panel discussion by the program’s presenters.

 No RSVP is required, and both SMPTE members and non-members are welcome.

NAB 2016 from an EP’s perspective

By Tara Holmes

Almost two weeks ago, I found myself at NAB for the first time. I am the executive producer of color and finishing at Nice Shoes, a post production studio in New York City. I am not an engineer and I am not an artist, so why would an EP go to NAB? I went because one of my main goals for 2016 is to make sure the studio remains at the forefront of technology. While I feel that our engineering team and artists represent us well in that respect, I wanted to make sure that I, along with our producers, were fully educated on these emerging technologies.

One of our first priorities for NAB was to meet with top monitor manufacturers to hopefully land on what UHD HDR monitors we would find to meet our standards for professional client viewing. We came to the conclusion that the industry is not there yet and we have more research to do before we upgrade our studio viewing environments.

Everyone with me was in agreement. They aren’t where they need to be. Most are only outputting around 400-800 nits and are experiencing luminance and contrast issues. None of this should stop the process of coloring for HDR. For the master monitor for the colorist, the Sony BVM-X300 OLED master monitor, which we are currently using, seems to be the ideal choice as you can still work in traditional Rec 709 as well as Rec 2020 for HDR.

After checking out some monitors, we headed to the FilmLight booth to go over the 5.0 upgrades to Baselight. Our colorist Ron Sudul, along with Nice Shoes Creative Studio VFX supervisor Adrian Winter, sat with myself and the FilmLight reps to discuss the upgrades, which included incredible new isolation tracking capabilities.  These upgrades will reinvent what can be achieved in the color suite: from realtime comps to retouch being done in color. The possibilities are exciting.

I also spent time learning about the upgrades to Filmlight’s Flip, which is their on-set color hardware. The Flip can allow you to develop your color look on set, apply it during your edit process (with the Baselight plug-in for Avid) and refine it in final color, all without affecting your RAW files. In addition to the Flip, they developed a software that supports on-set look development and grading called Prelight. I asked if these new technologies could enable us to even do high-end things like sky replacements on set and was told that the hardware within the Flip very well could.

We also visited our friends at DFT, the manufacturers of the Scanity film scanner, to catch up and discuss the business of archiving. With Scanity, Nice Shoes can scan 4K when other scanners only scan up to 2K resolution. This is a vital tool in not only preserving past materials, but in future proofing for emerging formats when archiving scans from film.

VR
On Sunday evening before the exhibits opened, we attended a panel on VR that was hosted by the Foundry. At this event we got to experience a few of the most talked about VR projects including Defrost, one of the first narrative VR films, from the director of Grease, Randal Kleiser, who was on the panel along with moderator Morris May (CEO/founder, Specular Theory), Bryn Mooser (co-founder, RYOT), Tim Dillon (executive producer, MPC) and Jake Black (head of VR, Create Advertising).

The Foundry’s VR panel.

The panel inspired me to delve deeper into the VR world, and on Wednesday I spent most of my last day exploring the Virtual & Augmented Reality Pavilion. In addition to seeing the newest VR camera rig offerings and experiencing a live VR feed, as well as demo-ing the Samsung Gear, I explored viewing options for the color workflow. Some people I spoke to mentioned that multiple Oculus set-ups all attached to a single feed was the way to go for color workflow, but another option that we did a very preliminary exploration of was the “dome” possibility, which offers a focused 180-degree view for everyone involved to comment on the same section of a VR scene. This would enable all involved to be sure they are experiencing and viewing the same thing at the same time.

HDR Workflow
Another panel we attended was about HDR workflows. Nice Shoes has already had the opportunity to work on HDR material and have begun to develop workflows for this emerging medium. Most HDR deliverables are for episodic and long form for such companies as Netflix, Hulu and the like. It may be some time before commercial clients are requesting an HDR deliverable, but the workflows will be much the same so the development being performed now is extremely valuable.

My biggest take away was that there are still no set standards. There’s Dolby Vision vs. HDR 10 vs. PQ vs. others. But it appears that everyone agrees that standards are not needed right now. We need to get tools into the hands of the artists and figure out what works best. Standards will come out of that. The good news is that we appear to be future-proofed for the standard to change. Meaning for the most part, every camera we are shooting on is shooting for HDR and should standards change — say from 1000 nits to 10,000 nits — the footage and process is still there to go back in and color for the new request.

Summing Up
I truly believe my time spent at NAB has prepared me for the myriad of questions that will be put forth throughout the year and will help us develop our workflows to evolve the creative process of post. I’ll be sure to be there again next year in order to prepare myself for the questions of 2017 and beyond.

Our Main Image: The view walking into the South Hall Lower at the LVCC.

Quick Chat: SGO CEO Miguel Angel Doncel

By Randi Altman

When I first happened upon Spanish company SGO, they were giving demos of their Mistika system on a small stand in the back of the post production hall at IBC. That was about eight years ago. Since then, the company has grown its Mistika DI finishing system, added a new product called Mamba FX, and brought them both to the US and beyond.

With NAB fast approaching, I thought I would check in with SGO CEO Miguel Angel Doncel to find out how the company began, where they are now and where they are going. I also checked in about some industry trends.

Can you talk about the genesis of your company and the Mistika product?
SGO was born out of a technically oriented mentality to find the best ways to use open architectures and systems to improve media content creation processes. That is not a challenging concept today, but it was an innovative view in 1993 when most of the equipment used in the industry was proprietary hardware. The idea of using computers to replace proprietary solutions was the reason SGO was founded.

It seems you guys were ahead of the curve in terms of one product that could do many things. Was that your goal from the outset?
Ten years ago, most of the manufacturers approached the industry with a set of different solutions to address different parts of the workflow; this gave us an opportunity to capitalize on improving the workflow, as disjointed solutions imply inefficient workflows due to their linearity/sequentiality.

We always thought that by improving the workflow, our technology would be able to play in all those arenas without having to change the tools. Making the workflow parallel and saving time when a problem is detected avoids going backwards in the pipeline, and we can focus moving forward.

I think after so many years, the industry is saying we were right, and all are going in that direction.

How is SGO addressing HDR?
We are excited about HDR, as it really improves the visual experience, but at the same time it is a big challenge to define a workflow that can work in both HDR and SDR in a smooth way. Our solution to that challenge is the four-dimensional grading that is implemented with our 4th ball. This allows the colorist to work not only in the three traditional dimensions — R, G and B — but also to work in the highlights as a parallel dimension.

What about VR?
VR pieces together all the requirements of the most demanding 3D with the requirements of 360. Considering what SGO already offers in stereo 3D production, we feel we are well positioned to provide a 360/VR solution. For that reason, we want to introduce a specific workflow for VR that helps customers to work on VR projects, addressing the most difficult requirements, such as discontinuities in the poles, or dealing with shapes.

The new VR mode we are preparing for Mistika 8.7 will be much more than a VR visualization tool. It will allow users to work in VR environments the same way they would work in a normal production. Not having to worry about circles ending up being highly distorted ellipses and so forth.

What do you see as the most important trends happening in post and production currently?
The industry is evolving in many different directions at the moment — 8K realtime, 4K/UHD, HDR, HFR, dual-stream stereo/VR. These innovations improve and enhance the audience’s experience in many different ways. They are all interesting individually, but the most vital aspect for us is that all of them actually have something in common — they all require a very smart way of how to deal with increasing bandwidths. We believe that a variety of content will use different types of innovation relevant to the genre.

Where do you see things moving in the future?
I personally envision a lot more UHD, HDR and VR material in the near future. The technology is evolving in a direction that can really make the entertainment experience very special for audiences, leaving a lot of room to still evolve. An example is the Quantum Break game from Remedy Studios/Microsoft, where the actual users’ experience is part of the story. This is where things are headed.

I think the immersive aspect is the challenge and goal. The reason why we all exist in this industry is to make people enjoy what they see, and all these tools and formulas combined together form a great foundation on which to build realistic experiences.

Reid Burns named president of post at Cognition

Hollywood-based post studio Cognition has hired Reid Burns as president of post production. In his new role, Burns is tasked with building relationships with studios, independent producers and filmmakers. His initial focus will be on growing the company’s digital intermediate finishing and color grading business.

Launched last fall, Cognition is nearing completion of a multi-million-dollar expansion of its creative campus in Hollywood that will include the addition of two 4K digital cinema finishing theaters, a scalable visual effects pipeline, creative office space and an emerging technologies research facility.

Burns began his career as a color timer and has nearly 100 films to his credit. He later managed laboratory operations at Deluxe. That was followed by tenures as director of global sales and technology at Consolidated Film Industries, SVP at Fotokem and COO at Reliance MediaWorks. His work at Reliance included oversight of 3D work for such titles as Avatar, Prometheus and Men in Black 3. He also founded the VFX and title post house The Image Resolution and served as CEO of the VFX/DI and boutique post house Ollin Studio in Mexico City. He joined Mexico City’s Labodigital in 2012 as SVP. Burns is an associate member of the American Cinematographers Society.

Commenting on Cognition’s adoption of SGO’s Mistika platform, he says, “We are ready for EDR (Extended Dynamic Range), planning ahead for HDR and using the ACES workflow on our current feature. We are also thrilled with Mistika’s robust stereo tools and look forward to projects that are a good fit in that arena.”

Among Burns’ first tasks will be to add to the facility’s DI staff. New hires are expected to include a senior DI producer, DI colorist and a business development specialist.

New version of Media Composer supports HDR

Avid’s latest version of its Avid Media Composer editing system offers support for high dynamic range (HDR) workflows, so users can now edit and grade projects using new color specs that display a greater dynamic range than standard video. The new version also features a more intuitive interface, making the tool more friendly to those editors who also work on Premiere Pro or Final Cut.

With what the company calls “better-organized tools,” the Media Composer’s tools and interface elements are now organized in a more logical order, enabling editors to work more efficiently and intuitively, regardless of system they have worked on before.

The new version of Media Composer also enables editors to work up to 64 audio tracks — 250 percent more than previously available — and delivers more power under the hood, allowing editors to work faster and focus on the creative.

The new version of Media Composer — which allows users to work in HD, 2K, 4K, 8K and HDR — is now available to purchase from the Avid Store and as a free download to current Media Composer customers with an active Upgrade and Support Plan or Subscription.

Here are some more details of the new version:
• Faster editing and better-organized tools – Users can access key tools and features faster thanks to several updates to menus and drop downs that make accessing tools more intuitive, productive, and fun. This delivers even greater efficiency, enabling editors to focus more time on their creative storytelling.
• Better visual feedback when editing – Users can edit with more precision thanks to high-visibility feedback displayed as they edit in the timeline.
• Ability to straighten images quickly with FrameFlex rotation – Users can rotate images a little or a lot by rotating the framing box in FrameFlex.
• Better performance – all played frames — and all effects applied to those clips — are now cached in RAM. This allows for a smoother, stutter-free edit when scrubbing or playing back a complex sequence multiple times.

Other enhancements include: full screen playback mode; sync Broadcast Wave audio files with video clips with subframe accuracy; add one or more custom columns to a bin easily through a contextual menu; copy and paste frame numbers in bin columns; find and filter effects faster and easier with the updated Effects palette; rename, edit and delete custom project presets with the Preset Manager; use Media Composer on OSX 10.11 (El Capitan) and Windows 10 computers; group master clips using audio waveform analysis; start a frame count at “1” instead of “0” (zero) for timecode burn-in segments; resize and configure the Audio Mixer for your project at hand; preserve field recorder metadata across multiple audio tracks when importing/linking

Quick Chat: FilmLight CEO Wolfgang Lempp on HDR

FilmLight, creator of the popular BaseLight color grading system, has been making products targeting color since 2002. Over the years they have added other products that surround the color workflow, such as image processing applications and on-set tools for film and television.

With high dynamic range (HDR) a hot topic among those making tools for production and post and those who believe in HDR’s future, we reached out to FilmLight CEO and co-founder Wolfgang Lempp to pick his brain about the benefits of HDR and extended color gamut, and what we need to do to make it a reality.

Are you a fan of HDR?
Definitely. It opens up more creative possibilities, and it adds depth to the picture. Not everything benefits from looking more real, but the real world is certainly HDR. There is a certain aesthetic to dim highlights, as there is to black-and-white photography, but that is no justification to stick with black-and-white television, or with dim displays.

And consumers will appreciate the benefit of HDR too. When they walk into an electronics store and see a couple of HDR televisions among the standard screens, they will leap out as being clearly better. That is very different from stereo 3D technology, and it will drive the adoption of HDR in a big way.

So, what will it take to get HDR to consumers?
High-end cameras have been HDR for quite a while. It is just that we have compressed the output to make it look okay on standard displays. We now have the displays, and we are starting to get the projectors, too. The biggest obstacle is the infrastructure in between, and the implications regarding the proposed standards.

So there will be a time of confusion, as well as a time for bad HDR, before the dust will settle. And sadly, like with 4K and UHD, we probably end up with two different standards for film and TV. The big question at the moment is whether the least disruptive method, which uses the same signal for both standard dynamic range and HDR displays, will be all we can realistically hope for in TV at this point, and whether that is actually good enough.

Samsung’s HDR-ready KS9500 SUHD TV with Quantum dot display.

Is the SMPTE PQ standard the answer?
SMPTE 2084 — which formalizes the Dolby Perceptual Quantization (PQ) concept — is already in use and has its merits, certainly for movies where you can send the right version to each cinema. But it is a bit too forward-looking for the broadcast industry, which prefers to send a common signal to both standard dynamic range and HDR displays at home.

The existing broadcast infrastructure can be made to work with the current generation of HDR displays, and that might well be good enough for many years to come. SMPTE PQ is looking further into the future, but ironically the projection technology for cinema is trailing behind in terms of absolute brightness, so for the foreseeable future there is even less of a need to provide for that extra dynamic range.

The critical issue in the short term is banding of contours, not in the very dark parts of the image which we are all familiar with, but in the mid-range. PQ is the safer bet in that respect, but it needs a higher bit depth than the broadcast distribution channels are offering.

BBC in the UK and NHK, the Japanese national broadcaster, have put forward a proposal for a hybrid logarithmic and gamma encoding that could be a reasonable compromise for broadcasting, but it remains to be seen if it is a compromise too far when a wide variety of HDR content becomes available. It would be a shame if we end up with a long list of do’s and don’ts to make the images look acceptable.

At FilmLight, we support both standards, and if the industry can agree on something better, we will of course support that too. Our interest is in taking the technical limitations away from post and allowing people to concentrate on creativity.

HDR broadcast at CES 2016.

HDR broadcast at CES 2016.

What happens when an HDR signal reaches televisions in the home?
The real concern is set up — because to see the benefits you have to set things up correctly. And a relatively subtle shift, like extended color gamut or a not-so-subtle shift like HDR, has the capability of being badly configured.

When we moved from 4×3 to 16×9 displays, many people didn’t bother to adjust the screens correctly, so 4×3 content was stretched, making everyone looked squashed and fat. Even today, that problem hasn’t gone away completely. Whatever system is in place for delivering HDR to the home, it has to be simple to set up accurately for whatever receiving device the consumer chooses to use.

Some colorists are expressing concern about working with HDR and eye strain. Is this a serious issue?
The real world is HDR. Go outside into the sunshine and see what extended color and dynamics really means. The new generation of displays deliver only a pale imitation of this reality. Our eyes and brain have the ability to adjust over an amazingly wide range.

The serious point is that HDR should help to create more realistic, as well as more engaging and enticing pictures. If all we do with HDR is make the highlights brighter then it has failed as an addition to the creative toolset.

Dolby's HDR offering at CES.

Dolby’s HDR offering at CES.

Colorists today are used to working in a very dim environment. It will be different in the future, and it will take some time to get used to, but I think we all have faced more serious challenges.

What do you think the timeframe is for HDR?
It is already happening. Movies are out there and television is ready to go. NAB 2015 saw the gee-whiz demonstrations and NAB 2016 will see workable, affordable, practical solutions. January’s CES featured many HDR-ready displays on show, so there is real pressure on the broadcasters to provide the content.

If it is used carefully and creatively, I am very excited by the prospect, and I believe viewers will absolutely love it.

 

Colorfront demos UHD HDR workflows at SMPTE 2015

Colorfront used the SMPTE 2015 Conference in Hollywood to show off the capabilities of its upcoming 2016 products supporting UHD/HDR workflows. New products include the Transkoder 2016 and On-Set Dailies 2016. Upgrades allow for faster, more flexible processing of the latest UHD HDR camera, color, editorial and deliverables formats for digital cinema, high-end episodic TV and OTT Internet entertainment channels.

Colorfront’s Bruno Munger filled us in on some of the highlights:

More details:
·   Transkoder and On-Set Dailies feature Colorfront Engine, an ACES-compliant, HDR-managed color pipeline, enabling on-set look creation and ensuring color fidelity of UHD/HDR materials and metadata though the camera-to-post chain. Colorfront Engine supports the full dynamic range and color gamut of the latest digital camera formats and mapping into industry-standard deliverables such as the latest IMF specs, AS-11 DPP and HEVC, at a variety of brightness, contrast and color ranges in current display devices.
·   The mastering toolset for Transkoder 2016 is enhanced with new statistical analysis tools for immediate HDR data graphing. Highlights include MaxCLL and MaxFALL calculations, as well as HDR mastering tools with tone and gamut mapping for a variety of target color spaces, including Rec. 2020 and P3D65, as well as XYZ, PQ curve and BBC-NHK Hybrid Log Gamma.
·    New for Transkoder 2016 are tools to concurrently color grade HDR and SDR UHD versions, cutting down the complexity, time and cost of delivering multiple masters at once.
·    Transkoder 2016 will output simultaneous, realtime grades on 4K 60p material to dual Sony OLED BVM-X300 broadcast monitors — concurrently processing HDR 2084 PQ Rec. 2020 at 1000nits and SDR Rec. 709 at 100nits — while visually graphing MaxFALL/MaxCLL light values per frame.

Advanced dailies toolsets enhancements include:
·    Support for the latest camera formats, including full Panasonic Varicam35 VRAW, AVC Intra 444, 422 and LT support, Canon EOS C300 Mark II with new Canon Log2 Gamma, ARRI Alexa 65 and Alexa SXT, Red Weapon, Sony XAVC and the associated image metadata from all of these.
·    The new Multi-view Dailies capability for On-Set Dailies 2016, which allows concurrent, realtime playback and color grading of all cameras and camera views.
·    Transwrapping, which allows video essence data (the RAW, compressed audio/video and metadata inside a container such as MXF or MOV) to be passed through the transcoding process without re-encoding, enabling frame-accurate insert editing on closed digital deliverables. This workflow can be a great time saver in day-to-day production, allowing Transkoder users to quickly generate new masters based on changes and versioning of content in the major mastering formats, like IMF, DCI and ProRes, and efficient trimming of camera original media for VFX pulls and final conform from Arri, Red and Sony cameras.

Dolby Cinema combines 
HDR video, immersive surround sound

By Mel Lambert

In addition to its advances in immersive surround sound, culminating in the new object-based Atmos format for theatrical and consumer playback, Dolby remains committed to innovating video solutions for the post and digital cinema communities.

Leveraging video technologies developed for high-resolution video monitors targeted at on-location, colorist and QC displays, the company also has been developing Dolby Cinema, which combines proprietary high dynamic range (HDR) Dolby Vision with Dolby Atmos immersive sound playback.

The first Dolby Cinema installations comprise a joint venture with AMC Entertainment — the nation’s second-largest theater chain — and, according AMC’s EVP of US operations, John McDonald, the companies are planning to unveil up to 100 such “Dolby Cinema at AMC Prime” theaters around the world within the next decade. To date, approximately a dozen such premium large format (PLF) locations have opened in the US and Europe.

Dolby Vision requires two specially modified, HDR Christie Digital 4K laser projectors, together with state-of-the-art optics and image processing, to provide an HDR output with light levels significantly greater than conventional Xenon digital projectors. Dolby Vision’s HDR output, with enhanced color technology, has been lauded by filmmakers for its enhanced contrast, high brightness and gamut range that is said to more closely match human vision.

Unique to the Dolby Vision projection system, beyond its brightness and vivid color reproduction, is its claimed ability to deliver HDR images with an extended contrast ratio that exceeds any other image technology currently on the market. The result is described by Dolby as a “richer, more detailed viewing experience, with strikingly vivid and realistic images that transport audiences into a movie’s immersive world.”

During a recent system demo at AMC16 in Burbank, Doug Darrow, Dolby’s SVP of Cinema, said, “Today’s movie audiences have an insatiable appetite for experiences. They want to be moved, and they want to feel [the on-screen action]. The combination of our Dolby Vision technology and Dolby Atmos offers audiences an immersive audio-video experience.”

The new proprietary system offers up to 31-foot-Lamberts of screen brightness for 2D Dolby Vision content, more than twice the 14 fL required by the Digital Cinema Initiatives (DCI) specification.

Recent films released in Dolby Cinema include Sony’s The Perfect Guy; Paramount’s Mission: Impossible – Rogue Nation; Fox’s Maze Runner: The Scorch Trials; Fox’s The Martian, Warner’s Pan; and Universal’s’ Everest. Upcoming releases include Warner’s In the Heart of the Sea; Lionsgate’s The Hunger Games: Mockingjay — Part 2 and Disney’s The Jungle Book.

During a series of endorsement videos shown at the Burbank showcase, Wes Ball, director of Maze Runner: The Scorch Trials, said, “It’s the only way I want to show movies.”

The new theatrical presentation format fits into existing post workflows, according to Stuart Bowling, Dolby’s director of content and creative relations. “Digital cameras are capable of capturing images with tremendous dynamic range that is suitable for Dolby Vision, which is capable of delivering a wide P3 color gamut. Laser projection can also extend the P3 color space to exceed Rec. 2020 [ITU-R Recommendation BT.2020], which is invaluable for animation and VFW. For now, however, we will likely see filmmakers stay within the P3 gamut.”

For enhanced visual coverage, the large-format screens extend from wall to wall and floor to ceiling, with matte-back side wall and fittings to reduce ambient light scattering that can easily diminish the HDR experience. “Whereas conventional presentations offer maybe 2,000:1 contrast ratios,” Bowling stressed, “Dolby Vision offers 1,000,000:1 [dynamic range], with true, inky blacks.”

Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service. He can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.