Tag Archives: cinematographers

Our Virtual Production Roundtable

By Randi Altman

Evolve or die. That old adage, while very dramatic, fits well with the state of our current production workflows. While most productions are now shot digitally, the warmth of film is still in the back of pros’ minds. Camera makers and directors of photography often look for ways to retain that warmth in digital. Whether it’s through lighting, vintage lenses, color grading, newer technology or all of the above.

There is also the question of setting looks on-set and how 8K and HDR are affecting the picture and workflows. And let’s not forget shooting for OTT series. There is a lot to cover!

In an effort to get a variety of perspectives, we reached out to a few cinematographers and some camera manufacturers to talk trends and technology. Enjoy!

Claudio Miranda, ASC

Claudio Miranda is a Chilean cinematographer who won an Oscar for his work on Life of Pi. He also worked on The Curious Case of Benjamin Button, the first movie nominated for a cinematography Oscar that was shot entirely on digital. Other films include Oblivion, Tomorrowland and the upcoming Top Gun: Maverick.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Seems like everyone is shooting large format. Chris Nolan and Quentin Tarantino shot 65mm film for their last projects. New digital cameras such as the Alexa LF and Sony Venice cater to this demand. People seem to like the shallow depth of field of these larger format lenses.

How is HDR affecting the way things are being shot these days? Are productions shooting/monitoring HDR on-set?
For me, too much grain in HDR can be distracting. This must be moderated in the camera acquisition format choice and DI. Panning in a high-contrast environment can cause painful strobing. This can be helped in the DI and set design. HDR done well is more important than 8K or even 3D.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K can be important for VFX plates. For me, creatively it is not important, 4K is enough. The positive of 8K is just more K. The downside is that I would rather the camera companies focus on dynamic range, color latitude, sensitivity and the look and feel of the captured image instead of trying to hit a high K number. Also, there are storage and processing issues.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
I have not shot for a streaming service. I do think we need to pay attention to all deliverables and make adjustments accordingly. In the DI, I am there for the standard cinema pass, HDR pass, IMAX pass, home video pass and other formats that arise.

Is the availability of all those camera resolutions a help or a hindrance?
I choose the camera that will fit the job. It is my job in prep to test and pick the camera that best serves the movie.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
On set, I am able to view HDR or 709. I test the pipeline and make sure the LUT is correct and make modifications if needed. I do not play with many LUTs on set, I normally just have one. I treat the camera like a film stock. I know I will be there in the DI to finalize the look. On set is not the place for futzing with LUTs on the camera. My plate is full enough as it is.

If not already covered, how has production changed in the last two years?
I am not sure production has changed, but there are many new tools to use to help make work more efficient and economical. I feel that I have always had to be mindful of the budget, no matter how large the show is. I am always looking for new solutions.

Daryn Okada, ASC
Daryn Okada is known for his work on films such as Mean GirlsAnna Karenina and Just Like Heaven. He has also worked on many TV series, such as Scandal, Grey’s Anatomy and Castle. He served as president of the ASC from 2006 to 2009.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses? 

Modern digital cinema cameras can achieve a level of quality with the proper workflows and techniques to evolve a story’s visual identity parallel explorations shooting on film. Larger image sensors, state-of-the-art lenses and mining historic optics enable cinematographers to use their experience and knowledge of the past to paint rich visual experiences for today’s audience.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
HDR is a creative and technical medium just as shooting and projecting 65mm film would be. It’s up to the director and the cinematographer to decide how to orchestrate the use of HDR for their particular story.

Can you address 8K? What are the positives, and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K will is working its way into production like 65mm and 35mm VistaVision did by providing more technical resolution for use in VFX or special-venue exhibition. The enormous amount of data and cost to handle it must be justified by its financial return and does it benefit a particular story. Latitude and color depth are paramount to creating a motion picture’s pallet and texture. Trying to use a format just because it’s technically possible may be distracting to an audience’s acceptance of a story or creative concept.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?

I think the delivery specifications of OTT have generally raised the bar, making 4K and wide color gamut the norm. For cinematographers that have spent years photographing features, we are accustomed to creating images with detail for a big screen and a wide color pallet. It’s a natural creative process to shoot for 4K and HDR in that respect. 

Are the availability of all those camera resolutions a help or a hindrance? 
Having the best imaging available is always welcomed. Even if a camera is not technically exploited, the creation of subtle images is richer and possible through the smoother transition and blending of color, contrast and detail from originating with higher resolutions and color range.

Can you talk about color management from the sensor/film to the screen? How do you ensure correct color management from the set into dailies and post, the DI and final delivery?
As a cinematographer we are still involved in workflows for dailies and post production to ensure everyone’s creative efforts to the final production are maintained for the immediate viewer and preserved for the audiences in the future.

How has production changed over the last two years?
There are more opportunities to produce content with creative high-quality cinematography thanks to advancements in cameras and cost-effective computing speed combined with demands of high quality displays and projection.

Vanja Černjul, ASC
This New York-based DP recently worked on the huge hit Crazy Rich Asians. In addition to feature film work, Černjul has shot TV shows (Deuce’s season 1 finale and two seasons of Marco Polo, as well as commercials for Panasonic and others.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
One interesting trend I noticed is the comeback of image texture. In the past, cinematographers used to expose film stock differently according to the grain texture they desired. Different exposure zones within the same frame had different grain character, which produced additional depth of the image. We lost that once we switched to digital. Crude simulations of film grain, such as overall filters, couldn’t produce the dimensionality we had with film.

Today, I am noticing new ways of bringing the texture back as a means of creative expression. The first one comes in the form of new, sophisticated post production tools designed to replicate the three-dimensional texturing that occurs naturally when shooting film, such as the realtime texturing tool LiveGrain. Monitoring the image on the set with a LiveGrain texture applied can impact lighting, filtration or lens choices. There are also new ways to manipulate texture in-camera. With the rise of super-sensitive, dual-native ISO sensors we can now shoot at very low-light levels and incorporate so-called photon shot noise into the image. Shot noise has organic character, very much like film grain.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?

The creative potential of HDR technology is far greater than that of added resolution. Unfortunately, it is hard for cinematographers to take full advantage of HDR because it is still far from being the standard way the audience sees our images. We can’t have two completely different looks for a single project, and we have to make sure the images are working on SDR screens. In addition, it is still impractical to monitor in HDR on the set, which makes it difficult to adjust lighting and lens choices to expanded dynamic range. Once HDR screens become a standard, we will be able to really start creatively exploring this new territory.

Crazy Rich Asians

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
Additional resolution adds more available choices regarding relationship of optical systems and aspect ratios. I am now able to choose lenses for their artifacts and character regardless of the desired aspect ratio. I can decide to shoot one part of the film in spherical and the other part in anamorphic and crop the image to the project’s predetermined aspect ratio without fear of throwing away too much information. I love that freedom.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices and workflows, if at all?
For me, the only practical difference between shooting high-quality content for cable or streaming is the fact that Netflix demands their projects to be capt
ured in true 4K RAW. I like the commitment to higher technical standards, even though this may be an unwelcome restriction for some projects.

Is the availability of all those camera resolutions a help or a hindrance?
I like choices. As large format lenses become more available, shooting across formats and resolutions will become easier and simpler.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The key for correct color management from the set to final color grading is in preproduction. It is important to take the time to do proper tests and establish the communication between DIT, the colorist and all other people involved as early as possible. This ensures that original ideas aren’t lost in the process.

Adjusting and fine-tuning the LUT to the lenses, lighting gels and set design and then testing it with the colorist is very important. Once I have a bulletproof LUT, I light and expose all the material for it specifically. If this part of the process is done correctly, the time in final color grading can be spent on creative work rather than on fixing inconsistencies.

I am very grateful for ACES workflow, which offers long-overdue standardization. It is definitely a move in the right direction.

How has production changed over the last two years?
With all the amazing post tools that are becoming more available and affordable, I am seeing negative trends of further cutting of preproduction time, and lack of creative discipline on the set. I sincerely hope this is just a temporary confusion due to recalibration of the process.

Kate Reid, DP
Kate Reid is a UK-based DP working in TV and film. Her recent work includes the TV series Hanna (Amazon) Marcella 2 (Netflix) and additional photography on the final season Game of Thrones for HBO. She is currently working on Press for BBC.

Can you talk about some camera trends you’ve been seeing? Such as Large Format? The use of old/vintage lenses?
Large format cameras are being used increasingly on drama productions to satisfy the requirement for additional resolution by certain distribution platforms. And, of course, the choice to use large format cameras in drama brings with it another aesthetic that DPs now have as another tool: Choosing if increased depth-of-field fall off, clarity in the image etc., enhances the particular story they wish to portray on screen.

Like many other DPs, I have always enjoyed using older lenses to help make the digital image softer, more organic and less predictable, but the larger format cameras now mean that much of this older glass designed for 35mm size sensor may not cover the increased sensor size, so newer lenses designed for the larger format cameras may become popular by necessity, alongside older larger format glass that is enjoying a renaissance.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
I have yet to shoot a show that requires HDR delivery. It hasn’t yet become the default in drama production in the UK.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and frame rate more important currently?
I don’t inherently find an ultra sharp image attractive. Through older glass and diffusion filters on the lens, I am usually looking to soften and break down my image, so I personally am not all about the extra Ks. How the camera’s sensor reproduces color and handles highlights and shadows is of more interest to me, and I believe has more impact on the picture.

Of primary importance is how practical a camera is to work with — size and how comfortable the camera is to handle would supersede excessive resolution — as the first requirement of any camera has got to be whether it allows you to achieve the shots you have in mind, because a story isn’t told through its resolution.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices, and workflows, if at all?
The major change is the requirement by Netflix for true 4K resolution, determining which cameras cinematographers are allowed to shoot on. For many cinematographers the Arri Alexa was their digital camera of choice, which was excluded by this rule, and therefore we have had to look to other cameras for such productions. Learning a new camera, its sensor, how it handles highlights, produces color, etc., and ensuring the workflow through to the post facility is something that requires time and testing, which has certainly added to a DP’s workload.

From a creative perspective, however, I found shooting for OTTs (I shot two episodes of the TV series Hanna made by Working Title TV and NBC Universal for Amazon) has been more liberating than making a series for broadcast television as there is a different idea and expectation around what the audience wants to watch and enjoy in terms of storytelling. This allowed for a more creative way of filming.

Is the availability of all those camera resolutions a help or a hindrance?
Where work is seen now can vary from a mobile phone screen to a digital billboard in Times Square, so it is good for DPs to have a choice of cameras and their respective resolutions so we can use the best tool of each job. It only becomes a hindrance if you let the technology lead your creative process rather than assist it.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Ideally, I will have had the time and opportunity to shoot tests during prep and then spend half a day with the show’s colorist to create a basic LUT I can work with on set. In practice, I have always found that I tweak this LUT during the first days of production with the DIT, and this is what serves me throughout the rest of the show.

I usually work with just one LUT that will be some version of a modified Rec. 709 (unless the look of the show drastically requires something else). It should then be straight forward in that the DIT can attach a LUT to the dailies, and this is the same LUT applied by editorial so that exactly what you see on set is what is being viewed in the edit.

However, where this fails is that the dailies uploaded to FTP sites — for viewing by the execs, producers and other people who have access to the work — are usually very compressed with low resolution, so it bears little resemblance to how the work looked on set or looks in the edit. This is really unsatisfying as for months, key members of production are not seeing an accurate reflection of the picture. Of course, when you get into the grade this can be restored, but it’s dangerous if those viewing the dailies in this way have grown accustomed to something that is a pale comparison of what was shot on set.

How has production changed over the last two years?
There is less differentiation between film and television in how productions are being made and, critically, where they are being seen by audiences, especially with online platforms now making award-winning feature films. The high production values we’ve seen with Netflix and Amazon’s biggest shows has seen UK television dramas pushing to up their game, which does put pressure on productions, shooting schedules and HODs, as the budgets to help achieve this aren’t there yet.

So, from a ground-level perspective, for DPs working in drama this looks like more pressure to produce work of the highest standard in less time. However, it’s also a more exciting place to be working as the ideas about how you film something for television versus cinema no longer need apply. The perceived ideas of what an audience is interested in, or expect, are being blown out the water by the success of new original online content, which flies in the face of more traditional storytelling. Broadcasters are noticing this and, hopefully, this will lead to more exciting and cinematic mainstream television in the future.

Blackmagic’s Bob Caniglia
In addition to its post and broadcast tools, Blackmagic offers many different cameras, including the Pocket Cinema Camera, Pocket Cinema Camera 4K, Micro Studio Camera 4K, Micro Cinema Camera, Studio Camera, Studio Camera 4K, Ursa Mini Pro, Ursa Mini 4.6K, Ursa Broadcast.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Lens freedom is on everyone’s mind right now… having the freedom to shoot in any style. This is bringing about things like seeing projects shot on 50-year-old glass because the DP liked the feel of a commercial back in the ‘60s.

We actually just had a customer test out actual lenses that were used on The Godfather, The Shining and Casablanca, and it was amazing to see the mixing of those with a new digital cinema camera. And so many people are asking for a camera to work with anamorphic lenses. The trend is really that people expect their camera to be able to handle whatever look they want.

For large format use, I would say that both Hollywood and indie filmmakers are using them more often. Or, at least they trying to get the general large format look by using anamorphic lenses to get a shallow depth of field.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Right now, HDR is definitely more of a concern for DPs in Hollywood, but also with indie filmmakers and streaming service content creators. Netflix and Hulu have some amazing HDR shows right now. And there is plenty of choice when it comes to the different HDR formats and shooting and monitoring on set. All of that is happening everyday, while 8K still needs the industry to catch up with the various production tools.

As for impacting shooting, HDR is about more immersive colors, and a DP needs to plan for it. It gives viewers a whole new level of image detail in what they shoot. They have to be much more aware of every surface or lighting impact so that the viewer doesn’t get distracted. Attention to detail gets even higher in HDR, and DPs and colorists will need to keep a close eye on every shot, including when an image in a sideview mirror’s reflection is just a little too sharp and needs a tweak.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
You can never have enough Ks! Seriously. It is not just about getting a beautiful 8K TV, it is about giving the production and post pros on a project as much data as possible. More data means more room to be creative, and is great for things like keying.

Latitude and framerate are important as well, and I don’t think any one is more important than another. For the viewers, the beauty will be in large displays, you’re already seeing 8K displays in Times Square, and though you may not need 8K on your phone, 8K on the side of a building or highway will be very impactful.

I do think one of the ways 8K is changing production practices is that people are going to be much more storage conscious. Camera manufacturers will need to continue to improve workflows as the images get larger in an effort to maximize storage efficiencies.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
For streaming content providers, shoots have definitely been impacted and are forcing productions to plan for shooting in a wider number of formats. Luckily, companies like Netflix have been very good about specifying up front the cameras they approve and which formats are needed.

Is the availability of all those camera resolutions a help or a hindrance?
While it can be a bit overwhelming, it does give creatives some options, especially if they have a smaller delivery size than the acquisition format. For instance, if you’re shooting in 4K but delivering in HD, you can do dynamic zooms from the 4K image that look like an optical zoom, or you can get a tight shot and wide shot from the same camera. That’s a real help on a limited budget of time and/or money.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Have the production and the post people planning together from the start and create the look everyone should be working on right up front.

Set the LUTs you want before a single shot is done and manage the workflow from camera to final post. Also, choose post software that can bring color correction on-set, near-set and off-set. That lets you collaborate remotely. Definitely choose a camera that works directly with any post software, and avoid transcoding.

How has production changed in the last two years?
Beyond the rise of HDR, one of the other big changes is that more productions are thinking live and streaming more than ever before. CNN’s Anderson Cooper now does a daily Facebook Live show. AMC has the live Talking Dead-type formats for many of their shows. That trend is going to keep happening, so cinematographers and camera people need to be thinking about being able to jump from scripted to live shooting.

Red Digital Cinema’s Graeme Nattress
Red Digital Cinema manufactures professional digital cameras and accessories. Red’s DSMC2 camera offers three sensor options — Gemini 5K S35, Helium 8K S35 and Monstro 8K VV.

Can you talk about some camera trends you’ve been seeing?
Industry camera trends continue to push image quality in all directions. Sensors are getting bigger, with higher resolutions and more dynamic range. Filmmakers continue to innovate, making new and amazing images all the time, which drives our fascination for advancing technology in service to the creative.

How is HDR affecting the way things are being shot these days?
One of the benefits of a primary workflow based on RAW recording is that HDR is not an added extra, but a core part of the system. Filmmakers do consider HDR important, but there’s some concern that HDR doesn’t always look appealing, and that it’s not always an image quality improvement. Cinematography has always been about light and shade and how they are controlled to shape the image’s emotional or storytelling intent. HDR can be a very important tool in that it greatly expands the display canvas to work on, but a larger canvas doesn’t mean a better picture. The increased display contrast of HDR can make details more visible, and it can also make motion judder more apparent. Thus, more isn’t always better; it’s about how you use what you have.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
Without resolution, we don’t have an image. Resolution is always going to be an important image parameter. What we must keep in mind is that camera resolution is based on input resolution to the system, and that can — and often will — be different to the output resolution on the display. Traditionally, in video the input and output resolutions were one and the same, but when film was used — which had a much higher resolution than a TV could display — we were taking a high-resolution input and downsampling it to the display, the TV screen.

As with any sampled system, in a digital cinema camera there are some properties we seek to protect and others to diminish. We want a high level of detail, but we don’t want sharpening artifacts and we don’t want aliasing. The only way to achieve that is through a high-resolution sensor, properly filtered (optical low-pass) that can see a large amount of real, un-enhanced detail. So yes, 8K can give you lots of fine detail should you want it, but the imaging benefits extend beyond downsampling to 4K or 2K. 8K makes for an incredibly robust image, but noise is reduced, and what noise remains takes on more of a texture, which is much more aesthetically pleasing.

One challenge of 8K is an increase in the amount of sensor data to be recorded, but that can be addressed through quality compression systems like RedCode.

Addressing dynamic range is very important because dynamic range and resolution work together to produce the image. It’s easy to think that high resolutions have a negative impact upon dynamic range, but improved pixel design means you can have dynamic range and resolution.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Color management is vitally important and so much more than just keeping color control from on-set through to delivery. Now with the move to HDR and an increasing amount of mobile viewing, we have a wide variety of displays, all with their own characteristics and color gamuts. Color management allows content creators to display their work at maximum quality without compromise. Red cameras help in multiple ways. On camera, one can monitor in both SDR and HDR simultaneously with the new IPP2 image processing pipeline’s output independence, which also allows you to color via CDL and creative 3D LUT in such a way as to have those decisions represented correctly on different monitor types.

In post and grading, the benefits of output independence continue, but now it’s critical that scene colors, which can so easily go out of gamut, are dealt with tastefully. Through the metadata support in the RedCode format, all the creative decisions taken on set follow through to dailies and post, but never get in the way of producing the correct image output, be it for VFX, editorial or grading.

Panavision’s Michael Cioni 
Panavision designs and manufactures high-precision camera systems, including both film and digital cameras, as well as lenses and accessories for the motion picture and television industries.

Can you talk about some camera trends you’ve been seeing?
With the evolution of digital capture, one of the most interesting things I’ve noticed in the market are new trends emerging from the optics side of cinematography. At a glance, it can appear as if there is a desire for older or vintage lenses based on the increasing resolution of large format digital cameras. While resolution is certainly a factor, I’ve noticed the larger contributor to vintage glass is driven by the quality of sensors, not the resolution itself. As sensors increase in resolution, they simultaneously show improvements in clarity, low-light capability, color science and signal-to-noise ratio.

The compounding effect of all these elements are improving images far beyond what was capable with analog film technology, which explains why the same lens behaves differently on film, S35 digital capture and large-format digital capture. As these looks continue to become popular, Panavision is responding through our investments in both restoration of classic lenses as well as designing new lenses with classic characteristics and textures that are optimized for large format photography on super sensors.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look?
Creating images is not always about what component is better, but rather how they elevate images by working in concert. HDR images are a tool that increases creative control alongside high resolution and 16-bit color. These components work really well together because a compelling image can make use of more dynamic range, more color and more clarity. Its importance is only amplified by the amalgamation of high-fidelity characteristics working together to increase overall image flexibility.

Today, the studios are still settling into an HDR world because only a few groups, led by OTT, are able to distribute in HDR to wide audiences. On-set tools capable of HDR, 4K and 16-bit color are still in their infancy and currently cost-prohibitive. 4K/HDR on the set is going to become a standard practice by 2021. 4K wireless transmitters are the first step — they are going to start coming online in 2019. Smaller OLED displays capable of 750 nits+ will follow in 2020, creating an excellent way to monitor higher quality images right on set. In 2021, editorial will start to explore HDR and 4K during the offline process. By 2024, all productions will be HDR from set to editorial to post to mobile devices. Early adopters that work out the details today will find themselves ahead of the competition and having more control as these trends evolve. I recommend cinematographers embrace the fundamentals of HDR, because understanding the tools and trends will help prevent images from appearing artificial or overdone.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
One of the reasons we partnered with Red is because the Monstro 8K VV sensor makes no sacrifice in dynamic range while still maintaining ultra high smoothness at 16 bits. The beauty of technology like this is that we can finally start to have the best from all worlds — dynamic range, resolution, bit depth, magnification, speed and workflow — without having to make quality sacrifices. When cinematographers have all these elements together, they can create images previously never seen before, and 8K is as much part of that story as any other element.

One important way to view 8K is not solely as a thermometer for high-resolution sharpness. A sensor with 35 million pixels is necessary in order to increase the image size, similar to trends in professional photography. 8K large format creates a larger, more magnified image with a wider field of view and less distortion, like the difference in images captured by 70mm film. The biggest positive I’ve noticed is that DXL2’s 8K large-format Red Monstro sensor is so good in terms of quality that it isn’t impacting images themselves. Lower quality sensors can add a “fingerprint” to the image, which can distort the original intention or texture of a particular lens.

With sensors like Monstro capable of such high precision, the lenses behave exactly as the lens maker intended. The same Panavision lenses on a lower grade sensor, or even 35mm film, are exhibiting characteristics that we weren’t able to see before. This is literally breathing new life into lenses that previously didn’t perform the same way until Monstro and large format.

Is the availability of so many camera formats a help or a hindrance?
You don’t have to look far to identify individuals who are easily fatigued by having too many choices. Some of these individuals cope with choices by finding ways to regulate them, and they feel fewer choices means more stability and perhaps more control (creative and economic). As an entrepreneur, I find the opposite to be true: I believe regulating our world, especially with regards to the arts and sciences, is a recipe for protecting the status quo. I fully admit there are situations in which people are fatigued by too many complex choices.

I find that failure is not of the technology itself, rather it’s the fault of the manufactures who have not provided the options in easy-to-consume ways. Having options is exactly what creatives need in order to explore something new and improved. But it’s also up to manufacturers to deliver the message in ways everyone can understand. We’re still learning how to do that, and with each generation the process changes a bit. And while I am not always certain which are the best ways to help people understand all the options, I am certain that the pursuit of new art will motivate us to go out of our comfort zones and try something previously thought not possible.

Have you encountered any examples of productions that have shot streaming content (i.e. for Netflix/Amazon) and had to change production practices and workflows for this format/deliverable?
Netflix and Amazon are exceptional examples of calculated risk takers. While most headlines discuss their investment in the quantity of content, I find the most interesting investment they make is in relationships. Netflix and Amazon are heavily invested in standards groups, committees, outreach, panels and constant communication. The model of the past and present (incumbent studios) are content creators with technology divisions. The model of the future (Netflix, Amazon, Hulu, Apple, Google and YouTube) are all the technology companies with the ability to create content. And technology companies approach problems from a completely different angle by not only embracing the technology, they help invent it. In this new technological age, those who lead and those who follow will likely be determined by the tools and techniques used to deliver. What I call “The Netflix Effect” is the impact Netflix has on traditional groups and how they have all had to strategically pivot based on Netflix’s impact.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The DXL2 has an advanced color workflow. In collaboration with LiveGrade by Pomfort, DXL2 can capture looks wirelessly from DITs in the form of CDLs and LUTs, which are not only saved into the metadata of the camera, but also baked into in-camera proxy files in the form of Apple ProRes or Avid DNx. These files now contain visual references of the exact looks viewed on monitors and can be delivered directly to post houses, or even editors. This improves creative control because it eliminates the guess work in the application of external color decisions and streamlines it back to the camera where the core database is kept with all the other camera information. This metadata can be traced throughout the post pipeline, which also streamlines the process for all entities that come in contact with camera footage.

How has production changed over the last two years?
Sheesh. A lot!

ARRI‘s Stephan Ukas-Bradley
The ARRI Group manufactures and distributes motion picture cameras, digital intermediate systems and lighting equipment. Their camera offerings include the Alexa LF, Alexa Mini, Alexa 65, Alexa SXT W and the Amira.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Large format opens some new creative possibilities, using a shallow depth of field to guide the audience’s view and provide a wonderful bokeh. It also conveys a perspective truer to the human eye, resulting in a seemingly increased dimensional depth. The additional resolution combined with our specially designed large format Signature Primes result in beautiful and emotional images.

Old and vintage lenses can enhance a story. For instance, when Gabriel Beristain, ASC, used Bausch & Lomb Super Baltar on the Starz show Magic City, and Bradford Young used detuned DNA lenses in conjunction with Alexa 65 on Solo: A Star Wars Story, certain characteristics like flares, reflections, distortions and focus fall-off are very difficult to recreate in post organically, so vintage lenses provide an easy way to create a unique look for a specific story and a way for the director of photography to maintain creative control.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Currently, things are not done much differently on set when shooting HDR versus SDR. While it would be very helpful to monitor in both modes on-set, HDR reference monitors are still very expensive and very few productions have the luxury to do that. One has to be aware of certain challenges when shooting for an HDR finish. High contrast edges can result in a more pronounced stutter/strobing effect when panning the camera, windows that are blown out in SDR might retain detail in the HDR pass and now all of a sudden, a ladder or grip stand are visible.

In my opinion, HDR is more important than higher resolution. HDR is resolution-independent in regard to viewing devices like phone/tablets and gives the viewer a perceived increased sharpness, and it is more immersive than increased resolution. Also, let’s not forget that we are working in the motion picture industry and that we are either capturing moving objects or moving the camera, and with that introducing motion blur. Higher resolution only makes sense to me in combination with higher frame rates, and that in return will start a discussion about aesthetics, as it may look hyper-real compared to the traditional 24fps capture. Resolution is one aspect of the overall image quality, but in my opinion extended dynamic range, signal/noise performance, sensitivity, color separation and color reproduction are more important.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices and workflows, if at all?
Shooting streaming content has really not changed production practices or workflows. At ARRI, we offer very flexible and efficient workflows and we are very transparent documenting our ARRIRAW file formats in SMPTE RDD 30 (format) and 31 (processing) and working with many industry partners to provide native file support in their products.

Is the availability of all those camera resolutions a help or a hindrance?
I would look at all those different camera types and resolutions as different film stocks and recommend to creatives to shoot their own test and select the camera systems based on what suits their project best.

We offer the ARRI Look Library for Amira, Alexa Mini and Alexa SXT (SUP 3.0), which is a collection of 87 looks, each of them available in three different intensities provided in Rec. 709 color space. Those looks can either be recorded or only used for monitoring. These looks travel with the picture, embedded in the metadata of the ARRIRAW file, QuickTime Atom or HD/SDI stream in form of the actual LUT and ASC CDL. One can also create a look dynamically on set, feeding the look back to the camera and having the ASC CDL values embedded in the same way.

More commonly, one would record in either ARRIRAW or ProRes LogC, while applying a standard Rec. 709 look for monitoring. The “C” in LogC stands for Cineon, which is a film-like response very much the like of a scanned film image. Colorists and post pros are very familiar with film and color grading LogC images is easy and quick.

How has production changed over the last two years?
I don’t have the feeling that production has changed a lot in the past two years, but with the growing demand from OTTs and increased production volume, it is even more important to have a reliable and proven system with flexible workflow options.

Main Image: DP Kate Reid.

Behind the Camera: Feature Film DPs

By Karen Moltenbrey

The responsibilities of a director of photography (DP) span far more than cinematography. Perhaps they are best known for their work behind the camera capturing the action on set, but that is just one part of their multi-faceted job. Well before they step onto the set, they meet with the director, at times working hand-in-hand to determine the overall look of the project. They also make a host of technical selections, such as the type of camera and lenses they will use as well as the film stock if applicable – crucial decisions that will support the director’s vision and make it a reality.

Here we focus on two DPs for a pair of recent films with specialized demands and varying aesthetics, as they discuss their workflows on these projects as well as the technical choices they made concerning equipment and the challenges each project presented.

Hagen Bogdanski: Papillon
The 2018 film Papillon, directed by Michael Noer, is a remake of the 1973 classic. Set in the 1930s, it follows two inmates who must serve, and survive, time in a French Guyana penal colony. The safecracker nicknamed Papillon (Charlie Hunnam) is serving a life sentence and offers protection to wealthy inmate Louis Dega (Rami Malek) in exchange for financing Papillon’s escape.

“We wanted to modernize the script, the whole story. It is a great story but it feels aged. To bring it to a new, younger audience, it had to be modernized in a more radical way, even though it is a classic,” says Hagen Bogdanski, the film’s DP, whose credits include the film The Beaver and the TV series Berlin Station, among others. To that end, he notes, “we were not interested in mimicking the original.”

This was done in a number of ways. First, through the camera work, using a semi-documentary style. The director has a history of shooting documentaries and, therefore, the crew shot with two cameras at all times. “We also shot the rehearsals,” notes Bogdanski, who was brought onto the project and given nearly five weeks of prep before shooting began. Although this presented a lot of potential risk for Bogdanski, the film “came out great in the end. I think it’s one of the reasons the look feels so modern, so spontaneous.”

In the film, the main characters face off against the harsh environment of their prison island. But to film such a landscape required the cinematographer and crew to also contend with these trying conditions. They shot on location outdoors for the majority of the feature, using just one physical structure: the prison. Also helping to define the film’s aesthetic was the lighting, which, as is typical with Bogdanski’s films, is as natural as possible without large artificial sources.

Most of the movie was shot in Montenegro, near sun-drenched Greece and Albania. Bogdanski does not mince words: “The locations were difficult.”

Weather seemed to impact Bogdanski the most. “It was very remote, and if it’s raining, it’s really raining. If it’s getting dark, it’s dark, and if it’s foggy, there is fog. You have to deal with a lot of circumstances you cannot control, and that’s always a bit of a nightmare for any cinematographer,” he says. “But, what is good about it is that you get the real thing, and you get texture, layers, and sometimes it’s better when it rains than when the sun is shining. Most of the time we were lucky with the weather and circumstances. The reality of location shooting adds quite heavily to the look and to the whole texture of the movie.”

The location shooting also affected this DP’s choice of cameras. “The footprint [I used] was as small as possible because we basically visited abandoned locales. Therefore, I chose as small a kit — lenses, cameras and lights — as possible,” Bogdanski points out. “Because [the camera] was handheld, every pound counted.” In this regard, he used ARRI’s Arriflex Mini cameras and one Alexa SXT, and only shot with Zeiss Ultra Prime lenses – “big zooms, no big filters, nothing,” he adds.

The prison build was on a remote mountain. On the upside, Bogdanski could shoot 360 degrees there without requiring the addition of CGI later. On the downside, the crew had to get up the mountain. A road was constructed to transport the gear and for the set construction, but even so, the trek was not easy. “It took two hours or longer each day from our hotel. It was quite an adventure,” he says.

As for the lighting, Bogdanski tried to shoot when the light was good, taking advantage of the location’s natural light as much as possible — within his documentary style. When this was not enough, LEDs were used. “Again, small footprint, smaller lens, smaller electrical power, smaller generators….” The night scenes were especially challenging because the nights were very short, no longer than five to six hours. When artificial rain had to be used, shooting was “a little painful” due to the size of the set, requiring the use of more traditional lighting sources, such as large Tungsten light units.

According to Bogdanski, filming Papillon followed what he calls an “eclectic” workflow, akin to the European method of filming whereby rehearsal occurred in the morning and was quite long, as the director rehearsed with the actors. Then, scenes were shot in script order, on the first take without technical rehearsals. “From there, we tried to cover the scene in handheld mode with two cameras in a kind of mash-up. We did pick up the close-ups and all that, but always in a very spontaneous and quick way,” says Bogdanski.

Looking back, Bogdanski describes Papillon as a “modern-period film”: a period look, without looking “period.” “It sounds a bit Catch-22, which it is, in my opinion, but that’s what we aimed for, a film that plays basically in the ’40s and ’50s, and later in the ’60s,” he says.

During the time since the original film was made in 1973, the industry has witnessed quite a technical revolution in terms of film equipment, providing the director and DP on the remake with even more tools and techniques at their disposal to leave their own mark on this classic for a new generation.

Nancy Schreiber: Mapplethorpe
Award-winning cinematographer Nancy Schreiber, ASC, has a resume spanning episodic television (The Comeback), documentaries (Eva Hesse) and features (The Nines). Her latest film, Mapplethorpe, paints an unflinching portrait of controversial-yet-revered photographer Robert Mapplethorpe, who died at the age of 42 from AIDS-related complications in 1989. Mapplethorpe, whose daring work influenced popular culture, rose to fame in the 1970s with his black-and-white photography.

In the early stages of planning the film, Schreiber worked with director Ondi Timoner and production designer Jonah Markowitz while they were still in California prior to the shoot in New York, where Mapplethorpe (played by The Crown’s Matt Smith) lived and worked at the height of his popularity.

“We looked at a lot of reference materials — books and photographs — as Ondi and I exchanged look books. Then we honed in on the palette, the color of my lights, the set dressing and wardrobe, and we were off to the races,” says Schreiber. Shooting began mid-July 2017.

Mapplethorpe is a period piece that spans three decades, all of which have a slightly different feel. “We kept the ’60s and into the ’70s quite warm in tone,” as this is the period when he first meets Patty Smith, his girlfriend at the time, and picks up a camera, explains Schreiber. “It becomes desaturated but still warm tonally when he and Patti visit his parents back home in Queens while the two are living at the Chelsea Hotel. The look progresses until it’s very much on the cool blue/gray side, almost black and white, in the later ’70s and ’80s.” During that time period, Mapplethorpe is successful, with an enormous studio, photographically exploring male body parts like no other person has ever done, while continuing to shoot portraits of the rich and famous.

Schreiber opted to use film, Super 16, rather than digital to capture the life of this famed photographer. “He shot in film, and we felt that format was true to his photography,” she notes. Despite Mapplethorpe’s penchant for mostly shooting in black and white, neither Timoner nor Schreiber considered using that format for the feature, mostly because the ’60s through ’80s in New York had very distinctive color palettes. They felt, however, that film in and of itself was very “textural and beautiful,” whereas you have to work a little harder with digital to make it look like film — even though new ways of adding grain to digital have become quite sophisticated. “Yet, the grain of Super 16 is so distinctive,” she says.

In addition, Kodak had just opened a lab in New York in the spring of 2017, facilitating their ability to shoot film by having it processed quickly nearby.

Schreiber used an ARRI Arriflex 416 camera for the project; when possible, she used two. She also had a set of Zeiss 35mm Super Speed lenses, along with two zoom lenses she used only occasionally for outdoor shots. “The Super Speeds were terrific. They’re vintage and were organic to the look of this period.”

She also used a light meter faithfully. Although Schreiber occasionally uses light meters when shooting digital, it was not optional for shooting film. “I had to use it for every shot, although after a couple of days, I was pretty good at guessing [by eyeing it],” Schreiber points out, “as I used to do when we only shot film.”

Soon after ARRI had introduced the Arriflex 416 – which is small and lightweight – the industry started moving to digital, prompting ARRI to roll out the now-popular Alexa. “But the [Arriflex 416] camera really caught on for those still shooting Super 16, as they do for the series The Walking Dead, Schreiber says, adding she was able to get her pair from TCS Technological Cinevideo Services rental house in New York.

“I had owned an Aaton, a French camera that was very popular in the 1980s and ’90s. But today, the 416 is very much in demand, resembling the shape of my Aaton, both of which are ergonomic, fitting nicely on your shoulder. There were numerous scenes in the car, and I could just jump in the car with this very small camera, much smaller than the digital cameras we use on movies; it was so flexible and easy to work with,” recalls Schreiber.

As for the lenses, “again, I chose the Super Speed Primes not only because they were vintage, but because I needed the speed of the 1.3 lens since film requires more light.” She tested other lenses at TCS, but those were her favorites.

While Schreiber has used film on some commercials and music videos, it had been some time since she had used it for an entire movie. “I had forgotten how freeing it is, how you can really move. There are no cables to worry about. Although, we did transmit to a tiny video village,” she says. “We didn’t always have two cameras [due to cost], so I needed to move fast and get all the coverage the editor needed. We had 19 days, and we were limited in how long we could shoot each day; our budget was small and we couldn’t afford overtime.” At times, though, she was able to hire a Steadicam or B operator who really helped move them along, keeping the camera fluid and getting extra coverage. Timoner also shot a bit of Super 8 along the way.

There was just one disadvantage to using film: The stocks are slow. As Schreiber explains, she used a 500 ASA stock. Therefore, she needed very fast lenses and a fair amount of light in order to compensate. “That worked OK for me on Mapplethorpe because there was a different sense of lighting in the 1970s, and films seemed more ‘lit.’ For example, I might use backlight or hair light, which I never would do for [a film set in] present day,” she says. “I rated that stock at 400 to get rich blacks; that also slightly minimized the grain when the day interior stock was 250 that I rated at 200. We are so used to shooting at 800 or 1280 ISO these days. It was an adjustment.”

Schreiber on set with “Mapplethorpe” director Ondi Timoner.

Shooting with film was also more efficient for Schreiber. “We had monitors for the video village, but we were standard def, old-school, which is not an exact representation. So, I could move quickly to get enough coverage, and I never looked at a monitor except when we had Steadicam. What you see is not what you get with an SD tap. I was trusted to create the imagery as I saw fit. I think many people today are used to seeing the digital image on the monitor as what the final film will look like and may be nervous about waiting for the processing and transfer, not trusting the mystery or mystique of how celluloid will look.”

To top things off, Schreiber was backed by an all-female A camera team. “I know how hard it is for women to get work,” she adds. “There are so many competent women working behind the camera these days, and I was happy to hire them. I remember how challenging it was when I was a gaffer or started to shoot.”

As for costs, digital camera equipment is more expensive than Super 16 film equipment, yet there were processing and transfer costs associated with getting the film into the edit suite. So, when all was said and done, film was indeed more expensive to use, but not by much.

“I am really proud that we were able to do the movie in 19 days with a very limited budget, in New York, covering many periods,” concludes Schreiber. “We had a great time, and I am happy I was able to hire so many women in my departments. Women are still really under-represented, and we must demonstrate that there is not a scarcity of talent, just a lack of exposure and opportunity.”

Mapplethorpe is expected in theaters this October.


Karen Moltenbrey is a longtime writer and editor in the CG and post industries.

Our Virtual Color Roundtable

By Randi Altman

The number of things you can do with color in today’s world is growing daily. It’s not just about creating a look anymore, it’s using color to tell or enhance a story. And because filmmakers recognize this power, they are getting colorists involved in the process earlier than ever before. And while the industry is excited about HDR and all it offers, this process also creates its own set of challenges and costs.

To find out what those in the trenches are thinking, we reached out to makers of color gear as well as hands-on colorists with the same questions, all in an effort to figure out today’s trends and challenges.

Company 3 Senior Colorist Stephen Nakamura
Company 3 is a global group of creative studios specializing in color and post services for features, TV and commercials. 

How has the finishing of color evolved most recently?
By far, the most significant change in the work that I do is the requirement to master for all the different exhibition mediums. There’s traditional theatrical projection at 14 footlamberts (fL) and HDR theatrical projection at 30fL. There’s IMAX. For home video, there’s UHD and different flavors of HDR. Our task with all of these is to master the movie so it feels and looks the way it’s supposed to feel and look on all the different formats.

There’s no one-size-fits-all approach. The colorist’s job is to work with the filmmakers and make those interpretations. At Company 3 we’re always creating custom LUTs. There are other techniques that help us get where we need to be to get the most out of all these different display types, but there’s no substitute for taking the time and interpreting every shot for the specific display format.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not too long ago, a cinematographer could expose an image specifically for one display format — a film print projected at 14fL. They knew exactly where they could place their highlights and shadows to get a precise look onscreen. Today, they’re thinking in terms of the HDR version, where if they don’t preserve detail in the blacks and whites it can really hurt the quality of the image in some of the newer display methods.

I work frequently with Dariuisz Wolski (Sicario: Day of the Soldado, All the Money in the World). We’ve spoken about this a lot, and he’s said that when he started shooting features, he often liked to expose things right at the edge of underexposure because he knew exactly what the resulting print would be like. But now, he has to preserve the detail and fine-tune it with me in post because it has to work in so many different display formats.

There are also questions about how the filmmakers want to use the different ways of seeing the images. Sometimes they really like the qualities of the traditional theatrical standard and really don’t want HDR to look very different and to make the most of the dynamic range. If we have more dynamic range, more light, to work with, it means that in essence we have a larger “canvas” to work on. But you need to take the time to individually treat every shot if you want to get the most out of that “canvas.”

Where do you see the industry moving in the near future?
The biggest change I expect to see is the development of even brighter, higher-contrast exhibition mediums. At NAB, Sony unveiled this wall of LED panels that are stitched together without seams and can display up to 1000 nits. It can be the size of a screen in a movie theater. If that took off, it could be a game changer. If theatrical exhibition gets better with brighter, higher-contrast screens, I think the public will enjoy it, provided that the images are mastered appropriately.

Sicario: Day of the Soldado

What is the biggest challenge you see for the color grading process now and beyond?
As there are more formats, there will be more versions of the master. From P3 to Rec.709 to HDR video in PQ — they all translate color information differently. It’s not just the brightness and contrast but the individual colors. If there’s a specific color blue the filmmakers want for Superman’s suit, or red for Spiderman, or whatever it is, there are multiple layers of challenges involved in maintaining those across different displays. Those are things you have to take a lot of care with when you get to the finishing stage.

What’s the best piece of work you’ve seen that you didn’t work on?
I know it was 12 years ago now, but I’d still say 300, which was colored by Company 3 CEO Stefan Sonnenfeld. I think that was enormously significant. Everyone who has seen that movie is aware of the graphic-novel-looking imagery that Stefan achieved in color correction working with Zack Snyder and Larry Fong.

We could do a lot in a telecine bay for television, but a lot of people still thought of digital color correction for feature films as an extension of the timing process from the photochemical world. But the look in 300 would be impossible to achieve photo-chemically, and I think that opened a lot of people’s minds about the power of digital color correction.

Alt Systems Senior Product Specialist Steve MacMillian
Alt Systems is a systems provider, integrating compositing, DI, networking and storage solutions for the media and entertainment industry.

How has the finishing of color evolved most recently?
Traditionally, there has been such a huge difference between the color finishing process for television production verses for cinematic release. It used to be that a target format was just one thing, and finishing for TV was completely different than finishing for the cinema.

Colorists working on theatrical films will spend most of their efforts on grading for projection, and only after there is a detailed trim pass to make a significantly different version for the small screen. Television colorists, who are usually under much tighter schedules, will often only be concerned with making Rec.709 look good on a standard broadcast monitor. Unless there is a great deal of care to preserve the color and dynamic range of the digital negative throughout the process, the Rec.709 grade will not be suitable for translation to other expanded formats like HDR.

Now, there is an ever-growing number of distribution formats with different color and brightness requirements. And with the expectation of delivering to all of these on ever-tighter production budgets, it has become important to use color management techniques so that the work is not duplicated. If done properly, this allows for one grade to service all of these requirements with the least amount of trimming needed.

How has laser projection and HDR impacted the work?
HDR display technology, in my opinion, has changed everything. The biggest impact on color finishing is the need for monitoring in both HDR and SDR in different color spaces. Also, there is a much larger set of complex delivery requirements, along with the need for greater technical expertise and capabilities. Much of this complexity can be reduced by having the tools that make the various HDR image transforms and complex delivery formats as automatic as possible.

Color management is more important than ever. Efficient and consistent workflows are needed for dealing with multiple sources with unique color sciences, integrating visual effects and color grading while preserving the latitude and wide color gamut of the image.

The color toolset should support remapping to multiple deliverables in a variety of color spaces and luminance levels, and include support for dynamic HDR metadata systems like Dolby and HDR10+. As HDR color finishing has evolved, so has the way it is delivered to studios. Most commonly it is delivered in an HDR IMF package. It is common that Rec.2020 HDR deliverables be color constrained to the P3 color volume and also that Light Level histograms and HDR QC reports be delivered.

Do you feel DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not as much as you would think. Two things are working against this. First, film and high-end digital cameras themselves have for some time been capturing latitude suitable for HDR production. Proper camera exposure is all that is needed to ensure that an image with a wide enough dynamic range is recorded. So from a capture standpoint, nothing needs to change.

The other is cost. There are currently only a small number of suitable HDR broadcast monitors, and most of these are extremely expensive and not designed well for the set. I’m sure HDR monitoring is being used on-set, but not as much as expected for productions destined for HDR release.

Also, it is difficult to truly judge HDR displays in a bright environment, and cinematographers may feel that monitoring in HDR is not needed full time. Traditionally with film production, cinematographers became accustomed to not being able to monitor accurately on-set, and they rely on their experience and other means of judging light and exposure. I think the main concern for cinematographers is the effect of lighting choices and apparent resolution, saturation and contrast when viewed in HDR.

Highlights in the background can potentially become distracting when displayed at 1000 nits verses being clamped at 100. Framing and lighting choices are informed by proper HDR monitoring. I believe we will see more HDR monitoring on-set as more suitable displays become available.

Colorfront’s Transkoder

Where do you see the industry moving in the near future?
Clearly HDR display technology is still evolving, and we will see major advances in HDR emissive displays for the cinema in the very near future. This will bring new challenges and require updated infrastructure for post as well as the cinema. It’s also likely that color finishing for the cinema will become more and more similar to the production of HDR for the home, with only relatively small differences in overall luminance and the ambient light of the environment.

Looking forward, standard dynamic range will eventually go away in the same way that standard definition video did. As we standardize on consumer HDR displays, and high-performance panels become cheaper to make, we may not need the complexity of HDR dynamic remapping systems. I expect that headset displays will continue to evolve and will become more important as time goes on.

What is the biggest challenge you see for the color grading process now and beyond?
We are experiencing a period of change that can be compared to the scope of change from SD to HD production, except it is happening much faster. Even if HDR in the home is slow to catch on, it is happening. And nobody wants their production to be dated as SDR-only. Eventually, it will be impossible to buy a TV that is not HDR-capable.

Aside from the changes in infrastructure, colorists used to working in SDR have some new skills to learn. I think it is a mistake to do separate grading versions for every major delivery format. Even though we have standards for HDR formats, they will continue to evolve, so post production must evolve too. The biggest challenge is meeting all of these different delivery requirements on budgets that are not growing as fast as the formats.

Northern Lights Flame Artist and Colorist Chris Hengeveld
NY- and SF-based Northern Lights, along with sister companies Mr. Wonderful for design, SuperExploder for composing and audio post, and Bodega for production, offers one-stop-shop services.

How has the finishing of color evolved most recently?
It’s interesting that you use the term “finishing of color.” In my clients’ world, finishing and color now go hand in hand. My commercial clients expect not only a great grade but seamless VFX work in finalizing their spots. Both of these are now often taking place with the same artist. Work has been pushed from just straight finishing with greenscreen, product replacement and the like to doing a grade up to par with some of the higher-end coloring studios. Price is pushing vastly separate disciplines into one final push.

Clients now expect to have a rough look ready not only of the final VFX, but also of the color pass before they attend the session. I usually only do minor VFX tweaks when clients arrive. Sending QuickTimes back and forth between studio and client usually gets us to a place where our client, and their client, are satisfied with at least the direction if not the final composites.

Color, as a completely subjective experience, is best enjoyed with the colorist in the room. We do grade some jobs remotely, but my experience has clearly been that from both time and creativity standpoints, it’s best to be in the grading suite. Unfortunately, recently due to time constraints and budget issues, even higher-end projects are being evaluated on a computer/phone/tablet back at the office. This leads to more iterations and less “the whole is greater than the sum of the parts” mentality. Client interaction, especially at the grading level, is best enjoyed in the same room as the colorist. Often the final product is markedly better than what either could envision separately.

Where do you see the industry moving in the near future?
I see the industry continuing to coalesce around multi-divisional companies that are best suited to fulfill many clients’ needs at once. Most projects that come to us have diverse needs that center around one creative idea. We’re all just storytellers. We do our best to tell the client’s story with the best talent we offer, in a reasonable timeframe and at a reasonable cost.

The future will continue to evolve, putting more pressure on the editorial staff to deliver near perfect rough cuts that could become finals in the not-too-distant future.

Invisalign

The tools continue to level the playing field. More generalists will be trained in disciplines including video editing, audio mixing, graphic design, compositing and color grading. This is not to say that the future of singularly focused creatives is over. It’s just that those focused creatives are assuming more and more responsibilities. This is a continuation of the consolidation of roles that has been going on for several years now.

What is the biggest challenge you see for the color grading process now and beyond?
The biggest challenge going forward is both technical and budgetary. Many new formats have emerged, including the new ProRes RAW. New working color spaces have also emerged. Many of us work without on-staff color scientists and must find our way through the morass of HDR, ACES, Scene Linear and Rec.709. Working with materials that round trip in-house is vastly easier than dealing with multiple shops all with their own way of working. As we collaborate with outside shops, it behooves us to stay at the forefront of technology.

But truth be told, perhaps the biggest challenge is keeping the creative flow and putting the client’s needs first. Making sure the technical challenges don’t get in the way. Clients need to see a seamless view without technical hurdles.

What’s the best piece of work you’ve seen that you didn’t work on?
I am constantly amazed at the quality of work coming out of Netflix. Some of the series are impeccably graded. Early episodes of Bloodline, which was shot with the Sony F65, come to mind. The visuals were completely absorbing, both daytime and nighttime scenes.

Codex VP Business Development Brian Gaffney
Codex designs tools for color, dailies creation, archiving, review and networked attached storage. Their offerings include the new Codex ColorSynth with Keys and the MediaVault desktop NAS.

How has the finishing of color evolved most recently?
While it used to be a specialized suite in a post facility, color finishing has evolved tremendously over the last 10 years with low-cost access to powerful systems like Resolve for use on-set in commercial finishing to final DI color grading. These systems have evolved from being more than just color. Now they are editorial, sound mixing and complete finishing platforms.

How has laser projection and HDR impacted the work?
Offering brighter images in the theatre and the home with laser projection, OLED walls and HDR displays will certainly change the viewers’ experience, and it has helped create more work in post, offering up another pass for grading.

However, brighter images also show off image artifacts and can bring attention to highlights that may already be clipping. Shadow detail that was graded in SDR may now look milky in HDR. These new display mediums require that you spend more time optimizing the color correction for both display types. There is no magic one grade fits all.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
I think cinematographers are still figuring this out. Much like color correction between SDR and HDR, lighting for the two is different. A window that was purposely blown out in SDR, to hide a lighting rig outside, may show up in HDR, exposing the rig itself. Color correction might be able to correct for this, but unless a cinematographer can monitor in HDR on-set, these issues will come up in post. To do it right, lighting optimization between the two spaces is required, plus SDR and HDR monitoring on-set and near-set and in editorial.

Where do you see the industry moving in the near future?
It’s all about content. With the traditional studio infrastructure and broadcast television market changing to Internet Service Providers (ISPs), the demand for content, both original and HDR remastered libraries, is helping prop up post production and is driving storage- and cloud-based services.

Codex’s ColorSynth and Media Vault

In the long term, if the competition in this space continues and the demand for new content keeps expanding, traditional post facilities will become “secure data centers” and managed service providers. With cloud-based services, the talent no longer needs to be in the facility with the client. Shared projects with realtime interactivity from desktop and mobile devices will allow more collaboration among global-based productions.

What is the biggest challenge you see for the color grading process now and beyond?
Project management — sharing color set-ups among different workstations. Monitoring of the color with proper calibrated displays in both SDR and HDR and in support of multiple deliverables is always a challenge. New display technologies, like laser projection and new Samsung and Sony videowalls, may not be cost effective for the creative community to access for final grading. Only certain facilities may wind up specializing in this type of grading experience, limiting global access for directors and cinematographers to fully visualize how their product will look like on these new display mediums. It’s a cost that may not get the needed ROI, so in the near future many facilities may not be able to support the full demand of deliverables properly.

Blackmagic Director of Sales/Operations Bob Caniglia
Blackmagic creates DaVinci Resolve, a solution that combines professional offline and online editing, color correction, audio post production and visual effects in one software tool.

How has the finishing of color evolved most recently?
The ability to work in 8K, and whatever flavor of HDR you see, is happening. But if you are talking evolution, it is about the ability to collaborate with everyone in the post house, and the ability to do high-quality color correction anywhere. Editors, colorists, sound engineers and VFX artists should not be kept apart or kept from being able to collaborate on the same project at the same time.

New collaborative workflows will speed up post production because you will no longer need to import, export or translate projects between different software applications.

How has laser projection and HDR impacted the work?
The most obvious impact has been on the need for colorists to be using software that can finish a project in whatever HDR format the client asks for. That is the same with laser projection. If you do not use software that is constantly updating to whatever new format is introduced, being able to bid on HDR projects will be hard.

HDR is all about more immersive colors. Any colorist should be ecstatic to be able to work with images that are brighter, sharper and with more data. This should allow them to be even more creative with telling a story with color.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
As for cinematographers, HDR gives viewers a whole new level of image details. But that hyper reality could draw the viewer from the wanted target in a shot. The beautiful details shining back on a coffee pot in a tracking shot may not be worth worrying about in SDR, but in HDR every shot will create more work for the colorist to make sure the viewer doesn’t get distracted by the little things. For DPs, it means they are going to have to be much more aware of lighting, framing and planning the impact of every possible item and shadow in an image.

Where do you see the industry moving in the near future?
Peace in our time amongst all of the different post silos, because those silos will finally be open. And there will be collaboration between all parts of the post workflow. Everyone — audio, VFX, editing and color correction — can work together on the same project seamlessly.

For example, in our Resolve tool, post pros can move between them all. This is what we see happening with colorists and post houses right now, as each member of the post team can be much more creatively flexible because anyone can explore new toolsets. And with new collaboration tools, multiple assistants, editors, colorists, sound designers and VFX artists can all work on the same project at the same time.

Resolve 15

For a long-term view, you will always have true artists in each of the post areas. People who have mastered the craft and can separate themselves as being color correction artists. What is really going to change is that everyone up and down the post workflow at larger post houses will be able to be much more creative and efficient, while small boutique shops and freelancers can offer their clients a full set of post production services.

What is the biggest challenge you see for the color grading process now and beyond?
Speed and flexibility. Because with everyone now collaborating and the colorist being part of every part of the post process, you will be asked to do things immediately… and in any format. So if you are not able to work in real time or with whatever footage format thrown at you, they will find someone who can.

This also comes with the challenge of changing the old notion that the colorist is one of the last people to touch a project. You will be asked to jump in early and often. Because every client would love to show early edits that are graded to get approvals faster.

FilmLight CEO Wolfgang Lempp
FilmLight designs, creates and manufactures color grading systems, image processing applications and workflow tools for the film and television industry

How has the finishing of color evolved recently?
When we started FilmLight 18 years ago, color management was comparatively simple: Video looked like video, and digital film was meant to look like film. And that was also the starting point for the DCI — the digital cinema standard tried to make digital projection look exactly like conventional cinema. This understanding lasted for a good 10 years, and even ACES today is very much built around film as the primary reference. But now we have an explosion of new technologies, new display devices and new delivery formats.

There are new options in resolution, brightness, dynamic range, color gamut, frame rate and viewing environments. The idea of a single deliverable has gone: There are just too many ways of getting the content to the viewer. That is certainly affecting the finishing process — the content has to look good everywhere. But there is another trend visible, too, which here in the UK you can see best on TV. The color and finishing tools are getting more powerful and the process is getting more productive. More programs than ever before are getting a professional color treatment before they go out, and they look all the better for it.

Either way, there is more work for the colorist and finishing house, which is of course something we welcome.

How has laser projection and HDR impacted the work?
Laser projection and HDR for cinema and TV are examples of what I described above. We have the color science and the tools to move comfortably between these different technologies and environments, in that the color looks “right,” but that is not the whole story.

The director and DP will choose to use a format that will best suit their story, and will shoot for their target environment. In SDR, you might have a bright window in an interior scene, for example, which will shape the frame but not get in the way of the story. But in HDR, that same window will be too bright, obliterate the interior scene and distract from the story. So you would perhaps frame it differently, or light up the interior to restore some balance. In other words, you have to make a choice.

HDR shouldn’t be an afterthought, it shouldn’t be a decision made after the shoot is finished. The DP wants to keep us on the edge of our seats — but you can’t be on the edge in HDR and SDR at the same time. There is a lot that can be done in post, but we are still a long way from recreating the multispectral, three-dimensional real world from the output of a camera.

HDR, of course, looks fantastic, but the industry is still learning how to shoot for best effect, as well as how to serve all the distribution formats. It might well become the primary mastering format soon, but SDR will never go away.

Where do you see the industry moving in the future?
For me, it is clear that as we have pushed resolution, frame rate, brightness and color gamut, it has affected the way we tell stories. Less is left to the imagination. Traditional “film style” gave a certain pace to the story, because there was the expectation that the audience was having to interpret, to think through to fill in the black screen in between.

Now technology has made things more explicit and more immersive. We now see true HDR cinema technology emerging with a brightness of 600 nits and more. Technology will continue to surge forward, because that is how manufacturers sell more televisions or projectors — or even phones. And until there is a realistic simulation of a full virtual reality environment, I don’t see that process coming to a halt. We have to be able to master for all these new technologies, but still ensure compatibility with existing standards.

What is the biggest challenge for color grading now and in the future?
Color grading technology is very much unfinished business. There is so much that can be done to make it more productive, to make the content look better and to keep us entertained.

Blackboard

As much as we might welcome all the extra work for our customers, generating an endless stream of versions for each program is not what color grading should be about. So it will be interesting to see how this problem will be solved. Because one way or another, it will have to be. But while this is a big challenge, it hopefully isn’t what we put all our effort into over the coming years.

BlackboardThe real challenge is to understand what makes us appreciate certain images over others. How composition and texture, how context, noise and temporal dynamics — not just color itself — affect our perception.

It is interesting that film as a capture medium is gaining popularity again, especially large-format capture. It is also interesting that the “film look” is still precious when it comes to color grading. It puts all the new technology into perspective. Filmmaking is storytelling. Not just a window to the world outside, replaced by a bigger and clearer window with new technology, but a window to a different world. And the colorist can shape that world to a degree that is limited only by her imagination.

Olympusat Entertainment Senior DI Colorist Jim Wicks
A colorist since 2007, Jim has been a senior DI colorist at Olympusat Entertainment since 2011. He has color restored hundreds of classic films and is very active in the color community.

How has the finishing of color evolved most recently?
The phrase I’m keying in on in your question is “most recently.” I believe the role of a colorist has been changing exponentially for the last several years, maybe longer. I would say that we are becoming, if we haven’t already, more like finishing artists. Color is now just one part of what we do. Because technologies are changing more rapidly than at any time I’ve witnessed, we now have a lot to understand and comprehend in addition to just color. There is ACES, HDR, changing color spaces, integrating VFX workflows into our timelines, laser projection and so on. The list isn’t endless, and it’s growing.

How has laser projection and HDR impacted the work?
For the time being, they do not impact my work. I am currently required to deliver in Rec.709. However, within that confine I am grading a wider range of media than ever before, such as 2K and 4K uncompressed DPX; Phantom Digital Video Files; Red Helium 8K in the IPP2 workspace; and much more. Laser projection and HDR is something that I continue to study by attending symposiums, or wherever I can find that information. I believe laser projection and HDR are important to know now. When the opportunity to work with laser projection and HDR is available to me, I plan to be ready.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Of course! At the very heart of every production, the cinematographer is the creator and author of the image. It is her creative vision. The colorist is the protector of that image. The cinematographer entrusts us with her vision. In this respect, the colorist needs to be in sync with the cinematographer as never before. As cinematographers move because of technology, so we move. It’s all about the deliverable and how it will be displayed. I see no benefit for the colorist and the cinematographer to not be on the same page because of changing technology.

Where do you see the industry moving in the near future and the long-range future?
In the near future: HDR, laser projection, 4K and larger and larger formats.

In the long-range future: I believe we only need to look to the past to see the changes that are inevitably ahead of us.

Technological changes forced film labs, telecine and color timers to change and evolve. In the nearly two decades since O Brother Where Art Thou? we no longer color grade movies the way we did back when the Coen classic was released in 2000. I believe it is inevitable: Change begets change. Nothing stays the same.

In keeping with the types of changes that came before, it is only a matter of time before today’s colorist is forced to change and evolve just as those before us were forced to do so. In this respect I believe AI technology is a game-changer. After all, we are moving towards driverless cars. So, if AI advances the way we have been told, will we need a human colorist in the future?

What is the biggest challenge you see for the color grading process now and beyond?
Not to sound like a “get off my lawn rant,” but education is the biggest challenge, and it’s a two-fold problem. Firstly, at many fine film schools in the US color grading is not taught as a degree-granting course, or at all.

Secondly, the glut of for-profit websites that teach color grading courses have no standardized curriculum, which wouldn’t be a problem, but at present there is no way to measure how much anyone actually knows. I have personally encountered individuals who claim to be colorists and yet do not know how to color grade. As a manager I have interviewed them — their resumes look strong, but their skills are not there. They can’t do the work.

What’s the best piece of work you’ve seen that you didn’t work on?
Just about anything shot by Roger Deakins. I am a huge fan of his work. Mitch Paulson and his team at Efilm did great work on protecting Roger’s vision for Blade Runner 2049.

Colorist David Rivero
This Madrid-born colorist is now based in China. He color grades and supervises the finishing of feature films and commercials, normally all versions, and often the trailers associated with them.

How has the finishing of color evolved most recently?
The line between strictly color grading and finishing is getting blurrier by the year. Although it is true there is still a clearer separation in the commercial world, on the film side the colorist has become the “de facto” finishing or supervising finishing artist. I think it is another sign of the bigger role the color grading is starting to play in post.

In the last two to three years I’ve noticed that fewer clients are looking at it as an afterthought, or as simply “color matching.” I’ve seen how the very same people went from a six- to seven-day DI schedule five years ago to a 20-day schedule now. The idea that spending a relatively small amount of extra time and budget on the final step can get you a far superior result is finally sinking in.

The tools and technology are finally moving into a “modern age” of grading:
– HDR is a game changer on the image-side of things, providing a noticeable difference for the audience and a different approach on our side on how to deal with all that information.

– The eventual acceptance by all color systems of what was traditionally compositing or VFX tools is also a turning point, although controversial. There are many that think that colorists should focus on grading. However, I think that rather than colorists becoming compositors, it is the color grading concept and mission that is (still) evolving.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Well, on my side of the world (China), the laser and HDR technologies are just starting to get to the public. Cinematographers are not really changing how they work yet, as it is a very small fraction of the whole exhibition system.

As for post, it requires a more careful way of handling the image, as it needs higher quality plates, compositions, CG, VFX, a more careful grade, and you can’t get away with as many tricks as you did when it was just SDR. The bright side is the marvelous images, and how different they can be from each other. I believe HDR is totally compatible with every style you could do in SDR, while opening the doors to new ones. There are also different approaches on shooting and lighting for cinematographers and CG artists.

Goldbuster

The biggest challenge it has created has been on the exhibition side in China. Although Dolby cinemas (Vision+Atmos) are controlled and require a specific pass and DCP, there are other laser projection theaters that show the same DCP being delivered to common (xenon lamp) theaters. This creates a frustrating environment. For example, during the 3D grading, you not only need to consider the very dark theaters with 3FL-3.5FL, but also the new laser rooms that are racking up their lamps to show off why they charge higher ticket prices with to 7FL-8FL.

Where do you see the industry moving in the near future and the long-range future?
I hope to see the HDR technologies settling and becoming the new standard within the next five to six years, and using this as the reference master from which all other deliveries are created. I also expect all these relative new practices and workflows (involving ACES, EXRs with the VFX/CG passes, non-LUT deliveries) to become more standardized and controlled.

In the long term, I could imagine two main changes happening, closely related to each other:
– The concept of grading and colorist, especially in films or long formats, evolving in importance and relationship within the production. I believe the separation or independence between photography and grading will get wider (and necessary) as tools evolve and the process is more standardized. We might get into something akin to how sound editors and sound mixers relate and work together on the sound.

– The addition of (serious) compositing in essentially all the main color systems is the first step towards the possibilities of future grading. A feature like the recent FaceRefinement in Resolve is one of the things I dreamed about five or six years ago.

What is the biggest challenge you see for the color grading process now and beyond?
Nowadays one of the biggest challenges is possibly the multi-mastering environment, with several versions on different color spaces, displays and aspect ratios. It is becoming easier, but it is still more painful than it should be.

Shrinking margins is something that also hurts the whole industry. We all work thanks to the benefits, but cutting on budgets and expecting the same results is not something that is going to happen.

What’s the best piece of work you’ve seen that you didn’t work on?
The Revanant, Mad Max, Fury and 300.

Carbon Colorist Aubrey Woodiwiss
Full-service creative studio Carbon has offices in New York, Chicago and Los Angeles.

How has the finishing of color evolved most recently?
It is always evolving, and the tools are becoming ever more powerful, and camera formats are becoming larger with more range and information in them. Probably the most significant evolution I see is a greater understanding of color science and color space workflows.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
These elements impact how footage is viewed and dealt with in post. As far as I can see, it isn’t affecting how things are shot.

Where do you see the industry moving in the near future? What about in the long-range future?
I see formats becoming larger, viewing spaces and color gamuts becoming wider, and more streaming- and laptop-based technologies and workflows.

What is the biggest challenge you see for the color grading process now and beyond?
The constant challenge is integrating the space you traditionally color grade in to how things are viewed outside of this space.

What’s the best piece of work you’ve seen that you didn’t work on?
Knight of Cups, directed by Terrence Malick with cinematography by Emanuel Lubezki.

Ntropic Colorist Nick Sanders
Ntropic creates and produces work for commercials, music videos, and feature films as well as experiential and interactive VR and AR media. They have locations in San Francisco, Los Angeles and New York City.

How has the finishing of color evolved most recently?
SDR grading in Rec.709 and 2.4 Gamma is still here, still looks great, and will be prominent for a long time. However, I think we’re becoming more aware of how exciting grading in HDR is, and how many creative doors it opens. I’ve noticed a feeling of disappointment when switching from an HDR to an SDR version of a project, and wondered for a second if I’m accidentally viewing the ungraded raw footage, or if my final SDR grade is actually as flat as it appears to my eyes. There is a dramatic difference between the two formats.

HDR is incredible because you can make the highlights blisteringly hot, saturate a color to nuclear levels or keep things mundane and save those heavier-handed tools in your pocket for choice moments in the edit where you might want some extra visceral impact.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
In one sense, cinematographers don’t need to do anything differently. Colorists are able to create high-quality SDR and HDR interpretations of the exact same source footage, so long as it was captured in a high-bit-depth raw format and exposed well. We’re even seeing modern HDR reimaginings of classic films. Movies as varied in subject matter as Saving Private Ryan and the original Blade Runner are coming back to life because the latitude of classic film stocks allows it. However, HDR has the power to greatly exaggerate details that may have otherwise been subtle or invisible in SDR formats, so some extra care should be taken in projects destined for HDR.

Extra contrast and shadow detail mean that noise is far more apparent in HDR projects, so ISO and exposure should be adjusted on-set accordingly. Also, the increased highlight range has some interesting consequences in HDR. For example, large blown-out highlights, such as overexposed skies, can look particularly bad. HDR can also retain more detail and color in the upper ranges in a way that may not be desirable. An unremarkable, desaturated background in SDR can become a bright, busy and colorful background in HDR. It might prove distracting to the point that the DP may want to increase his or her key lighting on the foreground subjects to refocus our attention on them.

Panasonic “PvP”

Where do you see the industry moving in the near future? What about the long-range future?
I foresee more widespread adoption of HDR — in a way that I don’t with 3D and VR — because there’s no headset device required to feel and enjoy it. Having some HDR nature footage running on a loop is a great way to sell a TV in Best Buy. Where the benefits of another recent innovation, 4K, are really only detectable on larger screens and begin to deteriorate with the slightest bit of compression in the image pipeline, HDR’s magic is apparent from the first glance.

I think we’ll first start to see HDR and SDR orders on everything, then a gradual phasing out of the SDR deliverables as the technology becomes more ubiquitous, just like we saw with the standard definition transition to HD.

For the long-range, I wouldn’t be surprised to see a phasing out of projectors as LED walls become more common for theater exhibitions due to their deeper black levels. This would effectively blur the line between technologies available for theater and home for good.

What is the biggest challenge you see for the color grading process now and beyond?
The lack of a clear standard makes workflow decisions a little tricky at the moment. One glaring issue is that consumer HDR displays don’t replicate the maximum brightness of professional monitors, so there is a question of mastering one’s work for the present, or for the near future when that higher capability will be more widely available. And where does this evolution stop? 4,000 nits? 10,000 nits?

Maybe a more pertinent creative challenge in the crossover period is which version to grade first, SDR or HDR, and how to produce the other version. There are a couple of ways to go about it, from using LUTs to initiate and largely automate the conversion to starting over from scratch and regrading the source footage in the new format.

What’s the best piece of work you’ve seen that you didn’t work on?
Chef’s Table on Netflix was one of the first things I saw in HDR; I still think it looks great!

Main Image: Courtesy of Jim Wicks.