Category Archives: on-set

Our Virtual Production Roundtable

By Randi Altman

Evolve or die. That old adage, while very dramatic, fits well with the state of our current production workflows. While most productions are now shot digitally, the warmth of film is still in the back of pros’ minds. Camera makers and directors of photography often look for ways to retain that warmth in digital. Whether it’s through lighting, vintage lenses, color grading, newer technology or all of the above.

There is also the question of setting looks on-set and how 8K and HDR are affecting the picture and workflows. And let’s not forget shooting for OTT series. There is a lot to cover!

In an effort to get a variety of perspectives, we reached out to a few cinematographers and some camera manufacturers to talk trends and technology. Enjoy!

Claudio Miranda, ASC

Claudio Miranda is a Chilean cinematographer who won an Oscar for his work on Life of Pi. He also worked on The Curious Case of Benjamin Button, the first movie nominated for a cinematography Oscar that was shot entirely on digital. Other films include Oblivion, Tomorrowland and the upcoming Top Gun: Maverick.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Seems like everyone is shooting large format. Chris Nolan and Quentin Tarantino shot 65mm film for their last projects. New digital cameras such as the Alexa LF and Sony Venice cater to this demand. People seem to like the shallow depth of field of these larger format lenses.

How is HDR affecting the way things are being shot these days? Are productions shooting/monitoring HDR on-set?
For me, too much grain in HDR can be distracting. This must be moderated in the camera acquisition format choice and DI. Panning in a high-contrast environment can cause painful strobing. This can be helped in the DI and set design. HDR done well is more important than 8K or even 3D.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K can be important for VFX plates. For me, creatively it is not important, 4K is enough. The positive of 8K is just more K. The downside is that I would rather the camera companies focus on dynamic range, color latitude, sensitivity and the look and feel of the captured image instead of trying to hit a high K number. Also, there are storage and processing issues.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
I have not shot for a streaming service. I do think we need to pay attention to all deliverables and make adjustments accordingly. In the DI, I am there for the standard cinema pass, HDR pass, IMAX pass, home video pass and other formats that arise.

Is the availability of all those camera resolutions a help or a hindrance?
I choose the camera that will fit the job. It is my job in prep to test and pick the camera that best serves the movie.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
On set, I am able to view HDR or 709. I test the pipeline and make sure the LUT is correct and make modifications if needed. I do not play with many LUTs on set, I normally just have one. I treat the camera like a film stock. I know I will be there in the DI to finalize the look. On set is not the place for futzing with LUTs on the camera. My plate is full enough as it is.

If not already covered, how has production changed in the last two years?
I am not sure production has changed, but there are many new tools to use to help make work more efficient and economical. I feel that I have always had to be mindful of the budget, no matter how large the show is. I am always looking for new solutions.

Daryn Okada, ASC
Daryn Okada is known for his work on films such as Mean GirlsAnna Karenina and Just Like Heaven. He has also worked on many TV series, such as Scandal, Grey’s Anatomy and Castle. He served as president of the ASC from 2006 to 2009.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses? 

Modern digital cinema cameras can achieve a level of quality with the proper workflows and techniques to evolve a story’s visual identity parallel explorations shooting on film. Larger image sensors, state-of-the-art lenses and mining historic optics enable cinematographers to use their experience and knowledge of the past to paint rich visual experiences for today’s audience.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
HDR is a creative and technical medium just as shooting and projecting 65mm film would be. It’s up to the director and the cinematographer to decide how to orchestrate the use of HDR for their particular story.

Can you address 8K? What are the positives, and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K will is working its way into production like 65mm and 35mm VistaVision did by providing more technical resolution for use in VFX or special-venue exhibition. The enormous amount of data and cost to handle it must be justified by its financial return and does it benefit a particular story. Latitude and color depth are paramount to creating a motion picture’s pallet and texture. Trying to use a format just because it’s technically possible may be distracting to an audience’s acceptance of a story or creative concept.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?

I think the delivery specifications of OTT have generally raised the bar, making 4K and wide color gamut the norm. For cinematographers that have spent years photographing features, we are accustomed to creating images with detail for a big screen and a wide color pallet. It’s a natural creative process to shoot for 4K and HDR in that respect. 

Are the availability of all those camera resolutions a help or a hindrance? 
Having the best imaging available is always welcomed. Even if a camera is not technically exploited, the creation of subtle images is richer and possible through the smoother transition and blending of color, contrast and detail from originating with higher resolutions and color range.

Can you talk about color management from the sensor/film to the screen? How do you ensure correct color management from the set into dailies and post, the DI and final delivery?
As a cinematographer we are still involved in workflows for dailies and post production to ensure everyone’s creative efforts to the final production are maintained for the immediate viewer and preserved for the audiences in the future.

How has production changed over the last two years?
There are more opportunities to produce content with creative high-quality cinematography thanks to advancements in cameras and cost-effective computing speed combined with demands of high quality displays and projection.

Vanja Černjul, ASC
This New York-based DP recently worked on the huge hit Crazy Rich Asians. In addition to feature film work, Černjul has shot TV shows (Deuce’s season 1 finale and two seasons of Marco Polo, as well as commercials for Panasonic and others.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
One interesting trend I noticed is the comeback of image texture. In the past, cinematographers used to expose film stock differently according to the grain texture they desired. Different exposure zones within the same frame had different grain character, which produced additional depth of the image. We lost that once we switched to digital. Crude simulations of film grain, such as overall filters, couldn’t produce the dimensionality we had with film.

Today, I am noticing new ways of bringing the texture back as a means of creative expression. The first one comes in the form of new, sophisticated post production tools designed to replicate the three-dimensional texturing that occurs naturally when shooting film, such as the realtime texturing tool LiveGrain. Monitoring the image on the set with a LiveGrain texture applied can impact lighting, filtration or lens choices. There are also new ways to manipulate texture in-camera. With the rise of super-sensitive, dual-native ISO sensors we can now shoot at very low-light levels and incorporate so-called photon shot noise into the image. Shot noise has organic character, very much like film grain.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?

The creative potential of HDR technology is far greater than that of added resolution. Unfortunately, it is hard for cinematographers to take full advantage of HDR because it is still far from being the standard way the audience sees our images. We can’t have two completely different looks for a single project, and we have to make sure the images are working on SDR screens. In addition, it is still impractical to monitor in HDR on the set, which makes it difficult to adjust lighting and lens choices to expanded dynamic range. Once HDR screens become a standard, we will be able to really start creatively exploring this new territory.

Crazy Rich Asians

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
Additional resolution adds more available choices regarding relationship of optical systems and aspect ratios. I am now able to choose lenses for their artifacts and character regardless of the desired aspect ratio. I can decide to shoot one part of the film in spherical and the other part in anamorphic and crop the image to the project’s predetermined aspect ratio without fear of throwing away too much information. I love that freedom.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices and workflows, if at all?
For me, the only practical difference between shooting high-quality content for cable or streaming is the fact that Netflix demands their projects to be capt
ured in true 4K RAW. I like the commitment to higher technical standards, even though this may be an unwelcome restriction for some projects.

Is the availability of all those camera resolutions a help or a hindrance?
I like choices. As large format lenses become more available, shooting across formats and resolutions will become easier and simpler.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The key for correct color management from the set to final color grading is in preproduction. It is important to take the time to do proper tests and establish the communication between DIT, the colorist and all other people involved as early as possible. This ensures that original ideas aren’t lost in the process.

Adjusting and fine-tuning the LUT to the lenses, lighting gels and set design and then testing it with the colorist is very important. Once I have a bulletproof LUT, I light and expose all the material for it specifically. If this part of the process is done correctly, the time in final color grading can be spent on creative work rather than on fixing inconsistencies.

I am very grateful for ACES workflow, which offers long-overdue standardization. It is definitely a move in the right direction.

How has production changed over the last two years?
With all the amazing post tools that are becoming more available and affordable, I am seeing negative trends of further cutting of preproduction time, and lack of creative discipline on the set. I sincerely hope this is just a temporary confusion due to recalibration of the process.

Kate Reid, DP
Kate Reid is a UK-based DP working in TV and film. Her recent work includes the TV series Hanna (Amazon) Marcella 2 (Netflix) and additional photography on the final season Game of Thrones for HBO. She is currently working on Press for BBC.

Can you talk about some camera trends you’ve been seeing? Such as Large Format? The use of old/vintage lenses?
Large format cameras are being used increasingly on drama productions to satisfy the requirement for additional resolution by certain distribution platforms. And, of course, the choice to use large format cameras in drama brings with it another aesthetic that DPs now have as another tool: Choosing if increased depth-of-field fall off, clarity in the image etc., enhances the particular story they wish to portray on screen.

Like many other DPs, I have always enjoyed using older lenses to help make the digital image softer, more organic and less predictable, but the larger format cameras now mean that much of this older glass designed for 35mm size sensor may not cover the increased sensor size, so newer lenses designed for the larger format cameras may become popular by necessity, alongside older larger format glass that is enjoying a renaissance.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
I have yet to shoot a show that requires HDR delivery. It hasn’t yet become the default in drama production in the UK.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and frame rate more important currently?
I don’t inherently find an ultra sharp image attractive. Through older glass and diffusion filters on the lens, I am usually looking to soften and break down my image, so I personally am not all about the extra Ks. How the camera’s sensor reproduces color and handles highlights and shadows is of more interest to me, and I believe has more impact on the picture.

Of primary importance is how practical a camera is to work with — size and how comfortable the camera is to handle would supersede excessive resolution — as the first requirement of any camera has got to be whether it allows you to achieve the shots you have in mind, because a story isn’t told through its resolution.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices, and workflows, if at all?
The major change is the requirement by Netflix for true 4K resolution, determining which cameras cinematographers are allowed to shoot on. For many cinematographers the Arri Alexa was their digital camera of choice, which was excluded by this rule, and therefore we have had to look to other cameras for such productions. Learning a new camera, its sensor, how it handles highlights, produces color, etc., and ensuring the workflow through to the post facility is something that requires time and testing, which has certainly added to a DP’s workload.

From a creative perspective, however, I found shooting for OTTs (I shot two episodes of the TV series Hanna made by Working Title TV and NBC Universal for Amazon) has been more liberating than making a series for broadcast television as there is a different idea and expectation around what the audience wants to watch and enjoy in terms of storytelling. This allowed for a more creative way of filming.

Is the availability of all those camera resolutions a help or a hindrance?
Where work is seen now can vary from a mobile phone screen to a digital billboard in Times Square, so it is good for DPs to have a choice of cameras and their respective resolutions so we can use the best tool of each job. It only becomes a hindrance if you let the technology lead your creative process rather than assist it.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Ideally, I will have had the time and opportunity to shoot tests during prep and then spend half a day with the show’s colorist to create a basic LUT I can work with on set. In practice, I have always found that I tweak this LUT during the first days of production with the DIT, and this is what serves me throughout the rest of the show.

I usually work with just one LUT that will be some version of a modified Rec. 709 (unless the look of the show drastically requires something else). It should then be straight forward in that the DIT can attach a LUT to the dailies, and this is the same LUT applied by editorial so that exactly what you see on set is what is being viewed in the edit.

However, where this fails is that the dailies uploaded to FTP sites — for viewing by the execs, producers and other people who have access to the work — are usually very compressed with low resolution, so it bears little resemblance to how the work looked on set or looks in the edit. This is really unsatisfying as for months, key members of production are not seeing an accurate reflection of the picture. Of course, when you get into the grade this can be restored, but it’s dangerous if those viewing the dailies in this way have grown accustomed to something that is a pale comparison of what was shot on set.

How has production changed over the last two years?
There is less differentiation between film and television in how productions are being made and, critically, where they are being seen by audiences, especially with online platforms now making award-winning feature films. The high production values we’ve seen with Netflix and Amazon’s biggest shows has seen UK television dramas pushing to up their game, which does put pressure on productions, shooting schedules and HODs, as the budgets to help achieve this aren’t there yet.

So, from a ground-level perspective, for DPs working in drama this looks like more pressure to produce work of the highest standard in less time. However, it’s also a more exciting place to be working as the ideas about how you film something for television versus cinema no longer need apply. The perceived ideas of what an audience is interested in, or expect, are being blown out the water by the success of new original online content, which flies in the face of more traditional storytelling. Broadcasters are noticing this and, hopefully, this will lead to more exciting and cinematic mainstream television in the future.

Blackmagic’s Bob Caniglia
In addition to its post and broadcast tools, Blackmagic offers many different cameras, including the Pocket Cinema Camera, Pocket Cinema Camera 4K, Micro Studio Camera 4K, Micro Cinema Camera, Studio Camera, Studio Camera 4K, Ursa Mini Pro, Ursa Mini 4.6K, Ursa Broadcast.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Lens freedom is on everyone’s mind right now… having the freedom to shoot in any style. This is bringing about things like seeing projects shot on 50-year-old glass because the DP liked the feel of a commercial back in the ‘60s.

We actually just had a customer test out actual lenses that were used on The Godfather, The Shining and Casablanca, and it was amazing to see the mixing of those with a new digital cinema camera. And so many people are asking for a camera to work with anamorphic lenses. The trend is really that people expect their camera to be able to handle whatever look they want.

For large format use, I would say that both Hollywood and indie filmmakers are using them more often. Or, at least they trying to get the general large format look by using anamorphic lenses to get a shallow depth of field.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Right now, HDR is definitely more of a concern for DPs in Hollywood, but also with indie filmmakers and streaming service content creators. Netflix and Hulu have some amazing HDR shows right now. And there is plenty of choice when it comes to the different HDR formats and shooting and monitoring on set. All of that is happening everyday, while 8K still needs the industry to catch up with the various production tools.

As for impacting shooting, HDR is about more immersive colors, and a DP needs to plan for it. It gives viewers a whole new level of image detail in what they shoot. They have to be much more aware of every surface or lighting impact so that the viewer doesn’t get distracted. Attention to detail gets even higher in HDR, and DPs and colorists will need to keep a close eye on every shot, including when an image in a sideview mirror’s reflection is just a little too sharp and needs a tweak.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
You can never have enough Ks! Seriously. It is not just about getting a beautiful 8K TV, it is about giving the production and post pros on a project as much data as possible. More data means more room to be creative, and is great for things like keying.

Latitude and framerate are important as well, and I don’t think any one is more important than another. For the viewers, the beauty will be in large displays, you’re already seeing 8K displays in Times Square, and though you may not need 8K on your phone, 8K on the side of a building or highway will be very impactful.

I do think one of the ways 8K is changing production practices is that people are going to be much more storage conscious. Camera manufacturers will need to continue to improve workflows as the images get larger in an effort to maximize storage efficiencies.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
For streaming content providers, shoots have definitely been impacted and are forcing productions to plan for shooting in a wider number of formats. Luckily, companies like Netflix have been very good about specifying up front the cameras they approve and which formats are needed.

Is the availability of all those camera resolutions a help or a hindrance?
While it can be a bit overwhelming, it does give creatives some options, especially if they have a smaller delivery size than the acquisition format. For instance, if you’re shooting in 4K but delivering in HD, you can do dynamic zooms from the 4K image that look like an optical zoom, or you can get a tight shot and wide shot from the same camera. That’s a real help on a limited budget of time and/or money.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Have the production and the post people planning together from the start and create the look everyone should be working on right up front.

Set the LUTs you want before a single shot is done and manage the workflow from camera to final post. Also, choose post software that can bring color correction on-set, near-set and off-set. That lets you collaborate remotely. Definitely choose a camera that works directly with any post software, and avoid transcoding.

How has production changed in the last two years?
Beyond the rise of HDR, one of the other big changes is that more productions are thinking live and streaming more than ever before. CNN’s Anderson Cooper now does a daily Facebook Live show. AMC has the live Talking Dead-type formats for many of their shows. That trend is going to keep happening, so cinematographers and camera people need to be thinking about being able to jump from scripted to live shooting.

Red Digital Cinema’s Graeme Nattress
Red Digital Cinema manufactures professional digital cameras and accessories. Red’s DSMC2 camera offers three sensor options — Gemini 5K S35, Helium 8K S35 and Monstro 8K VV.

Can you talk about some camera trends you’ve been seeing?
Industry camera trends continue to push image quality in all directions. Sensors are getting bigger, with higher resolutions and more dynamic range. Filmmakers continue to innovate, making new and amazing images all the time, which drives our fascination for advancing technology in service to the creative.

How is HDR affecting the way things are being shot these days?
One of the benefits of a primary workflow based on RAW recording is that HDR is not an added extra, but a core part of the system. Filmmakers do consider HDR important, but there’s some concern that HDR doesn’t always look appealing, and that it’s not always an image quality improvement. Cinematography has always been about light and shade and how they are controlled to shape the image’s emotional or storytelling intent. HDR can be a very important tool in that it greatly expands the display canvas to work on, but a larger canvas doesn’t mean a better picture. The increased display contrast of HDR can make details more visible, and it can also make motion judder more apparent. Thus, more isn’t always better; it’s about how you use what you have.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
Without resolution, we don’t have an image. Resolution is always going to be an important image parameter. What we must keep in mind is that camera resolution is based on input resolution to the system, and that can — and often will — be different to the output resolution on the display. Traditionally, in video the input and output resolutions were one and the same, but when film was used — which had a much higher resolution than a TV could display — we were taking a high-resolution input and downsampling it to the display, the TV screen.

As with any sampled system, in a digital cinema camera there are some properties we seek to protect and others to diminish. We want a high level of detail, but we don’t want sharpening artifacts and we don’t want aliasing. The only way to achieve that is through a high-resolution sensor, properly filtered (optical low-pass) that can see a large amount of real, un-enhanced detail. So yes, 8K can give you lots of fine detail should you want it, but the imaging benefits extend beyond downsampling to 4K or 2K. 8K makes for an incredibly robust image, but noise is reduced, and what noise remains takes on more of a texture, which is much more aesthetically pleasing.

One challenge of 8K is an increase in the amount of sensor data to be recorded, but that can be addressed through quality compression systems like RedCode.

Addressing dynamic range is very important because dynamic range and resolution work together to produce the image. It’s easy to think that high resolutions have a negative impact upon dynamic range, but improved pixel design means you can have dynamic range and resolution.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Color management is vitally important and so much more than just keeping color control from on-set through to delivery. Now with the move to HDR and an increasing amount of mobile viewing, we have a wide variety of displays, all with their own characteristics and color gamuts. Color management allows content creators to display their work at maximum quality without compromise. Red cameras help in multiple ways. On camera, one can monitor in both SDR and HDR simultaneously with the new IPP2 image processing pipeline’s output independence, which also allows you to color via CDL and creative 3D LUT in such a way as to have those decisions represented correctly on different monitor types.

In post and grading, the benefits of output independence continue, but now it’s critical that scene colors, which can so easily go out of gamut, are dealt with tastefully. Through the metadata support in the RedCode format, all the creative decisions taken on set follow through to dailies and post, but never get in the way of producing the correct image output, be it for VFX, editorial or grading.

Panavision’s Michael Cioni 
Panavision designs and manufactures high-precision camera systems, including both film and digital cameras, as well as lenses and accessories for the motion picture and television industries.

Can you talk about some camera trends you’ve been seeing?
With the evolution of digital capture, one of the most interesting things I’ve noticed in the market are new trends emerging from the optics side of cinematography. At a glance, it can appear as if there is a desire for older or vintage lenses based on the increasing resolution of large format digital cameras. While resolution is certainly a factor, I’ve noticed the larger contributor to vintage glass is driven by the quality of sensors, not the resolution itself. As sensors increase in resolution, they simultaneously show improvements in clarity, low-light capability, color science and signal-to-noise ratio.

The compounding effect of all these elements are improving images far beyond what was capable with analog film technology, which explains why the same lens behaves differently on film, S35 digital capture and large-format digital capture. As these looks continue to become popular, Panavision is responding through our investments in both restoration of classic lenses as well as designing new lenses with classic characteristics and textures that are optimized for large format photography on super sensors.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look?
Creating images is not always about what component is better, but rather how they elevate images by working in concert. HDR images are a tool that increases creative control alongside high resolution and 16-bit color. These components work really well together because a compelling image can make use of more dynamic range, more color and more clarity. Its importance is only amplified by the amalgamation of high-fidelity characteristics working together to increase overall image flexibility.

Today, the studios are still settling into an HDR world because only a few groups, led by OTT, are able to distribute in HDR to wide audiences. On-set tools capable of HDR, 4K and 16-bit color are still in their infancy and currently cost-prohibitive. 4K/HDR on the set is going to become a standard practice by 2021. 4K wireless transmitters are the first step — they are going to start coming online in 2019. Smaller OLED displays capable of 750 nits+ will follow in 2020, creating an excellent way to monitor higher quality images right on set. In 2021, editorial will start to explore HDR and 4K during the offline process. By 2024, all productions will be HDR from set to editorial to post to mobile devices. Early adopters that work out the details today will find themselves ahead of the competition and having more control as these trends evolve. I recommend cinematographers embrace the fundamentals of HDR, because understanding the tools and trends will help prevent images from appearing artificial or overdone.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
One of the reasons we partnered with Red is because the Monstro 8K VV sensor makes no sacrifice in dynamic range while still maintaining ultra high smoothness at 16 bits. The beauty of technology like this is that we can finally start to have the best from all worlds — dynamic range, resolution, bit depth, magnification, speed and workflow — without having to make quality sacrifices. When cinematographers have all these elements together, they can create images previously never seen before, and 8K is as much part of that story as any other element.

One important way to view 8K is not solely as a thermometer for high-resolution sharpness. A sensor with 35 million pixels is necessary in order to increase the image size, similar to trends in professional photography. 8K large format creates a larger, more magnified image with a wider field of view and less distortion, like the difference in images captured by 70mm film. The biggest positive I’ve noticed is that DXL2’s 8K large-format Red Monstro sensor is so good in terms of quality that it isn’t impacting images themselves. Lower quality sensors can add a “fingerprint” to the image, which can distort the original intention or texture of a particular lens.

With sensors like Monstro capable of such high precision, the lenses behave exactly as the lens maker intended. The same Panavision lenses on a lower grade sensor, or even 35mm film, are exhibiting characteristics that we weren’t able to see before. This is literally breathing new life into lenses that previously didn’t perform the same way until Monstro and large format.

Is the availability of so many camera formats a help or a hindrance?
You don’t have to look far to identify individuals who are easily fatigued by having too many choices. Some of these individuals cope with choices by finding ways to regulate them, and they feel fewer choices means more stability and perhaps more control (creative and economic). As an entrepreneur, I find the opposite to be true: I believe regulating our world, especially with regards to the arts and sciences, is a recipe for protecting the status quo. I fully admit there are situations in which people are fatigued by too many complex choices.

I find that failure is not of the technology itself, rather it’s the fault of the manufactures who have not provided the options in easy-to-consume ways. Having options is exactly what creatives need in order to explore something new and improved. But it’s also up to manufacturers to deliver the message in ways everyone can understand. We’re still learning how to do that, and with each generation the process changes a bit. And while I am not always certain which are the best ways to help people understand all the options, I am certain that the pursuit of new art will motivate us to go out of our comfort zones and try something previously thought not possible.

Have you encountered any examples of productions that have shot streaming content (i.e. for Netflix/Amazon) and had to change production practices and workflows for this format/deliverable?
Netflix and Amazon are exceptional examples of calculated risk takers. While most headlines discuss their investment in the quantity of content, I find the most interesting investment they make is in relationships. Netflix and Amazon are heavily invested in standards groups, committees, outreach, panels and constant communication. The model of the past and present (incumbent studios) are content creators with technology divisions. The model of the future (Netflix, Amazon, Hulu, Apple, Google and YouTube) are all the technology companies with the ability to create content. And technology companies approach problems from a completely different angle by not only embracing the technology, they help invent it. In this new technological age, those who lead and those who follow will likely be determined by the tools and techniques used to deliver. What I call “The Netflix Effect” is the impact Netflix has on traditional groups and how they have all had to strategically pivot based on Netflix’s impact.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The DXL2 has an advanced color workflow. In collaboration with LiveGrade by Pomfort, DXL2 can capture looks wirelessly from DITs in the form of CDLs and LUTs, which are not only saved into the metadata of the camera, but also baked into in-camera proxy files in the form of Apple ProRes or Avid DNx. These files now contain visual references of the exact looks viewed on monitors and can be delivered directly to post houses, or even editors. This improves creative control because it eliminates the guess work in the application of external color decisions and streamlines it back to the camera where the core database is kept with all the other camera information. This metadata can be traced throughout the post pipeline, which also streamlines the process for all entities that come in contact with camera footage.

How has production changed over the last two years?
Sheesh. A lot!

ARRI‘s Stephan Ukas-Bradley
The ARRI Group manufactures and distributes motion picture cameras, digital intermediate systems and lighting equipment. Their camera offerings include the Alexa LF, Alexa Mini, Alexa 65, Alexa SXT W and the Amira.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Large format opens some new creative possibilities, using a shallow depth of field to guide the audience’s view and provide a wonderful bokeh. It also conveys a perspective truer to the human eye, resulting in a seemingly increased dimensional depth. The additional resolution combined with our specially designed large format Signature Primes result in beautiful and emotional images.

Old and vintage lenses can enhance a story. For instance, when Gabriel Beristain, ASC, used Bausch & Lomb Super Baltar on the Starz show Magic City, and Bradford Young used detuned DNA lenses in conjunction with Alexa 65 on Solo: A Star Wars Story, certain characteristics like flares, reflections, distortions and focus fall-off are very difficult to recreate in post organically, so vintage lenses provide an easy way to create a unique look for a specific story and a way for the director of photography to maintain creative control.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Currently, things are not done much differently on set when shooting HDR versus SDR. While it would be very helpful to monitor in both modes on-set, HDR reference monitors are still very expensive and very few productions have the luxury to do that. One has to be aware of certain challenges when shooting for an HDR finish. High contrast edges can result in a more pronounced stutter/strobing effect when panning the camera, windows that are blown out in SDR might retain detail in the HDR pass and now all of a sudden, a ladder or grip stand are visible.

In my opinion, HDR is more important than higher resolution. HDR is resolution-independent in regard to viewing devices like phone/tablets and gives the viewer a perceived increased sharpness, and it is more immersive than increased resolution. Also, let’s not forget that we are working in the motion picture industry and that we are either capturing moving objects or moving the camera, and with that introducing motion blur. Higher resolution only makes sense to me in combination with higher frame rates, and that in return will start a discussion about aesthetics, as it may look hyper-real compared to the traditional 24fps capture. Resolution is one aspect of the overall image quality, but in my opinion extended dynamic range, signal/noise performance, sensitivity, color separation and color reproduction are more important.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices and workflows, if at all?
Shooting streaming content has really not changed production practices or workflows. At ARRI, we offer very flexible and efficient workflows and we are very transparent documenting our ARRIRAW file formats in SMPTE RDD 30 (format) and 31 (processing) and working with many industry partners to provide native file support in their products.

Is the availability of all those camera resolutions a help or a hindrance?
I would look at all those different camera types and resolutions as different film stocks and recommend to creatives to shoot their own test and select the camera systems based on what suits their project best.

We offer the ARRI Look Library for Amira, Alexa Mini and Alexa SXT (SUP 3.0), which is a collection of 87 looks, each of them available in three different intensities provided in Rec. 709 color space. Those looks can either be recorded or only used for monitoring. These looks travel with the picture, embedded in the metadata of the ARRIRAW file, QuickTime Atom or HD/SDI stream in form of the actual LUT and ASC CDL. One can also create a look dynamically on set, feeding the look back to the camera and having the ASC CDL values embedded in the same way.

More commonly, one would record in either ARRIRAW or ProRes LogC, while applying a standard Rec. 709 look for monitoring. The “C” in LogC stands for Cineon, which is a film-like response very much the like of a scanned film image. Colorists and post pros are very familiar with film and color grading LogC images is easy and quick.

How has production changed over the last two years?
I don’t have the feeling that production has changed a lot in the past two years, but with the growing demand from OTTs and increased production volume, it is even more important to have a reliable and proven system with flexible workflow options.

Main Image: DP Kate Reid.

Behind the Camera: Television DPs

By Karen Moltenbrey

Directors of photography on television series have their work cut out for them. Most collaborate early on with the director on a signature “look.” Then they have to make sure that aesthetic is maintained with each episode and through each season, should they continue on the series past the pilot. Like film cinematographers, their job entails a wide range of responsibilities aside from the camera work. Once shooting is done, they are often found collaborating with the colorists to ensure that the chosen look is maintained throughout the post process.

Here we focus on two DPs working on two popular television series — one drama, one sitcom — both facing unique challenges inherent in their current projects as they detail their workflows and equipment choices.

Ben Kutchins: Ozark
Lighting is a vital aspect in the look of the Netflix family crime drama Ozark. Or perhaps more accurate, the lack of lighting.

Ben Kutchins (left) on set with actor/director Jason Bateman.

“I’m going for a really naturalistic feel,” says DP Ben Kutchins. “My hope is that it never feels like there’s a light or any kind of artificial lighting on the actors or lighting the space. Rather, it’s something that feels more organic, like sunlight or a lamp that’s on in the room, but still offers a level of being stylized and really leans into the darkness… mining the shadows for the terror that goes along with Ozark.”

Ozark, which just kicked off its second season, focuses on financial planner Marty Byrde, who relocates his family from the Chicago suburbs to a summer resort area in the Missouri Ozarks. After a money laundering scheme goes awry, he must pay off a debt to a Mexican drug lord by moving millions of the cartel’s money from this seemingly quiet place, or die. But, trouble is waiting for them in the Ozarks, as Marty is not the only criminal operating there, and he soon finds himself in much deeper than he ever imagined.

“It’s a story about a family up against impossible odds, who constantly fear for their safety. There is always this feeling of imminent threat. We’re trying to invoke a heightened sense of terror and fear in the audience, similar to what the characters might be feeling,” explains Kutchins. “That’s why a look that creates a vibe of fear and danger is so important. We want it to feel like there is danger lurking around every corner — in the shadows, in the trees behind the characters, in the dark corners of the room.”

In summary, the look of the show is dark — literally and figuratively.

“It is pretty extreme by typical television standards,” Kutchins concedes. “We’ve embraced an aesthetic and are having fun pushing its boundaries, and we’re thrilled that it stands out from a pretty crowded market.”

According to Kutchins, there are numerous examples where the actor disappears into the shadows and then reappears moments later in a pool of light, falling in and out of shadow. For instance, a character may turn off a light and plunge the room into complete darkness, and you do not see that character again until they reappear, until they’re lit by moonlight coming through a window or silhouetted against a window.

“We’re not spending a lot of time trying to fill in the shadows. In fact, we spend most of our time creating more shadows than exist naturally,” he points out.

Jason Bateman, who plays Marty, is also an executive producer and directed the first two and last two episodes of Season 1. Early on, he, along with Kutchins and Pepe Avila del Pino, who shot the pilot, hashed out the desired look for the show, leaning into a very cyan and dark color palette — and leaning in pretty strongly. “Most people think of [this area as] the South, where it’s warm and bright, sweaty and hot. We just wanted to lean into something more nuanced, like a storm was constantly brewing,” Kutchins explains. “Jason really pushed that aesthetic hard across every department.”

Alas, that was made even more difficult since the show was mainly shot outdoors in the Atlanta area, and a good deal of work went into reacting to Mother Nature and transforming the locations to reflect the show’s Ozark mountain setting. “I spent an immense amount of time and effort killing direct sunlight, using a lot of negative fill and huge overheads, and trying to get rid of that direct, harsh sun,” says Kutchins. “Also, there are so many windows inside the Byrde house that it’s essentially like shooting an exterior location; there’s not a lot of controlled light, so you again are reacting and adapting.”

Kutchins shoots the series on a Panasonic VariCam, which he typically underexposes by a stop or two, mining the darker part of the sensor, “the toe of the exposure curve.” And by doing so, he is able to bring out the dirtier, more naturalistic, grimy parts of the image, rather than something that looks clean and polished. “Something that has a little bit of texture to it, some grit and grain, something that’s evocative of a memory, rather than something that looks like an advertisement,” he says.

To further achieve the look, Kutchins uses an in-camera LUT that mimics old Fuji film stock. “Then we take that into post,” he says, giving kudos to his colorist, Company 3’s Tim Stipan, who he says has been invaluable in helping to develop the “vibe” of the show. “As we moved along through Season 1 and into Season 2, he’s been instrumental in enhancing the footage.”

A lot of Kutchins’ work occurs in post, as the raw images captured on set are so different from the finals. Insofar as the digital intermediate is concerned, significant time is spent darkening parts of the frame, brightening small sections of the frame and working to draw the viewer into the frame. “I want people to be leaning on the edge of their seat, kind of wanting to look inside of the screen and poke their head in for a look around,” Kutchins says. “So I do a lot of vignetting and darkening of the edges, and darkening specific things that I think are distracting.”

Nevertheless, there is a delicate balance he must maintain. “I talk about the darkness of Ozark, but I am trying to ride that fine line of how dark it can be but still be something that’s pleasant to watch. You know, where you’re not straining to see the actor’s face, where there’s just enough information there and the frame is just balanced enough so your eyes feel comfortable looking at it,” he explains. “I spend a lot of time creating a focal point in the frame for your eyes to settle on — highlighting certain areas and letting some areas go black, leaving room for mystery in every frame.”

When filming, Kutchins and his crew use Steadicams, cranes, dollies and handheld. He also uses Cooke Optics’ S4 lenses, which he tends to shoot wide open, “to let the flaws and character of the lenses shine through.”

Before selecting the Panasonic VariCam, Kutchins and his group tested other cameras. Because of Netflix’s requirement for 4K, that immediately ruled out the ARRI Alexa, which is Kutchins’ preferred camera. “But the Panasonic ended up shining,” he adds.

In Ozark, the urban family is pitted against nature, and thus, the natural elements around them need to feel dangerous, Kutchins points out. “There’s a line in the first season about how people drown in the lake all the time. The audience should always feel that; when we are at the water’s edge, that someone could just slip in and disappear forever,” he says. “So, the natural elements play a huge role in the inspiration for the lighting and the feel of the show.”

Jason Blount:The Goldbergs
A polar opposite to Ozark in almost every way, The Goldbergs is a single-camera comedy sitcom set in the ’80s about a caring but grumpy dad, an overbearing mother and three teens — the oldest, a popular girl; the middle one, who fancies himself a gifted athlete and strives to be popular; and the youngest, a geek who is obsessed with filmmaking, as he chronicles his life and that of his family on film. The series is created and executive-produced by Adam F. Goldberg and is based on his own life and childhood, which he indeed captured on film while growing up.

The series is filmed mostly on stage, with the action taking place within the family home or at the kids’ schools. For the most part, The Goldbergs is an up-lit, broad comedy. The colors are rich, with a definite nod to the vibrant palette of the ’80s. “Our colorist, Scott Ostrowsky [from Level 3], has been grading the show from day one. He knows the look of the show so well that by the time I sit with him, there are very few changes that have to be made,” says Blount.

The Goldbergs began airing in 2013 and is now entering its sixth season. And the series’ current cinematographer, Jason Blount, has been involved since the start, first serving as the A camera/Steadicam operator before assuming the role of DP for the Season 1 finale — for a total of 92 episodes now and counting.

As this was a Sony show for ABC, the plan was to shoot with a Sony PMW-F55 CineAlta 4K digital camera, but at the time, it did not record at a fast enough frame rate for some of the high-speed work the production wanted. So, they ended up using the ARRI Alexa for Season 1. Blount took over as DP full time from Season 2 onward, and the decision was made to switch to the F55 for Season 2, as the frame rate issue had been resolved.

“The look of the show had already been established, and I wanted to make sure that the transition between cameras was seamless,” says Blount. “Our show is all about faces and seeing the comedy. From the onset, I was very happy with the Sony F55. The way the camera renders skin tone, the lack of noise in the deep shadows and the overall user-friendly nature of the camera impressed me from the beginning.”

Blount points to one particular episode where the F55 really shined. “The main character was filming a black-and-white noir-style home movie. The F55 handled the contrast beautifully. The blacks were rich and the highlights held onto detail very well,” he says. “We had a lot of smoke, hard light directly into the lens, and really pushed the limits of the sensor. I couldn’t have been happier with the results.”

In fact, the camera has proved its mettle winter, spring, summer and fall. “We’ve used it in the dead of winter, at night in the rain and during day exterior [shots] at the height of summer when it’s been over 100 degrees. It’s never skipped a beat.”

Blount also commends Keslow Camera in Los Angeles, which services The Goldbergs’ cameras. In addition, the rental house has accessorized the F55 camera body with extra bracketry and integrated power ports for more ease of use.

Due to the fast pace at which the show is filmed — often covering 10-plus pages of script a day — Blount uses Angenieux Optimo zoom lenses. “The A camera has a full set of lightweight zooms covering 15mm to 120mm, and the B camera always has the [Optimo] 24-290,” he says. “The Optimo lenses and F55 are a great combination, making it easy to move fast and capture beautiful images.”

Blount points out that he also does all the Steadicam work on the show, and with the F55 being so lightweight, compact and versatile, it makes for a “very comfortable camera in Steadicam mode. It’s perfect to use in all shooting modes.”

The Goldbergs’ DP always shoots with two cameras, sometimes three depending on the scene or action. And, there is never an issue of the cameras not matching, according to Blount. “I’m not a big fan of the GoPro image in the narrative world, and I own a Sony a7S. It’s become my go-to camera for mounts or tight space work on the show, and works perfectly with the F55.”

And, there is something to say for consistency, too. “Having used the same camera and lens package for the past five seasons has made it easy to keep the look consistent for The Goldbergs,” says Blount. “At the beginning of this season, I looked at shooting with the new Sony Venice. It’s a fantastic-looking camera, and I love the options, like the variable ND filters, more color temperature options and the dual ISO, but the limit of 60fps at this stage was a deal-breaker for me; we do a fair amount of 72fps and 120fps.”

“If only the F55 had image stabilization to take out the camera shake when the camera operators are laughing so hard at the actors’ performances during some scenes. Then it would be the perfect camera!” he says with a laugh himself.


Karen Moltenbrey is a longtime writer and editor in the CG and post industries.

DG 7.9, 8.27

Q&A: Camera Operators

By Randi Altman

Camera operators might not always get the glory, but they certainly do get the job done. Working hand in hand with DPs and directors, these artists make sure the camera is in the right place for the right shot, and so much more. As one of the ops we spoke to says, “The camera operator is the “protector of the frame.”

We reached out to three different camera operators, all of whom are members of the Society of Camera Operators (SOC), to find out more about their craft and how their job differs from some of the others on set.

Lisa Stacilauskas

Lisa Stacilauskas, SOC
What is the role of the camera operator? What is the camera operator accountable for on set?
The role of the camera operator varies quite a bit depending on the format. I work primarily in scripted television on “single camera” comedies. Don’t let the name “single camera” fool you. It’s meant to differentiate the shooting format from multicam, but these days most single camera shows shoot with two or three cameras. The show I work on, American Housewife, uses three cameras. I am the C camera operator.

In the most basic sense, the camera operator is responsible for the movement of the camera and the inclusion or exclusion of what is in frame. It takes a team of craftspeople to accomplish this. My immediate team includes a 1st and 2nd camera assistant and a dolly grip. Together we get the camera where it needs to be to get the shots and to tell the story as efficiently as possible.

In a larger sense, the camera operator is a storyteller. It is my responsibility to know the story we are trying to tell and assist the director in attaining their vision of that story. As C camera operator, I think about how the scene will come together in editing so I know which pieces of coverage to get.

Another big part of my job is keeping lighting equipment and abandoned water bottles out of my shot. The camera operator is the “protector of the frame”!

How do you typically work with the DP?
The DP is the head of the camera department. Each DP has nuances in the way they work with their operators. Some DPs tell you exactly where to put your camera and what focal length your lenses should be. Others give you an approximate position and an indication of the size (wide, medium, close-up) and let you work it out with the actors or stand-ins.

American Housewife

How does the role of the camera operator and a DP differ?
The DP is in charge of camera and lighting. Officially, I have no responsibility for lighting. However, it’s very important for a camera operator to think like a DP; to know and pay attention to the lighting. Additionally, especially when shooting digitally, once the blocking is determined, camera operators stay on set, working with all the other departments to prepare a shot while the DP is at the monitors evaluating the lighting and/or discussing set ups with the director.

What is the relationship between the operator and the director?
The relationship between the operator and director can vary depending on the director and the DP. Some directors funnel all instructions through the DP and only come to you with minor requests once the shots have already been determined.

If the director comes to me directly without going through the DP, it is my responsibility to let the DP know of the requested shot, especially if she/he hasn’t lit for it! Sometimes you are a mediator between the two and hopefully steer them closer to being on the same page. It can be a tough spot to be in if the DP and director have different visions.

Can you talk about recent projects you’ve worked on?
I’m currently working on Season 3 of American Housewife. The C camera position was a day-playing position at the beginning of Season 1, but DP Andrew Rawson loves to work with three cameras, and really knows how to use all three efficiently. Once production saw how much time we saved them, they brought us on full time. Shooting quickly and efficiently is especially important on American Housewife because three of our five principal actors are minors, whose hours on set are restricted by law.

During a recent hiatus, I operated B camera on a commercial with the DP operating A camera. It seemed like the DP appreciated the “extra set of eyes.”

Prior to American Housewife, I worked on several comedies operating the B camera, including Crazy Ex- Girlfriend (Season 1), Teachers (Season 1) and Playing House (Season 2).

Stephen Campanelli, SOC
What is the role of the camera operator? What is the camera operator accountable for on set?
The role of the camera operator on the set is to physically move the camera around to tell the story. The camera operator is accountable for what the director and the director of photography interpret to be the story that needs to be told visually by the camera.

Stephen Campanelli (center) on the set of American Sniper.

As a camera operator, you listen to their input, and sometimes have your own input and opinion to make the shot better or to convey the story point from a different view. It is the best job on set in my opinion, as you get to physically move the camera to tell great stories and work with amazing actors who give you their heart and soul right in front of you!

How do you typically work with the DP?
I have been very fortunate in my career to have worked with some very collaborative DPs. After 24 years of working with Clint Eastwood, I have absorbed so much of his directing style and visual nature that working closely with the DP we have created the Eastwood-style of filmmaking. When I am doing other films with other DPs we always talk about conveying the story in the truest, most visual way without letting the camera get in the way of a good story. That is one of the most important things to remember: A camera operator is not to bring attention to the camera, but bring attention to the story!

How does the role of the camera operator and a DP differ?
A DP usually is in charge of the lighting and the look of the entire motion picture. Some DPs also operate the camera, but that is a lot of work on both sides. A camera operator is very essential, as he or she can rehearse with the actors or stand-ins while the director of photography can concentrate solely on the lighting.

What is the relationship between the operator and the director?
As I mentioned earlier, my relationship with Clint Eastwood has been a very close one, as he works closely with the camera operator rather than the director of photography. We have an incredible bond where very few words are spoken, but we each know how to tell the story once we read the script. On some films, the director and the DP are the ones that work together closely to cohesively set the tone for the movie and to tell the story, and the camera operator interprets that and physically moves the camera with the collaboration of both the director and director of photography.

Can you talk about recent projects you’ve worked on?
I recently just wrapped my 22nd movie with Clint Eastwood as his camera operator, it is called The Mule. We filmed in Atlanta, New Mexico and Colorado. It is a very good script and Clint is back in front of the camera again, acting in it. It also stars Bradley Cooper, Laurence Fishburne and Michael Pena. [Editor’s note: He recently worked on A Star is Born, also with Bradley Cooper.]

Recently, I moved up to directing. In 2015, I directed a movie called Momentum, and this year I directed an award-winning film called Indian Horse that was a big hit in Canada and will soon be released in the United States.

Jamie Hitchcock

Jamie Hitchcock, SOC
What is the role of the camera operator? What is the camera operator accountable for on set?
The role of a camera operator is to compose an assigned shot and physically move the camera if necessary to perform that shot or series of shots as many times as needed to achieve the final take. The camera operator is responsible for maintaining the composition of the shot while also scanning the frame for anything that shouldn’t be there. The camera operator is responsible for communicating to all departments about elements that should or should not be in the frame.

How do you typically work with the DP?
The way a camera operator works with a director of photography varies depending on the type of project they are working on. On a feature film, episodic or commercial, the director of photography is very involved. The DP will set each shot and the operator then repeats it as many times as necessary. On a variety show, live show or soap opera, the DP is usually a lighting designer/director, and the director works with the camera operators to set the shots. On multi-camera sitcoms, the shots are usually set by the director… with the camera operator. When the production requires a complicated scene or location, the DP will become more actively involved with the selection of the shots.

How does the role of the camera operator and a DP differ?
The role of the DP and operator are quite different yet the goal is the same. The DP is involved with the pre-production process, lighting, running the set, managing the crew and the post process. The operator is involved on set working shot by shot. Ultimately, both the DP and operator are responsible for the final image the viewing audience will see.

What is the relationship between the operator and the director?
The relationship between the operator and the director, like that of the operator and DP, varies depending on the type of project. On a feature-type project, the director may be only using one camera. On a sports or variety program the director might be looking at 15 or more cameras. In all cases, the director is counting on the operators to perform their assigned shots each time. Ultimately, when a shot or take is complete, the director is the person who decides to move on or do it again, and they trust the operator to tell them if the shot was good or not.

CBS’s Mom

Can you talk about recent projects you’ve worked on?
I am currently working on The Big Bang Theory and Mom for CBS. Both are produced by Chuck Lorre Productions and Warner Bros. Television. Steven V. Silver, ASC, is the DP for both shows. Both shows use four cameras and are taped in front of a studio audience. We work in what I like to call the “Desilu-type” format because all four cameras are on a J.L. Fisher dolly with a 1st assistant working the lens and a dolly grip physically moving the camera. This format was perfected by Desi Arnez for I Love Lucy and still works well for this format.

The working relationship with the DP and director on our shows falls somewhere in the middle. Mark Cendroski and Jamie Widdoes direct almost all of our shows, and they work directly with the four operators to set the required shots. They have a lot of trust in us to know what elements are needed in the frame, and sometimes the only direction we receive is the type of shot they want. I work on a center camera, which is commonly referred to as a “master” camera, however it’s not uncommon to have a master, close-up and two or three shots all in the same scene. Each scene is shot beginning to end with all the coverage set, and a final edit is done in post. We do have someone cutting a live edit that feeds to the audience so they can follow along.

Our process is very fast, and our DP usually only sees the lighting when we start blocking shots with stand-ins. Steve spends a lot of time at the monitor and constantly switches between all four cameras — he’s looking at composition and lighting, and setting the final look with our video controller. Generally, when the actors are on set we roll the cameras. It’s a pretty high-pressure way to work for a product that will potentially be seen by millions of people around the world, but I love it and can’t imagine doing anything else.

Main Image Caption: Stephen Campanelli


The SOC, which celebrates 40 years in 2019, is an international organization that aims to bring together camera operators and crew. The Society also hosts an annual Lifetime Achievement Awards show, publishes the magazine Camera Operator and has a charitable commitment to The Vision Center at Children’s Hospital Los Angeles.


The ASC: Mentoring and nurturing diversity

Cynthia Pusheck, ASC, co-chairs the ASC Vision Committee, along with John Simmons, ASC. Working together they focus on encouraging and supporting the advancement of underrepresented cinematographers, their crews and other filmmakers. They hope their efforts inspire others in the industry to help positive change through hiring talent that better reflects society.

In addition to her role on the ASC Vision Committee, Pusheck is a VP of the ASC board. She became a member in 2013. Her credits include Sacred Lies, Good Girls Revolt, Revenge and Brothers & Sisters. She is currently shooting Limetown for Facebook Watch.

To find out more about their work, we reached out to Pusheck.

Can you talk about what the ASC Vision Committee has done since its inception? What it hopes to accomplish?
The ASC Vision Committee was formed in January 2016 as a way for the ASC to actively support those who face unique hurdles as they build their cinematography careers. We’ve held three full-day diversity events, and some individual panel discussions.

We’ve also awarded a number of scholarships to the ASC Master Class and will continue awarding a handful each year. Our mentorship program is getting off the ground now with many ASC members offering to give time to young DPs from underrepresented groups. There’s a lot more that John Simmons (my co-chair) and our committee members want to accomplish, and with the support of the ASC staff, board members and president, we will continue to push things forward.

(L-R) Diversity Day panel: Rebecca Rhine, Dr. Stacy Smith, Alan Caso, Natasha Foster-Owens, Xiomara Comrie, Tema Staig, Sarah Caplan.

The word “progress” has always been part of the ASC mission statement. So, with the goal of progress in mind, we redesigned an ASC red lapel pin and handed it out at the ASC Awards earlier this year (#ASCVision). We wanted to use it to call attention to the work of our committee and to encourage our own community of cinematographers and camera people to do their part. If directors of photography and their department heads (camera, grip and set lighting) hire with inclusivity in mind, then we can change the face of the industry.

What do you think is contributing to more females becoming interested in camera crew careers? What are you seeing in terms of tangible developments?
Gender inequality in this industry has certainly gotten a lot of attention the last few years, which is fantastic but despite all that attention, the actual facts and figures don’t show as much change as you’d think.

The percentage of women or people of color shooting movies and TV shows hasn’t really changed much. There certainly is a lot more “content” getting produced for TV, and that has been great for many of us, and it’s a very exciting time. But, we have a long way to go still.

What’s very hopeful, though, is that more producers and studios are really pushing for inclusivity. That means hiring more women and people of color in positions of leadership, and encouraging their crews to bring more underrepresented crew members onto the production.

Currently we’re also seeing more young female DPs getting some really good shooting opportunities very early in their careers. That didn’t happen so much in the past, and I think that continues to motivate more young women to consider the camera department, or cinematography, as a viable career path.

We also have to remember that it’s not just about getting more women on set, it’s about having our sets look like society at large. The ultimate goal should be that everyone has a fair chance to succeed in this industry.

How can women looking to get into this part of the industry find mentors?
The union (Local 600), and also now the ASC have mentorship programs. The union’s program is great for those coming up the ranks looking for help or advice as they build their career.

For example, an assistant can find another assistant, or an operator, to help them navigate the next phase of their career and give them advice. The ASC mentorship program is aimed more for young cinematographers or operators from underrepresented groups who may benefit from the support of an experienced DP.

Another way to find a mentor is by contacting someone whom you admire directly. Many women would be surprised to find that if they reach out and request a coffee or phone call, often that person will try and find time for them.

My advice would be to do your homework about the person you’re contacting and be specific in your questions and your goals. Asking broad questions like “How do I get a job” or “Will you hire me?” won’t get you very far.

What do you think will create the most change? What are the hurdles that still must be overcome?
Bias and discrimination, whether conscious or unconscious, is still a problem on our sets. It may have lessened in the last 25 years, but we all continue to hear stories about crew members (at all levels) who behave badly, make inappropriate comments or just have trouble working for woman or people of color. These are all unnecessary stresses for those trying to get hired and build their careers.


Behind the Camera: Feature Film DPs

By Karen Moltenbrey

The responsibilities of a director of photography (DP) span far more than cinematography. Perhaps they are best known for their work behind the camera capturing the action on set, but that is just one part of their multi-faceted job. Well before they step onto the set, they meet with the director, at times working hand-in-hand to determine the overall look of the project. They also make a host of technical selections, such as the type of camera and lenses they will use as well as the film stock if applicable – crucial decisions that will support the director’s vision and make it a reality.

Here we focus on two DPs for a pair of recent films with specialized demands and varying aesthetics, as they discuss their workflows on these projects as well as the technical choices they made concerning equipment and the challenges each project presented.

Hagen Bogdanski: Papillon
The 2018 film Papillon, directed by Michael Noer, is a remake of the 1973 classic. Set in the 1930s, it follows two inmates who must serve, and survive, time in a French Guyana penal colony. The safecracker nicknamed Papillon (Charlie Hunnam) is serving a life sentence and offers protection to wealthy inmate Louis Dega (Rami Malek) in exchange for financing Papillon’s escape.

“We wanted to modernize the script, the whole story. It is a great story but it feels aged. To bring it to a new, younger audience, it had to be modernized in a more radical way, even though it is a classic,” says Hagen Bogdanski, the film’s DP, whose credits include the film The Beaver and the TV series Berlin Station, among others. To that end, he notes, “we were not interested in mimicking the original.”

This was done in a number of ways. First, through the camera work, using a semi-documentary style. The director has a history of shooting documentaries and, therefore, the crew shot with two cameras at all times. “We also shot the rehearsals,” notes Bogdanski, who was brought onto the project and given nearly five weeks of prep before shooting began. Although this presented a lot of potential risk for Bogdanski, the film “came out great in the end. I think it’s one of the reasons the look feels so modern, so spontaneous.”

In the film, the main characters face off against the harsh environment of their prison island. But to film such a landscape required the cinematographer and crew to also contend with these trying conditions. They shot on location outdoors for the majority of the feature, using just one physical structure: the prison. Also helping to define the film’s aesthetic was the lighting, which, as is typical with Bogdanski’s films, is as natural as possible without large artificial sources.

Most of the movie was shot in Montenegro, near sun-drenched Greece and Albania. Bogdanski does not mince words: “The locations were difficult.”

Weather seemed to impact Bogdanski the most. “It was very remote, and if it’s raining, it’s really raining. If it’s getting dark, it’s dark, and if it’s foggy, there is fog. You have to deal with a lot of circumstances you cannot control, and that’s always a bit of a nightmare for any cinematographer,” he says. “But, what is good about it is that you get the real thing, and you get texture, layers, and sometimes it’s better when it rains than when the sun is shining. Most of the time we were lucky with the weather and circumstances. The reality of location shooting adds quite heavily to the look and to the whole texture of the movie.”

The location shooting also affected this DP’s choice of cameras. “The footprint [I used] was as small as possible because we basically visited abandoned locales. Therefore, I chose as small a kit — lenses, cameras and lights — as possible,” Bogdanski points out. “Because [the camera] was handheld, every pound counted.” In this regard, he used ARRI’s Arriflex Mini cameras and one Alexa SXT, and only shot with Zeiss Ultra Prime lenses – “big zooms, no big filters, nothing,” he adds.

The prison build was on a remote mountain. On the upside, Bogdanski could shoot 360 degrees there without requiring the addition of CGI later. On the downside, the crew had to get up the mountain. A road was constructed to transport the gear and for the set construction, but even so, the trek was not easy. “It took two hours or longer each day from our hotel. It was quite an adventure,” he says.

As for the lighting, Bogdanski tried to shoot when the light was good, taking advantage of the location’s natural light as much as possible — within his documentary style. When this was not enough, LEDs were used. “Again, small footprint, smaller lens, smaller electrical power, smaller generators….” The night scenes were especially challenging because the nights were very short, no longer than five to six hours. When artificial rain had to be used, shooting was “a little painful” due to the size of the set, requiring the use of more traditional lighting sources, such as large Tungsten light units.

According to Bogdanski, filming Papillon followed what he calls an “eclectic” workflow, akin to the European method of filming whereby rehearsal occurred in the morning and was quite long, as the director rehearsed with the actors. Then, scenes were shot in script order, on the first take without technical rehearsals. “From there, we tried to cover the scene in handheld mode with two cameras in a kind of mash-up. We did pick up the close-ups and all that, but always in a very spontaneous and quick way,” says Bogdanski.

Looking back, Bogdanski describes Papillon as a “modern-period film”: a period look, without looking “period.” “It sounds a bit Catch-22, which it is, in my opinion, but that’s what we aimed for, a film that plays basically in the ’40s and ’50s, and later in the ’60s,” he says.

During the time since the original film was made in 1973, the industry has witnessed quite a technical revolution in terms of film equipment, providing the director and DP on the remake with even more tools and techniques at their disposal to leave their own mark on this classic for a new generation.

Nancy Schreiber: Mapplethorpe
Award-winning cinematographer Nancy Schreiber, ASC, has a resume spanning episodic television (The Comeback), documentaries (Eva Hesse) and features (The Nines). Her latest film, Mapplethorpe, paints an unflinching portrait of controversial-yet-revered photographer Robert Mapplethorpe, who died at the age of 42 from AIDS-related complications in 1989. Mapplethorpe, whose daring work influenced popular culture, rose to fame in the 1970s with his black-and-white photography.

In the early stages of planning the film, Schreiber worked with director Ondi Timoner and production designer Jonah Markowitz while they were still in California prior to the shoot in New York, where Mapplethorpe (played by The Crown’s Matt Smith) lived and worked at the height of his popularity.

“We looked at a lot of reference materials — books and photographs — as Ondi and I exchanged look books. Then we honed in on the palette, the color of my lights, the set dressing and wardrobe, and we were off to the races,” says Schreiber. Shooting began mid-July 2017.

Mapplethorpe is a period piece that spans three decades, all of which have a slightly different feel. “We kept the ’60s and into the ’70s quite warm in tone,” as this is the period when he first meets Patty Smith, his girlfriend at the time, and picks up a camera, explains Schreiber. “It becomes desaturated but still warm tonally when he and Patti visit his parents back home in Queens while the two are living at the Chelsea Hotel. The look progresses until it’s very much on the cool blue/gray side, almost black and white, in the later ’70s and ’80s.” During that time period, Mapplethorpe is successful, with an enormous studio, photographically exploring male body parts like no other person has ever done, while continuing to shoot portraits of the rich and famous.

Schreiber opted to use film, Super 16, rather than digital to capture the life of this famed photographer. “He shot in film, and we felt that format was true to his photography,” she notes. Despite Mapplethorpe’s penchant for mostly shooting in black and white, neither Timoner nor Schreiber considered using that format for the feature, mostly because the ’60s through ’80s in New York had very distinctive color palettes. They felt, however, that film in and of itself was very “textural and beautiful,” whereas you have to work a little harder with digital to make it look like film — even though new ways of adding grain to digital have become quite sophisticated. “Yet, the grain of Super 16 is so distinctive,” she says.

In addition, Kodak had just opened a lab in New York in the spring of 2017, facilitating their ability to shoot film by having it processed quickly nearby.

Schreiber used an ARRI Arriflex 416 camera for the project; when possible, she used two. She also had a set of Zeiss 35mm Super Speed lenses, along with two zoom lenses she used only occasionally for outdoor shots. “The Super Speeds were terrific. They’re vintage and were organic to the look of this period.”

She also used a light meter faithfully. Although Schreiber occasionally uses light meters when shooting digital, it was not optional for shooting film. “I had to use it for every shot, although after a couple of days, I was pretty good at guessing [by eyeing it],” Schreiber points out, “as I used to do when we only shot film.”

Soon after ARRI had introduced the Arriflex 416 – which is small and lightweight – the industry started moving to digital, prompting ARRI to roll out the now-popular Alexa. “But the [Arriflex 416] camera really caught on for those still shooting Super 16, as they do for the series The Walking Dead, Schreiber says, adding she was able to get her pair from TCS Technological Cinevideo Services rental house in New York.

“I had owned an Aaton, a French camera that was very popular in the 1980s and ’90s. But today, the 416 is very much in demand, resembling the shape of my Aaton, both of which are ergonomic, fitting nicely on your shoulder. There were numerous scenes in the car, and I could just jump in the car with this very small camera, much smaller than the digital cameras we use on movies; it was so flexible and easy to work with,” recalls Schreiber.

As for the lenses, “again, I chose the Super Speed Primes not only because they were vintage, but because I needed the speed of the 1.3 lens since film requires more light.” She tested other lenses at TCS, but those were her favorites.

While Schreiber has used film on some commercials and music videos, it had been some time since she had used it for an entire movie. “I had forgotten how freeing it is, how you can really move. There are no cables to worry about. Although, we did transmit to a tiny video village,” she says. “We didn’t always have two cameras [due to cost], so I needed to move fast and get all the coverage the editor needed. We had 19 days, and we were limited in how long we could shoot each day; our budget was small and we couldn’t afford overtime.” At times, though, she was able to hire a Steadicam or B operator who really helped move them along, keeping the camera fluid and getting extra coverage. Timoner also shot a bit of Super 8 along the way.

There was just one disadvantage to using film: The stocks are slow. As Schreiber explains, she used a 500 ASA stock. Therefore, she needed very fast lenses and a fair amount of light in order to compensate. “That worked OK for me on Mapplethorpe because there was a different sense of lighting in the 1970s, and films seemed more ‘lit.’ For example, I might use backlight or hair light, which I never would do for [a film set in] present day,” she says. “I rated that stock at 400 to get rich blacks; that also slightly minimized the grain when the day interior stock was 250 that I rated at 200. We are so used to shooting at 800 or 1280 ISO these days. It was an adjustment.”

Schreiber on set with “Mapplethorpe” director Ondi Timoner.

Shooting with film was also more efficient for Schreiber. “We had monitors for the video village, but we were standard def, old-school, which is not an exact representation. So, I could move quickly to get enough coverage, and I never looked at a monitor except when we had Steadicam. What you see is not what you get with an SD tap. I was trusted to create the imagery as I saw fit. I think many people today are used to seeing the digital image on the monitor as what the final film will look like and may be nervous about waiting for the processing and transfer, not trusting the mystery or mystique of how celluloid will look.”

To top things off, Schreiber was backed by an all-female A camera team. “I know how hard it is for women to get work,” she adds. “There are so many competent women working behind the camera these days, and I was happy to hire them. I remember how challenging it was when I was a gaffer or started to shoot.”

As for costs, digital camera equipment is more expensive than Super 16 film equipment, yet there were processing and transfer costs associated with getting the film into the edit suite. So, when all was said and done, film was indeed more expensive to use, but not by much.

“I am really proud that we were able to do the movie in 19 days with a very limited budget, in New York, covering many periods,” concludes Schreiber. “We had a great time, and I am happy I was able to hire so many women in my departments. Women are still really under-represented, and we must demonstrate that there is not a scarcity of talent, just a lack of exposure and opportunity.”

Mapplethorpe is expected in theaters this October.


Karen Moltenbrey is a longtime writer and editor in the CG and post industries.


Atomos Ninja V records 4K 10-bit from new Nikon mirrorless cameras  

The new Nikon Z6 and Z7 mirrorless cameras output a full-frame 10-bit 4K N-Log signal, which the new Atomos Ninja V 4K HDR monitor/recorder can record and display in HDR.

The Nikon Z6 and Z7 have sensors that output 4K images over HDMI, ready for conversion to HDR by Atomos. The Atomos Ninja V records the output to production-ready 10-bit Apple ProRes or Avid DNx formats.

Atomos supports Nikon Log, Apple ProRes recording and HDR monitoring from the Z series cameras. The tiny Ninja V 5-inch device makes a nice go-to for these full-frame mirrorless cameras, making the setup ideal for corporate, news, documentary, nature films or b-roll for Hollywood productions.

The Z6 and Z7 offer the Nikon N-Log gamma, a brand-new Log gamma designed by Nikon to get the most out of the cameras’ sensors and wide dynamic range. Atomos helped resolve N-Log to HDR on their devices and their engineers have developed specific presets for it. Setup is automatic — plug the Ninja V into the cameras. The Ninja V can show 10+ stops of dynamic range on-screen to allow users to make accurate exposure and color decisions. The recorder can receive timecode and be triggered directly from the cameras.

The Ninja V costs $695, excluding SSD and batteries.

“It’s fantastic to push technology barriers with our friends at Nikon,” says Atomos CEO Jeromy Young. “Combining the new Nikon and our Ninja V HDR monitor/recorder gives filmmakers exactly what they have been asking for — a compact full-frame 4K 10-bit recording system at [this] price point.”


Review: OConnor camera assistant bag

By Brady Betzel

After years and years of gear acquisition, I often forget to secure proper bags and protection for my equipment. From Pelican cases to the cheapest camera bags, a truly high-quality bag will extend the life of your equipment.

In this review I am going to go over a super-heavy-duty assistant camera bag by OConnor, which is a part of the Vitec Group. While the Vitec Group provides many different products — from LED lighting to robotic camera systems — OConnor is typically known for their professional fluid heads and tripods. This camera bag is made to not only fit their products, but also other gear, such as pan bars and ARRI plates. The OConnor AC bag is a no-nonsense camera and accessory bag with velcro enforced-repositionable inserts that will accommodate most cameras and accessories you have.

As soon as I opened the box and touched the AC bag I could tell it was high quality. The bag exterior is waterproof and easily wipeable. But, more importantly, there is an internal water- and dust-proof liner that allows the lid to be hinged while the equipment is close at hand while the liner is fully zipped. This internal waterproofing is resistant up to a 1.2M/4ft. column of water. Once I got past the quality of materials, my second inspection focused on the zippers. If I have a camera bag with bad zippers or snaps, it usually is given away or tossed, but the AC bag has strong and easy gliding zippers.

On the lid and inside of the front pockets are extremely tough and see-through mesh pockets for everything from batteries to memory cards. On the front is a business card/label holder. Around the outside are multiple pockets with fixing points for Carabiner hooks. In addition, there are d-rings for the included leather strap if you want to carry this bag over your shoulder instead of using the handles. The bag comes with five dividers to be velcroed on the inside, including two right angle dividers.The dividers are made to securely tie down all OConnor heads and accessories. Finally, the AC bag comes with a separate pouch to use on set for quick use.

Summing Up
In the end, the OConnor AC bag is a well made and roomy bag that will protect your camera gear and accessories from dust as well as water for $375. The inside measures in at 18x12x10.5 inches while the outside measures in at 22×14.5×10.5 inches and has been designed to fit inside of a Pelicase 1620. You can check out the OConnor AC bag on their website and find a dealer in your area.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Review: The Litra Torch for pro adventure lighting

By Brady Betzel

If you are on Instagram you’ve definitely seen your fair share of “adventure” photography and video. Typically, it’s those GoPro-themed action-adventure shots of someone cliff diving off a million-mile-high waterfall. I definitely get jealous. Nonetheless, one thing I love about GoPro cameras is their size. They are small enough to fit in your pocket, and they will reliably produce a great image. Where those actioncams suffer is with light performance. While it is getting better every day, you just can’t pull a reliably clean and noise-free image from a camera sensor so small. This is where actioncam lights come into play as a perfect companion, including the Litra Torch.

The Litra Torch is an 800 Lumen, 1.5 by 1.5-inch magnetic light. I first started seeing the tiny light trend on Instagram where people were shooting slow shutter photos at night but also painting certain objects with a tiny bit of light. Check out Litra on Instagram: @litragear to see some of the incredible images people are producing with this tiny light. I saw an action sports person showing off some incredible nighttime pictures using the GoPro Hero. He mentioned in the post that he was using the Litra Torch, so I immediately contacted Litra, and here I am reviewing the light. Litra sent me the Litra Paparazzi Bundle, which retails for $129.99. The bundle includes the Litra Torch,  along with a filter kit and cold shoe mount.

So the Litra Torch has four modes, all accessible by clicking the button on top of the light: 800 Lumen brightness, 450 Lumens, 100 Lumens and flashing. The Torch has a consistent color temperature of 5700 kelvin, essentially the light is a crisp white — right in between blue and yellow. The rechargeable lithium-ion battery can be charged via the micro USB cable and will last up to 30 minutes or more depending on the brightness selected. With a backup battery attached you could be going for hours.

Over a month with intermittent use I only charged it once. One night I had to check out something under the hood of my car and used the Litra Torch to see what I was doing. It is very bright and when I placed the light onto the car I realized it was magnetic! Holy cow. Why doesn’t GoPro put magnets into their cameras for mounting! The Torch also has two ¼-20 camera screw mounts so you can mount them just about anywhere. The construction of the Torch is amazing — it is drop-proof, waterproof and made of a highly resilient aluminum. You can feel the high quality of the components the first time you touch the Torch.

In addition to the Torch itself, the cold shoe mount and diffuser, the Paparazzi Bundle comes with the photo filter kit. The photo filter kit comes with five frames to mount the color filters onto the Torch; three sets of Rosco Tungsten 4600k filters; three sets of Rosco Tungsten 3200k filters; 1 White Diffuser filter; and one each of a red, yellow and green color filter. Essentially, they give you a cheap way to change white balance temperatures and also some awesome color filters to play around with. I can really see the benefit of having at least two if not three of the Litra Torches in your bag with the filter sets; you can easily set up a properly lit product shoot or even a headshot session with nothing more than three tiny Torch lights.

Putting It To The Test
To test out the light in action I asked my son to set-up a Lego scene for me. One hour later I had some Lego models to help me out. I always love seeing people’s Lego scenes on Instagram so I figured this would also be a good way to show off the light and the extra color filters sent in the Paparazzi Bundle. One thing I discovered is that I would love to have a slide-in filter holder that is built onto the light; it would definitely help me avoid wasting time having to pop filters into frames.

All in all, this light is awesome. The only problem is I wish I had three so I could do a full three-point lighting setup. However, with some natural light and one Litra Torch I had enough to pull off some cool lighting. I really liked the Torch as a colored spotlight; you can get that blue or red shade on different objects in a scene quickly.

Summing Up
In the end, the Litra Torch is an amazing product. In the future I would really love to see multiple white balance temperatures built into the Torch without having to use photo filters. Also, a really exciting but probably expensive prospect of building a Bluetooth connection and multiple colors. Better yet, make this light a full-color-spectrum app-enabled light… oh wait, just recently they announced the Litra Pro on Kickstarter. You should definitely check that out as well with it’s advanced options and color profile.

I am spoiled by all of those at home lights, like the LIFX brand, that change to any color you want, so I’m greedy and want those in a sub-$100 light. But those are just wishes — the Litra Torch is a must-have for your toolkit in my opinion. From mounting it on top of my Canon DSLR using the cold shoe mount, to using the magnetic ability and mounting in unique places, as well as using the screw mount to attach to a tripod — the Litra Torch is a mind-melting game changer for anyone having to lug around a 100-pound light kit, which makes this new Kickstarter of the Litra Pro so enticing.

Check out their website for more info on the Torch and new Litra Pro, as well as a bunch of accessories. This is a must-have for any shooter looking to carry a tiny but powerful light anywhere, especially for summer and the outdoors!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Our Virtual Color Roundtable

By Randi Altman

The number of things you can do with color in today’s world is growing daily. It’s not just about creating a look anymore, it’s using color to tell or enhance a story. And because filmmakers recognize this power, they are getting colorists involved in the process earlier than ever before. And while the industry is excited about HDR and all it offers, this process also creates its own set of challenges and costs.

To find out what those in the trenches are thinking, we reached out to makers of color gear as well as hands-on colorists with the same questions, all in an effort to figure out today’s trends and challenges.

Company 3 Senior Colorist Stephen Nakamura
Company 3 is a global group of creative studios specializing in color and post services for features, TV and commercials. 

How has the finishing of color evolved most recently?
By far, the most significant change in the work that I do is the requirement to master for all the different exhibition mediums. There’s traditional theatrical projection at 14 footlamberts (fL) and HDR theatrical projection at 30fL. There’s IMAX. For home video, there’s UHD and different flavors of HDR. Our task with all of these is to master the movie so it feels and looks the way it’s supposed to feel and look on all the different formats.

There’s no one-size-fits-all approach. The colorist’s job is to work with the filmmakers and make those interpretations. At Company 3 we’re always creating custom LUTs. There are other techniques that help us get where we need to be to get the most out of all these different display types, but there’s no substitute for taking the time and interpreting every shot for the specific display format.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not too long ago, a cinematographer could expose an image specifically for one display format — a film print projected at 14fL. They knew exactly where they could place their highlights and shadows to get a precise look onscreen. Today, they’re thinking in terms of the HDR version, where if they don’t preserve detail in the blacks and whites it can really hurt the quality of the image in some of the newer display methods.

I work frequently with Dariuisz Wolski (Sicario: Day of the Soldado, All the Money in the World). We’ve spoken about this a lot, and he’s said that when he started shooting features, he often liked to expose things right at the edge of underexposure because he knew exactly what the resulting print would be like. But now, he has to preserve the detail and fine-tune it with me in post because it has to work in so many different display formats.

There are also questions about how the filmmakers want to use the different ways of seeing the images. Sometimes they really like the qualities of the traditional theatrical standard and really don’t want HDR to look very different and to make the most of the dynamic range. If we have more dynamic range, more light, to work with, it means that in essence we have a larger “canvas” to work on. But you need to take the time to individually treat every shot if you want to get the most out of that “canvas.”

Where do you see the industry moving in the near future?
The biggest change I expect to see is the development of even brighter, higher-contrast exhibition mediums. At NAB, Sony unveiled this wall of LED panels that are stitched together without seams and can display up to 1000 nits. It can be the size of a screen in a movie theater. If that took off, it could be a game changer. If theatrical exhibition gets better with brighter, higher-contrast screens, I think the public will enjoy it, provided that the images are mastered appropriately.

Sicario: Day of the Soldado

What is the biggest challenge you see for the color grading process now and beyond?
As there are more formats, there will be more versions of the master. From P3 to Rec.709 to HDR video in PQ — they all translate color information differently. It’s not just the brightness and contrast but the individual colors. If there’s a specific color blue the filmmakers want for Superman’s suit, or red for Spiderman, or whatever it is, there are multiple layers of challenges involved in maintaining those across different displays. Those are things you have to take a lot of care with when you get to the finishing stage.

What’s the best piece of work you’ve seen that you didn’t work on?
I know it was 12 years ago now, but I’d still say 300, which was colored by Company 3 CEO Stefan Sonnenfeld. I think that was enormously significant. Everyone who has seen that movie is aware of the graphic-novel-looking imagery that Stefan achieved in color correction working with Zack Snyder and Larry Fong.

We could do a lot in a telecine bay for television, but a lot of people still thought of digital color correction for feature films as an extension of the timing process from the photochemical world. But the look in 300 would be impossible to achieve photo-chemically, and I think that opened a lot of people’s minds about the power of digital color correction.

Alt Systems Senior Product Specialist Steve MacMillian
Alt Systems is a systems provider, integrating compositing, DI, networking and storage solutions for the media and entertainment industry.

How has the finishing of color evolved most recently?
Traditionally, there has been such a huge difference between the color finishing process for television production verses for cinematic release. It used to be that a target format was just one thing, and finishing for TV was completely different than finishing for the cinema.

Colorists working on theatrical films will spend most of their efforts on grading for projection, and only after there is a detailed trim pass to make a significantly different version for the small screen. Television colorists, who are usually under much tighter schedules, will often only be concerned with making Rec.709 look good on a standard broadcast monitor. Unless there is a great deal of care to preserve the color and dynamic range of the digital negative throughout the process, the Rec.709 grade will not be suitable for translation to other expanded formats like HDR.

Now, there is an ever-growing number of distribution formats with different color and brightness requirements. And with the expectation of delivering to all of these on ever-tighter production budgets, it has become important to use color management techniques so that the work is not duplicated. If done properly, this allows for one grade to service all of these requirements with the least amount of trimming needed.

How has laser projection and HDR impacted the work?
HDR display technology, in my opinion, has changed everything. The biggest impact on color finishing is the need for monitoring in both HDR and SDR in different color spaces. Also, there is a much larger set of complex delivery requirements, along with the need for greater technical expertise and capabilities. Much of this complexity can be reduced by having the tools that make the various HDR image transforms and complex delivery formats as automatic as possible.

Color management is more important than ever. Efficient and consistent workflows are needed for dealing with multiple sources with unique color sciences, integrating visual effects and color grading while preserving the latitude and wide color gamut of the image.

The color toolset should support remapping to multiple deliverables in a variety of color spaces and luminance levels, and include support for dynamic HDR metadata systems like Dolby and HDR10+. As HDR color finishing has evolved, so has the way it is delivered to studios. Most commonly it is delivered in an HDR IMF package. It is common that Rec.2020 HDR deliverables be color constrained to the P3 color volume and also that Light Level histograms and HDR QC reports be delivered.

Do you feel DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not as much as you would think. Two things are working against this. First, film and high-end digital cameras themselves have for some time been capturing latitude suitable for HDR production. Proper camera exposure is all that is needed to ensure that an image with a wide enough dynamic range is recorded. So from a capture standpoint, nothing needs to change.

The other is cost. There are currently only a small number of suitable HDR broadcast monitors, and most of these are extremely expensive and not designed well for the set. I’m sure HDR monitoring is being used on-set, but not as much as expected for productions destined for HDR release.

Also, it is difficult to truly judge HDR displays in a bright environment, and cinematographers may feel that monitoring in HDR is not needed full time. Traditionally with film production, cinematographers became accustomed to not being able to monitor accurately on-set, and they rely on their experience and other means of judging light and exposure. I think the main concern for cinematographers is the effect of lighting choices and apparent resolution, saturation and contrast when viewed in HDR.

Highlights in the background can potentially become distracting when displayed at 1000 nits verses being clamped at 100. Framing and lighting choices are informed by proper HDR monitoring. I believe we will see more HDR monitoring on-set as more suitable displays become available.

Colorfront’s Transkoder

Where do you see the industry moving in the near future?
Clearly HDR display technology is still evolving, and we will see major advances in HDR emissive displays for the cinema in the very near future. This will bring new challenges and require updated infrastructure for post as well as the cinema. It’s also likely that color finishing for the cinema will become more and more similar to the production of HDR for the home, with only relatively small differences in overall luminance and the ambient light of the environment.

Looking forward, standard dynamic range will eventually go away in the same way that standard definition video did. As we standardize on consumer HDR displays, and high-performance panels become cheaper to make, we may not need the complexity of HDR dynamic remapping systems. I expect that headset displays will continue to evolve and will become more important as time goes on.

What is the biggest challenge you see for the color grading process now and beyond?
We are experiencing a period of change that can be compared to the scope of change from SD to HD production, except it is happening much faster. Even if HDR in the home is slow to catch on, it is happening. And nobody wants their production to be dated as SDR-only. Eventually, it will be impossible to buy a TV that is not HDR-capable.

Aside from the changes in infrastructure, colorists used to working in SDR have some new skills to learn. I think it is a mistake to do separate grading versions for every major delivery format. Even though we have standards for HDR formats, they will continue to evolve, so post production must evolve too. The biggest challenge is meeting all of these different delivery requirements on budgets that are not growing as fast as the formats.

Northern Lights Flame Artist and Colorist Chris Hengeveld
NY- and SF-based Northern Lights, along with sister companies Mr. Wonderful for design, SuperExploder for composing and audio post, and Bodega for production, offers one-stop-shop services.

How has the finishing of color evolved most recently?
It’s interesting that you use the term “finishing of color.” In my clients’ world, finishing and color now go hand in hand. My commercial clients expect not only a great grade but seamless VFX work in finalizing their spots. Both of these are now often taking place with the same artist. Work has been pushed from just straight finishing with greenscreen, product replacement and the like to doing a grade up to par with some of the higher-end coloring studios. Price is pushing vastly separate disciplines into one final push.

Clients now expect to have a rough look ready not only of the final VFX, but also of the color pass before they attend the session. I usually only do minor VFX tweaks when clients arrive. Sending QuickTimes back and forth between studio and client usually gets us to a place where our client, and their client, are satisfied with at least the direction if not the final composites.

Color, as a completely subjective experience, is best enjoyed with the colorist in the room. We do grade some jobs remotely, but my experience has clearly been that from both time and creativity standpoints, it’s best to be in the grading suite. Unfortunately, recently due to time constraints and budget issues, even higher-end projects are being evaluated on a computer/phone/tablet back at the office. This leads to more iterations and less “the whole is greater than the sum of the parts” mentality. Client interaction, especially at the grading level, is best enjoyed in the same room as the colorist. Often the final product is markedly better than what either could envision separately.

Where do you see the industry moving in the near future?
I see the industry continuing to coalesce around multi-divisional companies that are best suited to fulfill many clients’ needs at once. Most projects that come to us have diverse needs that center around one creative idea. We’re all just storytellers. We do our best to tell the client’s story with the best talent we offer, in a reasonable timeframe and at a reasonable cost.

The future will continue to evolve, putting more pressure on the editorial staff to deliver near perfect rough cuts that could become finals in the not-too-distant future.

Invisalign

The tools continue to level the playing field. More generalists will be trained in disciplines including video editing, audio mixing, graphic design, compositing and color grading. This is not to say that the future of singularly focused creatives is over. It’s just that those focused creatives are assuming more and more responsibilities. This is a continuation of the consolidation of roles that has been going on for several years now.

What is the biggest challenge you see for the color grading process now and beyond?
The biggest challenge going forward is both technical and budgetary. Many new formats have emerged, including the new ProRes RAW. New working color spaces have also emerged. Many of us work without on-staff color scientists and must find our way through the morass of HDR, ACES, Scene Linear and Rec.709. Working with materials that round trip in-house is vastly easier than dealing with multiple shops all with their own way of working. As we collaborate with outside shops, it behooves us to stay at the forefront of technology.

But truth be told, perhaps the biggest challenge is keeping the creative flow and putting the client’s needs first. Making sure the technical challenges don’t get in the way. Clients need to see a seamless view without technical hurdles.

What’s the best piece of work you’ve seen that you didn’t work on?
I am constantly amazed at the quality of work coming out of Netflix. Some of the series are impeccably graded. Early episodes of Bloodline, which was shot with the Sony F65, come to mind. The visuals were completely absorbing, both daytime and nighttime scenes.

Codex VP Business Development Brian Gaffney
Codex designs tools for color, dailies creation, archiving, review and networked attached storage. Their offerings include the new Codex ColorSynth with Keys and the MediaVault desktop NAS.

How has the finishing of color evolved most recently?
While it used to be a specialized suite in a post facility, color finishing has evolved tremendously over the last 10 years with low-cost access to powerful systems like Resolve for use on-set in commercial finishing to final DI color grading. These systems have evolved from being more than just color. Now they are editorial, sound mixing and complete finishing platforms.

How has laser projection and HDR impacted the work?
Offering brighter images in the theatre and the home with laser projection, OLED walls and HDR displays will certainly change the viewers’ experience, and it has helped create more work in post, offering up another pass for grading.

However, brighter images also show off image artifacts and can bring attention to highlights that may already be clipping. Shadow detail that was graded in SDR may now look milky in HDR. These new display mediums require that you spend more time optimizing the color correction for both display types. There is no magic one grade fits all.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
I think cinematographers are still figuring this out. Much like color correction between SDR and HDR, lighting for the two is different. A window that was purposely blown out in SDR, to hide a lighting rig outside, may show up in HDR, exposing the rig itself. Color correction might be able to correct for this, but unless a cinematographer can monitor in HDR on-set, these issues will come up in post. To do it right, lighting optimization between the two spaces is required, plus SDR and HDR monitoring on-set and near-set and in editorial.

Where do you see the industry moving in the near future?
It’s all about content. With the traditional studio infrastructure and broadcast television market changing to Internet Service Providers (ISPs), the demand for content, both original and HDR remastered libraries, is helping prop up post production and is driving storage- and cloud-based services.

Codex’s ColorSynth and Media Vault

In the long term, if the competition in this space continues and the demand for new content keeps expanding, traditional post facilities will become “secure data centers” and managed service providers. With cloud-based services, the talent no longer needs to be in the facility with the client. Shared projects with realtime interactivity from desktop and mobile devices will allow more collaboration among global-based productions.

What is the biggest challenge you see for the color grading process now and beyond?
Project management — sharing color set-ups among different workstations. Monitoring of the color with proper calibrated displays in both SDR and HDR and in support of multiple deliverables is always a challenge. New display technologies, like laser projection and new Samsung and Sony videowalls, may not be cost effective for the creative community to access for final grading. Only certain facilities may wind up specializing in this type of grading experience, limiting global access for directors and cinematographers to fully visualize how their product will look like on these new display mediums. It’s a cost that may not get the needed ROI, so in the near future many facilities may not be able to support the full demand of deliverables properly.

Blackmagic Director of Sales/Operations Bob Caniglia
Blackmagic creates DaVinci Resolve, a solution that combines professional offline and online editing, color correction, audio post production and visual effects in one software tool.

How has the finishing of color evolved most recently?
The ability to work in 8K, and whatever flavor of HDR you see, is happening. But if you are talking evolution, it is about the ability to collaborate with everyone in the post house, and the ability to do high-quality color correction anywhere. Editors, colorists, sound engineers and VFX artists should not be kept apart or kept from being able to collaborate on the same project at the same time.

New collaborative workflows will speed up post production because you will no longer need to import, export or translate projects between different software applications.

How has laser projection and HDR impacted the work?
The most obvious impact has been on the need for colorists to be using software that can finish a project in whatever HDR format the client asks for. That is the same with laser projection. If you do not use software that is constantly updating to whatever new format is introduced, being able to bid on HDR projects will be hard.

HDR is all about more immersive colors. Any colorist should be ecstatic to be able to work with images that are brighter, sharper and with more data. This should allow them to be even more creative with telling a story with color.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
As for cinematographers, HDR gives viewers a whole new level of image details. But that hyper reality could draw the viewer from the wanted target in a shot. The beautiful details shining back on a coffee pot in a tracking shot may not be worth worrying about in SDR, but in HDR every shot will create more work for the colorist to make sure the viewer doesn’t get distracted by the little things. For DPs, it means they are going to have to be much more aware of lighting, framing and planning the impact of every possible item and shadow in an image.

Where do you see the industry moving in the near future?
Peace in our time amongst all of the different post silos, because those silos will finally be open. And there will be collaboration between all parts of the post workflow. Everyone — audio, VFX, editing and color correction — can work together on the same project seamlessly.

For example, in our Resolve tool, post pros can move between them all. This is what we see happening with colorists and post houses right now, as each member of the post team can be much more creatively flexible because anyone can explore new toolsets. And with new collaboration tools, multiple assistants, editors, colorists, sound designers and VFX artists can all work on the same project at the same time.

Resolve 15

For a long-term view, you will always have true artists in each of the post areas. People who have mastered the craft and can separate themselves as being color correction artists. What is really going to change is that everyone up and down the post workflow at larger post houses will be able to be much more creative and efficient, while small boutique shops and freelancers can offer their clients a full set of post production services.

What is the biggest challenge you see for the color grading process now and beyond?
Speed and flexibility. Because with everyone now collaborating and the colorist being part of every part of the post process, you will be asked to do things immediately… and in any format. So if you are not able to work in real time or with whatever footage format thrown at you, they will find someone who can.

This also comes with the challenge of changing the old notion that the colorist is one of the last people to touch a project. You will be asked to jump in early and often. Because every client would love to show early edits that are graded to get approvals faster.

FilmLight CEO Wolfgang Lempp
FilmLight designs, creates and manufactures color grading systems, image processing applications and workflow tools for the film and television industry

How has the finishing of color evolved recently?
When we started FilmLight 18 years ago, color management was comparatively simple: Video looked like video, and digital film was meant to look like film. And that was also the starting point for the DCI — the digital cinema standard tried to make digital projection look exactly like conventional cinema. This understanding lasted for a good 10 years, and even ACES today is very much built around film as the primary reference. But now we have an explosion of new technologies, new display devices and new delivery formats.

There are new options in resolution, brightness, dynamic range, color gamut, frame rate and viewing environments. The idea of a single deliverable has gone: There are just too many ways of getting the content to the viewer. That is certainly affecting the finishing process — the content has to look good everywhere. But there is another trend visible, too, which here in the UK you can see best on TV. The color and finishing tools are getting more powerful and the process is getting more productive. More programs than ever before are getting a professional color treatment before they go out, and they look all the better for it.

Either way, there is more work for the colorist and finishing house, which is of course something we welcome.

How has laser projection and HDR impacted the work?
Laser projection and HDR for cinema and TV are examples of what I described above. We have the color science and the tools to move comfortably between these different technologies and environments, in that the color looks “right,” but that is not the whole story.

The director and DP will choose to use a format that will best suit their story, and will shoot for their target environment. In SDR, you might have a bright window in an interior scene, for example, which will shape the frame but not get in the way of the story. But in HDR, that same window will be too bright, obliterate the interior scene and distract from the story. So you would perhaps frame it differently, or light up the interior to restore some balance. In other words, you have to make a choice.

HDR shouldn’t be an afterthought, it shouldn’t be a decision made after the shoot is finished. The DP wants to keep us on the edge of our seats — but you can’t be on the edge in HDR and SDR at the same time. There is a lot that can be done in post, but we are still a long way from recreating the multispectral, three-dimensional real world from the output of a camera.

HDR, of course, looks fantastic, but the industry is still learning how to shoot for best effect, as well as how to serve all the distribution formats. It might well become the primary mastering format soon, but SDR will never go away.

Where do you see the industry moving in the future?
For me, it is clear that as we have pushed resolution, frame rate, brightness and color gamut, it has affected the way we tell stories. Less is left to the imagination. Traditional “film style” gave a certain pace to the story, because there was the expectation that the audience was having to interpret, to think through to fill in the black screen in between.

Now technology has made things more explicit and more immersive. We now see true HDR cinema technology emerging with a brightness of 600 nits and more. Technology will continue to surge forward, because that is how manufacturers sell more televisions or projectors — or even phones. And until there is a realistic simulation of a full virtual reality environment, I don’t see that process coming to a halt. We have to be able to master for all these new technologies, but still ensure compatibility with existing standards.

What is the biggest challenge for color grading now and in the future?
Color grading technology is very much unfinished business. There is so much that can be done to make it more productive, to make the content look better and to keep us entertained.

Blackboard

As much as we might welcome all the extra work for our customers, generating an endless stream of versions for each program is not what color grading should be about. So it will be interesting to see how this problem will be solved. Because one way or another, it will have to be. But while this is a big challenge, it hopefully isn’t what we put all our effort into over the coming years.

BlackboardThe real challenge is to understand what makes us appreciate certain images over others. How composition and texture, how context, noise and temporal dynamics — not just color itself — affect our perception.

It is interesting that film as a capture medium is gaining popularity again, especially large-format capture. It is also interesting that the “film look” is still precious when it comes to color grading. It puts all the new technology into perspective. Filmmaking is storytelling. Not just a window to the world outside, replaced by a bigger and clearer window with new technology, but a window to a different world. And the colorist can shape that world to a degree that is limited only by her imagination.

Olympusat Entertainment Senior DI Colorist Jim Wicks
A colorist since 2007, Jim has been a senior DI colorist at Olympusat Entertainment since 2011. He has color restored hundreds of classic films and is very active in the color community.

How has the finishing of color evolved most recently?
The phrase I’m keying in on in your question is “most recently.” I believe the role of a colorist has been changing exponentially for the last several years, maybe longer. I would say that we are becoming, if we haven’t already, more like finishing artists. Color is now just one part of what we do. Because technologies are changing more rapidly than at any time I’ve witnessed, we now have a lot to understand and comprehend in addition to just color. There is ACES, HDR, changing color spaces, integrating VFX workflows into our timelines, laser projection and so on. The list isn’t endless, and it’s growing.

How has laser projection and HDR impacted the work?
For the time being, they do not impact my work. I am currently required to deliver in Rec.709. However, within that confine I am grading a wider range of media than ever before, such as 2K and 4K uncompressed DPX; Phantom Digital Video Files; Red Helium 8K in the IPP2 workspace; and much more. Laser projection and HDR is something that I continue to study by attending symposiums, or wherever I can find that information. I believe laser projection and HDR are important to know now. When the opportunity to work with laser projection and HDR is available to me, I plan to be ready.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Of course! At the very heart of every production, the cinematographer is the creator and author of the image. It is her creative vision. The colorist is the protector of that image. The cinematographer entrusts us with her vision. In this respect, the colorist needs to be in sync with the cinematographer as never before. As cinematographers move because of technology, so we move. It’s all about the deliverable and how it will be displayed. I see no benefit for the colorist and the cinematographer to not be on the same page because of changing technology.

Where do you see the industry moving in the near future and the long-range future?
In the near future: HDR, laser projection, 4K and larger and larger formats.

In the long-range future: I believe we only need to look to the past to see the changes that are inevitably ahead of us.

Technological changes forced film labs, telecine and color timers to change and evolve. In the nearly two decades since O Brother Where Art Thou? we no longer color grade movies the way we did back when the Coen classic was released in 2000. I believe it is inevitable: Change begets change. Nothing stays the same.

In keeping with the types of changes that came before, it is only a matter of time before today’s colorist is forced to change and evolve just as those before us were forced to do so. In this respect I believe AI technology is a game-changer. After all, we are moving towards driverless cars. So, if AI advances the way we have been told, will we need a human colorist in the future?

What is the biggest challenge you see for the color grading process now and beyond?
Not to sound like a “get off my lawn rant,” but education is the biggest challenge, and it’s a two-fold problem. Firstly, at many fine film schools in the US color grading is not taught as a degree-granting course, or at all.

Secondly, the glut of for-profit websites that teach color grading courses have no standardized curriculum, which wouldn’t be a problem, but at present there is no way to measure how much anyone actually knows. I have personally encountered individuals who claim to be colorists and yet do not know how to color grade. As a manager I have interviewed them — their resumes look strong, but their skills are not there. They can’t do the work.

What’s the best piece of work you’ve seen that you didn’t work on?
Just about anything shot by Roger Deakins. I am a huge fan of his work. Mitch Paulson and his team at Efilm did great work on protecting Roger’s vision for Blade Runner 2049.

Colorist David Rivero
This Madrid-born colorist is now based in China. He color grades and supervises the finishing of feature films and commercials, normally all versions, and often the trailers associated with them.

How has the finishing of color evolved most recently?
The line between strictly color grading and finishing is getting blurrier by the year. Although it is true there is still a clearer separation in the commercial world, on the film side the colorist has become the “de facto” finishing or supervising finishing artist. I think it is another sign of the bigger role the color grading is starting to play in post.

In the last two to three years I’ve noticed that fewer clients are looking at it as an afterthought, or as simply “color matching.” I’ve seen how the very same people went from a six- to seven-day DI schedule five years ago to a 20-day schedule now. The idea that spending a relatively small amount of extra time and budget on the final step can get you a far superior result is finally sinking in.

The tools and technology are finally moving into a “modern age” of grading:
– HDR is a game changer on the image-side of things, providing a noticeable difference for the audience and a different approach on our side on how to deal with all that information.

– The eventual acceptance by all color systems of what was traditionally compositing or VFX tools is also a turning point, although controversial. There are many that think that colorists should focus on grading. However, I think that rather than colorists becoming compositors, it is the color grading concept and mission that is (still) evolving.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Well, on my side of the world (China), the laser and HDR technologies are just starting to get to the public. Cinematographers are not really changing how they work yet, as it is a very small fraction of the whole exhibition system.

As for post, it requires a more careful way of handling the image, as it needs higher quality plates, compositions, CG, VFX, a more careful grade, and you can’t get away with as many tricks as you did when it was just SDR. The bright side is the marvelous images, and how different they can be from each other. I believe HDR is totally compatible with every style you could do in SDR, while opening the doors to new ones. There are also different approaches on shooting and lighting for cinematographers and CG artists.

Goldbuster

The biggest challenge it has created has been on the exhibition side in China. Although Dolby cinemas (Vision+Atmos) are controlled and require a specific pass and DCP, there are other laser projection theaters that show the same DCP being delivered to common (xenon lamp) theaters. This creates a frustrating environment. For example, during the 3D grading, you not only need to consider the very dark theaters with 3FL-3.5FL, but also the new laser rooms that are racking up their lamps to show off why they charge higher ticket prices with to 7FL-8FL.

Where do you see the industry moving in the near future and the long-range future?
I hope to see the HDR technologies settling and becoming the new standard within the next five to six years, and using this as the reference master from which all other deliveries are created. I also expect all these relative new practices and workflows (involving ACES, EXRs with the VFX/CG passes, non-LUT deliveries) to become more standardized and controlled.

In the long term, I could imagine two main changes happening, closely related to each other:
– The concept of grading and colorist, especially in films or long formats, evolving in importance and relationship within the production. I believe the separation or independence between photography and grading will get wider (and necessary) as tools evolve and the process is more standardized. We might get into something akin to how sound editors and sound mixers relate and work together on the sound.

– The addition of (serious) compositing in essentially all the main color systems is the first step towards the possibilities of future grading. A feature like the recent FaceRefinement in Resolve is one of the things I dreamed about five or six years ago.

What is the biggest challenge you see for the color grading process now and beyond?
Nowadays one of the biggest challenges is possibly the multi-mastering environment, with several versions on different color spaces, displays and aspect ratios. It is becoming easier, but it is still more painful than it should be.

Shrinking margins is something that also hurts the whole industry. We all work thanks to the benefits, but cutting on budgets and expecting the same results is not something that is going to happen.

What’s the best piece of work you’ve seen that you didn’t work on?
The Revanant, Mad Max, Fury and 300.

Carbon Colorist Aubrey Woodiwiss
Full-service creative studio Carbon has offices in New York, Chicago and Los Angeles.

How has the finishing of color evolved most recently?
It is always evolving, and the tools are becoming ever more powerful, and camera formats are becoming larger with more range and information in them. Probably the most significant evolution I see is a greater understanding of color science and color space workflows.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
These elements impact how footage is viewed and dealt with in post. As far as I can see, it isn’t affecting how things are shot.

Where do you see the industry moving in the near future? What about in the long-range future?
I see formats becoming larger, viewing spaces and color gamuts becoming wider, and more streaming- and laptop-based technologies and workflows.

What is the biggest challenge you see for the color grading process now and beyond?
The constant challenge is integrating the space you traditionally color grade in to how things are viewed outside of this space.

What’s the best piece of work you’ve seen that you didn’t work on?
Knight of Cups, directed by Terrence Malick with cinematography by Emanuel Lubezki.

Ntropic Colorist Nick Sanders
Ntropic creates and produces work for commercials, music videos, and feature films as well as experiential and interactive VR and AR media. They have locations in San Francisco, Los Angeles and New York City.

How has the finishing of color evolved most recently?
SDR grading in Rec.709 and 2.4 Gamma is still here, still looks great, and will be prominent for a long time. However, I think we’re becoming more aware of how exciting grading in HDR is, and how many creative doors it opens. I’ve noticed a feeling of disappointment when switching from an HDR to an SDR version of a project, and wondered for a second if I’m accidentally viewing the ungraded raw footage, or if my final SDR grade is actually as flat as it appears to my eyes. There is a dramatic difference between the two formats.

HDR is incredible because you can make the highlights blisteringly hot, saturate a color to nuclear levels or keep things mundane and save those heavier-handed tools in your pocket for choice moments in the edit where you might want some extra visceral impact.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
In one sense, cinematographers don’t need to do anything differently. Colorists are able to create high-quality SDR and HDR interpretations of the exact same source footage, so long as it was captured in a high-bit-depth raw format and exposed well. We’re even seeing modern HDR reimaginings of classic films. Movies as varied in subject matter as Saving Private Ryan and the original Blade Runner are coming back to life because the latitude of classic film stocks allows it. However, HDR has the power to greatly exaggerate details that may have otherwise been subtle or invisible in SDR formats, so some extra care should be taken in projects destined for HDR.

Extra contrast and shadow detail mean that noise is far more apparent in HDR projects, so ISO and exposure should be adjusted on-set accordingly. Also, the increased highlight range has some interesting consequences in HDR. For example, large blown-out highlights, such as overexposed skies, can look particularly bad. HDR can also retain more detail and color in the upper ranges in a way that may not be desirable. An unremarkable, desaturated background in SDR can become a bright, busy and colorful background in HDR. It might prove distracting to the point that the DP may want to increase his or her key lighting on the foreground subjects to refocus our attention on them.

Panasonic “PvP”

Where do you see the industry moving in the near future? What about the long-range future?
I foresee more widespread adoption of HDR — in a way that I don’t with 3D and VR — because there’s no headset device required to feel and enjoy it. Having some HDR nature footage running on a loop is a great way to sell a TV in Best Buy. Where the benefits of another recent innovation, 4K, are really only detectable on larger screens and begin to deteriorate with the slightest bit of compression in the image pipeline, HDR’s magic is apparent from the first glance.

I think we’ll first start to see HDR and SDR orders on everything, then a gradual phasing out of the SDR deliverables as the technology becomes more ubiquitous, just like we saw with the standard definition transition to HD.

For the long-range, I wouldn’t be surprised to see a phasing out of projectors as LED walls become more common for theater exhibitions due to their deeper black levels. This would effectively blur the line between technologies available for theater and home for good.

What is the biggest challenge you see for the color grading process now and beyond?
The lack of a clear standard makes workflow decisions a little tricky at the moment. One glaring issue is that consumer HDR displays don’t replicate the maximum brightness of professional monitors, so there is a question of mastering one’s work for the present, or for the near future when that higher capability will be more widely available. And where does this evolution stop? 4,000 nits? 10,000 nits?

Maybe a more pertinent creative challenge in the crossover period is which version to grade first, SDR or HDR, and how to produce the other version. There are a couple of ways to go about it, from using LUTs to initiate and largely automate the conversion to starting over from scratch and regrading the source footage in the new format.

What’s the best piece of work you’ve seen that you didn’t work on?
Chef’s Table on Netflix was one of the first things I saw in HDR; I still think it looks great!

Main Image: Courtesy of Jim Wicks.

Netflix’s Lost in Space: mastering for Dolby Vision HDR, Rec.709

There is a world of difference between Netflix’s ambitious science-fiction series Lost in Space (recently renewed for another 10 episodes) and the beloved but rather low-tech, tongue-in-cheek 1960s show most fondly remembered for the repartee between persnickety Dr. Smith and the rather tinny-looking Robot. This series, starring Molly Parker, Toby Stevens and Parker Posey (in a very different take on Dr. Smith), is a very modern, VFX-intensive adventure show with more deeply wrought characters and elaborate action sequences.

Siggy Ferstl

Colorist Siggy Ferstl of Company 3 devoted a significant amount of his time and creative energy to the 10-episode release over the five-and-a-half-month period the group of 10 episodes was in the facility. While Netflix’s approach to dropping all 10 episodes at once, rather than the traditional series schedule of an episode a week, fuels excitement and binge-watching among viewers, it also requires a different kind of workflow, with cross-boarded shoots across multiple episodes and different parts of episodes coming out of editorial for color grading throughout the story arc. “We started on episode one,” Ferstl explains, “but then we’d get three and portions of six and back to four, and so on.”

Additionally, the series was mastered both for Dolby Vision HDR and Rec.709, which added additional facets to the grading process over shows delivered exclusively for Rec.709.

Ferstl’s grading theater also served as a hub where the filmmakers, including co-producer Scott Schofield, executive producer Zack Estrin and VFX supervisor Jabbar Raisani could see iterations of the many effects sequences as they came in from vendors (Cinesite, Important Looking Pirates and Image Engine, among others).

Ferstl himself made use of some new tools within Resolve to create a number of effects that might once have been sent out of house or completed during the online conform. “The process was layered and very collaborative,” says Ferstl. “That is always a positive thing when it happens but it was particularly important because of this series’ complexity.”

The Look
Shot by Sam McCurdy, the show’s aesthetic was designed, “to have a richness and realness to the look,” Ferstl explains. “It’s a family show but it doesn’t have that vibrant and saturated style you might associate with that. It has a more sophisticated kind of look.”

One significant alteration to the look involves changes to the environment of the planet onto which the characters crash land. The filmmakers wanted the exteriors to look less Earthlike with foliage a bit reddish, less verdant than the actual locations. The visual effects companies handled some of the more pronounced changes, especially as the look becomes more extreme in later episodes, but for a significant amount of this work, Ferstl was able to affect the look in his grading sessions — something that until recently would likely not have been achievable.

Ferstl, who has always sought out and embraced new technology to help him do his job, made use of some features that were then brand new to Resolve 14. In the case of the planet’s foliage, he made use of the Color Compressor tool within the OpenFX tab on the color corrector. “This allowed me take a range of colors and collapse that into a single vector of color,” he explains. “This lets you take your selected range of colors, say yellows and greens in this case, and compress them in terms of hue, saturation and luminance.” Sometimes touted as a tool to give colorists more ability to even out flesh tones, Ferstl applied the tool to the foliage and compressed the many shades of green into a narrower range prior to shifting the resulting colors to the more orange look.

“With foliage you have light greens and darker greens and many different ranges within the color green,” Ferstl explains. “If we’d just isolated those ranges and turned them orange individually, it wouldn’t give us the same feel. But by limiting the range and latitude of those greens in the Color Compressor and then changing the hue we were able to get much more desirable results.” Of course, Ferstl also used multiple keys and windows to isolate the foliage that needed to change from the elements of the scenes that didn’t.

He also made use of the Camera Shake function, which was particularly useful in a scene in the second episode in which an extremely heavy storm of sharp hail-like objects hits the planet, endangering many characters. The storm itself was created at the VFX houses, but the additional effect of camera shake on top of that was introduced and fine-tuned in the grade. “I suggested that we could add the vibration, and it worked very well,” he recalls. By doing the work during color grading sessions, Ferstl and the filmmakers in the session could see that effect as it was being created, in context and on the big screen, and could fine-tune the “camera movement” right then and there.

Fortunately, the colorist notes, the production afforded the time to go back and revise color decisions as more episodes came into Company 3. “The environment of the planet changes throughout. But we weren’t coloring episodes one after the other. It was really like working on a 10-hour feature.

“If we start at episode one and jump to episode six,” Ferstl notes, “exactly how much should the environment have changed in-between? So it was a process of estimating where the look should land but knowing we could go back and refine those decisions if it proved necessary once we had the surrounding episodes for context.”

Dolby Vision Workflow
As most people reading this know, mastering in high dynamic range (Dolby Vision in this case) opens up the possibility of working within a significantly expanded contrast range and wider color gamut over Rec.709 standard for traditional HD. Lost in Space was mastered concurrently for both, which required Ferstl to use Dolby’s workflow. And this involves making all corrections for the HDR version and then allowing the Dolby hardware/software to analyze the images to bring them into the Rec.709 space for the colorist to do a standard-def pass.

Ferstl, who worked with two Sony X-300 monitors, one calibrated for Rec.709 and the other for HDR, explains, “Everyone is used to looking at Rec. 709. Most viewers today will see the show in Rec.709 and that’s really what the clients are most concerned with. At some point, if HDR becomes the dominant way people watch television, then that will probably change. But we had to make corrections in HDR and then wait for the analysis to show us what the revised image looked like for standard dynamic range.”

He elaborates that while the Dolby Vision spec allows the brightest whites to read at 4000 nits, he and the filmmakers preferred to limit that to 1000 nits. “If you let highlights go much further than we did,” he says, “some things can become hard to watch. They become so bright that visual fatigue sets in after too long. So we’d sometimes take the brightest portions of the frame and slightly clamp them,” he says of the technique of holding the brightest areas of the frame to levels below the maximum the spec allows.

“Sometimes HDR can be challenging to work with and sometimes it can be amazing,” he allows. Take the vast vistas and snowcapped mountains we first see when the family starts exploring the planet. “You have so much more detail in the snow and an amazing range in the highlights than you could ever display in Rec.709,” he says.

“In HDR, the show conveys the power and majesty of these vast spaces beyond what viewers are used to seeing. There are quite a few sections that lend themselves to HDR,” he continues. But as with all such tools, it’s not always appropriate to the story to use the extremes of that dynamic range. Some highlights in HDR can pull the viewer’s attention to a portion of the frame in a way that simply can’t be replicated in Rec. 709 and, likewise, a bright highlight from a practical or a reflection in HDR can completely overpower an image that tells the story perfectly in standard dynamic range. “The tools can re-map an image mathematically,” Ferstl notes, “but it still requires artists to interpret an image’s meaning and feel from one space to the other.”

That brings up another question: How close do you want the HDR and the Rec.709 to look to each other when they can look very different? Overall, the conclusion of all involved on the series was to constrain the levels in the HDR pass a bit in order to keep the two versions in the same ballpark aesthetically. “The more you let the highlights go in HDR,” he explains, “the harder it is to compress all that information for the 100-nit version. If you look at scenes with the characters in space suits, for example, they have these small lights that are part of their helmets and if you just let those go in HDR, those lights become so distracting that it becomes hard to look at the people’s faces.”

Such decisions were made in the grading theater on a case by case basis. “It’s not like we looked at a waveform monitor and just said, ‘let’s clamp everything above this level,’” he explains, “it was ultimately about the feeling we’d get from each shot.”