Category Archives: Camera Edition

Our Virtual Production Roundtable

By Randi Altman

Evolve or die. That old adage, while very dramatic, fits well with the state of our current production workflows. While most productions are now shot digitally, the warmth of film is still in the back of pros’ minds. Camera makers and directors of photography often look for ways to retain that warmth in digital. Whether it’s through lighting, vintage lenses, color grading, newer technology or all of the above.

There is also the question of setting looks on-set and how 8K and HDR are affecting the picture and workflows. And let’s not forget shooting for OTT series. There is a lot to cover!

In an effort to get a variety of perspectives, we reached out to a few cinematographers and some camera manufacturers to talk trends and technology. Enjoy!

Claudio Miranda, ASC

Claudio Miranda is a Chilean cinematographer who won an Oscar for his work on Life of Pi. He also worked on The Curious Case of Benjamin Button, the first movie nominated for a cinematography Oscar that was shot entirely on digital. Other films include Oblivion, Tomorrowland and the upcoming Top Gun: Maverick.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Seems like everyone is shooting large format. Chris Nolan and Quentin Tarantino shot 65mm film for their last projects. New digital cameras such as the Alexa LF and Sony Venice cater to this demand. People seem to like the shallow depth of field of these larger format lenses.

How is HDR affecting the way things are being shot these days? Are productions shooting/monitoring HDR on-set?
For me, too much grain in HDR can be distracting. This must be moderated in the camera acquisition format choice and DI. Panning in a high-contrast environment can cause painful strobing. This can be helped in the DI and set design. HDR done well is more important than 8K or even 3D.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K can be important for VFX plates. For me, creatively it is not important, 4K is enough. The positive of 8K is just more K. The downside is that I would rather the camera companies focus on dynamic range, color latitude, sensitivity and the look and feel of the captured image instead of trying to hit a high K number. Also, there are storage and processing issues.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
I have not shot for a streaming service. I do think we need to pay attention to all deliverables and make adjustments accordingly. In the DI, I am there for the standard cinema pass, HDR pass, IMAX pass, home video pass and other formats that arise.

Is the availability of all those camera resolutions a help or a hindrance?
I choose the camera that will fit the job. It is my job in prep to test and pick the camera that best serves the movie.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
On set, I am able to view HDR or 709. I test the pipeline and make sure the LUT is correct and make modifications if needed. I do not play with many LUTs on set, I normally just have one. I treat the camera like a film stock. I know I will be there in the DI to finalize the look. On set is not the place for futzing with LUTs on the camera. My plate is full enough as it is.

If not already covered, how has production changed in the last two years?
I am not sure production has changed, but there are many new tools to use to help make work more efficient and economical. I feel that I have always had to be mindful of the budget, no matter how large the show is. I am always looking for new solutions.

Daryn Okada, ASC
Daryn Okada is known for his work on films such as Mean GirlsAnna Karenina and Just Like Heaven. He has also worked on many TV series, such as Scandal, Grey’s Anatomy and Castle. He served as president of the ASC from 2006 to 2009.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses? 

Modern digital cinema cameras can achieve a level of quality with the proper workflows and techniques to evolve a story’s visual identity parallel explorations shooting on film. Larger image sensors, state-of-the-art lenses and mining historic optics enable cinematographers to use their experience and knowledge of the past to paint rich visual experiences for today’s audience.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
HDR is a creative and technical medium just as shooting and projecting 65mm film would be. It’s up to the director and the cinematographer to decide how to orchestrate the use of HDR for their particular story.

Can you address 8K? What are the positives, and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K will is working its way into production like 65mm and 35mm VistaVision did by providing more technical resolution for use in VFX or special-venue exhibition. The enormous amount of data and cost to handle it must be justified by its financial return and does it benefit a particular story. Latitude and color depth are paramount to creating a motion picture’s pallet and texture. Trying to use a format just because it’s technically possible may be distracting to an audience’s acceptance of a story or creative concept.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?

I think the delivery specifications of OTT have generally raised the bar, making 4K and wide color gamut the norm. For cinematographers that have spent years photographing features, we are accustomed to creating images with detail for a big screen and a wide color pallet. It’s a natural creative process to shoot for 4K and HDR in that respect. 

Are the availability of all those camera resolutions a help or a hindrance? 
Having the best imaging available is always welcomed. Even if a camera is not technically exploited, the creation of subtle images is richer and possible through the smoother transition and blending of color, contrast and detail from originating with higher resolutions and color range.

Can you talk about color management from the sensor/film to the screen? How do you ensure correct color management from the set into dailies and post, the DI and final delivery?
As a cinematographer we are still involved in workflows for dailies and post production to ensure everyone’s creative efforts to the final production are maintained for the immediate viewer and preserved for the audiences in the future.

How has production changed over the last two years?
There are more opportunities to produce content with creative high-quality cinematography thanks to advancements in cameras and cost-effective computing speed combined with demands of high quality displays and projection.

Vanja Černjul, ASC
This New York-based DP recently worked on the huge hit Crazy Rich Asians. In addition to feature film work, Černjul has shot TV shows (Deuce’s season 1 finale and two seasons of Marco Polo, as well as commercials for Panasonic and others.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
One interesting trend I noticed is the comeback of image texture. In the past, cinematographers used to expose film stock differently according to the grain texture they desired. Different exposure zones within the same frame had different grain character, which produced additional depth of the image. We lost that once we switched to digital. Crude simulations of film grain, such as overall filters, couldn’t produce the dimensionality we had with film.

Today, I am noticing new ways of bringing the texture back as a means of creative expression. The first one comes in the form of new, sophisticated post production tools designed to replicate the three-dimensional texturing that occurs naturally when shooting film, such as the realtime texturing tool LiveGrain. Monitoring the image on the set with a LiveGrain texture applied can impact lighting, filtration or lens choices. There are also new ways to manipulate texture in-camera. With the rise of super-sensitive, dual-native ISO sensors we can now shoot at very low-light levels and incorporate so-called photon shot noise into the image. Shot noise has organic character, very much like film grain.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?

The creative potential of HDR technology is far greater than that of added resolution. Unfortunately, it is hard for cinematographers to take full advantage of HDR because it is still far from being the standard way the audience sees our images. We can’t have two completely different looks for a single project, and we have to make sure the images are working on SDR screens. In addition, it is still impractical to monitor in HDR on the set, which makes it difficult to adjust lighting and lens choices to expanded dynamic range. Once HDR screens become a standard, we will be able to really start creatively exploring this new territory.

Crazy Rich Asians

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
Additional resolution adds more available choices regarding relationship of optical systems and aspect ratios. I am now able to choose lenses for their artifacts and character regardless of the desired aspect ratio. I can decide to shoot one part of the film in spherical and the other part in anamorphic and crop the image to the project’s predetermined aspect ratio without fear of throwing away too much information. I love that freedom.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices and workflows, if at all?
For me, the only practical difference between shooting high-quality content for cable or streaming is the fact that Netflix demands their projects to be capt
ured in true 4K RAW. I like the commitment to higher technical standards, even though this may be an unwelcome restriction for some projects.

Is the availability of all those camera resolutions a help or a hindrance?
I like choices. As large format lenses become more available, shooting across formats and resolutions will become easier and simpler.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The key for correct color management from the set to final color grading is in preproduction. It is important to take the time to do proper tests and establish the communication between DIT, the colorist and all other people involved as early as possible. This ensures that original ideas aren’t lost in the process.

Adjusting and fine-tuning the LUT to the lenses, lighting gels and set design and then testing it with the colorist is very important. Once I have a bulletproof LUT, I light and expose all the material for it specifically. If this part of the process is done correctly, the time in final color grading can be spent on creative work rather than on fixing inconsistencies.

I am very grateful for ACES workflow, which offers long-overdue standardization. It is definitely a move in the right direction.

How has production changed over the last two years?
With all the amazing post tools that are becoming more available and affordable, I am seeing negative trends of further cutting of preproduction time, and lack of creative discipline on the set. I sincerely hope this is just a temporary confusion due to recalibration of the process.

Kate Reid, DP
Kate Reid is a UK-based DP working in TV and film. Her recent work includes the TV series Hanna (Amazon) Marcella 2 (Netflix) and additional photography on the final season Game of Thrones for HBO. She is currently working on Press for BBC.

Can you talk about some camera trends you’ve been seeing? Such as Large Format? The use of old/vintage lenses?
Large format cameras are being used increasingly on drama productions to satisfy the requirement for additional resolution by certain distribution platforms. And, of course, the choice to use large format cameras in drama brings with it another aesthetic that DPs now have as another tool: Choosing if increased depth-of-field fall off, clarity in the image etc., enhances the particular story they wish to portray on screen.

Like many other DPs, I have always enjoyed using older lenses to help make the digital image softer, more organic and less predictable, but the larger format cameras now mean that much of this older glass designed for 35mm size sensor may not cover the increased sensor size, so newer lenses designed for the larger format cameras may become popular by necessity, alongside older larger format glass that is enjoying a renaissance.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
I have yet to shoot a show that requires HDR delivery. It hasn’t yet become the default in drama production in the UK.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and frame rate more important currently?
I don’t inherently find an ultra sharp image attractive. Through older glass and diffusion filters on the lens, I am usually looking to soften and break down my image, so I personally am not all about the extra Ks. How the camera’s sensor reproduces color and handles highlights and shadows is of more interest to me, and I believe has more impact on the picture.

Of primary importance is how practical a camera is to work with — size and how comfortable the camera is to handle would supersede excessive resolution — as the first requirement of any camera has got to be whether it allows you to achieve the shots you have in mind, because a story isn’t told through its resolution.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices, and workflows, if at all?
The major change is the requirement by Netflix for true 4K resolution, determining which cameras cinematographers are allowed to shoot on. For many cinematographers the Arri Alexa was their digital camera of choice, which was excluded by this rule, and therefore we have had to look to other cameras for such productions. Learning a new camera, its sensor, how it handles highlights, produces color, etc., and ensuring the workflow through to the post facility is something that requires time and testing, which has certainly added to a DP’s workload.

From a creative perspective, however, I found shooting for OTTs (I shot two episodes of the TV series Hanna made by Working Title TV and NBC Universal for Amazon) has been more liberating than making a series for broadcast television as there is a different idea and expectation around what the audience wants to watch and enjoy in terms of storytelling. This allowed for a more creative way of filming.

Is the availability of all those camera resolutions a help or a hindrance?
Where work is seen now can vary from a mobile phone screen to a digital billboard in Times Square, so it is good for DPs to have a choice of cameras and their respective resolutions so we can use the best tool of each job. It only becomes a hindrance if you let the technology lead your creative process rather than assist it.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Ideally, I will have had the time and opportunity to shoot tests during prep and then spend half a day with the show’s colorist to create a basic LUT I can work with on set. In practice, I have always found that I tweak this LUT during the first days of production with the DIT, and this is what serves me throughout the rest of the show.

I usually work with just one LUT that will be some version of a modified Rec. 709 (unless the look of the show drastically requires something else). It should then be straight forward in that the DIT can attach a LUT to the dailies, and this is the same LUT applied by editorial so that exactly what you see on set is what is being viewed in the edit.

However, where this fails is that the dailies uploaded to FTP sites — for viewing by the execs, producers and other people who have access to the work — are usually very compressed with low resolution, so it bears little resemblance to how the work looked on set or looks in the edit. This is really unsatisfying as for months, key members of production are not seeing an accurate reflection of the picture. Of course, when you get into the grade this can be restored, but it’s dangerous if those viewing the dailies in this way have grown accustomed to something that is a pale comparison of what was shot on set.

How has production changed over the last two years?
There is less differentiation between film and television in how productions are being made and, critically, where they are being seen by audiences, especially with online platforms now making award-winning feature films. The high production values we’ve seen with Netflix and Amazon’s biggest shows has seen UK television dramas pushing to up their game, which does put pressure on productions, shooting schedules and HODs, as the budgets to help achieve this aren’t there yet.

So, from a ground-level perspective, for DPs working in drama this looks like more pressure to produce work of the highest standard in less time. However, it’s also a more exciting place to be working as the ideas about how you film something for television versus cinema no longer need apply. The perceived ideas of what an audience is interested in, or expect, are being blown out the water by the success of new original online content, which flies in the face of more traditional storytelling. Broadcasters are noticing this and, hopefully, this will lead to more exciting and cinematic mainstream television in the future.

Blackmagic’s Bob Caniglia
In addition to its post and broadcast tools, Blackmagic offers many different cameras, including the Pocket Cinema Camera, Pocket Cinema Camera 4K, Micro Studio Camera 4K, Micro Cinema Camera, Studio Camera, Studio Camera 4K, Ursa Mini Pro, Ursa Mini 4.6K, Ursa Broadcast.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Lens freedom is on everyone’s mind right now… having the freedom to shoot in any style. This is bringing about things like seeing projects shot on 50-year-old glass because the DP liked the feel of a commercial back in the ‘60s.

We actually just had a customer test out actual lenses that were used on The Godfather, The Shining and Casablanca, and it was amazing to see the mixing of those with a new digital cinema camera. And so many people are asking for a camera to work with anamorphic lenses. The trend is really that people expect their camera to be able to handle whatever look they want.

For large format use, I would say that both Hollywood and indie filmmakers are using them more often. Or, at least they trying to get the general large format look by using anamorphic lenses to get a shallow depth of field.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Right now, HDR is definitely more of a concern for DPs in Hollywood, but also with indie filmmakers and streaming service content creators. Netflix and Hulu have some amazing HDR shows right now. And there is plenty of choice when it comes to the different HDR formats and shooting and monitoring on set. All of that is happening everyday, while 8K still needs the industry to catch up with the various production tools.

As for impacting shooting, HDR is about more immersive colors, and a DP needs to plan for it. It gives viewers a whole new level of image detail in what they shoot. They have to be much more aware of every surface or lighting impact so that the viewer doesn’t get distracted. Attention to detail gets even higher in HDR, and DPs and colorists will need to keep a close eye on every shot, including when an image in a sideview mirror’s reflection is just a little too sharp and needs a tweak.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
You can never have enough Ks! Seriously. It is not just about getting a beautiful 8K TV, it is about giving the production and post pros on a project as much data as possible. More data means more room to be creative, and is great for things like keying.

Latitude and framerate are important as well, and I don’t think any one is more important than another. For the viewers, the beauty will be in large displays, you’re already seeing 8K displays in Times Square, and though you may not need 8K on your phone, 8K on the side of a building or highway will be very impactful.

I do think one of the ways 8K is changing production practices is that people are going to be much more storage conscious. Camera manufacturers will need to continue to improve workflows as the images get larger in an effort to maximize storage efficiencies.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
For streaming content providers, shoots have definitely been impacted and are forcing productions to plan for shooting in a wider number of formats. Luckily, companies like Netflix have been very good about specifying up front the cameras they approve and which formats are needed.

Is the availability of all those camera resolutions a help or a hindrance?
While it can be a bit overwhelming, it does give creatives some options, especially if they have a smaller delivery size than the acquisition format. For instance, if you’re shooting in 4K but delivering in HD, you can do dynamic zooms from the 4K image that look like an optical zoom, or you can get a tight shot and wide shot from the same camera. That’s a real help on a limited budget of time and/or money.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Have the production and the post people planning together from the start and create the look everyone should be working on right up front.

Set the LUTs you want before a single shot is done and manage the workflow from camera to final post. Also, choose post software that can bring color correction on-set, near-set and off-set. That lets you collaborate remotely. Definitely choose a camera that works directly with any post software, and avoid transcoding.

How has production changed in the last two years?
Beyond the rise of HDR, one of the other big changes is that more productions are thinking live and streaming more than ever before. CNN’s Anderson Cooper now does a daily Facebook Live show. AMC has the live Talking Dead-type formats for many of their shows. That trend is going to keep happening, so cinematographers and camera people need to be thinking about being able to jump from scripted to live shooting.

Red Digital Cinema’s Graeme Nattress
Red Digital Cinema manufactures professional digital cameras and accessories. Red’s DSMC2 camera offers three sensor options — Gemini 5K S35, Helium 8K S35 and Monstro 8K VV.

Can you talk about some camera trends you’ve been seeing?
Industry camera trends continue to push image quality in all directions. Sensors are getting bigger, with higher resolutions and more dynamic range. Filmmakers continue to innovate, making new and amazing images all the time, which drives our fascination for advancing technology in service to the creative.

How is HDR affecting the way things are being shot these days?
One of the benefits of a primary workflow based on RAW recording is that HDR is not an added extra, but a core part of the system. Filmmakers do consider HDR important, but there’s some concern that HDR doesn’t always look appealing, and that it’s not always an image quality improvement. Cinematography has always been about light and shade and how they are controlled to shape the image’s emotional or storytelling intent. HDR can be a very important tool in that it greatly expands the display canvas to work on, but a larger canvas doesn’t mean a better picture. The increased display contrast of HDR can make details more visible, and it can also make motion judder more apparent. Thus, more isn’t always better; it’s about how you use what you have.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
Without resolution, we don’t have an image. Resolution is always going to be an important image parameter. What we must keep in mind is that camera resolution is based on input resolution to the system, and that can — and often will — be different to the output resolution on the display. Traditionally, in video the input and output resolutions were one and the same, but when film was used — which had a much higher resolution than a TV could display — we were taking a high-resolution input and downsampling it to the display, the TV screen.

As with any sampled system, in a digital cinema camera there are some properties we seek to protect and others to diminish. We want a high level of detail, but we don’t want sharpening artifacts and we don’t want aliasing. The only way to achieve that is through a high-resolution sensor, properly filtered (optical low-pass) that can see a large amount of real, un-enhanced detail. So yes, 8K can give you lots of fine detail should you want it, but the imaging benefits extend beyond downsampling to 4K or 2K. 8K makes for an incredibly robust image, but noise is reduced, and what noise remains takes on more of a texture, which is much more aesthetically pleasing.

One challenge of 8K is an increase in the amount of sensor data to be recorded, but that can be addressed through quality compression systems like RedCode.

Addressing dynamic range is very important because dynamic range and resolution work together to produce the image. It’s easy to think that high resolutions have a negative impact upon dynamic range, but improved pixel design means you can have dynamic range and resolution.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Color management is vitally important and so much more than just keeping color control from on-set through to delivery. Now with the move to HDR and an increasing amount of mobile viewing, we have a wide variety of displays, all with their own characteristics and color gamuts. Color management allows content creators to display their work at maximum quality without compromise. Red cameras help in multiple ways. On camera, one can monitor in both SDR and HDR simultaneously with the new IPP2 image processing pipeline’s output independence, which also allows you to color via CDL and creative 3D LUT in such a way as to have those decisions represented correctly on different monitor types.

In post and grading, the benefits of output independence continue, but now it’s critical that scene colors, which can so easily go out of gamut, are dealt with tastefully. Through the metadata support in the RedCode format, all the creative decisions taken on set follow through to dailies and post, but never get in the way of producing the correct image output, be it for VFX, editorial or grading.

Panavision’s Michael Cioni 
Panavision designs and manufactures high-precision camera systems, including both film and digital cameras, as well as lenses and accessories for the motion picture and television industries.

Can you talk about some camera trends you’ve been seeing?
With the evolution of digital capture, one of the most interesting things I’ve noticed in the market are new trends emerging from the optics side of cinematography. At a glance, it can appear as if there is a desire for older or vintage lenses based on the increasing resolution of large format digital cameras. While resolution is certainly a factor, I’ve noticed the larger contributor to vintage glass is driven by the quality of sensors, not the resolution itself. As sensors increase in resolution, they simultaneously show improvements in clarity, low-light capability, color science and signal-to-noise ratio.

The compounding effect of all these elements are improving images far beyond what was capable with analog film technology, which explains why the same lens behaves differently on film, S35 digital capture and large-format digital capture. As these looks continue to become popular, Panavision is responding through our investments in both restoration of classic lenses as well as designing new lenses with classic characteristics and textures that are optimized for large format photography on super sensors.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look?
Creating images is not always about what component is better, but rather how they elevate images by working in concert. HDR images are a tool that increases creative control alongside high resolution and 16-bit color. These components work really well together because a compelling image can make use of more dynamic range, more color and more clarity. Its importance is only amplified by the amalgamation of high-fidelity characteristics working together to increase overall image flexibility.

Today, the studios are still settling into an HDR world because only a few groups, led by OTT, are able to distribute in HDR to wide audiences. On-set tools capable of HDR, 4K and 16-bit color are still in their infancy and currently cost-prohibitive. 4K/HDR on the set is going to become a standard practice by 2021. 4K wireless transmitters are the first step — they are going to start coming online in 2019. Smaller OLED displays capable of 750 nits+ will follow in 2020, creating an excellent way to monitor higher quality images right on set. In 2021, editorial will start to explore HDR and 4K during the offline process. By 2024, all productions will be HDR from set to editorial to post to mobile devices. Early adopters that work out the details today will find themselves ahead of the competition and having more control as these trends evolve. I recommend cinematographers embrace the fundamentals of HDR, because understanding the tools and trends will help prevent images from appearing artificial or overdone.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
One of the reasons we partnered with Red is because the Monstro 8K VV sensor makes no sacrifice in dynamic range while still maintaining ultra high smoothness at 16 bits. The beauty of technology like this is that we can finally start to have the best from all worlds — dynamic range, resolution, bit depth, magnification, speed and workflow — without having to make quality sacrifices. When cinematographers have all these elements together, they can create images previously never seen before, and 8K is as much part of that story as any other element.

One important way to view 8K is not solely as a thermometer for high-resolution sharpness. A sensor with 35 million pixels is necessary in order to increase the image size, similar to trends in professional photography. 8K large format creates a larger, more magnified image with a wider field of view and less distortion, like the difference in images captured by 70mm film. The biggest positive I’ve noticed is that DXL2’s 8K large-format Red Monstro sensor is so good in terms of quality that it isn’t impacting images themselves. Lower quality sensors can add a “fingerprint” to the image, which can distort the original intention or texture of a particular lens.

With sensors like Monstro capable of such high precision, the lenses behave exactly as the lens maker intended. The same Panavision lenses on a lower grade sensor, or even 35mm film, are exhibiting characteristics that we weren’t able to see before. This is literally breathing new life into lenses that previously didn’t perform the same way until Monstro and large format.

Is the availability of so many camera formats a help or a hindrance?
You don’t have to look far to identify individuals who are easily fatigued by having too many choices. Some of these individuals cope with choices by finding ways to regulate them, and they feel fewer choices means more stability and perhaps more control (creative and economic). As an entrepreneur, I find the opposite to be true: I believe regulating our world, especially with regards to the arts and sciences, is a recipe for protecting the status quo. I fully admit there are situations in which people are fatigued by too many complex choices.

I find that failure is not of the technology itself, rather it’s the fault of the manufactures who have not provided the options in easy-to-consume ways. Having options is exactly what creatives need in order to explore something new and improved. But it’s also up to manufacturers to deliver the message in ways everyone can understand. We’re still learning how to do that, and with each generation the process changes a bit. And while I am not always certain which are the best ways to help people understand all the options, I am certain that the pursuit of new art will motivate us to go out of our comfort zones and try something previously thought not possible.

Have you encountered any examples of productions that have shot streaming content (i.e. for Netflix/Amazon) and had to change production practices and workflows for this format/deliverable?
Netflix and Amazon are exceptional examples of calculated risk takers. While most headlines discuss their investment in the quantity of content, I find the most interesting investment they make is in relationships. Netflix and Amazon are heavily invested in standards groups, committees, outreach, panels and constant communication. The model of the past and present (incumbent studios) are content creators with technology divisions. The model of the future (Netflix, Amazon, Hulu, Apple, Google and YouTube) are all the technology companies with the ability to create content. And technology companies approach problems from a completely different angle by not only embracing the technology, they help invent it. In this new technological age, those who lead and those who follow will likely be determined by the tools and techniques used to deliver. What I call “The Netflix Effect” is the impact Netflix has on traditional groups and how they have all had to strategically pivot based on Netflix’s impact.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The DXL2 has an advanced color workflow. In collaboration with LiveGrade by Pomfort, DXL2 can capture looks wirelessly from DITs in the form of CDLs and LUTs, which are not only saved into the metadata of the camera, but also baked into in-camera proxy files in the form of Apple ProRes or Avid DNx. These files now contain visual references of the exact looks viewed on monitors and can be delivered directly to post houses, or even editors. This improves creative control because it eliminates the guess work in the application of external color decisions and streamlines it back to the camera where the core database is kept with all the other camera information. This metadata can be traced throughout the post pipeline, which also streamlines the process for all entities that come in contact with camera footage.

How has production changed over the last two years?
Sheesh. A lot!

ARRI‘s Stephan Ukas-Bradley
The ARRI Group manufactures and distributes motion picture cameras, digital intermediate systems and lighting equipment. Their camera offerings include the Alexa LF, Alexa Mini, Alexa 65, Alexa SXT W and the Amira.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Large format opens some new creative possibilities, using a shallow depth of field to guide the audience’s view and provide a wonderful bokeh. It also conveys a perspective truer to the human eye, resulting in a seemingly increased dimensional depth. The additional resolution combined with our specially designed large format Signature Primes result in beautiful and emotional images.

Old and vintage lenses can enhance a story. For instance, when Gabriel Beristain, ASC, used Bausch & Lomb Super Baltar on the Starz show Magic City, and Bradford Young used detuned DNA lenses in conjunction with Alexa 65 on Solo: A Star Wars Story, certain characteristics like flares, reflections, distortions and focus fall-off are very difficult to recreate in post organically, so vintage lenses provide an easy way to create a unique look for a specific story and a way for the director of photography to maintain creative control.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Currently, things are not done much differently on set when shooting HDR versus SDR. While it would be very helpful to monitor in both modes on-set, HDR reference monitors are still very expensive and very few productions have the luxury to do that. One has to be aware of certain challenges when shooting for an HDR finish. High contrast edges can result in a more pronounced stutter/strobing effect when panning the camera, windows that are blown out in SDR might retain detail in the HDR pass and now all of a sudden, a ladder or grip stand are visible.

In my opinion, HDR is more important than higher resolution. HDR is resolution-independent in regard to viewing devices like phone/tablets and gives the viewer a perceived increased sharpness, and it is more immersive than increased resolution. Also, let’s not forget that we are working in the motion picture industry and that we are either capturing moving objects or moving the camera, and with that introducing motion blur. Higher resolution only makes sense to me in combination with higher frame rates, and that in return will start a discussion about aesthetics, as it may look hyper-real compared to the traditional 24fps capture. Resolution is one aspect of the overall image quality, but in my opinion extended dynamic range, signal/noise performance, sensitivity, color separation and color reproduction are more important.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices and workflows, if at all?
Shooting streaming content has really not changed production practices or workflows. At ARRI, we offer very flexible and efficient workflows and we are very transparent documenting our ARRIRAW file formats in SMPTE RDD 30 (format) and 31 (processing) and working with many industry partners to provide native file support in their products.

Is the availability of all those camera resolutions a help or a hindrance?
I would look at all those different camera types and resolutions as different film stocks and recommend to creatives to shoot their own test and select the camera systems based on what suits their project best.

We offer the ARRI Look Library for Amira, Alexa Mini and Alexa SXT (SUP 3.0), which is a collection of 87 looks, each of them available in three different intensities provided in Rec. 709 color space. Those looks can either be recorded or only used for monitoring. These looks travel with the picture, embedded in the metadata of the ARRIRAW file, QuickTime Atom or HD/SDI stream in form of the actual LUT and ASC CDL. One can also create a look dynamically on set, feeding the look back to the camera and having the ASC CDL values embedded in the same way.

More commonly, one would record in either ARRIRAW or ProRes LogC, while applying a standard Rec. 709 look for monitoring. The “C” in LogC stands for Cineon, which is a film-like response very much the like of a scanned film image. Colorists and post pros are very familiar with film and color grading LogC images is easy and quick.

How has production changed over the last two years?
I don’t have the feeling that production has changed a lot in the past two years, but with the growing demand from OTTs and increased production volume, it is even more important to have a reliable and proven system with flexible workflow options.

Main Image: DP Kate Reid.

DITs: Maintaining Order on Set

By Karen Moltenbrey

The DIT, or digital imaging technician, can best be described as that important link between on-set photography and post production. Part of the camera crew, the DIT works with the cinematographer and post production on the workflow, camera settings, signal integrity and image acquisition. Much more than a data wrangler, a DIT ensures the technical quality control, devises creative solutions involving photography technology and sees that the original camera data and metadata are backed up regularly.

Years ago, the DIT’s job was to solve issues as the industry transitioned from film to digital. But today, with digital being so complex and involving many different formats, this job is more vital than ever, sweating the technical stuff so that the DP and others can focus on their work for a successful production. In fact, one DIT interviewed for this piece notes that the job today focuses less on fine-tuning the live look than it did in the past. One reason for that is the many available tools that enable the files to be shaped more carefully in post.

The DITs interviewed here note that the workflow usually changes from production to production. “If you ask 10 different DITs what they do, they would probably give you 10 different answers,” says one. Still, the focus remains the same: to assist the DP and others, ensuring that everyone and everything is working in concert.

And while some may question whether a production needs the added expense of a DIT, perhaps a better question would be whether they can afford not to have one.

Here, two DITs discuss their ever-changing workflows for this important job.

Michele deLorimier 
Veteran DIT Michele deLorimier describes the role of a digital imaging technician as a problem solver. “It’s like doing puzzles — multiple, different-size puzzles that have to be sorted out,” she says. “It always involves problem solving, from trying to fix the director’s iPhone to the tech parameter settings in the cameras to the whole computer having to be torn apart and put back together. All the while, shooting has not stopped and footage is accumulating.”

There are often multiple cameras, and the footage needs to be downloaded and QC’d, and cards erased and sent back into rotation in order to continue shooting. “So, I guess the greatest tool on the cart is the complete computer workstation, and if it is having a problem, it requires high-gear, intense problem solving,” she adds.

And through it all, deLorimier and her fellow DITs must keep their cool and come up with a solution — and fast.

deLorimier has been working as a DIT for many years now. She honed her problem-solving skills working at live concerts, where she had to be fast on her feet while working with live control of multiple cameras through remote control units and paint boxes. “I’d sit at a switcher, with a stack of monitors and one big monitor, and keep the look consistent — black levels, paint controls — on all cameras, live.”

Later, this segued into setting up and controlling on- and off-board videotape and data-recorder digital cinema cameras on set for commercial film production.

“I just kind of fell into [DIT work] because of what I had done, and then it just continued to evolve,” says deLorimier. With the introduction of digital cinema cameras, DITs with a film and video background were needed during the transition period — spawning the term “digital imaging technician.”

“It went from being tape-based, where you’re creating and baking in a look while you’re shooting, to tape-based where you’re shooting sort of a flat pass and creating a timeline of looks you’re delivering alongside the videotape. And then to data recording, delivering files and additionally honing the look after the footage is ingested,” she says.

Among the equipment deLorimier uses is a reference grade monitor “that must be calibrated properly,” she says, a way to objectively assess exposure, such as with a waveform monitor, and some method of objectively assessing color, so a type of vectorscope. That is the base-level equipment. For commercials, efficient hardware and software are needed for downloading, manipulating and QC’ing the footage, color correcting it and creating deliverables for post.”

deLorimier prefers Flanders Scientific monitors — she has six for various tasks: a pair of 25 inch, a 24 inch, a pair of 21 inch and a 17 inch — as well as a Leader waveform monitor/vectorscope.

“We’re using wireless video a lot these days so we can move around freely and the cables aren’t all over the ground to trip on,” she says. “That part of the chain can have the incorrect setting, so it’s important to ensure that everything is [set at] baseline and that what you are adding to it — usually some form of a LUT to the livestream — is baseline too.” This starts with settings in the camera and then anything the video signal chain might touch.

Then there is various software, drivers, readers, cables and power management, which change and get updated regularly. Thus, deLorimier stresses that any software change should be tested and updated during prep, to ensure compatibility. “There are unexpected things that you can’t prep for. There are times when you show up at a shoot and will be told, ‘We shot some drone footage yesterday,’ and it’s with a camera that you had no control over the settings,” she says. “So, the more you can prep for, the higher the rate of success you will have.”

Over the years, deLorimier has worked on a variety of productions, from features to TV commercials, with each type of project requiring a different setup. Preparing for a commercial usually entails physically prepping equipment and putting pieces together, as well as checking its end-to-end signal chain, from camera settings, through distribution of the video signal, to the final destination for monitoring and data delivery.

A day before this interview, deLorimier finished a Subaru commercial, shooting in Sequoia National Forest for the first few days, then Griffith Park and some areas around LA. Before that was a multi-unit job for a Nike spot that was filmed in numerous cities over the course of five days. For that project, each of the DITs for the A, B and C units had to coordinate with one another for consistency, ensuring that the cameras would be set up the same way, that they had the same specs and were delivering a similar look. “We were shooting with big projectors onto buildings and screens, and the cameras needed to sync to the projectors in some instances,” deLorimier explains.

According to deLorimier, it is unusual for the work of a DIT not to be physical. “We’re on the move a lot,” she says, crediting her past concert experience for her ability to adjust to adverse and unexpected conditions. “And we are not working in a controlled environment, but we do our best under the constraints we have and always try to keep post in mind.”

She recalls one physically demanding job that required three consecutive nights of shooting in the rain near Santa Barbara, to film a train coming down the tracks. Part of the crew was on one side of the tracks, and part on the other. And deLorimier was in a cornfield with her carts, computer system and monitors, inside a tent to keep dry. “They kept calling me to come to B camera. But I was also remotely setting up live looks inside my tent.

“I had a headlamp on because I had to deal with cables and stuff in my tent, and at one point illuminated by my headlamp, I could see that there were at least 45 snails crawling up the inside of my tent and cart. I was getting mud on my glasses and in my eyes. Then my whole cart, which was pretty heavy, started tipping and tilting, and I was bracing myself and my feet were starting to get sucked into the mud in the mole holes that were filling with rainwater. I couldn’t even call for help because it took both of my hands to hold up the cart, and the snails were everywhere! And, through it all, they kept calling on the walkie-talkie, ‘Michele, B camera needs you. The train’s coming.’”

Insofar as acquisition formats are concerned, deLorimier says that it’s higher resolution and almost always raw files for commercials these days. “A minimum of 4K is almost mandatory across the board,” she notes. And if the project is shooting with Red Digital Cinema cameras, it is between 6K and 8K, as the team she works with mostly use Red Monstros or ARRIRAW. She also works with Phantom Cine raw files.

“The higher data rates have definitely given me more gray hairs,” says deLorimier with a smile. “There’s no downtime. There’s always six or seven balls in the air, and there’s very little room for error or any fixing on set. This is also why the prep day is vital; so much can be worked out and pre-run during the prep, and this pays off for production during the shoot.”

Francesco Luigi Giardiello
Francesco Luigi Giardiello defines his role as that of an on-set workflow supervisor, as opposed to a traditional DIT. “Over the last five to 10 years, I have been designing a workflow that basically extends from set to post production, focusing on the whole pipeline so we don’t have to throw away what has been done on set,” he says.

Giardiello has been designing a pipeline based on a white balance match, which he says is quite unusual in the business because everything gets done through a simplified and more standardized color grading. “We designed something that goes a bit deeper into the color science and works with the Academy’s ACES workflow, trying to establish a common working colorspace, common color pipeline and a common method to control and manipulate colors. This — across any possible camera or source media used in production — is to provide balanced and consistent footage to the DI and visual effects teams. This allows the CG to be applied without having to spend time on balancing and tweaking the color of the shots.”

The Thor team (L-R): Francesco Giardiello, Kramer Morgenthau ASC (DP), Fabio Ferrantini (data manager).

This is important, especially today, where people are shooting with different digital systems. Or maybe even film and digital cameras, plus different lenses, so the shots look very different, even with the same lighting conditions. To this end, Giardiello’s role as DIT would be to grade or match everything so it all looks the same.

“Normally this gets done by using color tools, some of which are more sophisticated than others. When the tools are too sophisticated, they are intractable in the workflow and, therefore, become useless after leaving the set. When they are too ‘simple,’ like CDLs, often they are insufficient in correctly balancing the shots. And, because they are applied during a stage of the pipeline where the cinematographer’s look is introduced, they end up lost or often convolute the pipeline,” he notes. “We designed a system where the color balance occurs before any other color grading, and then the color grading is applied just as a look.”

Giardiello is currently in production on Marvel Studios’ Spider-Man: Far from Home, scheduled for release July 5, 2019. Not his first trip into the Marvel universe, he has worked on Thor: The Dark World, in addition to a number of episodic TV series and other big VFX productions, including Jurassic World and Aladdin. “You are the ambassador of the post production and VFX work,” he explains. “You have to foresee any technical issue and establish a workflow that will facilitate them. So, doing my job without being on set would be a complete waste of time. Sure, I can work in the studios and post production facilities to design workflows that will work without a DIT, but the problem is that things happen on set because that’s where decisions get made.”

As Giardiello points out, the other departments, such as camera and VFX, even the cinematographers, have different priorities and different jobs to fulfill. So, they’re not necessarily spending the time to ensure that every camera, every lens and every setting is in line with a consistent workflow to match the others. “They tend to shoot with whatever camera or medium they think is best and then expect that VFX or post will be able to fit that into an existing workflow.”

On average, Giardiello spends a few weeks of prep to design a project’s workflow, probably longer than producers and production companies would like. But, he believes that the more you plan, the less you have to deal with on set and in post production. When a shoot is finished, he will spend a week or two with the post facility, more to facilitate the handoff than to fix major issues.

Jurassic World was shot with 6K Arri Alexa 65s and the 8K Red Digital Cinema Helium camera, but the issue with high-resolution cameras is the amount of data they generate. “When you start shooting 4, 5, 6 or 8 terabytes a day, you have to make sure you are on set as a data point and that post production is capable of handling all this incoming data,” Giardiello advises. To this end, he had been working with Pinewood Digital to streamline a workflow for moving the data from set to post, whereby rather than sending the original mags to post, his group packaged up the data into very fast, very secure Codex Digital SLEDs.

The most important challenge on a VFX-oriented film, Giardiello says, is the color pipeline, as large studios, like Marvel, Disney, Warner Bros. and Universal, are focused on making sure that the so-called “digital negatives,” or raw footage, that arrive to post and VFX is well balanced and doesn’t require a lot of fixing before those departments can begin their work. “So, having balanced footage has been, and still is, one of the biggest concerns for any major studio when it comes to managing color from set to post production,” he notes.

So, for the last few years, this issue has been handled through the in-camera white balance with a system developed by Giardiello. “We changed the white balance on every single camera, using that to match every single shot before it gets to post production. So when it arrives in front of a VFX compositor and the DI suite, the average color and density of every single shot is consistent,” he adds.

Francesco Giardiello’s rig on Jurassic World.

Giardiello’s workflow is one that he has designed and developed over a five-year period and shines particularly when it comes to facilitating VFX interaction with action footage. “If you have to spend weeks fixing things in VFX on a big job like Jurassic World, Aladdin or Spider-Man, we’re talking about losing thousands of dollars every day,” he points out.

The work entails using a range of tools, some of which are designed for each new job. One tool that has been used on Giardiello’s last few films modifies the metadata for Red cameras to match them with that of the Alexa camera. Meanwhile, on set he uses Filmlight’s Prelight for light grading or to design CDLs. Probably the most important tool for dealing with RAW footage, he maintains, is Codex Digital’s Codex Production Suite. “It allows us to streamline the cloning and backup processes, to perform a visual QC near set and to access the metadata of raw footage and change it (when it is not changed in-camera).

“When those files get to post production in [Filmlight’s] Daylight, which is mostly used these days to process rushes, Daylight doesn’t recognize that change as an actual change, but as something that the DIT does on set in-camera,” Giardiello says.

In addition, he also uses the new SSD SLED designed by Codex, which offers encryption — an important feature for studios like Marvel or Sony. Then, on set, he uses BoxIOs, a LUT box from Flanders Scientific, as well as Flanders monitors, either DM240s (LCDs) or DM250s (OLEDs), depending on the type of project.

Over the years, Giardiello has often worked with the same DPs, but in the past three years, his major clients instead have been studios: Universal, Marvel and Warner Bros. “But my boss is still the DP,” he adds.

During the past 12 years, Giardiello has witnessed an evolution in the role of DIT and expects this to continue, particularly as media continues to converge and merge — from cinema or television to mobile devices. “So yeah, I would say our job has changed and is going to change, but I think it’s more important now than it was 10 years ago, and obviously it’s going to be even more important in the next 10 years.”


Karen Moltenbrey is a longtime writer and editor in the CG and post industries.

DG 7.9, 8.27, 9.26

Behind the Camera: Television DPs

By Karen Moltenbrey

Directors of photography on television series have their work cut out for them. Most collaborate early on with the director on a signature “look.” Then they have to make sure that aesthetic is maintained with each episode and through each season, should they continue on the series past the pilot. Like film cinematographers, their job entails a wide range of responsibilities aside from the camera work. Once shooting is done, they are often found collaborating with the colorists to ensure that the chosen look is maintained throughout the post process.

Here we focus on two DPs working on two popular television series — one drama, one sitcom — both facing unique challenges inherent in their current projects as they detail their workflows and equipment choices.

Ben Kutchins: Ozark
Lighting is a vital aspect in the look of the Netflix family crime drama Ozark. Or perhaps more accurate, the lack of lighting.

Ben Kutchins (left) on set with actor/director Jason Bateman.

“I’m going for a really naturalistic feel,” says DP Ben Kutchins. “My hope is that it never feels like there’s a light or any kind of artificial lighting on the actors or lighting the space. Rather, it’s something that feels more organic, like sunlight or a lamp that’s on in the room, but still offers a level of being stylized and really leans into the darkness… mining the shadows for the terror that goes along with Ozark.”

Ozark, which just kicked off its second season, focuses on financial planner Marty Byrde, who relocates his family from the Chicago suburbs to a summer resort area in the Missouri Ozarks. After a money laundering scheme goes awry, he must pay off a debt to a Mexican drug lord by moving millions of the cartel’s money from this seemingly quiet place, or die. But, trouble is waiting for them in the Ozarks, as Marty is not the only criminal operating there, and he soon finds himself in much deeper than he ever imagined.

“It’s a story about a family up against impossible odds, who constantly fear for their safety. There is always this feeling of imminent threat. We’re trying to invoke a heightened sense of terror and fear in the audience, similar to what the characters might be feeling,” explains Kutchins. “That’s why a look that creates a vibe of fear and danger is so important. We want it to feel like there is danger lurking around every corner — in the shadows, in the trees behind the characters, in the dark corners of the room.”

In summary, the look of the show is dark — literally and figuratively.

“It is pretty extreme by typical television standards,” Kutchins concedes. “We’ve embraced an aesthetic and are having fun pushing its boundaries, and we’re thrilled that it stands out from a pretty crowded market.”

According to Kutchins, there are numerous examples where the actor disappears into the shadows and then reappears moments later in a pool of light, falling in and out of shadow. For instance, a character may turn off a light and plunge the room into complete darkness, and you do not see that character again until they reappear, until they’re lit by moonlight coming through a window or silhouetted against a window.

“We’re not spending a lot of time trying to fill in the shadows. In fact, we spend most of our time creating more shadows than exist naturally,” he points out.

Jason Bateman, who plays Marty, is also an executive producer and directed the first two and last two episodes of Season 1. Early on, he, along with Kutchins and Pepe Avila del Pino, who shot the pilot, hashed out the desired look for the show, leaning into a very cyan and dark color palette — and leaning in pretty strongly. “Most people think of [this area as] the South, where it’s warm and bright, sweaty and hot. We just wanted to lean into something more nuanced, like a storm was constantly brewing,” Kutchins explains. “Jason really pushed that aesthetic hard across every department.”

Alas, that was made even more difficult since the show was mainly shot outdoors in the Atlanta area, and a good deal of work went into reacting to Mother Nature and transforming the locations to reflect the show’s Ozark mountain setting. “I spent an immense amount of time and effort killing direct sunlight, using a lot of negative fill and huge overheads, and trying to get rid of that direct, harsh sun,” says Kutchins. “Also, there are so many windows inside the Byrde house that it’s essentially like shooting an exterior location; there’s not a lot of controlled light, so you again are reacting and adapting.”

Kutchins shoots the series on a Panasonic VariCam, which he typically underexposes by a stop or two, mining the darker part of the sensor, “the toe of the exposure curve.” And by doing so, he is able to bring out the dirtier, more naturalistic, grimy parts of the image, rather than something that looks clean and polished. “Something that has a little bit of texture to it, some grit and grain, something that’s evocative of a memory, rather than something that looks like an advertisement,” he says.

To further achieve the look, Kutchins uses an in-camera LUT that mimics old Fuji film stock. “Then we take that into post,” he says, giving kudos to his colorist, Company 3’s Tim Stipan, who he says has been invaluable in helping to develop the “vibe” of the show. “As we moved along through Season 1 and into Season 2, he’s been instrumental in enhancing the footage.”

A lot of Kutchins’ work occurs in post, as the raw images captured on set are so different from the finals. Insofar as the digital intermediate is concerned, significant time is spent darkening parts of the frame, brightening small sections of the frame and working to draw the viewer into the frame. “I want people to be leaning on the edge of their seat, kind of wanting to look inside of the screen and poke their head in for a look around,” Kutchins says. “So I do a lot of vignetting and darkening of the edges, and darkening specific things that I think are distracting.”

Nevertheless, there is a delicate balance he must maintain. “I talk about the darkness of Ozark, but I am trying to ride that fine line of how dark it can be but still be something that’s pleasant to watch. You know, where you’re not straining to see the actor’s face, where there’s just enough information there and the frame is just balanced enough so your eyes feel comfortable looking at it,” he explains. “I spend a lot of time creating a focal point in the frame for your eyes to settle on — highlighting certain areas and letting some areas go black, leaving room for mystery in every frame.”

When filming, Kutchins and his crew use Steadicams, cranes, dollies and handheld. He also uses Cooke Optics’ S4 lenses, which he tends to shoot wide open, “to let the flaws and character of the lenses shine through.”

Before selecting the Panasonic VariCam, Kutchins and his group tested other cameras. Because of Netflix’s requirement for 4K, that immediately ruled out the ARRI Alexa, which is Kutchins’ preferred camera. “But the Panasonic ended up shining,” he adds.

In Ozark, the urban family is pitted against nature, and thus, the natural elements around them need to feel dangerous, Kutchins points out. “There’s a line in the first season about how people drown in the lake all the time. The audience should always feel that; when we are at the water’s edge, that someone could just slip in and disappear forever,” he says. “So, the natural elements play a huge role in the inspiration for the lighting and the feel of the show.”

Jason Blount:The Goldbergs
A polar opposite to Ozark in almost every way, The Goldbergs is a single-camera comedy sitcom set in the ’80s about a caring but grumpy dad, an overbearing mother and three teens — the oldest, a popular girl; the middle one, who fancies himself a gifted athlete and strives to be popular; and the youngest, a geek who is obsessed with filmmaking, as he chronicles his life and that of his family on film. The series is created and executive-produced by Adam F. Goldberg and is based on his own life and childhood, which he indeed captured on film while growing up.

The series is filmed mostly on stage, with the action taking place within the family home or at the kids’ schools. For the most part, The Goldbergs is an up-lit, broad comedy. The colors are rich, with a definite nod to the vibrant palette of the ’80s. “Our colorist, Scott Ostrowsky [from Level 3], has been grading the show from day one. He knows the look of the show so well that by the time I sit with him, there are very few changes that have to be made,” says Blount.

The Goldbergs began airing in 2013 and is now entering its sixth season. And the series’ current cinematographer, Jason Blount, has been involved since the start, first serving as the A camera/Steadicam operator before assuming the role of DP for the Season 1 finale — for a total of 92 episodes now and counting.

As this was a Sony show for ABC, the plan was to shoot with a Sony PMW-F55 CineAlta 4K digital camera, but at the time, it did not record at a fast enough frame rate for some of the high-speed work the production wanted. So, they ended up using the ARRI Alexa for Season 1. Blount took over as DP full time from Season 2 onward, and the decision was made to switch to the F55 for Season 2, as the frame rate issue had been resolved.

“The look of the show had already been established, and I wanted to make sure that the transition between cameras was seamless,” says Blount. “Our show is all about faces and seeing the comedy. From the onset, I was very happy with the Sony F55. The way the camera renders skin tone, the lack of noise in the deep shadows and the overall user-friendly nature of the camera impressed me from the beginning.”

Blount points to one particular episode where the F55 really shined. “The main character was filming a black-and-white noir-style home movie. The F55 handled the contrast beautifully. The blacks were rich and the highlights held onto detail very well,” he says. “We had a lot of smoke, hard light directly into the lens, and really pushed the limits of the sensor. I couldn’t have been happier with the results.”

In fact, the camera has proved its mettle winter, spring, summer and fall. “We’ve used it in the dead of winter, at night in the rain and during day exterior [shots] at the height of summer when it’s been over 100 degrees. It’s never skipped a beat.”

Blount also commends Keslow Camera in Los Angeles, which services The Goldbergs’ cameras. In addition, the rental house has accessorized the F55 camera body with extra bracketry and integrated power ports for more ease of use.

Due to the fast pace at which the show is filmed — often covering 10-plus pages of script a day — Blount uses Angenieux Optimo zoom lenses. “The A camera has a full set of lightweight zooms covering 15mm to 120mm, and the B camera always has the [Optimo] 24-290,” he says. “The Optimo lenses and F55 are a great combination, making it easy to move fast and capture beautiful images.”

Blount points out that he also does all the Steadicam work on the show, and with the F55 being so lightweight, compact and versatile, it makes for a “very comfortable camera in Steadicam mode. It’s perfect to use in all shooting modes.”

The Goldbergs’ DP always shoots with two cameras, sometimes three depending on the scene or action. And, there is never an issue of the cameras not matching, according to Blount. “I’m not a big fan of the GoPro image in the narrative world, and I own a Sony a7S. It’s become my go-to camera for mounts or tight space work on the show, and works perfectly with the F55.”

And, there is something to say for consistency, too. “Having used the same camera and lens package for the past five seasons has made it easy to keep the look consistent for The Goldbergs,” says Blount. “At the beginning of this season, I looked at shooting with the new Sony Venice. It’s a fantastic-looking camera, and I love the options, like the variable ND filters, more color temperature options and the dual ISO, but the limit of 60fps at this stage was a deal-breaker for me; we do a fair amount of 72fps and 120fps.”

“If only the F55 had image stabilization to take out the camera shake when the camera operators are laughing so hard at the actors’ performances during some scenes. Then it would be the perfect camera!” he says with a laugh himself.


Karen Moltenbrey is a longtime writer and editor in the CG and post industries.


Q&A: Camera Operators

By Randi Altman

Camera operators might not always get the glory, but they certainly do get the job done. Working hand in hand with DPs and directors, these artists make sure the camera is in the right place for the right shot, and so much more. As one of the ops we spoke to says, “The camera operator is the “protector of the frame.”

We reached out to three different camera operators, all of whom are members of the Society of Camera Operators (SOC), to find out more about their craft and how their job differs from some of the others on set.

Lisa Stacilauskas

Lisa Stacilauskas, SOC
What is the role of the camera operator? What is the camera operator accountable for on set?
The role of the camera operator varies quite a bit depending on the format. I work primarily in scripted television on “single camera” comedies. Don’t let the name “single camera” fool you. It’s meant to differentiate the shooting format from multicam, but these days most single camera shows shoot with two or three cameras. The show I work on, American Housewife, uses three cameras. I am the C camera operator.

In the most basic sense, the camera operator is responsible for the movement of the camera and the inclusion or exclusion of what is in frame. It takes a team of craftspeople to accomplish this. My immediate team includes a 1st and 2nd camera assistant and a dolly grip. Together we get the camera where it needs to be to get the shots and to tell the story as efficiently as possible.

In a larger sense, the camera operator is a storyteller. It is my responsibility to know the story we are trying to tell and assist the director in attaining their vision of that story. As C camera operator, I think about how the scene will come together in editing so I know which pieces of coverage to get.

Another big part of my job is keeping lighting equipment and abandoned water bottles out of my shot. The camera operator is the “protector of the frame”!

How do you typically work with the DP?
The DP is the head of the camera department. Each DP has nuances in the way they work with their operators. Some DPs tell you exactly where to put your camera and what focal length your lenses should be. Others give you an approximate position and an indication of the size (wide, medium, close-up) and let you work it out with the actors or stand-ins.

American Housewife

How does the role of the camera operator and a DP differ?
The DP is in charge of camera and lighting. Officially, I have no responsibility for lighting. However, it’s very important for a camera operator to think like a DP; to know and pay attention to the lighting. Additionally, especially when shooting digitally, once the blocking is determined, camera operators stay on set, working with all the other departments to prepare a shot while the DP is at the monitors evaluating the lighting and/or discussing set ups with the director.

What is the relationship between the operator and the director?
The relationship between the operator and director can vary depending on the director and the DP. Some directors funnel all instructions through the DP and only come to you with minor requests once the shots have already been determined.

If the director comes to me directly without going through the DP, it is my responsibility to let the DP know of the requested shot, especially if she/he hasn’t lit for it! Sometimes you are a mediator between the two and hopefully steer them closer to being on the same page. It can be a tough spot to be in if the DP and director have different visions.

Can you talk about recent projects you’ve worked on?
I’m currently working on Season 3 of American Housewife. The C camera position was a day-playing position at the beginning of Season 1, but DP Andrew Rawson loves to work with three cameras, and really knows how to use all three efficiently. Once production saw how much time we saved them, they brought us on full time. Shooting quickly and efficiently is especially important on American Housewife because three of our five principal actors are minors, whose hours on set are restricted by law.

During a recent hiatus, I operated B camera on a commercial with the DP operating A camera. It seemed like the DP appreciated the “extra set of eyes.”

Prior to American Housewife, I worked on several comedies operating the B camera, including Crazy Ex- Girlfriend (Season 1), Teachers (Season 1) and Playing House (Season 2).

Stephen Campanelli, SOC
What is the role of the camera operator? What is the camera operator accountable for on set?
The role of the camera operator on the set is to physically move the camera around to tell the story. The camera operator is accountable for what the director and the director of photography interpret to be the story that needs to be told visually by the camera.

Stephen Campanelli (center) on the set of American Sniper.

As a camera operator, you listen to their input, and sometimes have your own input and opinion to make the shot better or to convey the story point from a different view. It is the best job on set in my opinion, as you get to physically move the camera to tell great stories and work with amazing actors who give you their heart and soul right in front of you!

How do you typically work with the DP?
I have been very fortunate in my career to have worked with some very collaborative DPs. After 24 years of working with Clint Eastwood, I have absorbed so much of his directing style and visual nature that working closely with the DP we have created the Eastwood-style of filmmaking. When I am doing other films with other DPs we always talk about conveying the story in the truest, most visual way without letting the camera get in the way of a good story. That is one of the most important things to remember: A camera operator is not to bring attention to the camera, but bring attention to the story!

How does the role of the camera operator and a DP differ?
A DP usually is in charge of the lighting and the look of the entire motion picture. Some DPs also operate the camera, but that is a lot of work on both sides. A camera operator is very essential, as he or she can rehearse with the actors or stand-ins while the director of photography can concentrate solely on the lighting.

What is the relationship between the operator and the director?
As I mentioned earlier, my relationship with Clint Eastwood has been a very close one, as he works closely with the camera operator rather than the director of photography. We have an incredible bond where very few words are spoken, but we each know how to tell the story once we read the script. On some films, the director and the DP are the ones that work together closely to cohesively set the tone for the movie and to tell the story, and the camera operator interprets that and physically moves the camera with the collaboration of both the director and director of photography.

Can you talk about recent projects you’ve worked on?
I recently just wrapped my 22nd movie with Clint Eastwood as his camera operator, it is called The Mule. We filmed in Atlanta, New Mexico and Colorado. It is a very good script and Clint is back in front of the camera again, acting in it. It also stars Bradley Cooper, Laurence Fishburne and Michael Pena. [Editor’s note: He recently worked on A Star is Born, also with Bradley Cooper.]

Recently, I moved up to directing. In 2015, I directed a movie called Momentum, and this year I directed an award-winning film called Indian Horse that was a big hit in Canada and will soon be released in the United States.

Jamie Hitchcock

Jamie Hitchcock, SOC
What is the role of the camera operator? What is the camera operator accountable for on set?
The role of a camera operator is to compose an assigned shot and physically move the camera if necessary to perform that shot or series of shots as many times as needed to achieve the final take. The camera operator is responsible for maintaining the composition of the shot while also scanning the frame for anything that shouldn’t be there. The camera operator is responsible for communicating to all departments about elements that should or should not be in the frame.

How do you typically work with the DP?
The way a camera operator works with a director of photography varies depending on the type of project they are working on. On a feature film, episodic or commercial, the director of photography is very involved. The DP will set each shot and the operator then repeats it as many times as necessary. On a variety show, live show or soap opera, the DP is usually a lighting designer/director, and the director works with the camera operators to set the shots. On multi-camera sitcoms, the shots are usually set by the director… with the camera operator. When the production requires a complicated scene or location, the DP will become more actively involved with the selection of the shots.

How does the role of the camera operator and a DP differ?
The role of the DP and operator are quite different yet the goal is the same. The DP is involved with the pre-production process, lighting, running the set, managing the crew and the post process. The operator is involved on set working shot by shot. Ultimately, both the DP and operator are responsible for the final image the viewing audience will see.

What is the relationship between the operator and the director?
The relationship between the operator and the director, like that of the operator and DP, varies depending on the type of project. On a feature-type project, the director may be only using one camera. On a sports or variety program the director might be looking at 15 or more cameras. In all cases, the director is counting on the operators to perform their assigned shots each time. Ultimately, when a shot or take is complete, the director is the person who decides to move on or do it again, and they trust the operator to tell them if the shot was good or not.

CBS’s Mom

Can you talk about recent projects you’ve worked on?
I am currently working on The Big Bang Theory and Mom for CBS. Both are produced by Chuck Lorre Productions and Warner Bros. Television. Steven V. Silver, ASC, is the DP for both shows. Both shows use four cameras and are taped in front of a studio audience. We work in what I like to call the “Desilu-type” format because all four cameras are on a J.L. Fisher dolly with a 1st assistant working the lens and a dolly grip physically moving the camera. This format was perfected by Desi Arnez for I Love Lucy and still works well for this format.

The working relationship with the DP and director on our shows falls somewhere in the middle. Mark Cendroski and Jamie Widdoes direct almost all of our shows, and they work directly with the four operators to set the required shots. They have a lot of trust in us to know what elements are needed in the frame, and sometimes the only direction we receive is the type of shot they want. I work on a center camera, which is commonly referred to as a “master” camera, however it’s not uncommon to have a master, close-up and two or three shots all in the same scene. Each scene is shot beginning to end with all the coverage set, and a final edit is done in post. We do have someone cutting a live edit that feeds to the audience so they can follow along.

Our process is very fast, and our DP usually only sees the lighting when we start blocking shots with stand-ins. Steve spends a lot of time at the monitor and constantly switches between all four cameras — he’s looking at composition and lighting, and setting the final look with our video controller. Generally, when the actors are on set we roll the cameras. It’s a pretty high-pressure way to work for a product that will potentially be seen by millions of people around the world, but I love it and can’t imagine doing anything else.

Main Image Caption: Stephen Campanelli


The SOC, which celebrates 40 years in 2019, is an international organization that aims to bring together camera operators and crew. The Society also hosts an annual Lifetime Achievement Awards show, publishes the magazine Camera Operator and has a charitable commitment to The Vision Center at Children’s Hospital Los Angeles.


The ASC: Mentoring and nurturing diversity

Cynthia Pusheck, ASC, co-chairs the ASC Vision Committee, along with John Simmons, ASC. Working together they focus on encouraging and supporting the advancement of underrepresented cinematographers, their crews and other filmmakers. They hope their efforts inspire others in the industry to help positive change through hiring talent that better reflects society.

In addition to her role on the ASC Vision Committee, Pusheck is a VP of the ASC board. She became a member in 2013. Her credits include Sacred Lies, Good Girls Revolt, Revenge and Brothers & Sisters. She is currently shooting Limetown for Facebook Watch.

To find out more about their work, we reached out to Pusheck.

Can you talk about what the ASC Vision Committee has done since its inception? What it hopes to accomplish?
The ASC Vision Committee was formed in January 2016 as a way for the ASC to actively support those who face unique hurdles as they build their cinematography careers. We’ve held three full-day diversity events, and some individual panel discussions.

We’ve also awarded a number of scholarships to the ASC Master Class and will continue awarding a handful each year. Our mentorship program is getting off the ground now with many ASC members offering to give time to young DPs from underrepresented groups. There’s a lot more that John Simmons (my co-chair) and our committee members want to accomplish, and with the support of the ASC staff, board members and president, we will continue to push things forward.

(L-R) Diversity Day panel: Rebecca Rhine, Dr. Stacy Smith, Alan Caso, Natasha Foster-Owens, Xiomara Comrie, Tema Staig, Sarah Caplan.

The word “progress” has always been part of the ASC mission statement. So, with the goal of progress in mind, we redesigned an ASC red lapel pin and handed it out at the ASC Awards earlier this year (#ASCVision). We wanted to use it to call attention to the work of our committee and to encourage our own community of cinematographers and camera people to do their part. If directors of photography and their department heads (camera, grip and set lighting) hire with inclusivity in mind, then we can change the face of the industry.

What do you think is contributing to more females becoming interested in camera crew careers? What are you seeing in terms of tangible developments?
Gender inequality in this industry has certainly gotten a lot of attention the last few years, which is fantastic but despite all that attention, the actual facts and figures don’t show as much change as you’d think.

The percentage of women or people of color shooting movies and TV shows hasn’t really changed much. There certainly is a lot more “content” getting produced for TV, and that has been great for many of us, and it’s a very exciting time. But, we have a long way to go still.

What’s very hopeful, though, is that more producers and studios are really pushing for inclusivity. That means hiring more women and people of color in positions of leadership, and encouraging their crews to bring more underrepresented crew members onto the production.

Currently we’re also seeing more young female DPs getting some really good shooting opportunities very early in their careers. That didn’t happen so much in the past, and I think that continues to motivate more young women to consider the camera department, or cinematography, as a viable career path.

We also have to remember that it’s not just about getting more women on set, it’s about having our sets look like society at large. The ultimate goal should be that everyone has a fair chance to succeed in this industry.

How can women looking to get into this part of the industry find mentors?
The union (Local 600), and also now the ASC have mentorship programs. The union’s program is great for those coming up the ranks looking for help or advice as they build their career.

For example, an assistant can find another assistant, or an operator, to help them navigate the next phase of their career and give them advice. The ASC mentorship program is aimed more for young cinematographers or operators from underrepresented groups who may benefit from the support of an experienced DP.

Another way to find a mentor is by contacting someone whom you admire directly. Many women would be surprised to find that if they reach out and request a coffee or phone call, often that person will try and find time for them.

My advice would be to do your homework about the person you’re contacting and be specific in your questions and your goals. Asking broad questions like “How do I get a job” or “Will you hire me?” won’t get you very far.

What do you think will create the most change? What are the hurdles that still must be overcome?
Bias and discrimination, whether conscious or unconscious, is still a problem on our sets. It may have lessened in the last 25 years, but we all continue to hear stories about crew members (at all levels) who behave badly, make inappropriate comments or just have trouble working for woman or people of color. These are all unnecessary stresses for those trying to get hired and build their careers.


Behind the Camera: Feature Film DPs

By Karen Moltenbrey

The responsibilities of a director of photography (DP) span far more than cinematography. Perhaps they are best known for their work behind the camera capturing the action on set, but that is just one part of their multi-faceted job. Well before they step onto the set, they meet with the director, at times working hand-in-hand to determine the overall look of the project. They also make a host of technical selections, such as the type of camera and lenses they will use as well as the film stock if applicable – crucial decisions that will support the director’s vision and make it a reality.

Here we focus on two DPs for a pair of recent films with specialized demands and varying aesthetics, as they discuss their workflows on these projects as well as the technical choices they made concerning equipment and the challenges each project presented.

Hagen Bogdanski: Papillon
The 2018 film Papillon, directed by Michael Noer, is a remake of the 1973 classic. Set in the 1930s, it follows two inmates who must serve, and survive, time in a French Guyana penal colony. The safecracker nicknamed Papillon (Charlie Hunnam) is serving a life sentence and offers protection to wealthy inmate Louis Dega (Rami Malek) in exchange for financing Papillon’s escape.

“We wanted to modernize the script, the whole story. It is a great story but it feels aged. To bring it to a new, younger audience, it had to be modernized in a more radical way, even though it is a classic,” says Hagen Bogdanski, the film’s DP, whose credits include the film The Beaver and the TV series Berlin Station, among others. To that end, he notes, “we were not interested in mimicking the original.”

This was done in a number of ways. First, through the camera work, using a semi-documentary style. The director has a history of shooting documentaries and, therefore, the crew shot with two cameras at all times. “We also shot the rehearsals,” notes Bogdanski, who was brought onto the project and given nearly five weeks of prep before shooting began. Although this presented a lot of potential risk for Bogdanski, the film “came out great in the end. I think it’s one of the reasons the look feels so modern, so spontaneous.”

In the film, the main characters face off against the harsh environment of their prison island. But to film such a landscape required the cinematographer and crew to also contend with these trying conditions. They shot on location outdoors for the majority of the feature, using just one physical structure: the prison. Also helping to define the film’s aesthetic was the lighting, which, as is typical with Bogdanski’s films, is as natural as possible without large artificial sources.

Most of the movie was shot in Montenegro, near sun-drenched Greece and Albania. Bogdanski does not mince words: “The locations were difficult.”

Weather seemed to impact Bogdanski the most. “It was very remote, and if it’s raining, it’s really raining. If it’s getting dark, it’s dark, and if it’s foggy, there is fog. You have to deal with a lot of circumstances you cannot control, and that’s always a bit of a nightmare for any cinematographer,” he says. “But, what is good about it is that you get the real thing, and you get texture, layers, and sometimes it’s better when it rains than when the sun is shining. Most of the time we were lucky with the weather and circumstances. The reality of location shooting adds quite heavily to the look and to the whole texture of the movie.”

The location shooting also affected this DP’s choice of cameras. “The footprint [I used] was as small as possible because we basically visited abandoned locales. Therefore, I chose as small a kit — lenses, cameras and lights — as possible,” Bogdanski points out. “Because [the camera] was handheld, every pound counted.” In this regard, he used ARRI’s Arriflex Mini cameras and one Alexa SXT, and only shot with Zeiss Ultra Prime lenses – “big zooms, no big filters, nothing,” he adds.

The prison build was on a remote mountain. On the upside, Bogdanski could shoot 360 degrees there without requiring the addition of CGI later. On the downside, the crew had to get up the mountain. A road was constructed to transport the gear and for the set construction, but even so, the trek was not easy. “It took two hours or longer each day from our hotel. It was quite an adventure,” he says.

As for the lighting, Bogdanski tried to shoot when the light was good, taking advantage of the location’s natural light as much as possible — within his documentary style. When this was not enough, LEDs were used. “Again, small footprint, smaller lens, smaller electrical power, smaller generators….” The night scenes were especially challenging because the nights were very short, no longer than five to six hours. When artificial rain had to be used, shooting was “a little painful” due to the size of the set, requiring the use of more traditional lighting sources, such as large Tungsten light units.

According to Bogdanski, filming Papillon followed what he calls an “eclectic” workflow, akin to the European method of filming whereby rehearsal occurred in the morning and was quite long, as the director rehearsed with the actors. Then, scenes were shot in script order, on the first take without technical rehearsals. “From there, we tried to cover the scene in handheld mode with two cameras in a kind of mash-up. We did pick up the close-ups and all that, but always in a very spontaneous and quick way,” says Bogdanski.

Looking back, Bogdanski describes Papillon as a “modern-period film”: a period look, without looking “period.” “It sounds a bit Catch-22, which it is, in my opinion, but that’s what we aimed for, a film that plays basically in the ’40s and ’50s, and later in the ’60s,” he says.

During the time since the original film was made in 1973, the industry has witnessed quite a technical revolution in terms of film equipment, providing the director and DP on the remake with even more tools and techniques at their disposal to leave their own mark on this classic for a new generation.

Nancy Schreiber: Mapplethorpe
Award-winning cinematographer Nancy Schreiber, ASC, has a resume spanning episodic television (The Comeback), documentaries (Eva Hesse) and features (The Nines). Her latest film, Mapplethorpe, paints an unflinching portrait of controversial-yet-revered photographer Robert Mapplethorpe, who died at the age of 42 from AIDS-related complications in 1989. Mapplethorpe, whose daring work influenced popular culture, rose to fame in the 1970s with his black-and-white photography.

In the early stages of planning the film, Schreiber worked with director Ondi Timoner and production designer Jonah Markowitz while they were still in California prior to the shoot in New York, where Mapplethorpe (played by The Crown’s Matt Smith) lived and worked at the height of his popularity.

“We looked at a lot of reference materials — books and photographs — as Ondi and I exchanged look books. Then we honed in on the palette, the color of my lights, the set dressing and wardrobe, and we were off to the races,” says Schreiber. Shooting began mid-July 2017.

Mapplethorpe is a period piece that spans three decades, all of which have a slightly different feel. “We kept the ’60s and into the ’70s quite warm in tone,” as this is the period when he first meets Patty Smith, his girlfriend at the time, and picks up a camera, explains Schreiber. “It becomes desaturated but still warm tonally when he and Patti visit his parents back home in Queens while the two are living at the Chelsea Hotel. The look progresses until it’s very much on the cool blue/gray side, almost black and white, in the later ’70s and ’80s.” During that time period, Mapplethorpe is successful, with an enormous studio, photographically exploring male body parts like no other person has ever done, while continuing to shoot portraits of the rich and famous.

Schreiber opted to use film, Super 16, rather than digital to capture the life of this famed photographer. “He shot in film, and we felt that format was true to his photography,” she notes. Despite Mapplethorpe’s penchant for mostly shooting in black and white, neither Timoner nor Schreiber considered using that format for the feature, mostly because the ’60s through ’80s in New York had very distinctive color palettes. They felt, however, that film in and of itself was very “textural and beautiful,” whereas you have to work a little harder with digital to make it look like film — even though new ways of adding grain to digital have become quite sophisticated. “Yet, the grain of Super 16 is so distinctive,” she says.

In addition, Kodak had just opened a lab in New York in the spring of 2017, facilitating their ability to shoot film by having it processed quickly nearby.

Schreiber used an ARRI Arriflex 416 camera for the project; when possible, she used two. She also had a set of Zeiss 35mm Super Speed lenses, along with two zoom lenses she used only occasionally for outdoor shots. “The Super Speeds were terrific. They’re vintage and were organic to the look of this period.”

She also used a light meter faithfully. Although Schreiber occasionally uses light meters when shooting digital, it was not optional for shooting film. “I had to use it for every shot, although after a couple of days, I was pretty good at guessing [by eyeing it],” Schreiber points out, “as I used to do when we only shot film.”

Soon after ARRI had introduced the Arriflex 416 – which is small and lightweight – the industry started moving to digital, prompting ARRI to roll out the now-popular Alexa. “But the [Arriflex 416] camera really caught on for those still shooting Super 16, as they do for the series The Walking Dead, Schreiber says, adding she was able to get her pair from TCS Technological Cinevideo Services rental house in New York.

“I had owned an Aaton, a French camera that was very popular in the 1980s and ’90s. But today, the 416 is very much in demand, resembling the shape of my Aaton, both of which are ergonomic, fitting nicely on your shoulder. There were numerous scenes in the car, and I could just jump in the car with this very small camera, much smaller than the digital cameras we use on movies; it was so flexible and easy to work with,” recalls Schreiber.

As for the lenses, “again, I chose the Super Speed Primes not only because they were vintage, but because I needed the speed of the 1.3 lens since film requires more light.” She tested other lenses at TCS, but those were her favorites.

While Schreiber has used film on some commercials and music videos, it had been some time since she had used it for an entire movie. “I had forgotten how freeing it is, how you can really move. There are no cables to worry about. Although, we did transmit to a tiny video village,” she says. “We didn’t always have two cameras [due to cost], so I needed to move fast and get all the coverage the editor needed. We had 19 days, and we were limited in how long we could shoot each day; our budget was small and we couldn’t afford overtime.” At times, though, she was able to hire a Steadicam or B operator who really helped move them along, keeping the camera fluid and getting extra coverage. Timoner also shot a bit of Super 8 along the way.

There was just one disadvantage to using film: The stocks are slow. As Schreiber explains, she used a 500 ASA stock. Therefore, she needed very fast lenses and a fair amount of light in order to compensate. “That worked OK for me on Mapplethorpe because there was a different sense of lighting in the 1970s, and films seemed more ‘lit.’ For example, I might use backlight or hair light, which I never would do for [a film set in] present day,” she says. “I rated that stock at 400 to get rich blacks; that also slightly minimized the grain when the day interior stock was 250 that I rated at 200. We are so used to shooting at 800 or 1280 ISO these days. It was an adjustment.”

Schreiber on set with “Mapplethorpe” director Ondi Timoner.

Shooting with film was also more efficient for Schreiber. “We had monitors for the video village, but we were standard def, old-school, which is not an exact representation. So, I could move quickly to get enough coverage, and I never looked at a monitor except when we had Steadicam. What you see is not what you get with an SD tap. I was trusted to create the imagery as I saw fit. I think many people today are used to seeing the digital image on the monitor as what the final film will look like and may be nervous about waiting for the processing and transfer, not trusting the mystery or mystique of how celluloid will look.”

To top things off, Schreiber was backed by an all-female A camera team. “I know how hard it is for women to get work,” she adds. “There are so many competent women working behind the camera these days, and I was happy to hire them. I remember how challenging it was when I was a gaffer or started to shoot.”

As for costs, digital camera equipment is more expensive than Super 16 film equipment, yet there were processing and transfer costs associated with getting the film into the edit suite. So, when all was said and done, film was indeed more expensive to use, but not by much.

“I am really proud that we were able to do the movie in 19 days with a very limited budget, in New York, covering many periods,” concludes Schreiber. “We had a great time, and I am happy I was able to hire so many women in my departments. Women are still really under-represented, and we must demonstrate that there is not a scarcity of talent, just a lack of exposure and opportunity.”

Mapplethorpe is expected in theaters this October.


Karen Moltenbrey is a longtime writer and editor in the CG and post industries.