Category Archives: Cameras

Red Ranger all-in-one camera system now available

Red Digital Cinema has made its new Red Ranger all-in-one camera system available to select Red authorized rental houses. Ranger includes Red’s cinematic full-frame 8K sensor Monstro in an all-in-one camera system, featuring three SDI outputs (two mirrored and one independent) allowing two different looks to be output simultaneously; wide-input voltage (11.5V to 32V); 24V and 12V power outs (two of each); one 12V P-Tap port; integrated 5-pin XLR stereo audio input (Line/Mic/+48V Selectable); as well as genlock, timecode, USB and control.

Ranger is capable of handling heavy-duty power sources and boasts a larger fan for quieter and more efficient temperature management. The system is currently shipping in a gold mount configuration, with a v-lock option available next month.

Ranger captures 8K RedCode RAW up to 60fps full-format, as well as Apple ProRes or Avid DNxHR formats at 4K up to 30fps and 2K up to 120fps. It can simultaneously record RedCode RAW plus Apple ProRes or Avid DNxHD or DNxHR at up to 300MB/s write speeds.

To enable an end-to-end color management and post workflow, Red’s enhanced image processing pipeline (IPP2) is also included in the system.

Ranger ships complete, including:
• Production top handle
• PL mount with supporting shims
• Two 15mm LWS rod brackets
• Red Pro Touch 7.0-inch LCD with 9-inch arm and LCD/EVF cable
• LCD/EVF adaptor A and LCD/EVF adaptor D
• 24V AC power adaptor with 3-pin 24V XLR power cable
• Compatible Hex and Torx tools

Shooting, posting New Republic’s Indie film, Sister Aimee

After a successful premiere at the Sundance Film Festival, New Republic Studios’ Sister Aimee screened at this month’s SXSW. The movie tells the story of an infamous American evangelist of the 1920s, Sister Aimee Semple McPherson, who gets caught up in her lover’s dreams of Mexico and finds herself on a road trip toward the border.

Sister Aimee shot at the newly renovated New Republic Studios near Austin, Texas, over two and a half weeks. “Their crew used our 2,400-square-foot Little Bear soundstage, our 3,000-square-foot Lone Wolf soundstage, our bullpen office space and numerous exterior locations in our backlot,” reports New Republic Studios president Mindy Raymond, adding that the Sister Aimee production also had access to two screening rooms with 5.1 surround sound, HDMI hookups to 4K monitors and theater-style leather chairs to watch dailies. The film also hit the road, shooting in the New Mexico desert.

L-R: Directors Samantha Buck, Marie Schlingmann at SXSW. Credit: Harrison Funk

Co-written and co-directed by Samantha Buck and Marie Schlingmann, the movie takes some creative license with the story of Aimee. “We don’t look for factual truth in Aimee’s journey,” they explain. “Instead we look for a more timeless truth that says something about female ambition, the female quest for immortality and, most of all, the struggle for women to control their own narratives. It becomes a story about storytelling itself.”

The film, shot by cinematographer Carlos Valdes-Lora at 3.2K ProRes 4444 XQ on an Arri Alexa Mini, was posted at Dallas and Austin-based Charlieuniformtango.

We reached out to the DP and the post team to find out more.

Carlos, why did you choose the package of the Alexa and Cooke Mini S4 Primes?
Carlos Valdes-Lora: In early conversations with the directors, we all agreed that we didn’t want Sister Aimee to feel like a traditional period movie. We didn’t want to use softening filters or vintage lenses. We aimed instead for clear images, deep focus and a rich color palette that remains grounded in the real world. We felt that this would lend the story a greater sense of immediacy and draw the viewer closer to the characters. Following that same thinking, we worked very extensively with the 25mm and 32mm, especially in closeups and medium closeups, emphasizing accessibility.

The Cooke Mini S4s are a beautiful and affordable set (relative to our other options.) We like the way they give deep dimensionality and warmth to faces, and how they create a slightly lower contrast image compared to the other modern lenses we looked at. They quickly became the right choice for us, striking the right balance between quality, size and value.

The shoot at New Republic. Credit: Cruz Garcia

The Cookes paired with the Alexa Mini gave us a lightweight camera system with a very contained footprint, and we needed to stay fast and lean due to our compressed shooting schedule and often tight shooting quarters. The Chapman Cobra dolly was a big help in that regard as well.

What was the workflow to post like?
Charlieuniformtango producers Bettina Barrow, Katherine Harper, David Hartstein: Post took place primarily between Charlieuniformtango’s Dallas and Austin offices. Post strategizing started months before the shoot, and active post truly began when production began in July 2018.

Tango’s Evan Linton handled dailies brought in from the shoot, working alongside editor Katie Ennis out of Tango’s Austin studio, to begin assembling a rough cut as shooting continued. Ennis continued to cut at the studio through August with directors Schlingmann and Buck.

Editorial then moved back to the directors’ home state of New York to finish the cut for Sundance. (Editor Ennis, who four-walled out of Tango Austin for the first part of post, went to  New York with the directors, working out of a rented space.)

VFX and audio work started early at Tango, with continuously updated timelines coming from editorial, working to have certain locked shots also finished for the Sundance submission, while saving much of the cleanup and other CG heavy shots for the final picture lock.

Tango audio engineer Nick Patronella also tackled dialogue edit, sound design and mix for the submission out of the Dallas studio.

Can you talk about the VFX?
Barrow, Harper, Hartstein: The cut was locked in late November, and the heavy lifting really began. With delivery looming, Tango’s Flame artists Allen Robbins, Joey Waldrip, David Hannah, David Laird, Artie Peña and Zack Smith divided effects shots, which ranged from environmental cleanup, period-specific cleanup, beauty work such as de-aging, crowd simulation, CG sign creation and more. 3D

(L-R) Tango’s Artie Peña, Connor Adams, Allen Robbins in one of the studio’s Flame suites.

Artist Connor Adams used Houdini, Mixamo and Maya to create CG elements and crowds, with final comps being done in Nuke and sent to Flame for final color. Over 120 VFX shots were handled in total and Flame was the go-to for effects. Color and much of the effects happened simultaneously. It was a nice workflow as the project didn’t have major VFX needs that would have impacted color.

What about the color grade?
Barrow, Harper, Hartstein: Directors Buck and Schlingmann and DP Valdes-Lora worked with Tango colorist Allen Robbins to craft the final look of the film — with the color grade also done in Flame. The trio had prepped shooting for a Kodachrome-style look, especially for the exteriors, but really overall. They found important reference in selections of Robert Capa photographs.

Buck, Schlingmann and Valdes-Lora responded mostly to Kodachrome’s treatment of blues, browns, tans, greens and reds (while staying true to skin tone), but also to their gamma values, not being afraid of deep shadows and contrast wherever appropriate. Valdes-Lora wanted to avoid lighting/exposing to a custom LUT on set that would reflect this kind of Kodachrome look, in case they wanted to change course during the process. With the help of Tango, however, they discovered that by dialing back the Capa look it grounded the film a little more and made the characters “feel” more accessible. The roots of the inspiration remained in the image but a little more naturalism, a little more softness, served the story better.

Because of that they monitored on set with Alexa 709, which he thought exposing for would still provide enough room. Production designer Jonathan Rudak (another regular collaborator with the directors) was on the same page during prep (in terms of reflecting this Capa color style), and the practical team did what they could to make sure the set elements complemented this approach.

What about the audio post?
Barrow, Harper, Hartstein: With the effects and color almost complete, the team headed to Skywalker Ranch for a week of final dialogue edit, mix, sound design and Foley, led by Skywalker’s Danielle Dupre, Kim Foscato and E. Larry Oatfield. The team also was able to simultaneously approve color sections in Skywalker’s Stag Theater allowing for an ultra-efficient schedule. With final mix in hand, the film was mastered just after Christmas so that DCP production could begin.

Another shot of the Sister Aimee set at New Republic. Credit: Cruz Garcia.

Since a portion of the film was musical, how complex was the audio mix?
Skywalker sound mixer Dupre: The musical number was definitely one of the most challenging but rewarding scenes to design and mix. It was such a strong creative idea that played so deeply into the main character. The challenge was in striking a balance between tying it into the realism of the film while also leaning into the grandiosity of the musical to really sell the idea.

It was really fun to play with a combination of production dialogue and studio recordings to see how we could make it work. It was also really rewarding to create a soundscape that starts off minimally and simply and transitions to Broadway scale almost undetectably — one of the many exciting parts to working with creative and talented filmmakers.

What was the biggest challenge in post?
Barrow, Harper, Hartstein: Finishing a film in five to six weeks during the holidays was no easy feat. Luckily, we were able to have our directors hands-on for all final color, VFX and mix. Collaborating in the same room is always the best when you have no time to spare. We had a schedule where each day was accounted for — and we stuck to it almost down to the hour.

 

DigitalGlue 3.7

DP Chat: Madam Secretary’s Learan Kahanov

By Randi Altman

Cinematographer Learan Kahanov’s love of photography started at an early age, when he would stage sequences and scenes with his Polaroid camera, lining up the photos to create a story.

He took that love of photography and turned it into a thriving career, working in television, features and commercials. He currently works on the CBS drama Madam Secretary — where he was initially  hired to be the A-camera operator and additional DP. He shot 12 episodes and tandem units, then he took over the show fully in Season 3. The New York-shot, Washington, DC-set show stars Téa Leoni as the US Secretary of State, following her struggle to balance her work and personal life.

We recently reached out to Kahanov to find out more about his path, as well as his workflow, on Madam Secretary.

Learan Kahanov on set with director Rob Greenlea.

Can you talk about your path to cinematography?
My mother is a sculptor and printmaker, and when I was in middle school, she went back to get a degree in fine arts with a minor in photography. This essentially meant I was in tow, on many a weeknight, to the darkroom so she could do her printing and, in turn, I learned as well.

I shot mostly black and white all through middle school and high school. I would often use my mother’s art studio to shoot the models who posed for the drawing class she taught. Around the same time, I developed a growing fascination with animal behavior and strove to become a wildlife photographer, until I realized I didn’t have the patience to sit in a tree for days to get the perfect shot.

I soon turned my attention to videography while working at a children’s museum, teaching kids how to use the cameras and how to make short movies. I decided to pursue cinematography officially in high school. I eventually found myself at NYU film school, based off my photography portfolio. As soon as I got to New York City, I started working on indie films, as an electrician and gaffer, shooting every student film and indie project I could.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
I could list artists or filmmakers whose work I gravitate to, but the main thing I learned from my mother about art is that it’s about a feeling. Whether it’s being affected by a beautifully photographed image of a woman in a commercial or getting sucked into the visuals in a wildlife documentary, if you can invoke a feeling and or create an emotion you have made art.

Madam Secretary

I am always looking at things around me, and I’m always aware of how light falls on the world around me. Or how the shape of everyday objects and places change depending on the time, the weather or just my mood at the moment.

My vision of a project is always born out of the story, so the key for me is to always use technology (new or old) to support that story. Sometimes the latest in LED technology is the right tool for the job, sometimes it’s a bare light bulb attached to the underside of a white, five-gallon paint bucket (a trick Gaffer Jack Coffin and I use quite often). I think the balance between vision and technology is a two-way street — the key is to recognize when the technology serves your vision or the other way around.

What new technology has changed the way you work?
In the area of lighting, I have found that no matter what new tools come onto the scene, I still hold true to my go-to lighting techniques that I have preferred for years.

A perfect example would be my love for book lights — a book light is a bounced light that then goes through another layer of diffusion, which is perfect for lighting faces. Whether I am using an old Mole Richardson 5K tungsten unit or the newer ARRI S60 SkyPanels, the concept and end result are basically the same.

That being said, for location work the ARRI LED SkyPanels have become one of the go-to units on my current show, Madam Secretary. The lights’ high-output, low-power consumption, ease for matching existing location color sources and quick effects make them an easy choice for dealing with the faster-paced TV production schedule.

On-set setup

One other piece of gear that I have found myself calling for on a daily basis, since my key grip Ted Lehane introduced me to. It’s a diffusion material called Magic Cloth, which is produced by The Rag Place. This material can work as a bounce, as well as a diffusion, and you can directly light through. It produces a very soft light, as it’s fairly thick, but it does not change the color temperature of the source light. This new material, in conjunction with new LED technology, has created some interesting opportunities for my team.

Many DPs talk about the latest digital sensor, camera support (drone/gimbals, etc.) or LED lighting, but sometimes it’s something very simple, like finding a new diffusion material that can really change the look and the way I work. In fact, I think gripology in general often gets overlooked in the current affairs of filmmaking where everything seems to need to be “state of the art.”

What are some of your best practices or rules that you try to follow on each job?
I have one hard and fast rule in any project I shoot: support the story! I like to think of myself as a filmmaker first, using cinematography as a way to contribute to the filmmaking process. That being said, we can create lots of “rules” and have all the “go-to practices” to create beautiful images, but if what you are doing doesn’t advance the story, or at the very least create the right mood for the scene, then you are just taking a picture.

There are definite things I do because I simply prefer how it looks, but if it doesn’t make sense for the scene/move (based on the directors and my vision), I will then adjust what I do to make sure I am always supporting the story. There are definitely times where a balance is needed. We don’t create in a bubble, as there are all the other factors to consider, like budget, time, shooting conditions, etc. It’s this need/ability to be both technician and artisan that excites me the most about my job.

Can you explain your ideal collaboration with the director when setting the look of a project?
When working in episodic TV, every episode — essentially every eight days — there is a different director. Even when I have a repeat director, I have to adapt quickly between each director’s style. This goes beyond just being a chameleon from a creative standpoint — I need to quickly establish trust and a short hand to help the director put their stamp on their episode, all while staying within the already established look of the show.

Madam Secretary

I have always considered myself not an “idea man” but rather a “make-the-idea-better” man. I say this because being able to collaborate with a director and not just see their vision, but also enhance it and take it a step further (and see their excitement in the process), is completely fulfilling.

Tell us about Madam Secretary. How would you describe the overarching look of the show? How early did you get involved in the production?
I have been a part of Madam Secretary since the beginning, minus the pilot. I was hired as the A camera operator and as an additional DP. Jonathan Brown, ASC, shot the pilot and was the DP for the first two seasons. He was also one of our directors for the first three seasons. In addition to shooting tandem/2nd unit days and filling on scout days, I was the DP whenever Jonathan directed. So while I didn’t create the initial look of the show, I worked closely with Jonathan as the seasons went on until I officially took over in the third season.

Since I took over (and during my episodes), I felt an obligation to hold true to the original look and the intent of the show, while also adding my personal touch and allowing the show’s look to evolve with the series. The show does give us opportunities every week to create something new. While the reoccurring sets/locations do have a relatively set look, every episode takes us to new parts of the world and to new events.

It gives the director, production team and me an opportunity to create different looks and aesthetics to differentiate it from Madam Secretary’s life in DC. While it’s a quick schedule to prep,  research and create new looks for convincing foreign locations every episode (we shoot 99% of the show in New York), it is a challenge that brings a creativity and excitement to the job that I really enjoy.

Learan Kahanov on set with Hillary Clinton for the episode E Pluribus Unum.

Can you talk about what you shoot on and what lenses you use, etc.?
The show is currently shooting on Alexa SXTs with Leica Summicron Prime lenses and Fujinon Cabrio zooms. One of the main things I did when I officially took over the show was to switch to Lecia Primes. We did some testing with Tèa Leoni and Tim Daly on our sets to see how the lenses treated skin tones.

Additionally, we wanted to see how they reacted to the heavy backlight and to the blown out windows we have on many of our sets. We all agreed that the lenses were sharp, but also realized that they created a softer feel on our actors faces, had a nice focus fall-off and they handled the highlights really well. They are flexible enough to help me create different looks while still retaining a consistency for the show. The lenses have an interesting flare characteristic that sometimes makes controlling them difficult, but it all adds to the current look of the show and has yet to be limiting.

You used a Blackmagic Pocket Cinema camera for some specialized shots. Can you describe those?
The show has many scenes that entail some specialized shots that need a small but high-res camera that has an inherently different feel from the Alexa. These shots include webcam and security camera footage. There are also many times when we need to create body/helmet cam footage to emulate images recorded from military/police missions that then were played back in the president’s situation room. That lightweight, high-quality camera allows for a lot of flexibility. We also employ other small cameras like GoPro and DJI Osmo, as well as the Sony A7RII with PL mount.

Madam Secretary

Any challenging scenes that you are particularly proud of?
I don’t think there is an episode that goes by without some type of challenge, but one in particular that I was really happy with took place on a refugee boat in the middle of the Mediterranean Sea.

The scene was set at night where refugees were making a harrowing trip from the north coast of Libya to France. Since we couldn’t shoot on the ocean at night, we brought the boat and a storm into the studio.

Our production designer and art department cut a real boat in half and brought it onto the stage. Drew Jiritano and his special effects team then placed the boat on a gimbal and waterproofed the stage floor so we could place rain towers and air cannons to simulate a storm in the middle of the sea.

Using a technocrane, handheld cameras and interactive lighting, we created a great scene that immersed the audience in a realistic depiction of the dramatic journey that happens more often than most Americans realize.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Blackmagic offers next-gen Ursa Mini Pro camera, other product news

Blackmagic has introduced the Ursa Mini Pro 4.6K G2, a second-generation Ursa Mini Pro camera featuring fully redesigned electronics and a new Super 35mm 4.6K image sensor with 15 stops of dynamic range that combine to support high-frame-rate shooting at up to 300 frames per second.

In addition, the Ursa Mini Pro 4.6K G2 supports Blackmagic RAW and features a new USB-C expansion port for direct recording to external disks. Ursa Mini Pro 4.6K G2 is available now for $5,995 from Blackmagic resellers worldwide.

The new user interface

Key Features:
• Digital film camera with 15 stops of dynamic range
• Super 35mm 4.6K sensor with Blackmagic Design Generation 4 Color Science
• Supports project frame rates up to 60fps and off-speed slow motion recording up to 120fps in 4.6K, 150fps in 4K DCI and 300fps in HD Blackmagic RAW
• Interchangeable lens mount with EF mount included as standard. Optional PL, B4 and F lens mounts available separately
• High-quality 2-, 4- and 6-stop neutral density (ND) filters with IR compensation designed to specifically match the colorimetry and color science of Blackmagic URSA Mini Pro 4.6K G2
• Fully redundant controls including external controls that allow direct access to the most important camera settings such as external power switch, ND filter wheel, ISO, shutter, white balance, record button, audio gain controls, lens and transport control, high frame rate button and more
• Built-in dual C-Fast 2.0 recorders and dual SD/UHS-II card recorders allow unlimited duration recording in high quality
• High-speed USB-C expansion port for recording directly to an external SSD or flash disk
• Lightweight and durable magnesium alloy body
• LCD status display for quickly checking timecode, shutter and lens settings, battery, recording status and audio levels
• Support for Blackmagic RAW files in constant bitrate 3:1, 5:1, 8:1 and 12:1 or constant quality Q0 and Q5 as well as ProRes 4444 XQ, ProRes 4444, ProRes 422 HQ, ProRes 422, ProRes 422 LT, ProRes 422 Proxy recording at 4.6K, 4K, Ultra HD and HD resolutions
• Supports recording of up to 300fps in HD, 150fps in 4K DCI and 120fps at full-frame 4.6K.
• Features all standard connections, including dual XLR mic/line audio inputs with phantom power, 12G-SDI output for monitoring with camera status graphic overlay and separate XLR 4-pin power output for viewfinder power, headphone jack, LANC remote control and standard 4-pin 12V DC power connection
• Built-in high-quality stereo microphones for recording sound
• Offers a four-inch foldout touchscreen for on-set monitoring and menu settings
• Includes full copy of DaVinci Resolve color grading and editing software

Additional Blackmagic news:
– Blackmagic adds Blackmagic RAW to Blackmagic Pocket Cinema Camera 4K
– Blackmagic intros DeckLink Quad HDMI recorder
– Blackmagic updates DeckLink 8K Pro
– Blackmagic announces long-form recording on Blackmagic Duplicator 4K


Colorist Christopher M. Ray talks workflow for Alexa 65-shot Alpha

By Randi Altman

Christopher M. Ray is a veteran colorist with a varied resume that includes many television and feature projects, including Tomorrowland, Warcraft, The Great Wall, The Crossing, Orange Is the New Black, Quantico, Code Black, The Crossing and Alpha. These projects have taken Ray all over the world, including remote places throughout North America, Europe, Asia and Africa.

We recently spoke with Ray, who is on staff at Burbank’s Picture Shop, to learn more about his workflow on the feature film Alpha, which focuses on a young man trying to survive alone in the wilderness after he’s left for dead during his first hunt with his Cro-Magnon tribe.

Ray was dailies colorist on the project, working with supervising DI colorist Maxine Gervais. Gervais of Technicolor won an HPA Award for her work on Alpha in the Outstanding Color Grading — Feature Film category.

Let’s find out more….

Chris Ray and Maxine Gervais at the HPA Awards.

How early did you get involved in Alpha?
I was approached about working on Alpha right before the start of principal photography. From the beginning I knew that it was going to be a groundbreaking workflow. I was told that we would be working with the ARRI Alexa 65 camera, mainly working in an on-set color grading trailer and we would be using FilmLight’s Daylight software.

Once I was on board, our main focus was to design a comprehensive workflow that could accommodate on-set grading and Daylight software while adapting to the ever-changing challenges that the industry brings. Being involved from the start was actually was a huge perk for me. It gave us the time we needed to design and really fine-tune the extensive workflow.

Can you talk about working with the final colorist Maxine Gervais and how everyone communicated?
It was a pleasure working with Maxine. She’s really dialed in to the demands of our industry. She was able to fly to Vancouver for a few days while we were shooting the hair/makeup tests, which is how we were able to form in-person communication. We were able to sit down and discuss creative approaches to the feature right away, which I appreciated as I’m the type of person that likes to dive right in.

At the film’s conception, we set in motion a plan to incorporate a Baselight Linked Grade (BLG) color workflow from FilmLight. This would allow my color grades in Daylight to transition smoothly into Maxine’s Baselight software. We knew from the get-go that there would be several complicated “day for night” scenes that Maxine and I would want to bring to fruition right away. Using the BLG workflow, I was able to send her single “Arriraw” frames that gave that “day for night” look we were searching for. She was able to then send them back to me via a BLG file. Even in remote locations, it was easy for me to access the BLG grade files via the Internet.

[Maxine Gervais weighs in on working with Ray: “Christopher was great to work with. As the workflow on the feature was created from scratch, he implemented great ideas. He was very keen on the whole project and was able to adapt to the ever-changing challenges of the show. It is always important to have on-set color dialed in correctly, as it can be problematic if it is not accurately established in production.”]

How did you work with the DP? What direction were you given?
Being on set, it was very easy for DP Martin Gschlacht to come over to the trailer and view the current grade I was working on. Like Maxine, Martin already had a very clear vision for the project, which made it easy to work with him. Oftentimes, he would call me over on set and explain his intent for the scene. We would brainstorm ways of how I could assist him in making his vision come to life. Audiences rarely see raw camera files, or the how important color can influence the story being told.

It also helps that Martin is a master of aesthetic. The content being captured was extremely striking; he has this natural intuition about what look is needed for each environment that he shoots. We shot in lush rain forests in British Columbia and arid badlands in Alberta, which each inspired very different aesthetics.

Whenever I had a bit of down time, I would walk over to set and just watch them shoot, like a fly on the wall quietly observing and seeing how the story was unfolding. As a colorist, it’s so special to be able to observe the locations on set. Seeing the natural desaturated hues of dead grass in the badlands or the vivid lush greens in the rain forest with your own eyes is an amazing opportunity many of us don’t get.

You were on set throughout? Is that common for you?
We were on set throughout the entire project as a lot of our filming locations were in remote areas of British Columbia and Alberta, Canada. One of our most demanding shooting locations included the Dinosaur Provincial Park in Brooks, Alberta. The park is a UNESCO World Heritage site that no one had been allowed to film at prior to this project. I needed to have easy access to the site in order to easily communicate with the film’s executive team and production crew. They were able to screen footage in their trailer and we had this seamless back-and-forth workflow. This also allowed them to view high-quality files in a comfortable and controlled environment. Also, the ability to flag any potential issues and address them immediately on set was incredibly valuable with a film of such size and complexity.

Alpha was actually the first time I worked in an on-set grading trailer. In the past I usually worked out of the production office. I have heard of other films working with an on-set trailer, but I don’t think I would say that it is overly common. Sometimes, I wish I could be stationed on set more often.

The film was shot mostly with the Alexa 65, but included footage from other formats. Can you talk about that workflow?
The film was mostly shot on the Alexa 65, but there were also several other formats it was shot on. For most of the shoot there was a second unit that was shooting with Alexa XT and Red Weapon cameras, with a splinter unit shooting B-roll footage on Canon 1D, 5D and Sony A7S. In addition to these, there were units in Iceland and South Africa shooting VFX plates on a Red Dragon.

By the end of the shoot, there were several different camera formats and over 10 different resolutions. We used the 6.5K Alexa 65 resolution as the master resolution and mapped all the others into it.

The Alexa 65 camera cards were backed up to 8TB “sled” transfer drives using a Codex Vault S system. The 8TB transfer drives were then sent to the trailer where I had two Codex Vault XL systems — one was used for ingesting all of the footage into my SAN and the second was used to prepare footage for LTO archival. All of the other unit footage was sent to the trailer via shuttle drives or Internet transfer.

After the footage was successfully ingested to the SAN with a checksum verification, it was ready to be colored, processed, and then archived. We had eight LTO6 decks running 24/7, as the main focus was to archive the exorbitant amounts of high-res camera footage that we were receiving. Just the Alexa 65 alone was about 2.8TB per hour for each camera.

Had you worked with Alexa 65 footage previously?
Many times. A few year ago, I was in China for seven months working on The Great Wall, which was one of the first films to shoot with the Alexa 65. I had a month of in-depth pre-production with the camera testing, shooting and honing the camera’s technology. Working very closely with Arri and Codex technicians during this time, I was able to design the most efficient workflow possible. Even as the shoot progressed, I continued to communicate closely with both companies. As new challenges arose, we developed and implemented solutions that kept production running smoothly.

The workflow we designed for The Great Wall was very close to the workflow we ended up using on Alpha, so it was a great advantage that I had previous experience working in-depth with the camera.

What were some of the challenges you faced on this film?
To be honest, I love a challenge. As a colorist, we are thrown into tricky situations every day. I am thankful for these challenges; they improve my craft and enable me to become more efficient at problem solving. One of the largest challenges that I faced in this particular project was working with so many different units, given the number of units shooting, the size of the footage alone and the dozens of format types needed.

We had to be accessible around the clock, most of us working 24 hours a day. Needless to say, I made great friends with the transportation driving team and the generator operators. I think they would agree that my grading trailer was one of their largest challenges on the film since I constantly needed to be on set and my work was being imported/exported in such high resolutions.

In the end, as I was watching this absolutely gorgeous film in the theater it made sense. Working those crazy hours was absolutely worth it — I am thankful to have worked with such a cohesive team and the experience is one I will never forget.


DP Petr Hlinomaz talks about the look of Marvel’s The Punisher

By Karen Moltenbrey

For antiheroes like Frank Castle, the lead character in the Netflix series Marvel’s The Punisher, morality comes in many shades of gray. A vigilante hell-bent on revenge, the Marine veteran used whatever lethal means possible — kidnapping, murder, extortion — to exact revenge on those responsible for the deaths of his family. However, Castle soon found that the criminal conspiracy that set him on this destructive path ran far deeper than initially imagined, and he had to decide whether to embrace his role as the Punisher and help save other victims, or retreat to a more solitude existence.

Alas, in the end, the decision to end the Punisher’s crusade was made not by Frank Castle nor by the criminal element he sought to exact justice upon. Rather, it was made by Netflix, which just recently announced it was cancelling all its live-action Marvel shows. This coming a mere month after Season 2 was released, as many fans are still watching the season’s action play out.

Petr Hlinomaz

The Punisher is dark and intense, as is the show itself. The overall aesthetic is dim and gritty to match the action, yet rich and beautiful at the same time. This is the world initially envisioned by Marvel and then brought to life on screen late in Season 1 by director of photography Petr Hlinomaz under the direction of showrunner Steve Lightfoot.

The Punisher is based on the Marvel Comics character by the same name, and the story is set in the Marvel Cinematic Universe, meaning it shares DNA with the films and other TV shows in the franchise. There is a small family resemblance, but The Punisher is not considered a spin-off of Marvel’s Daredevil, despite the introduction of the Punisher (played by Jon Bernthal) on Season 2 of that series, for which Hlinomaz served as a camera operator and tandem DP. Therefore, there was no intent to match the shows’ cinematic styles.

“The Punisher does not have any special powers like other Marvel characters possess; therefore, I felt that the photographic style should be more realistic, with strong compositions and lighting resembling Marvel’s style,” Hlinomaz says. “It’s its own show. In the Marvel universe, it is not uncommon for characters to go from one show to another and then another after that.”

Establishing the Look
It seems that Hlinomaz followed somewhat in the characters’ footsteps himself, later joining The Punisher crew and taking on the role of DP after the first 10 episodes. He sussed out Lightfoot to find out what he liked as far as framing, look, camera movement and lighting were concerned, and built upon the look of those initial 10 episodes to finish out the last three episodes of Season 1. Then Hlinomaz enhanced that aesthetic on Season 2.

Hlinomaz was assisted by Francis Spieldenner, a Marvel veteran familiar with the property, who in Season 1 and again in Season 2 functioned as A camera/steadicam operator and who shot tandems in addition to serving as DP on two episodes (209 and 211) for Season 2.

“Steve and I had some discussions regarding the color of lighting for certain scenes in Season 2, but he pretty much gave me the freedom of devising the look and camera movement for the show on my own,” recalls Hlinomaz. “I call this look ‘Marvel Noir,’ which is low light and colorful. I never use any normal in the camera color temperature settings (for instance, 3,200K for night and 5,600K for day). I always look for different settings that fit the location and feel of the scene, and build the lighting from there. My approach is very source-oriented, and I do not like cheating in lighting when shooting scenes.”

According to Hlinomaz, the look they were striving for was a mix of Taxi Driver and The Godfather, but darker and more raw. “We primarily used wide-angle lenses to place our characters into our sets and scenery and to see geographically where they are. At times we strived to be inside the actors’ head.” They also used Jason Bourne films as a guideline, “making Jon (the Punisher) and all our characters feel small in the large NYC surroundings,” he adds. “The stunt sequences move fast, continuously and are brutally real.”

In terms of color, Hlinomaz uses very low light with a dark, colorful palette. This compliments New York City, which is colorful, while the city’s multitude of lights and colors “provide a spectacular base for the filming.” The show highlights various location throughout the city. “We felt the look is very fitting for this show, the Punisher being an earnest human being in the beginning of his life, but after joining the force is troubled by his past, PTSD and his family being brutally slaughtered, and in turn, he is brutal and ruthless to ‘bad people,’” explains Hlinomaz.

For instance, in a big fight scene in Season 1, Episode 11 at Micro’s hideout, Hlinomaz showed the top portion of the space to its fullest extent. “It looks dark, mysterious. We used a mixture of top, side and uplighting to make the space look interesting, with lots of color temperature mixes,” he says. “There was a plethora of leftover machinery and huge transformers and generators that were no longer in use, and stairwells that provided a superb backdrop for this sequence.”

The Workflow
For the most part, Hlinomaz has just one day to prep for an episode with the director, and that is often during the technical scout day. “Aside from reading the script and exchanging a few emails, that is the only prep we get,” he says.

During the technical scout, a discussion takes place with the director concerning how the scenes should look and feel. “We discuss lighting and grip, set dressing, blocking, shooting direction, time of day, where we light from, where the sun should be and so on, along with any questions concerning the locations for the next episodes,” he says.

During the scout and rehearsal, Hlinomaz looks for visually stimulating backgrounds, camera angles and shots that will enhance and propel the story line.

When they start shooting the episode, the group rehearses the scene, discusses the most efficient or suitable blocking for the scene and which lenses to use. During the shoot, Hlinomaz takes stills that will be used by the colorists as reference for the lighting, density, color and mood. When the episode is cut and roughly colored, he then will view the episode at the lab (Company 3 in New York) and make notations. Those notes are then provided to the post producer and colorist Tony D’Amore (from Encore) for the final color pass and Lightfoot’s approval.

The group employs HDR, “which, in a way, is hard because you always have to protect for overexposure on sources within the frame,” adds Hlinomaz. In fact, D’Amore has commended Hlinomaz, the directors and Lightfoot with devising unique lighting scenarios that highlighted the HDR aspect of the show in Season 2.

Tools of the Trade
The Punisher’s main unit uses two cameras – “we have crew to cover two at all times,” Hlinomaz says. That number increases to three or more as needed for certain sequences, though there are times when just one camera is used for certain scenes and shots.

According to Hlinomaz, Netflix and Marvel only shoot with Red 4K cameras and up. For the duration of The Punisher shoot, the crew only carried four “Panavised” Red cameras. “We shot 4K but frequently used the 5K and 6K settings to go a bit wider with the [Panavision] Primo lenses, or for a tilt and swing lens special look,” he says, adding that he has used Red cameras for the past four years and is still impressed with the color rendering of the Red sensors. Prior to shooting the series, he tested Zeiss Ultra Prime lenses, Leica Summilux lenses, along with Panavision Primos; Hlinomaz chose the Primos for their 3D rendering of the subjects.

The lens set ranged from 10mm to 150mm; there was also an 11:1 zoom lens that was used sparingly. It all depended on the shot. In Episode 13, when Frank finally shoots and kills hitman Billy Russo (aka Jigsaw), Hlinomaz used an older 12mm lens with softer edges to simulate Billy’s state as he is losing a lot of blood. “It looked great, somewhat out of focus along the edges as Frank approaches; then, when Frank steps closer for the kill, he comes into clear focus,” Hlinomaz explains.

In fact, The Punisher was shot using the same type of camera and lenses as the second season of the now-cancelled Marvel/Netflix series Luke Cage (Hlinomaz served as a DP on Luke Cage Season 2 and a camera operator for four episodes of Season 1). In addition to wide-angle lenses, the show also used more naturalistic lighting, similar to The Punisher.

Hlinomaz details another sequence pertaining to his choice of cameras and lenses on The Punisher, whereby he used 10mm and 14mm lenses for a fight scene inside an elevator. Spieldenner, the A cam operator, was inside the elevator with the performers. “We didn’t pull any walls for that, only the ceilings were pulled for one overhead shot when Frank flips a guy over his shoulder,” explains Hlinomaz. “I did not want to pull any walls; when you do, it feels like the camera is on the outside, especially if it’s a small space like that elevator.”

On-Set Challenges
A good portion of the show is filmed outdoors — approximately two-thirds of the series —which always poses an additional challenge due to constantly changing weather conditions, particularly in New York. “When shooting exteriors, you are in the elements. Night exteriors are better than day exteriors because you have more control, unless the day provides constant lighting — full sun or overcast, with no changes. Sometimes it’s impractical or prohibitive to use overhead cover to block out the sun; then you just have to be quick and make smart decisions on how to shoot a scene with backlight on one side and front fill that feels like sunlight on the other, and make it cut and look good together,” explains Hlinomaz.

As he noted earlier, Hlinomaz is a naturalist when it comes to lighting, meaning he uses existing source-driven lighting. “I like simplicity. I use practicals, sun and existing light to give and drive our light direction,” he further adds. “We use every possible light, from big HMIs all the way down to the smallest battery-driven LED lights. It all depends on a given shot, location, sources and where the natural or existing light is coming from. On the other hand, sometimes it is just a bounce card for a little fill or nothing extra to make the shot look great.”

All The Punisher sets, meanwhile, have hard ceilings. “That means with our use of lower camera angles and wide lenses, we are seeing everything, including the ceilings, and are not pulling bits of ceilings and hanging any lights up from the grid. All lighting is crafted from the floor, driven by sources, practicals, floor bounces, windows and so on,” says Hlinomaz. “My feeling is that this way, the finished product looks better and more natural.”

Most of Season 1’s crew returned for Season 2, so they were familiar with the dark and gritty style, which made things easier on Hlinomaz. The season begins with the Punisher somewhere in the Midwest before agent Madani brings Frank back to New York, although all the filming took place throughout New York.

One of the more challenging sequences this season, according to Hlinomaz, was an ambulance chase that was filmed in Albany, New York. For the shoot, they used a 30-foot Louma crane and Edge arm from Action Camera cars, and three to four Red cameras. For the actual ambulance drop, they placed four additional cameras. “We had to shoot many different passes with stunts as well as the actors, in addition to the Edge arm pass. It was quite a bit of work,” he says. Of course, it didn’t help that when they arrived in Albany to start filming, they encountered a rain delay, but “we used the time to set up the car and ambulance rigs and plan to the last detail how to approach our remaining days there.” For the ambulance interior, they shot on a greenscreen stage with two ambulances — one on a shaky drive simulation rig and the other mounted 20 feet or so high on a teeter rig that simulated the drop of the highway as it tilted forward until it was pointing straight to the ground.

“If I remember correctly, we spent six days total on that sequence,” says Hlinomaz.

The second season of The Punisher was hard work, but a fun and rewarding experience, Hlinomaz contends. “It was great to be surrounded from top to bottom with people working on this show who wanted to be there 100 percent, and that dedication and our hard work is evident, I believe, in the finished season,” he adds.

As Hlinomaz waited for word on Season 3 of The Punisher, he lent his talents to Jessica Jones, also set in the Marvel Cinematic Universe — and sadly also receiving the same ultimate fate — as Hlinomaz stepped in to help shoot Episode 305, with the new Red DSMC2 Gemini 5K S35 camera. “I had a great experience there and loved the new camera. I am looking forward to using it on my next project,” he adds.


Karen Moltenbrey is a veteran VFX and post writer.


Color plays big role in the indie thriller Rust Creek

In the edge-of-your-seat thriller Rust Creek, confident college student Sawyer (Hermione Corfield) loses her way while driving through very rural Appalachia and quickly finds herself in a life-or-death struggle with some very dangerous men. The modestly-budgeted feature from Lunacy Productions — a company that encourages female filmmakers in top roles — packs a lot of power with virtually no pyrotechnics using well-thought-out filmmaking techniques, including a carefully planned and executed approach to the use of color throughout the film.

Director Jen McGowan and DP Michelle Lawler

Director Jen McGowan, cinematographer Michelle Lawler and colorist Jill Bogdanowicz of Company 3 collaborated to help express Sawyer’s character arc through the use of color. For McGowan, successful filmmaking requires thorough prep. “That’s where we work out, ‘What are we trying to say and how do we illustrate that visually?’” she explains. “Film is such a visual medium,” she adds, “but it’s very different from something like painting because of the element of time. Change over time is how we communicate story, emotion and theme as filmmakers.”

McGowan and Lawler developed the idea that Sawyer is lost, confused and overwhelmed as her dire situation becomes clear. Lawler shot most of Rust Creek handholding an ARRI Alexa Mini (with Cooke S4s) following Sawyer as she makes her way through the late autumn forest. “We wanted her to become part of the environment,” Lawler says. “We shot in winter and everything is dead, so there was a lot of brown and orange everywhere with zero color separation.”

Production designer Candi Guterres pushed that look further, rather than fighting it, with choices about costumes and some of the interiors.

“They had given a great deal of thought to how color affects the story,” recalls colorist Bogdanowicz, who sat with both women during the grading sessions (using Blackmagic’s DaVinci Resolve) at Company 3 in Santa Monica. “I loved the way color was so much a part of the process, even subtly, of the story arc. We did a lot in the color sessions to develop this concept where Sawyer almost blends into the environment at first and then, as the plot develops and she finds inner strength, we used tonality and color to help make her stand out more in the frame.”

Lawler explains that the majority of the film was shot on private property deep in the Kentucky woods, without the use of any artificial light. “I prefer natural light where possible,” she says. “I’d add some contrast to faces with some negative fill and maybe use little reflectors to grab a rake of sunlight on a rock, but that was it. We had to hike to the locations and we couldn’t carry big lights and generators anyway. And I think any light I might have run off batteries would have felt fake. We only had sun about three days of the 22-day shoot, so generally I made use of the big ‘silk’ in the sky and we positioned actors in ways that made the best use of the natural light.”

In fact, the weather was beyond bad, it was punishing. “It would go from rain to snow to tornado conditions,” McGowan recalls. “It dropped to seven degrees and the camera batteries stopped working.”

“The weather issues can’t be overstated,” Lawler adds, describing conditions on the property they used for much of the exterior location. “Our base camp was in a giant field. The ground would be frozen in the morning and by afternoon there would be four feet of mud. We dug trenches to keep craft services from flooding.”

The budget obviously didn’t provide for waiting around for the elements to change, David Lean-style. “Michelle and I were always mindful when shooting that we would need to be flexible when we got to the color grading in order to tie the look together,” McGowan explains. “I hate the term ‘fix it post.’ It wasn’t about fixing something, it was about using post to execute what was intended.”

Jill Bogdanowicz

“We were able to work with my color grading toolset to fine tune everything shot by shot,” says Bogdanowicz. “It was lovely working with the two of them. They were very collaborative but were very clear on what they wanted.”

Bogdanowicz also adapted a film emulation LUT, which was based on the characteristics of a Fujifilm print stock and added in a subtle hint of digital grain, via a Boris FX Sapphire plug-in, to help add a unifying look and filmic feel to the imagery. At the very start of the process, the colorist recalls, “I showed Jen and Michelle a number of ‘recipes’ for looks and they fell in love with this one. It’s somewhat subtle and elegant and it made ‘electric’ colors not feel so electric but has a film-style curve with strong contrast in the mids and shadows you can still see into.”

McGowan says she was quite pleased with the work that came out of the color theater. “Color is not one of the things audiences usually pick up on, but a lot of people do when they see Rust Creek. It’s not highly stylized, and it certainly isn’t a distracting element, but I’ve found a lot of people have picked up on what we were doing with color and I think it definitely helped make the story that much stronger.”

Rust Creek is currently streaming on Amazon Prime and Google.


Helicopter Film Services intros Titan ultra-heavy lifting drone

Helicopter Filming Services (HFS) has launched an ultra-heavy lift drone that incorporates a large, capable airframe paired with the ARRI SRH-3. Known as the Titan, the drone’s ARRI SRH-3 stabilized head enables easy integration of existing ARRI lens motors and other functionality directly with the ARRI Alexa 65 and LF cameras.

HFS developed the large drone in response to requests from some legendary DPs and VFX supervisors to enable filmmakers to fly large-format digital or 35mm film packages.

“We have trialed other heavy-lift machines, but all of them have been marginal in terms of performance when carrying the larger cameras and lenses that we’re asked to fly,” says Alan Perrin, chief UAV pilot at HFS. “What we needed, and what we’ve designed, is a system that will capably and safely operate with the large-format cameras and lenses that top productions demand.”

The Titan combines triple redundancy on flight controls and double redundancy on power supply and ballistic recovery into an aircraft that can deploy and operate easily on any production involving a substantial flight duration. The drone can easily fly a 35mm film camera while carrying an ARRI 435 and 400-foot magazine.

Here are some specs:
• Optimized for large-format digital and 35mm film cameras
• Max payload up to 30 kilograms
• Max take-off mass — 80 kilograms
• Redundant flight control systems
• Ballistic recovery system (parachute)
• Class-leading stability
• Flight duration up to 15 minutes (subject to payload weight and configuration)
• HD video downlink
• Gimbal: ARRI SRH3 or Movi XL

Final payload-proving flights are taking place now, and the company is in the process of planning first use on major productions. HFS is also exploring the ability to fly a new 65mm film camera on the Titan.


SciTech Medallion Recipient: A conversation with Curtis Clark, ASC

By Barry Goch

The Academy of Motion Pictures Arts & Sciences has awarded Curtis Clark, ASC, the John A. Bonner Medallion “in appreciation for outstanding service and dedication in upholding the high standards of the Academy.” The presentation took place in early February and just prior to the event, I spoke to Clark and asked him to reflect on the transition from film to digital cinema and his contributions to the industry.

Clark’s career as a cinematographer includes features, TV and commercials. He is also the chair of the ASC Motion Imaging Technology Council that developed the ASC- CDL.

Can you reflect on the changes you’ve seen over your career and how you see things moving ahead in the future?
Once upon a time, life was an awful lot simpler. I look back on it nostalgically when it was all film-based, and the possibilities of the cinematographer included follow-up on the look of dailies and also follow through with any photographic testing that helped to hone in on the desired look. It had its photochemical limitations; its analog image structure was not as malleable or tonally expansive as the digital canvas we have now.

Do you agree that Kodak’s Cineon helped us to this digital revolution — the hybrid film/digital imaging system where you would shoot on film, scan it and then digitally manipulate it before going back out to film via a film recorder?
That’s where the term digital intermediate came into being, and it was an eye opener. I think at the time not everyone fully understood the ramifications of the sort of impact it was making. Kodak created something very potent and led the way in terms of methodologies, or how to arrive at integration of digital into what was then called a hybrid imaging system —combining digital and film together.

The DCI (Digital Cinema Initiative) was created to establish digital projection standards. Without a standard we’d potentially be creating chaos in terms of how to move forward. For the studios, distributors and exhibitors, it would be a nightmare Can you talk about that?
In 2002, I had been asked to form a technology committee at the ASC to explore these issues: how the new emerging digital technologies were impacting the creative art form of cinematography and of filmmaking, and also to help influence the development of these technologies so they best serve the creative intent of the filmmaker.

DCI proposed that for digital projection to be considered ready for primetime, its image quality needed to be at least as good as, if not better than, a print from the original negative. I thought this was a great commitment that the studios were making. For them to say digital projection was going to be judged against a film print projection from the original camera negative of the exact same content was a fantastic decision. Here was a major promise of a solution that would give digital cinema image projection an advantage since most people saw release prints from a dupe negative.

Digital cinema had just reached the threshold of being able to do 2K digital cinema projection. At that time, 4K digital projection was emerging, but it was a bit premature in terms of settling on that as a standard. So you had digital cinema projection and the emergence of a sophisticated digital intermediate process that could create the image quality you wanted from the original negative, but projected on a digital projection.

In 2004, the Michael Mann film Collateral film was shot with the Grass Valley Viper Film Stream, the Sony F900 and Sony F950 cameras, the latest generation of digital motion picture cameras — basically video cameras that were becoming increasingly sophisticated with better dynamic range and tonal contrast, using 24fps and other multiple frame rates, but 24p was the key.
These cameras were used in the most innovative and interesting manner, because Mann combined film with digital, using the digital for the low-light level night scenes and then using film for the higher-light level day exterior scenes and day interior scenes where there was no problem with exposure.

Because of the challenge of shooting the night scenes, they wanted to shoot at such low light levels that film would potentially be a bit degraded in terms of grain and fog levels. If you had to overrate the negative, you needed to underexpose and overdevelop it, which was not desirable, whereas the digital cameras thrived in lower light levels. Also, you could shoot at a stop that gave you better depth of field. At the time, it was a very bold decision. But looking back on it historically, I think it was the inflection point that brought the digital motion picture camera into the limelight as a possible alternative to shooting on film.

That’s when they decided to do Camera Assessment Series tests, which evaluates all the different digital cinema cameras available at the time?
Yeah, with the idea being that we’d never compare two digital cameras together, we’d always compare the digital camera against a film reference. We did that first Camera Assessment Series, which was the first step in the direction of validating the digital motion picture camera as viable for shooting motion pictures compared with shooting on film. And we got part way there. A couple of the cameras were very impressive: the Sony F35, the Panavision Genesis, the Arri D21 and the Grass Valley Viper were pretty reasonable, but this was all still mainly within a 2K (1920×1080) realm. We had not yet broached that 4K area.

A couple years later, we decided to do this again. It was called the Image Control Assessment Series, ICAS. That was shot at Warner Bros. It was the scenes that we shot in a café — daylight interior and then night time exterior. Both scenes had a dramatically large range of contrast and different colors in the image. It was the big milestone. The new Arri Alexa was used, along with the Sony F65 and the then latest versions of the Red cameras.

So we had 4K projection and 4K cameras and we introduced the use of ACES (Academy Color Encoding System) color management. So we were really at the point where all the key components that we needed were beginning to come together. This was the first instance where these digital workflow components were all used in a single significant project testing. Using film as our common benchmark reference — How are these cameras in relation to film? That was the key thing. In other words, could we consider them to be ready for prime time? The answer was yes. We did that project in conjunction with the PGA and a company called Revelations Entertainment, which is Morgan Freeman’s company. Lori McCreary, his partner, was one of the producers who worked with us on this.

So filmmakers started using digital motion picture cameras instead of film. And with digital cinema having replaced film print as a distribution medium, these new generation digital cameras started to replace film as an image capture medium. Then the question was would we have an end-to-end digital system that would become potentially viable as an alternative to shooting on film.

L to R: Josh Pines, Steve MacMillan, Curtis Clark and Dhanendra Patel.

Part of the reason you are getting this acknowledgement from the Academy is your dedication on the highest quality of image and the respect for the artistry, from capture through delivery. Can you talk about your role in look management from on-set through delivery?
I think we all need to be on the same page; it’s one production team whose objective is maintaining the original creative intent of the filmmakers. That includes director and cinematographer and working with an editor and a production designer. Making a film is a collective team effort, but the overall vision is typically established by the director in collaboration with the cinematographer and a production designer. The cinematographer is tasked with capturing that with lighting, with camera composition, movement, lens choices — all those elements that are part of the process of creative filmmaking. Once you start shooting with these extremely sophisticated cameras, like the Sony F65 or Venice, Panavision Millennium DXL, an Arri or the latest versions of the Red camera, all of which have the ability to reproduce high dynamic range, wide color gamut and high resolution. All that raw image data is inherently there and the creative canvas has certainly been expanded.

So if you’re using these creative tools to tell your story, to advance your narrative, then you’re doing it with imagery defined by the potential of what these technologies are able to do. In the modern era, people aren’t seeing dailies at the same time, not seeing them together under controlled circumstances. The viewing process has become fragmented. When everyone had to come together to view projected dailies, there was a certain camaraderie constructive contributions that made the filmmaking process more effective. So if something wasn’t what it should be, then everyone could see exactly what it was and make a correction if you needed to do that.

But now, we have a more dispersed production team at every stage of the production process, from the initial image capture through to dailies, editorial, visual effects and final color grading. We have so many different people in disparate locations working on the production who don’t seem to be as unified, sometimes, as we were when it was all film-based analog shooting. But now, it’s far easier and simpler to integrate visual effects into your workflow. Like Cineon indicated when it first emerged, you could do digital effects as opposed to optical effects and that was a big deal.

So coming back to the current situation, and particularly now with the most advanced forms of imaging, which include high dynamic range, wider color gamut, wider than even P3, REC 2020, having a color management system like ACES that actually has enough color gamut to be able to contain any color space that you capture and want to be able to manipulate.

Can you talk about the challenges you overcame, and how that fits into the history of cinema as it relates to the Academy recognition you received?
As a cinematographer, working on feature films or commercials, I kept thinking, if I’m fortunate enough to be able to manage the dailies and certainly the final color grading, there are these tools called lift gain gamma, which are common to all the different color correctors. But they’re all implemented differently. They’re not cross-platform-compatible, so the numbers from a lift gain gamma — which is the primary RGB grading — from one color corrector will not translate automatically to another color corrector. So I thought, we should have a cross platform version of that, because that is usually seen as the first step for grading.

That’s about as basic as you can get, and it was designed so that it would be a cross-platform implementation, so that everybody who installs and applies the ASC-CDL in a color grading system compatible with that app, whether you did it on a DaVinci, Baselight, Lustre or whatever you were using, the results would be the same and transferable.

You could transport those numbers from one set-up on set using a dailies creation tool, like ColorFront for example. You could then use the ASC CDL to establish your dailies look during the shoot, not while you’re actually shooting, but with the DIT to establish a chosen look that could then be applied to dailies and used for VFX.

Then when you make your way into the final color grading session with the final cut — or whenever you start doing master color grading going back to the original camera source — you would have these initial grading corrections as a starting point as references. This now gives you the possibility of continuing on that color grading process using all the sophistication of a full color corrector, whether it’s power windows or secondary color correction. Whatever you felt you needed to finalize the look.

I was advocating this in the ASC Technology Committee, as it was called, now subsequently renamed the Motion Imaging Technology Council (MITC). We needed a solution like this and there were a group of us who got together and decided that we would do this. There were plenty of people who were skeptical, “Why would you do something like that when we already have lift gain gamma? Why would any of the manufacturers of the different color grading systems integrate this into their system? Would it not impinge upon their competitive advantage if they had a system that people were used to using, and if their own lift gain gamma would work perfectly well for them, why would they want to use the ASC CDL?

We live in a much more fragmented post world, and I saw that becoming even more so with the advances of digital. The ASC CDL would be a “look unifier” that would establish initial look parameters. You would be able to have control over the look at every stage of the way.

I’m assuming that the cinematographer would work with the director and editor, and they would assess certain changes that probably should be made because we’re now looking at cut sequences and what we had thought would be most appropriate when we were shooting is now in the context of an edit and there may need to be some changes and adjustments.

Were you involved in ACES? Was it a similar impetus for ACES coming about? Or was it just spawned because visual effects movies became so big and important with the advent of digital filmmaking?
It was bit of both, including productions without VFX. So I would say that initially it was driven by the fact that there really should be a standardized color management system. Let me give you an example of what I’m talking about. When we were all photochemical and basically shooting with Kodak stock, we were working with film-based Kodak color science.

It’s a color science that everybody knew and understood, even if they didn’t understand it from an engineering photochemical point of view, they understood the effects of it. It’s what helps enable the look and the images that we wanted to create.

That was a color management system that was built into film. That color science system could have been adapted into the digital world, but Kodak resisted that because of the threat to negatives. If you apply that film color science to digital cameras, then you’re making digital cameras look more like film and that could pose a threat to the sale of color film negative.

So that’s really where the birth of ACES came about — to create a universal, unified color management system that would be appropriate anywhere you shot and with the widest possible color gamut. And it supports any camera or display technology because it would always have a more expanded (future proofing) capability within which the digital camera and display technologies would work effectively and efficiently but accurately, reliably and predictably.

Very early on, my ASC Technology Committee (now called Motion Imaging Technology Council) got involved with ACES development and became very excited about it. It was the missing ingredient needed to be able to make the end-to-end digital workflow the success that we thought that it could become. Because we no longer could rely on film-based color science, we had to either replicate that or emulate it with a color management system that could accommodate everything we wanted to do creatively. So ACES became that color management system.

So, in addition to becoming the first cross-platform primary color grading tool, the ASC CDL became the first official ACES look modification transform. Because ACES is not a color grading tool, it’s a color management system, you have to have color grading tools with color management. So you have the color management with ACES, you have the color grading with ASC CDL and the combination of those together is the look management system because it takes them all to make that work. And it’s not that the ASC CDL is the only tool you use for color grading, but it has the portable cross-platform ability to be able to control the color grading from dailies through visual effects up to the final color grade when you’re then working with a sophisticated color corrector.

What do you see for the future of cinematography and the merging of the worlds of post and on-set work and, what do you see as future challenges for future integrations between maintaining the creative intent and the metadata.
We’re very involved in metadata at the moment. Metadata is a crucial part of making all this work, as you well know. In fact, we worked on the common 3D LUT format, which we worked on with the Academy. So there is a common 3D LUT format that is something that would again have cross-platform consistency and predictability. And it’s functionality and its scope of use would be better understood if everyone were using it. It’s a work in progress. Metadata is critical.

I think as we expand the canvas and the palette of the possibility of image making, you have to understand what these technologies are capable of doing, so that you can incorporate them into your vision. So if you’re saying my creative vision includes doing certain things, then you would have to understand the potential of what they can do to support that vision. A very good example in the current climate is HDR.

That’s very controversial in a lot of ways, because the set manufacturers really would love to have everything just jump off the screen to make it vibrant and exciting. However, from a storytelling point of view, it may not be appropriate to push HDR imagery where it distracts from the story.
Well, it depends on how it’s done and how you are able to use that extended dynamic range when you have your bright highlights. And you can use foreground background relationships with bigger depth of field for tremendous effect. They have a visceral presence, because they have a dimensionality when, for example, you see the bright images outside of a window.

When you have an extended dynamic range of scene tones that could add dimensional depth to the image, you can choreograph and stage the blocking for your narrative storytelling with the kind of images that take advantage of those possibilities.

So HDR needs to be thought of as something that’s integral to your storytelling, not just something that’s there because you can do it. That’s when it can become a distraction. When you’re on set, you need a reference monitor that is able to show and convey, all the different tonal and color elements that you’re working with to create your look, from HDR to wider color gamut, whatever that may be, so that you feel comfortable that you’ve made the correct creative decision.

With virtual production techniques, you can incorporate some of that into your live-action shooting on set with that kind of compositing, just like James Cameron started with Avatar. If you want to do that with HDR, you can. The sky is the limit in terms of what you can do with today’s technology.

So these things are there, but you need to be able to pull them all together into your production workflow to make sure that you can comfortably integrate in the appropriate way at the appropriate time. And that it conforms to what the creative vision for the final result needs to be and then, remarkable things can happen. The aesthetic poetry of the image can visually drive the narrative and you can say things with these images without having to be expositional in your dialogue. You can make it more of an experientially immersive involvement with the story. I think that’s something that we’re headed toward, that’s going to make the narrative storytelling very interesting and much more dynamic.

Certainly, and certainly with the advancements of consumer technology and better panels and the high dynamic range developments, and Dolby Vision coming into the home and Atmos audio coming into the home. It’s really an amazing time to be involved in the industry; it’s so fun and challenging.

It’s a very interesting time, and a learning curve needs to happen. That’s what’s driven me from the very beginning and why I think our ASC Motion Imaging Technology Council has been so successful in its 16 years of continuous operation influencing the development of some of these technologies in very meaningful ways. But always with the intent that these new imaging technologies are there to better serve the creative intent of the filmmaker. The technology serves the art. It’s not about the technology per se, it’s about the technology as the enabling component of the art. It enables the art to happen. And expands it’s scope and possibility to broader canvases with wider color gamuts in ways that have never been experienced or possible before.


Barry Goch is a Finishing Artist at The Foundation and a Post Production Instructor at UCLA Extension. You can follow him on Twitter at @gochya.

Red intros LCD touch monitor for DSMC2 cameras

Red Digital Cinema has introduced the DSMC2 Touch 7-inch Ultra-Brite LCD monitor to its line of camera accessories. It offers an optically-bonded touchscreen with Gorilla Glass that allows for what the company calls “intuitive ways to navigate menus, adjust camera parameters and review .R3D clips directly out of the camera.”

The monitor offers a brighter high-definition viewing experience for recording and viewing footage on DSMC2 camera systems, even in direct sunlight. A 1920×1200 resolution display panel provides 2,200 nits of brightness to overcome viewing difficulties in bright outdoor environments as well as a high-pixel density (at 323ppi) and a 1200:1 contrast ratio.

The Ultra-Brite display mounts to Red’s DSMC2 Brain or other 1/4-20 mounting surfaces, and provides a LEMO connection to the camera, making it an ideal monitoring option for gimbals, cranes, and cabled remote viewing. Shooters can use a DSMC2 LEMO Adaptor A in conjunction with the Ultra-Brite display for convenient mounting options away from the DSMC2 camera Brain.

Check out a demo of the new monitor, priced at $3,750, here.