Cinnafilm 6.6.19

Category Archives: Cameras

DP Chat: Autumn Durald Arkapaw on The Sun Is Also a Star

By Randi Altman

Autumn Durald Arkapaw always enjoyed photography and making films with friends in high school, so it was inevitable that her path would lead to cinematography.

“After a genre course in college where we watched Raging Bull and Broadway Danny Rose I was hooked. From that day on I wanted to find out who was responsible for photographing a film. After I found out it was an actual job, I set out to become a DP. I immediately started learning what the job entailed and also started applying to film schools with my photography portfolio.”

The Sun Is Also a Star

Her credits are vast and include James Franco’s Palo Alto, the indie film One & Two, and music videos for the Jonas Brothers and Arcade Fire. Most recently she worked on Emma Forrest’s feature film Untogether, Max Minghella’s feature debut Teen Spirit and director Ry Russo-Young’s The Sun Is Also a Star, which follows a two young people who hit it off immediately and spend one magical day enjoying each other and the chaos that is New York City.

We recently reached out to Durald Arkapaw to find out more about these films, her workflow and more.

You’ve been busy with three films out this year — Untogether, Teen Spirit and The Sun Is Also a Star. What attracts you to a project?
I’m particular when it comes to choosing a narrative project. I have mostly worked with friends in the past and continue to do so. When making feature films, I throw myself into it. So it’s usually the relationship with the director and their vision that draws me first to a project.

Tell us about The Sun Is Also a Star. How would you describe the overall look of the film?
Director Ry Russo-Young and I wanted the film to feel grounded and not like the usual overlit/precious versions of these films we’ve all encountered. We wanted it to have texture and darks and lights, and the visuals to have a soulfulness. It was important that the world we created felt like an authentic and emotional environment.

Autumn Durald Arkapaw

How early did you get involved in the production? What were some of the discussions about conveying the story arc visually?
Ry and I met early on before she left for prep in New York. She shared with me her passion for wanting to make something new in this genre. That was always the basis for me when I thought about the story unfolding over one day and the arc of these characters. It was important for us to have the light show their progression through the city, but also have it highlight their love.

How did you go about choosing the right camera and lenses to achieve the look?
Ry was into anamorphic before I signed on, so it was already alluring to me once she sent me her look book and visual inspirations. I tend to shoot mostly in the Panavision anamorphic format, so my love goes deep for this medium. As for our camera, the ARRI Alexa Mini was our first choice since it renders a filmic texture, which is very important to me.

Any challenging scene or scenes that you are particularly proud of?
One of my favorite scenes/shots in the film is when Daniel (Charles Melton) sees Natasha (Yara Shahidi) for the first time in Grand Central Station. We had a Scorpio 23-foot telescopic crane on the ground floor. It is a beautiful shot that pulls out, booms down from Daniel’s medium shot in the glass staircase windows, swings around the opposite direction and pushes in while also zooming in on a 12:1 into an extreme closeup of Natasha’s face. We only did two takes and we nailed it on the first one. My focus puller, Ethan Borsuk, is an ace, and so is my camera operator, Andrew Fletcher. We all celebrated that one.

The Sun is Also a Star

Were you involved in the final color grading? What’s important to you about the collaboration between DP and colorist?
Yes, we did final color at Company 3 in New York. Drew Geary was our DI colorist. I do a lot of color on set, and I like to use the on-set LUT for the final as well. So, it’s important that my colorist and I share the same taste. I also like to work fast. Drew was amazing. He was fantastic to work with and added a lot to the overall look and feel.

What inspires you artistically?
Talented, inspiring, hardworking people. Because filmmaking is a team effort, and those around me inspire me to make better art.

How do you stay on top of advancing technology that serves your vision?
Every opportunity I get to shoot is an opportunity to try something new and tell a story differently. Working with directors that like to push the envelope is always a plus. Since I work a lot in commercials, that always affords me the occasion to try new technology and have fun with it.

Has any recent or new technology changed the way you work, looking back over the past few years?
I recently wrapped a film where we shot a few scenes with the iPhone. Something I would have never considered in the past, but the technology has come a long way. Granted the film is about a YouTube star, but I was happily surprised at how decent some of the stuff turned out.

What are some of your best practices or rules you try to follow on each job?
Always work fast and always make it look the best you can while, most importantly, telling the story.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Perpetual Grace’s DPs, colorist weigh in on show’s gritty look

You don’t have to get very far into watching the Epix series Perpetual Grace LTD to realize just how ominous this show feels. It begins with the opening shots, and by the time you’ve spent a few minutes with the dark, mysterious characters who populate this world — and gathered hints of the many schemes within schemes that perpetuate the story — the show’s tone is clear. With its black-and-white flashbacks and the occasional, gritty flash-forwards, Perpetual Grace gets pretty dark, and the action goes in directions you won’t see coming.

This bizarre show revolves around James (Westworld’s Jimmi Simpson), who gets caught up in what initially seems like a simple con that quickly gets out of control. Sir Ben Kingsley, Jacki Weaver, Chris Conrad and Luis Guzmán also star as an assortment of strange and volatile characters.

The series comes from the minds of executive producer Steve Conrad, who also served in that role on Amazon’s quirky drama Patriot, and Bruce Terris, who was both a writer and a first AD on that show.

These showrunners developed the look with other Patriot veterans: cinematographers James Whitaker and Nicole Hirsch Whitaker, who incorporated colorist Sean Coleman’s input before commencing principal photography.

Coleman left his grading suite at Company 3 in Santa Monica to spend several days at the series’ New Mexico location. While there he worked with the DPs to build customized LUTs for them to use during production. This meant that everyone on set could get a strong sense of how lighting, costumes, sets and locations would read with the show’s signature looks applied.

The Whitakers on set

“I’ve never been able to work with the final colorist this way,” says Whitaker, who also alternated directing duties with Conrad. “It was great having him there on set where we could talk about the subtleties of color. What should the sky look like? What should blood look like? Faces? Clothes? Using Resolve, he made two LUTs — “the main one for the color portions and a different one specifically for the black-and-white parts.”

The main look of the show is inspired by film noir and western movie tropes, and all with a tip of the hat to Roger Deakins’ outstanding work on The Assassination of Jesse James by the Coward Robert Ford. “For me,” says Whitaker, “it’s about strong contrast, deep blacks and desert colors … the moodier the better. I don’t love very blue skies, but we wanted to keep some tonality there.”

“It’s real sweaty, gritty, warm, nicotine-stained kind of thing,” Coleman elaborates.

“When we showed up in New Mexico,” Whitaker recalls, “all these colors did exist at various times of the day, and we just leaned into them. When you have landscapes with big, blue skies, strong greens and browns, you can lean in the other way and make it overly saturated. We leaned into it the other way, holding the brown earth tones but pulling out some of the color, which is always better for skin tones.”

The LUTs, Whitaker notes, offer a lot more flexibility than the DPs would have if they used optical filters. Beyond the nondestructive aspect of a LUT, it also allows for a lot more complexity. “If you think about a ‘sepia’ or ‘tobacco’ filter or something like that, you think of an overall wash that goes across the entire frame, and I get immediately bored by that. It’s tricky to do something that feels like it’s from a film a long time ago without dating the project you’re working on now; you want a lot of flexibility to get [the imagery] where you want it to go.”

The series was shot in November through February, often in brutally cold environments. Almost the entire series (the present-day scenes and black-and-white flashbacks) was shot on ARRI Alexa cameras in a 2.0:1 aspect ratio. A frequent Whitaker/Hirsch Whitaker collaborator, DIT Ryan Kunkleman applied and controlled the LUTs so the set monitors reflected their effect on the look.

The flash forwards, which usually occur in very quick spurts, were shot on a 16mm Bolex camera using Kodak’s 7203 (50D) and 7207 (250D) color negative film, which was pushed two stops in processing to enhance grain in post by Coleman.

Final color was done at Company 3’s Santa Monica facility, working primarily alongside the Whitakers. “We enhanced the noir look with the strong, detailed blacks,” says Coleman. Even though a lot of the show exudes the dry desert heat, it was actually shot over a particularly cold winter in New Mexico. “Things were sometimes kind of cold-looking, so sometimes we’d twist things a bit. We also added some digital ‘grain’ to sort of muck it up a little.”

For the black and white, Coleman took the color material in Resolve and isolated just the blue channel in order to manipulate it independent of the red and green, “to make it more inky,” he says. “Normally, you might just drain the color out, but you can really go further than that if you want a strong black-and-white look. When you adjust the individual channel, you affect the image in a way that’s similar to the effect of shooting black-and-white film through a yellow filter. It helps us make darker skies and richer blacks.”

Sean Coleman

“We’ve booked a whole lot of hours together, and that provides a level of comfort,” says Hirsch Whitaker about her and Whitaker’s work with Coleman. “He does some wonderful painting [in Resolve] that helps make a character pop in the frame or direct the viewer’s eye to a specific part of the frame. He really enjoys the collaborative element of color grading.”

Whitaker seconds that emotion: “As a cinematographer, I look at color grading a bit like working on set. It’s not a one-person job. It takes a lot of people to make these images.”

Cinnafilm 6.6.19


Glassbox’s virtual camera toolset for Unreal, Unity, Maya

Virtual production software company Glassbox Technologies has released its virtual camera plugin DragonFly from private beta for public use. DragonFly offers professional virtual cinematography tools to filmmakers and content creators, allowing users to view character performances and scenes within computer-generated virtual environments in realtime, through the camera’s viewfinder, an external monitor or iPad.

Available for Unreal Engine, Unity 3D and Autodesk Maya, DragonFly delivers an inclusive virtual cinematography workflow that allows filmmakers and content creators to make and test creative decisions faster and earlier in the process, whittling down production cost on projects of all scopes and sizes.

This off-the-shelf toolkit allows users to create previz to postviz without the need for large teams of operators, costly hardware or proprietary tools. It is platform-agnostic and fits seamlessly into any workflow out of box. Users can visualize and explore a CG virtual environment, then record, bookmark, create snapshots and replicate real camera movement as seamlessly as conducting a live-action shoot.

“Virtual production poses great potential for creators, but there were no off-the-shelf filming solutions available that worked out of the box,” notes co-founder/CPO Mariana Acuña. “In response, we made DragonFly: a virtual window that allows users to visualize complex sets, environments and performances through a viewfinder. Without the need for a big stage or mocap crew, it brings greater flexibility to the production and post pipeline for films, animation, immersive content, games and realtime VFX.”

The product was developed in collaboration with top Hollywood visualization and production studios, including The Third Floor for best-in-class results.

“Prior to DragonFly, each studio created its own bespoke virtual production workflow, which is costly and time-consuming per project. DragonFly makes realtime virtual production usable for all creators,” says Evelyn Cover, global R&D manager for The Third Floor. “We’re excited to collaborate with the Glassbox team to develop and test  DragonFly in all kinds of production scenarios from previz to post, with astounding success.”

Glassbox’s second in-beta virtual production software solution, BeeHive — the multi-platform, multi-user collaborative virtual scene syncing, editing and review solution–is slated to launch later this summer.

DragonFly is now available for purchase or can be downloaded for free as a 15-day trial from the Glassbox website. Pricing and licensing includes a permanent license option costing $750 USD (including $250 for the first year of support and updates) and an annual rental option costing $420 a year.


Remembering ARRI’s Franz Wieser

By Randi Altman

Franz Wieser passed away last week, and the world is worse for it. I’ve known Franz for over 20 years, going back to when he was still based in ARRI’s Blauvelt, New York, office and I was editor of Post Magazine.

We would meet in the city from time to time for an event or a meal. In fact, he introduced me to a hidden gem of a restaurant just off Washington Square Park that has become one of my favorites. It reminds me of him — warm, friendly and welcoming.

I always laugh when I remember him telling me about when his car broke down here in New York. Even though he had his hazard lights on and it was clear his car wasn’t cooperating, people kept driving by and giving him the finger. He was bemused but incredulous, which made it even funnier.

Then he moved to LA and I saw him less… a quick hello at trade shows a couple of times a year. When I think of Franz, I remember his smile first and how soft spoken and kind he was.

He touched many over the years and their stories are similar to mine.

“I have known Franz for nearly two decades, but it was during the earliest days of ARRI’s digital era that we truly connected,” shares Gary Adcock, an early ARRI digital adopter, writer and industry consultant. “We got together after one of the director of photography conferences I chaired at NAB to talk about ARRI’s early D20 and D21 digital cameras. Franz was just a great person, always a kind word, always wanting to know how your family and friends were. It will be that kindness that I will miss the most.”

“This is such sad news,” says Andy Shipsides, CTO at Burbank’s AbleCine. “Franz was a dear friend and will be greatly missed. He was an amazing person and brought fun and levity to his work everyday. I had lunch with him several months ago and I feel lucky to have shared that time with him. Franz was a truly a delightful person. He took me out when I first moved to LA to welcome me to the city, which I will always remember. He always had a smile on his face, and his positive energy was contagious. He will be very much missed, a big loss for our industry.”

ARRI sent out the following about Franz.

It is with great sadness, that we share the news of the passing of Franz Wieser, VP, marketing at ARRI Inc.

Franz Wieser grew up in Rosenheim in Bavaria, Germany. He was originally hired by ARRI CT in nearby Stephanskirchen, where ARRI’s Lighting factory is situated. Franz started at ARRI with an internship with Volker Bahnemann, a member of the supervisory board of the ARRI Group, at what was then called Arriflex Corporation in Blauvelt, NY, USA, and spent some time doing market research in New York and California.

In July 1994, Franz accepted a position as marketing manager at Arriflex with Volker Bahnemann and relocated to New York at that time. Franz had a distinguished career of 25 years in marketing for Arriflex and ARRI Inc., leading to his current position of VP of marketing based in the ARRI Burbank office. His contributions spanned the marketing of ARRI film and digital camera systems and analog and digital lighting fixtures. He also built sustaining relationships with the American Society of Cinematographers (ASC) and many others in the film and television industry. His ability to connect with people, his friendliness and reliability, along with his deep understanding of the film industry was outstanding. He was a highly valued member of the global marketing network and a wonderful person and colleague.

Glenn Kennel, president and CEO of ARRI Inc., says “Franz will be remembered by his colleagues and many friends in the industry as a friend and mentor, willing to listen and help. He always had a smile on his face and a gracious approach.”

We are very saddened by his early loss and will remember him well. Our deepest sympathy goes out to his wife and his parents. 


Hobo Films’ Howard Bowler on new series The System

By Randi Altman

Howard Bowler, the founder of New York City-based audio post house Hobo
Audio, has launched Hobo Films, a long-form original content development company.

Howard Bowler’s many faces

Bowler is also the founder and president of Green Point Creative, a marijuana-advocacy branding agency focused on the war on drugs and changing drug laws. And it is this topic that inspired Hobo Films’ first project, a dramatic series called The System. It features actress Lolita Foster from Netflix’s Orange Is The New Black.

Bowler has his hand in many things these days, and with those paths colliding, what better time to reach out to find out more?

After years working in audio post, what led you to want to start an original long-form production arm?
I’ve always wanted to do original scripted content and have been collecting story ideas for years. As our audio post business has grown, it’s provided us a platform to develop this related, exciting and creative business.

You are president/founder of Green Point Creative. Can you tell us more about that initiative?
Green Point Creative is an advocacy platform that was born out of personal experience. After an arrest followed by release (not me), I researched the history of marijuana prohibition. What I found was shocking. Hobo VP Chris Stangroom and I started to produce PSAs through Green Point to share what we had learned. We brought in Jon Mackey to aid in this mission, and he’s since moved up the ranks of Hobo into production management. The deeper we explored this topic, the more we realized there was a much larger story to tell and one that couldn’t be told through PSAs alone.

You wrote the script for the show The System? Can you tell our readers what the show is about?
The show’s storyline plots the experiences of a white father raising his bi-racial son, set against the backdrop of the war on drugs. The tone of the series is a cross between Marvel Comics and Schindler’s List. What happens to these kids in the face of a nefarious system that has them in its grips, how they get out, fight back, etc.

What about the shoot? How involved were you on set? What cameras were used? Who was your DP?
I was very involved the whole time working with the director Michael Cruz. We had to change lines of the script on set if we felt they weren’t working, so everyone had to be flexible. Our DP was David Brick, an incredible talent, driven and dedicated. He shot on the Red camera and the footage is stunning.

Can you talk about working with the director?
I met Michael Cruz when we worked together at Grey, a global advertising agency headquartered in NYC. I told him back then that he was born to direct original content. At the time he didn’t believe me, but he does now.

L-R: DP David Brick and director Mike Cruz on set

Mike’s directing style is subtle but powerful; he knows how to frame a shot and get the performance. He also knows how to build a formidable crew. You’ve got to have a dedicated team in place to pull these things off.

What about the edit and the post? Where was that done? What gear was used?
Hobo is a natural fit for this type of creative project and is handling all the audio post as well as the music score that is being composed by Hobo staffer and musician Oscar Convers.

Mike Cruz tapped the resources of his company, Drum Agency to handle the first phase of editing and they pulled together the rough cuts. For final edit, we connected with Oliver Parker. Ollie was just coming off two seasons of London Kills, a police thriller that’s been released to great reviews. Oliver’s extraordinary editing elevated the story in ways I hadn’t predicted. All editing was done on an Avid Media Composer. Music was composed by Hobo staffer Oscar Convers.

The color grade via Juan Salvo at TheColourSpace using Blackmagic Resolve. [Editor’s Note: We reached out to Salvo to find out more. “We got the original 8K Red files from editorial and conformed that on our end. The look was really all about realism. There’s a little bit of stylized lighting in some scenes, and some mixed-temperature lights as well. Mostly, the look was about finding a balance between some of the more stylistic elements and the very naturalist, almost cinéma vérité tone of the series.

“I think ultimately we tried to make it true-to-life with a little bit of oomph. A lot of it was about respecting and leaning into the lighting that DP Dave Brick developed on the shoot. So during the dialogue scenes, we tend to have more diffuse light that feels really naturalist and just lets the performances take center stage, and in some of the more visual scenes we have some great set piece lighting — police lights and flashlights — that really drive the style of those shots.”]

Where can people see The System?
Click here view the first five minutes of the pilot and learn more about the series.

Any other shows in the works?
Yes, we have several properties in development and to help move these projects forward, we’ve brought on Tiffany Jackman to lead these efforts. She’s a gifted producer who spent 10 years honing her craft at various agencies, as well as working on various films. With her aboard, we can now create an ecosystem that connects all the stories.


All Is True director Kenneth Branagh

By Iain Blair

Five-time Oscar-nominee Ken Branagh might be the biggest Shakespeare fan in the business. In fact, it’s probably fair to say that the actor/director/producer/screenwriter largely owes his fame and fortune to the Bard. For the past 30 years he’s directed (and often starred in) dozens of theatrical productions, as well as feature film adaptations of Shakespeare’s works, starting with 1989’s Henry V. That film won him two Oscar nominations: Best Actor and Best Director. He followed that with Much Ado About Nothing, Othello, Hamlet (which won him a Best Adapted Screenplay Oscar nod), Love’s Labour’s Lost and As You Like It.

Ken Branagh and Iain Blair

So it was probably only a matter of time before the Irish star jumped at the chance to play Shakespeare himself in the new film All Is True, a fictionalized look at the final years of the playwright. Set in 1613, Shakespeare is acknowledged as the greatest writer of the age, but disaster strikes when his renowned Globe Theatre burns to the ground. Devastated, Shakespeare returns to Stratford, where he must face a troubled past and a neglected family — wife Anne (Judi Dench) and two daughters, Susanna (Lydia Wilson) and Judith (Kathryn Wilder). The large ensemble cast also includes Ian McKellen as the Earl of Southampton.

I sat down with Branagh — whose credits include directing such non-Shakespeare movies as Thor, Cinderella and Murder on the Orient Express and acting in Dunkirk and Harry Potter and the Chamber of Secrets — to talk about about making the film and his workflow.

You’ve played many of Shakespeare’s characters in film or on stage. Was it a dream come true to finally play the man himself, or was it intimidating?
It was a dream come true, as I feel like he’s been a guide and mentor since I discovered him at school. And, rather like a dog, he’s given me unconditional love ever since. So I was happy to return some. It’s easy to forget that he was just a guy. He was amazing and a genius, but first and foremost he was a human being.

What kind of film did you hope to make?
A chamber piece, a character piece that took him out of his normal environment. I didn’t want it to be the predictable romp inside a theater, full of backstage bitching and all that sort of theatricality. I wanted to take him away from that and put him back in the place he was from, and I also wanted to load the front part of the movie with silence instead of tons of dialogue.

How close do you feel it gets to the reality of his final years?
I think it’s very truthful about Stratford. It was a very litigious society, and some of the scenes — like the one where John Lane stands up in church and makes very public accusations — all happened. His son Hamnet’s death was unexplained, and Shakespeare did seem to be very insecure in some areas. He wanted money and success and he lived in a very volatile world. If he was supposed to be this returning hero coming back to the big house and a warm welcome from his family, whom he hadn’t seen much of the past two decades, it didn’t quite happen that way. No, he was this absentee dad and husband, and the town had an ambivalent relationship with him; it wasn’t a peaceful retirement at all.

The film is visually gorgeous, and all the candlelit scenes reminded me of Barry Lyndon.
I’m so glad you said that as DP Zac Nicholson and I were partly inspired by that film and that look, and we used only candlelight and no additional lights for those scenes. Painters, like Vermeer and Rembrandt, were our inspiration for all the day and night scenes, respectively.

Clint Eastwood told me, “Don’t ever direct and star in a movie unless you’re a sucker for punishment — it’s just too hard.” So how hard was it?
(Laughs) He’s right. It is very hard, and a lot of work, but it’s also a big privilege. But I had a lot of great help — the crew and people like Judi and Ian. They had great suggestions and you listen to every tidbit they have to offer. I don’t know how Clint does it, but I do a lot of listening and stealing. The directing and acting are so interlinked to me, and I love directing as I get to watch Ian and Judi work, and they’re such hard workers. Judi literally gets to the set before anyone else, and she’s pacing up and down and getting ready to defend Anne Hathaway. She has this huge empathy for her characters which you feel so much, and here she was giving voice to a woman who could not read or write.

Where did you post?
We were based at Longcross Studios, where we did Murder on the Orient Express and the upcoming Artemis Fowl. We did most of it there, and then we ended up at The Post Republic, which has places in London and Berlin, to do the final finishing. Then we did all the final mixing at Twickenham with the great re-recording mixer Andy Nelson and his team. It was my second picture with Andy Nelson as the rerecording mixer. I am completely present throughout and I am completely involved in the final mix.

Do you like the post process?
I love it. It’s the place where I understood, right from my first film, that it could make — in terms of performance — a good one bad, a good one great, a bad one much better. The power of change in post is just amazing to me, and realizing that anything is possible if you have the imagination. So the way you juxtapose the images you’ve collected — and the way a scene from the third act might actually work better in the first act — is so huge in post. That fluidity was a revelation to me, and you can have these tremendous eureka moments in post that can be beautiful and so inspiring.

Can you talk about working with editor Una Ni Dhongaile, who cut The Crown and won a BAFTA for Three Girls?
She’s terrific. She wasn’t on the set but we talked a lot during the shoot. I like her because she really has an opinion. She’s definitely not a “yes” person, but she’s also very sensitive. She also gets very involved with the characters and protects you as a director. She won’t let you cut too soon or too deep, and she encourages you to take a moment to think about stuff. She’s one of those editors who has this special kind of intuition about what the film needs, in addition to all her technical skills and intellectual understanding of what’s going on.

What were the big editing challenges?
After doing a lot of very long takes we used the very best, and despite using a very painterly style we didn’t make the film feel too static. We didn’t want to falsely or artificially cut to just affect the pace, but allow it to flow naturally so every minute was earned. We also didn’t want to feel afraid of holding a particular shot for a long time. We definitely needed pauses and rests, and Shakespeare is musical in his poetry and the way he juxtaposes fast and slow moments. So all those decisions were critical and needed mulling as well as executing.

Talk about the importance of sound and music, as it’s a very quiet film.
It’s absolutely critical in a world like this where light and sound play huge roles and are so utterly different to our own modern understanding of it. The aural and audio space you can offer an audience for this was a big chance to adventure back in time, when the world was far more sparsely populated. Especially in a little place like Stratford; silence played a big role as well. You’re offering a hint of the outside world and the aural landscape is really the bedrock for all the introspection and thoughtfulness this movie deals with.

Patrick Doyle’s music has this gossamer approach — that was the word we used. It was like a breath, so that the whole sound experience invited the audience into the meditative world of Shakespeare. We wanted them to feel the seasons pass, the wind in the trees, and how much more was going on than just the man thinking about his past. It was the experience of returning home and being with this family again, so you’d hear a creak of a chair and it would interrupt his thoughts. So we worked hard on every little detail like that.

Where did you do the grading and coloring?
Post Republic in their North London facility, and again, I’m involved every step of the way.

Did making this film change your views about Shakespeare the man?
Yes, and it was an evolving thing. I’ve always been drawn to his flawed humanity, so it seemed real to be placing this man in normal situations and have him be right out of his comfort zone at the start of the film. So you have this acclaimed, feted and busy playwright, actor, producer and stage manager suddenly back on the dark side of the moon, which Stratford was back then. It was a small town, a three-day trip from London, and it must have been a shock. It was candlelight and recrimination. But I think he was a man without pomp. His colleagues most often described him as modest and gentle, so I felt a vulnerability that surprised me. I think that’s authentic to the man.

What’s next for you?
Disney’s Artemis Fowl, the fantasy-adventure based on the books, which will be released on May 29, and then I start directing Death on the Nile for Fox, which starts shooting late summer.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Showrunner: Eric Newman of Netflix’s Narcos: Mexico

By Iain Blair

Much like the drugs that form the dark heart of Narcos: Mexico, the hit Netflix crime drama is full of danger, chills and thrills — and is highly addictive. It explores the origins of the modern, ultra-violent drug war by going back to its roots, beginning at a time when the Mexican trafficking world was a loose and disorganized confederation of independent growers and dealers. But that all changed with the rise of the Guadalajara Cartel in the 1980s as Félix Gallardo (Diego Luna) — the real-life former Sinaloan police-officer-turned-drug lord — takes the helm, unifying traffickers in order to build an empire.

L-R: Director José Padilha and producer Chris Brancato bookend Eric Newman on the set of Narcos, Season 1.

The show also follows DEA agent Kiki Camarena (Michael Peña), who moves his wife and young son from California to Guadalajara to take on a new post. He quickly learns that his assignment will be more challenging than he ever could have imagined. As Kiki garners intelligence on Félix and becomes more entangled in his mission, a tragic chain of events unfold, affecting the drug trade and the war against it for years to come.

Narcos showrunner, writer and executive producer Eric Newman is a film and television veteran whose resume includes the Academy Award-nominated Children of Men, as well as The Dawn of the Dead, The Last Exorcism and Bright. After over 20 years in the movie industry, Newman transitioned into television as an executive producer on Hemlock Grove for Netflix. It was his curiosity about the international drug trade that led him to develop and executive produce his passion project Narcos, and Newman assumed showrunning responsibilities at the end of its first season. Narcos: Mexico initially started out as the fourth season of Narcos before Netflix decided to make it a stand-alone series.

I recently spoke with Newman about making the show, his involvement in post and another war that’s grabbed a lot of headlines — the one between streaming platforms and traditional cinema.

Do you like being a showrunner?
Yeah! There are aspects of it I really love. I began toward the end of the first season and there was this brief period where I tried not to be the showrunner, even though it was my show. I wasn’t really a writer — I wasn’t in the WGA — so I had a lot of collaborators, but I still felt alone in the driver’s seat. It’s a huge amount of work, from the writing to the shoot and then post, and it never really ends. It’s exhausting but incredibly rewarding.

What are the big challenges of running this show?
If I’d known more about TV at the time, I might have been far more frightened than I was (laughs).The big one is dealing with all the people and personalities involved. We have anywhere between 200 and 400 people working on the show at any given time, so it’s tricky. But I love working with actors, I think I’m a good listener, and any major problems are usually human-oriented. And then there’s all the logistics and moving parts. We began the series shooting in Colombia and then moved the whole thing to Mexico, so that was a big challenge. But the cast and crew are so great, we’re like a big family at this point, and it runs pretty smoothly now.

How far along are you with the second season of Narcos: Mexico?
We’re well into it, and while it’s called Season Two, the reality for us is that it’s the fifth season of a long, hard slog.

This show obviously deals with a lot of locations. How difficult is it when you shoot in Mexico?
It can be hard and grueling. We’re shooting entirely in Mexico — nothing in the States. We shot in Colombia for three years and we went to Panama once, and now we’re all over Mexico — from Mexico City to Jalisco, Puerto Vallarta, Guadalajara, Durango and so on.

It’s also very dangerous subject matter, and one of your location scouts was murdered. Do you worry about your safety?
That was a terrible incident, and I’m not sure whoever shot him even knew he was a location scout on our show. The reality is that a number of incredibly brave journalists, who had nowhere near the protection we have, had already shared these stories — and many were killed for it. So in many ways we’re late to the party.

Of course, you have to be careful anywhere you go, but that’s true of every city. You can find trouble in LA or New York if you are in the wrong place. I don’t worry about the traffickers we depict, as they’re mainly all dead now or in jail, and they seem OK with the way they’re depicted… that it’s pretty truthful. I worry a little bit more about the police and politicians.

Where do you post and do you like the post process?
I absolutely love post, and I think it’s a deeply underrated and under-appreciated aspect of the show. We’ve pulled off far more miracles in post than in any of the writing and shooting. We do all the post at Lantana in LA with the same great team that we’ve had from the start, including post producer Tim King and associate post producer Tanner King.

When we began the series in Colombia, we were told that Netflix didn’t feel comfortable having the footage down there for editing because of piracy issues, and that worked for me. I like coming back to edit and then going back down to Mexico to shoot. We shoot two episodes at a time and cut two at a time. I’m in the middle of doing fixes on Episode 2 and we’re about to lock Episode 3.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
We have four full-time editors — Iain Erskine, Garret Donnelly, Monty DeGraff and Jon Otazua — who each take separate episodes, plus we have one editor dedicated to the archival package, which is a big part of the show. We’ve also promoted two assistant editors, which I’m very proud of. That’s a nice part of being on a show that’s run for five years; you can watch people grow and move them up the ladder.

You have a huge cast and a lot of moving pieces in each episode. What are the big editing challenges?
We have a fair amount of coverage to sort through, and it’s always about telling the story and the pacing — finding the right rhythm for each scene.

This show has a great score and great sound design. Where do you mix, and can you talk about the importance of sound and music?
We do all the mixing at Technicolor, and we have a great team that includes supervising sound editor Randle Akerson and supervising ADR editor Thomas Whiting. (The team also includes sound effects editors Dino R. DiMuro and Troy Prehmus, dialogue editor David Padilla, music editor Chris Tergesen, re-recording mixers Pete Elia and Kevin Roache and ADR mixer Judah Getz.)

It’s all so crucial. All you have to do is look at a rough edit without any sound of music and it’s just so depressing. I come from a family of composers, so I really appreciate this part of post, and composer Gustavo Santaolalla has done a fantastic job, and the music’s changed a bit since we moved to Mexico. I’m fairly involved with all of it. I get a final playback and maybe I’ll have a few notes, but generally the team has got it right.

In 2017, you formed Screen Arcade with producer Bryan Unkeless, a production company based at Netflix with deals for features and television. I heard you have a new movie you’re producing for Netflix, PWR with Jamie Foxx and Joseph Gordon-Levitt?
It’s all shot, and we’re just headed into the Director’s Cut. We’re posting in New York and have our editorial offices there. Netflix is so great to partner with. They care as much about the quality of image and sound as any studio I’ve ever worked with — and I’ve worked with everyone. In terms of the whole process and deliverables, there’s no difference.

It’s interesting because there’s been a lot of pushback against Netflix and other streaming platforms from the studios, purists and directors like Steven Spielberg. Where do you see the war for cinema’s future going?
I think it’ll be driven entirely by audience viewing habits, as it should be. Some of my all-time favorite movies — The Bridge on the River Kwai, Taxi Driver, Sunset Boulevard, Barry Lyndon — weren’t viewed in a movie house.

Cinema exhibition is a business. They want Black Panther and Star Wars, so it’s a commerce argument not a creative one. With all due respect to Spielberg, no one can dictate viewing habits, and maybe for now they can deny Netflix and streaming platforms Academy awards, but not forever.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


DP Chat: The Man in the High Castle’s Gonzalo Amat

By Randi Altman

Amazon’s The Man in the High Castle is based on the 1962 Phillip K. Dick novel, which asks the question: “What would it look like if the Germans and Japanese won World War II?” It takes a look at the Nazi and Japanese occupation of portions of the United States and the world. But it’s a Philip K. Dick story, so you know there is more to it than that… like an alternate reality.

The series will premiere its fourth and final season this fall on the streaming service. We recently reached out to cinematographer Gonzalo Amat, who was kind enough to talk to us about workflow and more.

How did you become interested in cinematography?
Since I was very young, I had a strong interest in photography and was shooting stills as long as I can remember. Then, when I was maybe 10 or 12 years old, I discovered that movies also had a photographic aspect. I didn’t think about doing it until I was already in college studying communications, and that is when I decided to make it my career.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology?
Artistically, I get inspiration from a lot of sources, such as photography, film, literature, painting or any visual medium. I try to curate what I consume, though. I believe that everything we feed our brain somehow shows up in the work we do, so I am very careful about consuming films, books and photography that feed the story that I will be working on. I think any creation is inspiration. It can be all the way from a film masterpiece to a picture drawn by a kid, music, performance art, historical photographs or testimonies, too.

About staying on top: I read trade magazines and stay educated through seminars and courses, but at some point, it’s also about using those tools. So I try to test the tools instead of reading about them. Almost any rental place or equipment company will let you try newer tools. If I’m shooting, we try to schedule a test for a particular piece of equipment we want to use, during a light day.

What new technology has changed the way you work?
The main new technology would be the migration of most projects to digital. That has changed the way we work on set and collaborate with the directors, since everyone can now see, on monitors, something closely resembling the final look of the project.

A lot of people think this is a bad thing that has happened, but for me, it actually allows more clear communication about the concrete aspects of a sometimes very personal vision. Terms like dark, bright, or colorful are very subjective, so having a reference is a good point to continue the conversation.

Also, digital technology has helped use more available light on interiors and use less light on exterior nights. Still, it hasn’t reached the latitude of film, where you could just let the windows burn. It’s trickier for exterior day shots, where I think you end up needing more control. I would also say that the evolution of visual effects as a more invisible tool has helped us achieve a lot more from a storytelling perspective and has affected the way we shoot scenes in general.

What are some of your best practices, or rules you try to follow on each job?
Each project is different, so I try to learn how that particular project will be. But there are some time-tested rules that I try to implement. The main line is to always go for the story; every answer is always in the script. Another main rule is communication. So being open about questions, even if they seem silly. It’s always good to ask.

Another rule is listening to ideas. People that end up being part of my team are very experienced and sometimes have solutions to problems that come up. If you are open to ideas, more ideas will come, and people will do their jobs with more intention and commitment. Gratitude, respect, collaboration, communication and being conscious about safety is important and part of my process.

Gonzalo Amat on set

Explain your ideal collaboration with the director when setting the look of a project.
Every director is different, so I look at each new project as an opportunity to learn. As a DP, you have to learn and adapt, since through your career you will be asked for different levels of involvement. Because of my interest in storytelling, I personally prefer a bit more of a hands-off approach from directors; talking more about story and concepts, where we collaborate setting up the shoots for covering a scene, and same with lighting: talking moods and concepts that get polished as we are on set. Some directors will be very specific, and that is a challenge because you have to deliver what is inside their heads and hopefully make it better. I still enjoy this challenge, because it also makes you work for someone’s vision.

Ideally, developing the look of a project comes from reading the script together and watching movies and references together. This is when you can say “dark like this” or “moody like this” because visual concepts are very subjective, and so is color. From then on, it’s all about breaking up the script and the visual tone and arc of the story, and subsequently all the equipment and tools for executing the ideas. Lots of meetings as well as walking the locations with just the director and DP are very useful.

How would you describe the overarching look of the show?
Basically, the main visual concept of this project is based in film noir, and our main references were The Conformist and Blade Runner. As we went along, we added some more character-based visual ideas inspired by projects like In the Mood for Love and The Insider for framing.

The main idea is to visually portray the worlds of the characters through framing and lighting. Sometimes, we play it the way the script tells us; sometimes we counterpoint visually what it says, so we can make the audience respond in an emotional way. I see cinematography as the visual music that makes people respond emotionally to different moods. Sometimes it’s more subtle and sometimes more obvious. We prefer to not be very intrusive, even though it’s not a “realist” project.

How early did you get involved in the production?
I start four or five weeks before the season. Even if I’m not doing the first episode, I will still be there to prepare new sets and do some tests for new equipment or characters. Preparation is key in a project like this, because once we start with the production the time is very limited.

Did you start out on the pilot? Did the look change from season to season at all?
James Hawkinson did the pilot, and I came in when the series got picked up. He set up the main visual concepts, and when it came to series I adapted some of the requirements from the studio and the notes from Ridley Scott into the style we see now.

The look has been evolving from season to season, as we feel we can be bolder with the visual language of the show. If you look at the pilot all the way to the end of Season 3, or Season 4, which is filming, you can definitely see a change, even though it still feels like the same project — the language has been polished and distilled. I think we have reached the sweet spot.

Does the look change at all when the timelines shift?
Yes, all of the timelines require a different look and approach with lighting and camera use. Also, the art design and wardrobe changes, so we combine all those subtle changes to give each world, place and timeline a different feel. We have lots of conceptual meetings, and we develop the look and feel of each timeline and place. Once these concepts are established, the team gets to work constructing the sets and needed visual elements, and then we go from there.

This is a period piece. How did that affect the look, if at all?
We have tried to give it a specific and unique look that still feels tied to the time period so, yes, the fact that this happens in our own version of the ‘60s has determined the look, feeling and language of the series. We base our aesthetics in what the real world was in 1945, which our story diverges from to form this alternate world.

The 1960s of the story are not the real 1960s because there is no USA and no free Europe, so that means most of the music and wardrobe doesn’t look like the 1960s we know. There are many Nazi and Japanese visual elements on the visuals that distinguish us from a regular 1960s look, but it still feels period.

How did you go about choosing the right camera and lenses for this project?
Because we had a studio mandate to finish in 4K, the Red One with Zeiss Master Prime lenses was chosen in the pilot, so when I came on we inherited that tech. We stuck with all this for the first season, but after a few months of shooting we adapted the list and filters and lighting. On Season 2, we pushed to change to an ARRI Alexa camera, so we ended up adjusting all the equipment around this new camera and it’s characteristics — such as needing less light, so we ended up with less lighting equipment.

We also added classic Mitchell Diffusion Filters and some zooms. Lighting and grip equipment have been evolving toward less and less equipment since we light less and less. It’s a constant evolution. We also looked at some different lens options in the season breaks, but we haven’t added them because we don’t want to change our budget too much from season to season, and we use them as required.

Any challenging scenes that you are particularly proud of in Season 3?
I think the most challenging scene was the one in the Nebenwelt tunnel set. We had to have numerous meetings about what this tunnel was as a concept and then, based on the concept, find a way to execute it in a visual way. We wanted to make sure that the look of the scene matched the concepts of quantum physics within the story.

I wanted to achieve lighting that felt almost like plasma. We decided to put a mirror at the end of the tunnel with circle lighting right above it. We then created the effect of the space travel by using a blast of light — using lighting strikes with an elaborate setup that collectively used more than a million watts. It was a complex setup, but fortunately we had a lot of very talented people come together to execute it.

What’s your go-to gear (camera, lens, mount/accessories) — things you can’t live without?
On this project, I’d say it’s the 40mm lens. I don’t think this project would have the same vibe without this lens. Then, of course, I love the Technocrane, but we don’t use it every day, for budgetary and logistical reasons.

For other projects, I would say the ARRI Alexa camera and the 40mm and handheld accessories. You can do a whole movie with just those two; I have done it, and it’s liberating. But if I had an unlimited budget, I would love to use a Technocrane every day with a stabilized remote head.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Collaboration company Pix acquires Codex

Pix has reached an agreement to acquire London-based Codex, in a move that will enable both companies to deliver a range of new products and services, from streamlined camera capture to post production finishing.

The Pix System  is a collaboration tool that provides industry pros with secure access to production content on mobile devices, laptops or TVs from offices, homes or while traveling. They won an Oscar for its technology in 2019.

Codex products include recorders and media processing systems that transfer digital files and images from the camera to post, and tools for color dynamics, dailies creation, archiving, review and digital asset management.

“Our clients have relied on Pix to protect their material and ideas throughout all phases of production. In Codex, we found a group that similarly values relationships with attention to critical details,” explains Pix founder/CEO Eric Dachs. “Codex will retain its distinct brand and culture, and there is a great deal we can do together for the benefit of our clients and the industry.”

Over the years, Pix and Codex have seen wide industry adoption, delivering a proven record of contributing value to their clients. Introduced in 2003, Pix soon became a trusted and widely used secure communication and content management provider. The Pix System enables creative continuity and reduces project risk by ensuring that ideas are accurately shared, stored, and preserved throughout the entire production process.

“Pix and Codex are complementary, trusted brands used by leading creatives, filmmakers and studios around the world,” says Codex managing director Marc Dando. “The integration of both services into one simplified workflow will deliver the industry a fast, secure, global collaborative ecosystem.”

With the acquisition of Codex, Pix will expand its servicing reach across the globe. Pix founder Dachs will remain as CEO, and Dando will take on the role of chief design officer at Pix, with a focus on existing and new products.

NAB 2019: postPerspective Impact Award winners

postPerspective has announced the winners of our Impact Awards from NAB 2019. Seeking to recognize debut products with real-world applications, the postPerspective Impact Awards are voted on by an anonymous judging body made up of respected industry artists and pros (to whom we are very grateful). It’s working pros who are going to be using these new tools — so we let them make the call.

It was fun watching the user ballots come in and discovering which products most impressed our panel of post and production pros. There are no entrance fees for our awards. All that is needed is the ability to impress our voters with products that have the potential to make their workdays easier and their turnarounds faster.

We are grateful for our panel of judges, which grew even larger this year. NAB is exhausting for all, so their willingness to share their product picks and takeaways from the show isn’t taken for granted. These men and women truly care about our industry and sharing information that helps their fellow pros succeed.

To be successful, you can’t operate in a vacuum. We have found that companies who listen to their users, and make changes/additions accordingly, are the ones who get the respect and business of working pros. They aren’t providing tools they think are needed; they are actively asking for feedback. So, congratulations to our winners and keep listening to what your users are telling you — good or bad — because it makes a difference.

The Impact Award winners from NAB 2019 are:

• Adobe for Creative Cloud and After Effects
• Arraiy for DeepTrack with The Future Group’s Pixotope
• ARRI for the Alexa Mini LF
• Avid for Media Composer
• Blackmagic Design for DaVinci Resolve 16
• Frame.io
• HP for the Z6/Z8 workstations
• OpenDrives for Apex, Summit, Ridgeview and Atlas

(All winning products reflect the latest version of the product, as shown at NAB.)

Our judges also provided quotes on specific projects and trends that they expect will have an impact on their workflows.

Said one, “I was struck by the predicted impact of 5G. Verizon is planning to have 5G in 30 cities by end of year. The improved performance could reach 20x speeds. This will enable more leverage using cloud technology.

“Also, AI/ML is said to be the single most transformative technology in our lifetime. Impact will be felt across the board, from personal assistants, medical technology, eliminating repetitive tasks, etc. We already employ AI technology in our post production workflow, which has saved tens of thousands of dollars in the last six months alone.”

Another echoed those thoughts on AI and the cloud as well: “AI is growing up faster than anyone can reasonably productize. It will likely be able to do more than first thought. Post in the cloud may actually start to take hold this year.”

We hope that postPerspective’s Impact Awards give those who weren’t at the show, or who were unable to see it all, a starting point for their research into new gear that might be right for their workflows. Another way to catch up? Watch our extensive video coverage of NAB.

NAB 2019: An engineer’s perspective

By John Ferder

Last week I attended my 22nd NAB, and I’ve got the Ross lapel pin to prove it! This was a unique NAB for me. I attended my first 20 NABs with my former employer, and most of those had me setting up the booth visits for the entire contingent of my co-workers and making sure that the vendors knew we were at each booth and were ready to go. Thursday was my “free day” to go wandering and looking at the equipment, cables, connectors, test gear, etc., that I was looking for.

This year, I’m part of a new project, so I went with a shopping list and a rough schedule with the vendors we needed to see. While I didn’t get everywhere I wanted to go, the three days were very full and very rewarding.

Beck Video IP panel

Sessions and Panels
I also got the opportunity to attend the technical sessions on Saturday and Sunday. I spent my time at the BEITC in the North Hall and the SMPTE Future of Cinema Conference in the South Hall. Beck TV gave an interesting presentation on constructing IP-based facilities of the future. While SMPTE ST2110 has been completed and issued, there are still implementation issues, as NMOS is still being developed. Today’s systems are and will for the time being be hybrid facilities. The decision to be made is whether the facility will be built on an IP routing switcher core with gateways to SDI, or on an SDI routing switcher core with gateways to IP.

Although more expensive, building around an IP core would be more efficient and future-proof. Fiber infrastructure design, test equipment and finding engineers who are proficient in both IP and broadcast (the “Purple Squirrels”) are large challenges as well.

A lot of attention was also paid to cloud production and distribution, both in the BEITC and the FoCC. One such presentation, at the FoCC, was on VFX in the cloud with an eye toward the development of 5G. Nathaniel Bonini of BeBop Technology reported that BeBop has a new virtual studio partnership with Avid, and that the cloud allows tasks to be performed in a “massively parallel” way. He expects that 5G mobile technology will facilitate virtualization of the network.

VFX in the Cloud panel

Ralf Schaefer, of the Fraunhofer Heinrich-Hertz Institute, expressed his belief that all devices will be attached to the cloud via 5G, resulting in no cables and no mobile storage media. 5G for AR/VR distribution will render the scene in the network and transmit it directly to the viewer. Denise Muyco of StratusCore provided a link to a virtual workplace: https://bit.ly/2RW2Vxz. She felt that 5G would assist in the speed of the collaboration process between artist and client, making it nearly “friction-free.” While there are always security concerns, 5G would also help the prosumer creators to provide more content.

Chris Healer of The Molecule stated that 5G should help to compress VFX and production workflows, enable cloud computing to work better and perhaps provide realtime feedback for more perfect scene shots, showing line composites of VR renders to production crews in remote locations.

The Floor
I was very impressed with a number of manufacturers this year. Ross Video demonstrated new capabilities of Inception and OverDrive. Ross also showed its new Furio SkyDolly three-wheel rail camera system. In addition, 12G single-link capability was announced for Acuity, Ultrix and other products.

ARRI AMIRA (Photo by Cotch Diaz)

ARRI showed a cinematic multicam system built using the AMIRA camera with a DTS FCA fiber camera adapter back and a base station controllable by Sony RCP1500 or Skaarhoj RCP. The Sony panel will make broadcast-centric people comfortable, but I was very impressed with the versatility of the Skaarhoj RCP. The system is available using either EF, PL, or B4 mount lenses.

During the show, I learned from one of the manufacturers that one of my favorite OLED evaluation monitors is going to be discontinued. This was bad news for the new project I’ve embarked on. Then we came across the Plura booth in the North Hall. Plura as showing a new OLED monitor, the PRM-224-3G. It is a 24.5-inch diagonal OLED, featuring two 3G/HD/SD-SDI and three analog inputs, built-in waveform monitors and vectorscopes, LKFS audio measurement, PQ and HLG, 10-bit color depth, 608/708 closed caption monitoring, and more for a very attractive price.

Sony showed the new HDC-3100/3500 3xCMOS HD cameras with global shutter. These have an upgrade program to UHD/HDR with and optional processor board and signal format software, and a 12G-SDI extension kit as well. There is an optional single-mode fiber connector kit to extend the maximum distance between camera and CCU to 10 kilometers. The CCUs work with the established 1000/1500 series of remote control panels and master setup units.

Sony’s HDC-3100/3500 3xCMOS HD camera

Canon showed its new line of 4K UHD lenses. One of my favorite lenses has been the HJ14ex4.3B HD wide-angle portable lens, which I have installed in many of the studios I’ve worked in. They showed the CJ14ex4.3B at NAB, and I even more impressed with it. The 96.3-degree horizontal angle of view is stunning, and the minimization of chromatic aberration is carried over and perhaps improved from the HJ version. It features correction data that support the BT.2020 wide color gamut. It works with the existing zoom and focus demand controllers for earlier lenses, so it’s  easily integrated into existing facilities.

Foot Traffic
The official total of registered attendees was 91,460, down from 92,912 in 2018. The Evertz booth was actually easy to walk through at 10a.m. on Monday, which I found surprising given the breadth of new interesting products and technologies. Evertz had to show this year. The South Hall had the big crowds, but Wednesday seemed emptier than usual, almost like a Thursday.

The NAB announced that next year’s exhibition will begin on Sunday and end on Wednesday. That change might boost overall attendance, but I wonder how adversely it will affect the attendance at the conference sessions themselves.

I still enjoy attending NAB every year, seeing the new technologies and meeting with colleagues and former co-workers and clients. I hope that next year’s NAB will be even better than this year’s.

Main Image: Barbie Leung.


John Ferder is the principal engineer at John Ferder Engineer, currently Secretary/Treasurer of SMPTE, an SMPTE Fellow, and a member of IEEE. Contact him at john@johnferderengineer.com.

NAB 2019: A cinematographer’s perspective

By Barbie Leung

As an emerging cinematographer, I always wanted to attend an NAB show, and this year I had my chance. I found that no amount of research can prepare you for the sheer size of the show floor, not to mention the backrooms, panels and after-hours parties. As a camera operator as well as a cinematographer who is invested in the post production and exhibition end of the spectrum, I found it absolutely impossible to see everything I wanted to or catch up with all the colleagues and vendors I wanted to. This show is a massive and draining ride.

Panasonic EV1

There was a lot of buzz in the ether about 5G technology. Fast and accurate, the consensus seems to be that 5G will be the tipping point in implementing a lot of the tech that’s been talked about for years but hasn’t quite taken off yet, including the feasibility of autonomous vehicles and 8K streaming stateside.

It’s hard to deny the arrival of 8K technology while staring at the detail and textures on an 80-inch Sharp 8K professional display. Every roof tile, every wave in the ocean is rendered in rich, stunning detail.

In response to the resolution race, on the image capture end of things, Arri had already announced and started taking orders for the Alexa Mini LF — its long-awaited entry into the large format game — in the week before NAB.

Predictably, at NAB we saw many lens manufacturers highlighting full-frame coverage. Canon introduced its Sumire Prime lenses, while Fujinon announced the Premista 28-100mm T2.9 full-format zoom.

Sumire Prime lenses

Camera folks, including many ASC members, are embracing large format capture for sure, but some insist the appeal lies not so much in the increased resolution, but rather in the depth and overall image quality.

Meanwhile, back in 35mm sensor land, Panasonic continues its energetic push of the EVA1 camera. Aside from presentations at their booth emphasizing “cinematic” images from this compact 5.7K camera, they’ve done a subtle but not-to-subtle job of disseminating the EVA1 throughout the trade show floor. If you’re at the Atomos booth, you’ll find director/cinematographers like Elle Schneider presenting work shot with Atomos with the EVA1 balanced on a Ronin-S, and if you stop by Tiffen you’ll find an EVA1 being flown next to the Alexa Mini.

I found a ton of motion control at the show. From Shotover’s new compact B1 gyro stabilized camera system to the affable folks at Arizona-based Defy, who showed off their Dactylcam Pro, an addictively smooth-to-operate cable-suspension rig. The Bolt high-speed Cinebot had high-speed robotic arms complete with a spinning hologram.

Garret Brown at the Tiffen booth.

All this new gimbal technology is an ever-evolving game changer. Steadicam inventor Garrett Brown was on hand at the Tiffen booth to show the new M2 sled, which has motors elegantly built into the base. He enthusiastically heralded that camera operators can go faster and more “dangerously” than ever. There was so much motion control that it vied for attention alongside all the talk of 5G, 8K and LED lighting.

Some veterans of the show have expressed that this year’s show felt “less exciting” than shows of the past eight to 10 years. There were fewer big product launch announcements, perhaps due to past years where companies have been unable to fulfill the rush of post-NAB orders for new products for 12 or even 18 months. Vendors have been more conservative with what to hype, more careful with what to promise.

For a new attendee like me, there was more than enough new tech to explore. Above all else, NAB is really about the people you meet. The tech will be new next year, but the relationships you start and build at NAB are meant to last a career.

Main Image: ARRI’s Alexa Mini LF.


Barbie Leung is a New York-based cinematographer and camera operator working in independent film and branded content. Her work has played Sundance, the Tribeca Film Festival and Outfest. You can follow her on Instagram at @barbieleungdp.

Colorfront at NAB with 8K HDR, product updates

Colorfront, which makes on-set dailies and transcoding systems, has rolled out new 8K HDR capabilities and updates across its product lines. The company has also deepened its technology partnership with AJA and entered into a new collaboration with Pomfort to bring more efficient color and HDR management on-set.

Colorfront Transkoder is a post workflow tool for handling UHD, HDR camera, color and editorial/deliverables formats, with recent customers such as Sky, Pixelogic, The Picture Shop and Hulu. With a new HDR GUI, Colorfront’s Transkoder 2019 performs the realtime decompression/de-Bayer/playback of Red and Panavision DXL2 8K R3D material displayed on a Samsung 82-inch Q900R QLED 8K Smart TV in HDR and in full 8K resolution (7680 X 4320). The de-Bayering process is optimized through Nvidia GeForce RTX graphics cards with Turing GPU architecture (also available on Colorfront On-Set Dailies 2019), with 8K video output (up to 60p) using AJA Kona 5 video cards.

“8K TV sets are becoming bigger, as well as more affordable, and people are genuinely awestruck when they see 8K camera footage presented on an 8K HDR display,” said Aron Jaszberenyi, managing director, Colorfront. “We are actively working with several companies around the world originating 8K HDR content. Transkoder’s new 8K capabilities — across on-set, post and mastering — demonstrate that 8K HDR is perfectly accessible to an even wider range of content creators.”

Powered by a re-engineered version of Colorfront Engine and featuring the HDR GUI and 8K HDR workflow, Transkoder 2019 supports camera/editorial formats including Apple ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE (High Density Encoding).

Transkoder 2019’s mastering toolset has been further expanded to support Dolby Vision 4.0 as well as Dolby Atmos for the home with IMF and Immersive Audio Bitstream capabilities. The new Subtitle Engine 2.0 supports CineCanvas and IMSC 1.1 rendering for preservation of content, timing, layout and styling. Transkoder can now also package multiple subtitle language tracks into the timeline of an IMP. Further features support fast and efficient audio QC, including solo/mute of individual tracks on the timeline, and a new render strategy for IMF packages enabling independent audio and video rendering.

Colorfront also showed the latest versions of its On-Set Dailies and Express Dailies products for motion pictures and episodic TV production. On-Set Dailies and Express Dailies both now support ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE. As with Transkoder 2019, the new version of On-Set Dailies supports real-time 8K HDR workflows to support a set-to-post pipeline from HDR playback through QC and rendering of HDR deliverables.

In addition, AJA Video Systems has released v3.0 firmware for its FS-HDR realtime HDR/WCG converter and frame synchronizer. The update introduces enhanced coloring tools together with several other improvements for broadcast, on-set, post and pro AV HDR production developed by Colorfront.

A new, integrated Colorfront Engine Film Mode offers an ACES-based grading and look creation toolset with ASC Color Decision List (CDL) controls, built-in LOOK selection including film emulation looks, and variable Output Mastering Nit Levels for PQ, HLG Extended and P3 colorspace clamp.

Since launching in 2018, FS-HDR has been used on a wide range of TV and live outside broadcast productions, as well as motion pictures including Paramount Pictures’ Top Gun: Maverick, shot by Claudio Miranda, ASC.

Colorfront licensed its HDR Image Analyzer software to AJA for AJA’s HDR Image Analyzer in 2018. A new version of AJA HDR Image Analyzer is set for release during Q3 2019.

Finally, Colorfront and Pomfort have teamed up to integrate their respective HDR-capable on-set systems. This collaboration, harnessing Colorfront Engine, will include live CDL reading in ACES pipelines between Colorfront On-Set/Express Dailies and Pomfort LiveGrade Pro, giving motion picture productions better control of HDR images while simplifying their on-set color workflows and dailies processes.

NAB 2019: First impressions

By Mike McCarthy

There are always a slew of new product announcements during the week of NAB, and this year was no different. As a Premiere editor, the developments from Adobe are usually the ones most relevant to my work and life. Similar to last year, Adobe was able to get their software updates released a week before NAB, instead of for eventual release months later.

The biggest new feature in the Adobe Creative Cloud apps is After Effects’ new “Content Aware Fill” for video. This will use AI to generate image data to automatically replace a masked area of video, based on surrounding pixels and surrounding frames. This functionality has been available in Photoshop for a while, but the challenge of bringing that to video is not just processing lots of frames but keeping the replaced area looking consistent across the changing frames so it doesn’t stand out over time.

The other key part to this process is mask tracking, since masking the desired area is the first step in that process. Certain advances have been made here, but based on tech demos I saw at Adobe Max, more is still to come, and that is what will truly unlock the power of AI that they are trying to tap here. To be honest, I have been a bit skeptical of how much AI will impact film production workflows, since AI-powered editing has been terrible, but AI-powered VFX work seems much more promising.

Adobe’s other apps got new features as well, with Premiere Pro adding Free-Form bins for visually sorting through assets in the project panel. This affects me less, as I do more polishing than initial assembly when I’m using Premiere. They also improved playback performance for Red files, acceleration with multiple GPUs and certain 10-bit codecs. Character Animator got a better puppet rigging system, and Audition got AI-powered auto-ducking tools for automated track mixing.

Blackmagic
Elsewhere, Blackmagic announced a new version of Resolve, as expected. Blackmagic RAW is supported on a number of new products, but I am not holding my breath to use it in Adobe apps anytime soon, similar to ProRes RAW. (I am just happy to have regular ProRes output available on my PC now.) They also announced a new 8K Hyperdeck product that records quad 12G SDI to HEVC files. While I don’t think that 8K will replace 4K television or cinema delivery anytime soon, there are legitimate markets that need 8K resolution assets. Surround video and VR would be one, as would live background screening instead of greenscreening for composite shots. No image replacement in post, as it is capturing in-camera, and your foreground objects are accurately “lit” by the screens. I expect my next major feature will be produced with that method, but the resolution wasn’t there for the director to use that technology for the one I am working on now (enter 8K…).

AJA
AJA was showing off the new Ki Pro Go, which records up to four separate HD inputs to H.264 on USB drives. I assume this is intended for dedicated ISO recording of every channel of a live-switched event or any other multicam shoot. Each channel can record up to 1080p60 at 10-bit color to H264 files in MP4 or MOV and up to 25Mb.

HP
HP had one of their existing Z8 workstations on display, demonstrating the possibilities that will be available once Intel releases their upcoming DIMM-based Optane persistent memory technology to the market. I have loosely followed the Optane story for quite a while, but had not envisioned this impacting my workflow at all in the near future due to software limitations. But HP claims that there will be options to treat Optane just like system memory (increasing capacity at the expense of speed) or as SSD drive space (with DIMM slots having much lower latency to the CPU than any other option). So I will be looking forward to testing it out once it becomes available.

Dell
Dell was showing off their relatively new 49-inch double-wide curved display. The 4919DW has a resolution of 5120×1440, making it equivalent to two 27-inch QHD displays side by side. I find that 32:9 aspect ratio to be a bit much for my tastes, with 21:9 being my preference, but I am sure there are many users who will want the extra width.

Digital Anarchy
I also had a chat with the people at Digital Anarchy about their Premiere Pro-integrated Transcriptive audio transcription engine. Having spent the last three months editing a movie that is split between English and Mandarin dialogue, needing to be fully subtitled in both directions, I can see the value in their tool-set. It harnesses the power of AI-powered transcription engines online and integrates the results back into your Premiere sequence, creating an accurate script as you edit the processed clips. In my case, I would still have to handle the translations separately once I had the Mandarin text, but this would allow our non-Mandarin speaking team members to edit the Mandarin assets in the movie. And it will be even more useful when it comes to creating explicit closed captioning and subtitles, which we have been doing manually on our current project. I may post further info on that product once I have had a chance to test it out myself.

Summing Up
There were three halls of other products to look through and check out, but overall, I was a bit underwhelmed at the lack of true innovation I found at the show this year.

Full disclosure, I was only able to attend for the first two days of the exhibition, so I may have overlooked something significant. But based on what I did see, there isn’t much else that I am excited to try out or that I expect to have much of a serious impact on how I do my various jobs.

It feels like most of the new things we are seeing are merely commoditized versions of products that may originally have been truly innovative when they were initially released, but now are just slightly more fleshed out versions over time.

There seems to be much less pioneering of truly new technology and more repackaging of existing technologies into other products. I used to come to NAB to see all the flashy new technologies and products, but now it feels like the main thing I am doing there is a series of annual face-to-face meetings, and that’s not necessarily a bad thing.

Until next year…


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Sony’s NAB updates — a cinematographer’s perspective

By Daniel Rodriguez

With its NAB offerings, Sony once again showed that they have a firm presence in nearly every stage of production, be it motion picture, broadcast media or short form. The company continues to keep up to date with the current demands while simultaneously preparing for the inevitable wave of change that seems to come faster and faster each year. While the introduction of new hardware was kept to a short list this year, many improvements to existing hardware and software were released to ensure Sony products — both new and existing — still have a firm presence in the future.

The ability to easily access, manipulate, share and stream media has always been a priority for Sony. This year at NAB, Sony continued to demonstrate its IP Live, SR Live, XDCAM Air and Media Backbone Hive platforms, which give users the opportunity to manage media all over the globe. IP Live allows users to access remote production, which contains core processing hardware while accessing it anywhere. This extends to 4K and HDR/SDR streaming as well, which is where SR Live comes into play. SR Live allows for a native 4K HDR signal to be processed into full HD and regular SDR signals, and a core improvement is the ability to adjust the curves during a live broadcast for any issues that may arise in converting HDR signals to SDR.

For other media, including XDCAM-based cameras, XDCAM Air allows for the wireless transfer and streaming of most media through QoS services, and turns almost any easily accessible camera with wireless capabilities into a streaming tool.

Media Backbone Hive allows users to access their media anywhere they want. Rather than just being an elaborate cloud service, Media Backbone Hive allows internal Adobe Cloud-based editing, accepts nearly every file type, allows a user to embed metadata and makes searching simple with keywords and phrases that are spoken in the media itself.

For the broadcast market, Sony introduced the Sony HDC-5500 4K HDR three-CMOS sensor camcorder which they are calling their “flagship” camera in this market. Offering 4K HDR and high frame rates, the camera also offers a global shutter — which is essential for dealing with strobing from lights — and can now capture fast action without the infamous rolling shutter blur. The camera allows for 4K output over 12G SDI, allowing for 4K monitoring and HDR, and as these outputs continue to be the norm, the introduction of the HDC-5500 will surely be a hit with users, especially with the addition of global shutter.

Sony is very much a company that likes to focus on the longevity of their previous releases… cameras especially. Sony’s FS7 is a camera that has excelled in its field since its introduction in 2014, and to this day is an extremely popular choice for short form, narrative and broadcast media. Like other Sony camera bodies, the FS7 allows for modular builds and add-ons, and this is where the new CBK-FS7BK ENG Build-Up Kit comes in. Sporting a shoulder mount and ENG viewfinder, the kit includes an extension in the back that allows for two wireless audio inputs, RAW output, streaming and file transfer via Wireless LAN or 4G/LTE connection, as well as QoS streaming (only through XDCAM Air) and timecode input. This CBK-FS7BK ENG Build-Up Kit turns the FS7 into an even more well-rounded workhorse.

The Sony Venice is Sony’s flagship Cinema camera, replacing the Sony F65, which is still brilliant and a popular camera. Having popped up as recently as last year’s Annihilation, the Venice takes a leap further in entering the full-frame, VistaVision market. Boasting top-of-the-line specs and a smaller, more modular build than the F65, the camera isn’t exactly a new release — it came out in November 2017 — but Sony has secured longevity in their flagship camera in a time when other camera manufacturers are just releasing their own VistaVision-sensored cameras and smaller alternatives.

Sony recently released a firmware update to the Venice that allows X-OCN XT — their highest form of compressed 16-bit RAW — two new imager modes, allowing the camera to sample 5.7K 16:9 in full frame and 6K 2.39:1 in full width, as well as 4K signal over 6G/12G SDI output and wireless remote control with the CBK-WA02. Since the Venice is smaller and able to be mounted on harder-to-reach mounts, wireless control is quickly becoming a feature that many camera assistants need. Newer anamorphic desqueeze modes for 1.25x, 1.3x, 1.5x and 1.8x have also been added, which is huge, since many older and newer lenses are constantly being created and revisited, such as the Technovision 1.5x — made famous by Vittorio Storaro on Apocalypse Now (1979) — and the Cooke Full Frame Anamorphics 1.8X. With VistaVision full frame now being an easily accessible way of filming, new forms of lensing are now becoming common, so systems like anamorphic are no longer limited to 1.3X and 2X. It’s reassuring to see Sony look out for storytellers who may want to employ less common anamorphic desqueeze sizes.

As larger resolutions and higher frame rates become the norm, Sony has introduced the new Sony SxS Pro X cards. A follow up to the hugely successful Sony SxS Pro+ cards, these new cards boost an incredible transfer speed of 10Gbps (1250Mbps) in 120GB and 240GB cards. This is a huge step up from the previous SxS Pro+ cards that offered a read speed of 3.5Gbps and a write speed of 2.8Gbps. Probably the most exciting part of these new cards being introduced is the corresponding SBAC-T40 card reader which guarantees a full 240GB card to be offloaded in 3.5 minutes.

Sony’s newest addition to the Venice camera is the Rialto extension system. Using the Venice’s modular build, the Rialto is a hardware extension that allows you to remove the main body’s sensor and install it into a smaller body unit which is then tethered either nine or 18 feet by cable back to the main body. Very reminiscent of the design of ARRI’s Alexa M unit, the Rialto goes further by being an extension of its main system rather than a singular system, which may bring its own issues. The Rialto allows users to reach spots where it may otherwise prove difficult using the actual Venice body. Its lightweight design allows users to mount it nearly anywhere. Where other camera bodies that are designed to be smaller end up heavy when outfitted with accessories such as batteries and wireless transmitters, the Rialto can easily be rigged to aerials, handhelds, and Steadicams. Though some may question why you wouldn’t just get a smaller body from another camera company, the big thing to consider is that the Rialto isn’t a solution to the size of the Venice body — which is already very small, especially compared to the previous F65 — but simply another tool to get the most out of the Venice system, especially considering you’re not sacrificing anything as far as features or frame rates. The Rialto is currently being used on James Cameron’s Avatar sequels, as its smaller body allows him to employ two simultaneously for true 3D recording whilst giving all the options of the Venice system.

With innovations in broadcast and motion picture production, there is a constant drive to push boundaries and make capture/distribution instant. Creating a huge network for distribution, streaming, capture, and storage has secured Sony not only as the powerhouse that it already is, but also ensures its presence in the ever-changing future.


Daniel Rodriguez is a New York based director and cinematographer. Having spent years working for such companies as Light Iron, Panavision and ARRI Rental, he currently works as a freelance cinematographer, filming narrative and commercial work throughout the five boroughs. 

 

Atomos’ new Shogun 7: HDR monitor, recorder, switcher

The new Atomos Shogun 7 is a seven-inch HDR monitor, recorder and switcher that offers an all-new 1500-nit, daylight-viewable, 1920×1200 panel with a 1,000,000:1 contrast ratio and 15+ stops of dynamic range displayed. It also offers ProRes RAW recording and realtime Dolby Vision output. Shogun 7 will be available in June 2019, priced at $1,499.

The Atomos screen uses a combination of advanced LED and LCD technologies which together offer deeper, better blacks the company says rivals OLED screens, “but with the much higher brightness and vivid color performance of top-end LCDs.”

A new 360-zone backlight is combined with this new screen technology and controlled by the Dynamic AtomHDR engine to show millions of shades of brightness and color. It allows Shogun 7 to display 15+ stops of real dynamic range on-screen. The panel, says Atomos, is also incredibly accurate, with ultra-wide color and 105% of DCI-P3 covered, allowing for the same on-screen dynamic range, palette of colors and shades that your camera sensor sees.

Atomos and Dolby have teamed up to create Dolby Vision HDR “live” — a tool that allows you to see HDR live on-set and carry your creative intent from the camera through into HDR post. Dolby have optimized their target display HDR processing algorithm which Atomos has running inside the Shogun 7. It brings realtime automatic frame-by-frame analysis of the Log or RAW video and processes it for optimal HDR viewing on a Dolby Vision-capable TV or monitor over HDMI. Connect Shogun 7 to the Dolby Vision TV and AtomOS 10 automatically analyzes the image, queries the TV and applies the right color and brightness profiles for the maximum HDR experience on the display.

Shogun 7 records images up to 5.7kp30, 4kp120 or 2kp240 slow motion from compatible cameras, in RAW/Log or HLG/PQ over SDI/HDMI. Footage is stored directly to AtomX SSDmini or approved off-the-shelf SATA SSD drives. There are recording options for Apple ProRes RAW and ProRes, Avid DNx and Adobe CinemaDNG RAW codecs. Shogun 7 has four SDI inputs plus a HDMI 2.0 input, with both 12G-SDI and HDMI 2.0 outputs. It can record ProRes RAW in up to 5.7kp30, 4kp120 DCI/UHD and 2kp240 DCI/HD, depending on the camera’s capabilities. Also, 10-bit 4:2:2 ProRes or DNxHR recording is available up to 4Kp60 or 2Kp240. The four SDI inputs enable the connection of most quad-link, dual-link or single-link SDI cinema cameras. Pixels are preserved with data rates of up to 1.8Gb/s.

In terms of audio, Shogun 7 eliminates the need for a separate audio recorder. Users can add 48V stereo mics via an optional balanced XLR breakout cable, or select mic or line input levels, plus record up to 12 channels of 24/96 digital audio from HDMI or SDI. Monitoring selected stereo tracks is via the 3.5mm headphone jack. There are dedicated audio meters, gain controls and adjustments for frame delay.

Shogun 7 features the latest version of the AtomOS 10 touchscreen interface, first seen on the Ninja V.  The new body of Shogun 7 has a Ninja V-like exterior with ARRI anti-rotation mounting points on the top and bottom of the unit to ensure secure mounting.

AtomOS 10 on Shogun 7 has the full range of monitoring tools, including Waveform, Vectorscope, False Color, Zebras, RGB parade, Focus peaking, Pixel-to-pixel magnification, Audio level meters and Blue only for noise analysis.

Shogun 7 can also be used as a portable touchscreen-controlled multi-camera switcher with asynchronous quad-ISO recording. Users can switch up to four 1080p60 SDI streams, record each plus the program output as a separate ISO, then deliver ready-for-edit recordings with marked cut-points in XML metadata straight to your NLE. The current Sumo19 HDR production monitor-recorder will also gain the same functionality in a free firmware update.

There is asynchronous switching, plus use genlock in and out to connect to existing AV infrastructure. Once the recording is over, users can import the XML file into an NLE and the timeline populates with all the edits in place. XLR audio from a separate mixer or audio board is recorded within each ISO, alongside two embedded channels of digital audio from the original source. The program stream always records the analog audio feed as well as a second track that switches between the digital audio inputs to match the switched feed.

DP Chat: The Village cinematographer William Rexer

By Randi Altman

William Rexer is a cinematographer who has worked on documentaries, music videos, commercials and narratives — both comedies and dramas. He’s frequently collaborated with writer/director Ed Burns (Friends With Kids, Newlyweds, Summertime). Recently, he’s directed photography on several series including The Get Down, The Tick, Sneaky Pete and the new NBC drama The Village.

He sat down with us to answer some questions about his love of cinematography, his process and The Village, which follow a diverse group of people living in the same apartment building in Brooklyn.

The set of The Village. Photo: Peter Kramer

How did you become interested in cinematography?
When I was a kid, my mother had a theater company and my father was an agent/producer. I grew up sleeping backstage. When I was a teen, I was running a followspot (light) for Cab Calloway. I guess there was no escaping some job in this crazy business!

My father would check out 16mm movies from the New York City public library — Chaplin, Keaton — and that would be our weekend night entertainment. When I was in 8th grade, an art cinema started in my hometown; it is now called the Cinema Arts Center in Huntington, New York. It showed cinema from all over the world, including Bergman, Fellini, Jasny. I began to see the world through films and fell in love.

What inspires you artistically?
I love going to the movies, the theater and art galleries. Films like Roma and Cold War make me have faith in the world. What mostly inspires me is checking out what my peers are up to. Tim Ives, ASC, and Tod Campbell are two friends that I love to watch. Very impressive guys. David Mullen, ASC, and Eric Moynier are doing great work on Mrs. Maisel. I guess I would say watching my peers and their work inspires me.

NBC’s The Village

How do you stay on top of advancing technology tools for achieving your vision on set or in post?
The cameras and post workflow change every few months. I check in with the rental houses to stay on top of gear. Panavision, Arri Rental, TCS, Keslow and Abel are great resources. I also stay in touch with post houses. My friends at Harbor and Technicolor are always willing to help create LUTs, evaluate cameras and lenses.

Has any recent or new technology changed the way you work?
The introduction of the Red One MX and the ARRI D-20 changed a lot of things. They made shooting high-quality images affordable and cleaner for the environment. It put 35mm size sensors out there and gave a lot of young people a chance to create.

The introduction of large-format cameras, the Red Monstro 8K VV, the ARRI LF and 65, and the Sony Venice have made my life more interesting. All these sensors are fantastic, and the new color spaces we get to work with like Red’s IPP2 are truly astounding. I like having control of depth of field and controlling where the audience looks.

What are some of your best practices or rules you try to follow on each job?
I try my best to shoot tests, create a LUT in the test phase and take the footage through the entire process and see how it holds up. I make sure that all my monitors are calibrated at the post house to match; that gets us all on the same page. Then, I’ll adjust the LUT after a few days of shooting in the field, using the LUT as a film stock and light to it. I watch dailies, give notes and try to get in with colorist/timer and work with them.

Will Rexer (center) with showrunner Mike Daniels and director Minkie Spiro. Photo: Jennifer Rhoades

Tell us about The Village. How would you describe the general look of the show?
The look of The Village is somewhere between romantic realism and magical realism. It is a world that could be. Our approach was to thread that line between real and the potential — warm and inviting and full of potential.

Can you talk about your collaboration with the showrunner when setting the look of a project?
Mike Daniels, Minkie Spiro, Jessica Rhoades and I looked at a ton of photographs and films to find our look. The pilot designer Ola Maslik and the series designer Neil Patel created warm environments for me.

How early did you get involved in the production?
I had three weeks of prep for the pilot, and I worked with Minkie and Ola finding locations and refining the look.

How did you go about choosing the right camera and lenses to achieve the look?
The show required a decent amount of small gimbal work, so we chose the Red Monstro 8K VV using Red’s IPP2 color space. I love the camera, great look, great functionality and my team has customized the accessories to make our work on set effortless.

We used the Sigma Cine PL Primes with 180mm Leica R, Nikon 200 T2, Nikkor Zero Optik 58mm T1.2, Angenieux HR 25-250mm and some other special optics. I looked at other full-frame lenses but really liked the Sigma lenses and their character. These lenses are a nice mix of roundness and warmth and consistency.

What was your involvement with post? Who supported your vision from dailies through final grade? Have you worked with this facility and/or colorists on past projects?
Dailies were through Harbor Picture Company. I love these guys. I have worked with Harbor since they started, and they are total pros. They have helped me create LUTs for many projects, including Public Morals.

The final post for The Village was done in LA at NBC/Universal. Craig Budrick has done a great job coloring the show. I do wish that I could be in the room, but that’s not always possible.

What’s most satisfying to you about this show?
I am very proud of the show and its message. It’s a romantic vision of the world. TV and cinema often go to the dark side. I like going there, but I do think we need to be reminded of our better selves and our potential.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: Mzed.com’s Directing Color With Ollie Kenchington

By Brady Betzel

I am constantly looking to educate myself, no matter what the source — or subject. Whether I am learning how to make a transition in Adobe After Effects from an eSports editor on YouTube to Warren Eagles teaching color correction in Blackmagic’s DaVinci Resolve on FXPHD.com, I’m always beefing up my skills. I even learn from bad tutorials — they teach you what not to do!

But when you come across a truly remarkable learning experience, it is only fair to share with the rest of the world. Last year I saw an ad for an MZed.com course called “Directing Color With Ollie Kenchington,” and was immediately interested. These days you can pretty much find any technical tutorial you can dream of on YouTube, but truly professional, higher education-like, theory-based education series are very hard to come by. Even ones you need to pay for aren’t always worth their price of admission, which is a huge let down.

Ollie sharing his wisdom.

Once I gained access to MZed.com I wanted to watch every educational series they had. From lighting techniques with ASC member Shane Hurlbut to the ARRI Amira Camera Primer, there are over 150 hours of education available from industry leaders. However, I found my way to Directing Color…

I am often asked if I think people should go to college or a film school. My answer? If you have the money and time, you should go to college followed by film school (or do both together, if the college offers it). Not only will you learn a craft, but you will most likely spend hundreds of hours studying and visualizing the theory behind it. For example, when someone asks me about the science behind camera lenses, I can confidently answer them thanks to my physics class based on lenses and optics from California Lutheran University (yes, a shameless plug).

In my opinion, a two-, four- or even 10-year education allows me to live in the grey. I am comfortable arguing for both sides of a debate, as well as the options that are in between —  the grey. I feel like my post-high school education really allowed me to recognize and thrive in the nuances of debate. Leaving me to play devil’s advocate maybe a little too much, but also having civil and proactive discussions with others without being demeaning or nasty — something we are actively missing these days. So if living in the grey is for you, I really think a college education supplemented by online or film school education is valuable (assuming you make the decision that the debt is worth it like I did).

However, I know that is not an option for everyone since it can be very expensive — trust me, I know. I am almost done paying off my undergraduate fees while still paying off my graduate ones, which I am still two or three classes away from finishing. That being said, Directing Color With Ollie Kenchington is the only online education series I have seen so far that is on the same level as some of my higher education classes. Not only is the content beautifully shot and color corrected, but Ollie gives confident and accessible lessons on how color can be used to draw the viewer’s attention to multiple parts of the screen.

Ollie Kenchington is a UK-based filmmaker who runs Korro Films. From the trailer of his Directing Color series, you can immediately see the beauty of Ollie’s work and know that you will be in safe hands. (You can read more about his background here.)

The course raises the online education bar and will elevate the audiences idea of professional insight. The first module “Creating a Palette” covers the thoughts behind creating a color palette for a small catering company. You may even want to start with the last Bonus Module “Ox & Origin” to get a look at what Ollie will be creating throughout the seven modules and about an hour and a half of content.

While Ollie goes over “looks,” the beauty of this course is that he goes through his internal thought processes including deciding on palettes based on color theory. He didn’t just choose teal and orange because it looks good, he chooses his color palette based on complementary colors.

Throughout the course Ollie covers some technical knowledge, including calibrating monitors and cameras, white balancing and shooting color charts to avoid having wrong color balance in post. This is so important because if you don’t do these simple steps, your color correction session while be much harder. And wasting time on fixing incorrect color balance takes time away from the fun of color grading. All of this is done through easily digestible modules that range from two to 20 minutes.

The modules include Creating a Palette; Perceiving Color; Calibrating Color; Color Management; Deconstructing Color 1 – 3 and the Bonus Module Ox & Origin.

Without giving away the entire content in Ollie’s catalog, my favorite modules in this course are the on-set modules. Maybe because I am not on-set that often, but I found the “thinking out loud” about colors helpful. Knowing why reds represent blood, which raise your heart rate a little bit, is fascinating. He even goes through practical examples of color use in films such as in Whiplash.

In the final “Deconstructing Color” modules, Ollie goes into a color bay (complete with practical candle backlighting) and dives in Blackmagic’s DaVinci Resolve. He takes this course full circle to show how since he had to rush through a scene he can now go into Resolve and add some lighting to different sides of someone’s face since he took time to set up proper lighting on set, he can focus on other parts of his commercial.

Summing Up
I want to watch every tutorial MZed.com has to offer. From “Philip Bloom’s Cinematic Masterclass” to Ollie’s other course “Mastering Color.” Unfortunately, as of my review, you would have to pay an additional fee to watch the “Mastering Color” series. It seems like an unfortunate trend in online education to charge a fee and then when an extra special class comes up, charge more, but this class will supposedly be released to the standard subscribers in due time.

MZed.com has two subscription models: MZed Pro, which is $299 for one year of streaming the standard courses, and MZed Pro Premium for $399. This includes the standard courses for one year and the ability to choose one “Premium” course.

“Philip Bloom’s Cinematic Master Class” was the Premium course I was signed up for initially, but you you can decide between this one and the “Mastering Color” course. You will not be disappointed regardless of which one you choose. Even their first course “How to Photograph Everyone” is chock full of lighting and positioning instruction that can be applied in many aspects of videography.

I really was impressed with Directing Color with Ollie Kenchington, and if the other course are this good MZed.com will definitely become a permanent bookmark for me.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Atomos offering Shinobi SDI camera-top monitor

On the heels of its successful Shinobi launch in March, Atomos has introduced Atomos Shinobi SDI, a
super-lightweight, 5-inch HD-SDI and 4K HDMI camera-top monitor. Its color-accurate calibrated display makes makes it suitable compact HDR and SDR reference monitor. It targets the professional video creator who uses or owns a variety of cameras and camcorders and needs the flexibility of SDI or HDMI, accurate high bright and HDR, while not requiring external recording capability.

Shinobi SDI features a compact, durable body combined with an ultra-clear, ultra-bright, daylight viewable 1000-nit display. The anti-reflection, anti-fingerprint screen has a pixel density of 427PPI (pixels per inch) and is factory calibrated for color accuracy, with the option for in-field calibration providing ongoing accuracy. Thanks to the
HD-SDI input and output, plus a 4K HDMI input, it can be used in most productions.

This makes Shinobi SDI a useful companion for high-end cinema and production cameras, ENG cameras, handheld camcorders and any other
HD-SDI equipped source.

“Our most requested product in recent times has been a stand-alone SDI monitor. We are thrilled to be bringing the Atomos Shinobi SDI to market for professional video and film creators,” says Jeromy Young, CEO of Atomos.

ARRI’s new Alexa Mini LF offers large-format sensor in small footprint

Offering a large-format sensor in a small form factor, ARRI has introduced its new Alexa Mini LF camera, which combines the compact size and low weight of the Alexa Mini with the large-format Alexa LF sensor. According to the company, it “provides the best overall image quality for large-format shooting” and features three internal motorized FSND filters, 12V power input, extra power outputs, a new Codex Compact Drive and a new MVF-2 high-contrast HD viewfinder.

The new Alexa Mini LF cameras are scheduled to start shipping in mid-2019.

ARRI’s large-format camera system, launched in 2018, is based around a 4.5K version of the Alexa sensor, which is twice the size and offers twice the resolution of Alexa cameras in 35 format. This allows for large-format looks, with improvements on the Alexa sensor’s natural colorimetry, pleasing skin tones, low noise and it’s suitable for HDR and Wide Color Gamut workflows.

Alexa Mini LF now joins the existing system elements: the high-speed capable Alexa LF camera; ARRI Signature Prime lenses; LPL lens mount and PL-to-LPL adapter; and Lens Data System LDS-2. The combined feature sets and form factors of ARRI’s two large-format cameras encompass all on-set requirements.

The Alexa Mini LF is built for use in challenging professional conditions. It features a hard-wearing carbon body and a wide temperature range of -4° F to +113° F, and each Alexa Mini LF is put through a vigorous stress test before leaving the ARRI factory and is then supported by ARRI’s global service centers.

While Alexa Mini LF is compatible with almost all Alexa Mini accessories, the company says it brings significant enhancements to the Mini camera design. Among them are extra connectors, including regulated 12V and 24V accessory power; a new 6-pin audio connector; built-in microphones; and improved WiFi.

Six user buttons are now in place on the camera’s operating side, and the camera and viewfinder each have their own lock button, while user access to the recording media, and VF and TC connectors, has been made easier.

Alexa Mini LF allows internal recording of MXF/ARRIRAW or MXF/Apple ProRes in a variety of formats and aspect ratios, and features the new Compact Drive recording media from Codex, an ARRI technology partner. This small and lightweight drive offers 1TB of recording. It comes with a USB-C Compact Drive reader that can be used without any extra software or licenses on Mac or Windows computers. In addition, a Compact Drive adapter can be used in any dock that accepts SXR Capture Drives, potentially more than doubling download speeds.

Another development from Codex is Codex High Density Encoding (HDE), which uses sophisticated, loss-less encoding to reduce ARRIRAW file sizes by around 40% during downloading or later in the workflow. This lowers storage costs, shortens transfer times and speeds up workflows.

HDE is free for use with Codex Capture or Compact Drives, openly shared and fast: ARRIRAW Open Gate 4.5K can be encoded at 24fps on a modern MacBook Pro.

ARRI’s new MVF-2 viewfinder for the Alexa Mini LF is the same high-contrast HD OLED display, color science and ARRICAM eyepiece as in Alexa LF’s EVF-2 viewfinder, allowing optimal judgment of focus, dynamic range and color on set.

In addition, the MVF-2 features a large, four-inch flip-out monitor that can display the image or the camera control menu. The MVF-2 can be used on either side of the camera and connects via a new CoaXPress VF cable that has a reach of up to 10m for remote camera operations. It features a refined user interface, a built-in eyepiece lens heater for de-fogging and a built-in headphones connector.

Sony’s RX0 II ultra-compact, rugged camera with 4K, flip-up screen

Sony has added to its camera lineup with the launch of the light and compact RX0 II — what some might call a “GoPro on steroids,” with a higher price tag of approximately $700. It will ship in April. Building upon the waterproof, dustproof, shockproof, crushproof and ultra-compact qualities of the original RX0, the new model offers internal 4K recording, an adjustable LCD screen that tilts upward 180 degrees and downward 90 degrees and the ability to work underwater, as well as new image stabilization solutions for video recording.

At the heart of the RX0 II sits a 1.0-type stacked 15.3-megapixel Exmor RS CMOS image sensor and an advanced Bionz X image processing engine that offer enhanced color reproduction. It has been optimized for both stills and movie shooting across a wide sensitivity range of ISO 80-12800. The Zeiss Tessar T 24mm F4 fixed wide-angle lens has a shortened minimum focusing distance of 20cm.

Measuring 59mm x 40.5mm x 35mm and weighing 132g, the RX0 II fits easily into a pocket. It is waterproof up to 10 meters deep, it’s dustproof, shockproof up to two meters and crushproof up to 200KG force.

The RX0 II offers 4K 30p internal movie recording with full pixel readout and no pixel binning, which allows it to collect approximately 1.7 times the amount of data required for 4K video. By oversampling this data, the appearance of moiré and jaggies is reduced to deliver smooth, high-quality 4K footage with detail and depth. Using the recently introduced Sony Imaging Edge mobile application, this footage can be transferred to a smartphone, edited and shared easily across social networks.

The RX0 II introduces in-body electronic stabilization for steady footage, even when shot handheld. This can be enhanced even further when footage is exported to a smartphone or tablet running the Movie Edit add-on, where the additional information captured during filming can be processed to produce a video with gimbal-like smoothness.

An additional new feature that can also be accessed via the Sony Movie Edit add-on is Intelligent Framing, where the selected subject is kept in the center of the frame and image distortion is corrected in a final edit. Depending on where the video will be shared, a variety of aspect ratios can then be selected.

Additional movie features of the RX0 II include super-slow-motion recording at up to 1,000fps, uncompressed 4K HDMI output and simultaneous proxy movie recording. Users can use Picture Profile, S-Log 2 and Timecode/User Bit functions to make sure their final result exactly matches their creative vision.

The RX0 II can also be used as a multi-camera option. Up to five RX0 II cameras can be controlled wirelessly using Sony Imaging Edge Mobile application and between five and 50 cameras can be controlled via an access point (scheduled for summer 2019). The RX0 II is also compatible with the Camera Control Box CCB-WD1, which enables up to 100 cameras to be connected and controlled in a wired multi-camera setup.

DP Tom Curran on Netflix’s Tidying Up With Marie Kondo

By Iain Blair

Forget all the trendy shows about updating your home décor or renovating your house. What you really need to do is declutter. And the guru of decluttering is Marie Kondo, the Japanese star of the hot Netflix show Tidying Up With Marie Kondo.

The organizational expert became a global star when her first book, 2014’s “The Life-Changing Magic of Tidying Up: The Japanese Art of Decluttering and Organizing,” was translated into English, becoming a New York Times bestseller. Her follow-up was 2016’s “Spark Joy: An Illustrated Master Class on the Art of Organizing and Tidying Up.”

Tom Curran

Clearly, people everywhere need to declutter, and Kondo’s KonMari Method is the answer for those who have too much stuff. As she herself puts it, “My mission is to organize the world and spark joy in people’s lives. Through this partnership with Netflix, I am excited to spread the KonMari Method to as many people as possible.”

I recently spoke with Tom Curran, the cinematographer of the Kondo show. His extensive credits include Ugly Delicious for Netflix, Fish My City for National Geographic and 9 Months for Facebook, which is hosted by Courteney Cox. Curran has an Emmy on his mantle for ABC Sports’ Iditarod Sled Dog Race.

Let’s start with the really important stuff. Do you have too much clutter? Has Marie’s philosophy helped you?
(Laughs). It has! I think we all have too much stuff. To be honest, I was a little skeptical at first about all this. But as I spent time with her and educated myself, I began to realize just how much there is to it. I think that it particularly applies to the US, where we all have so much and move so quickly.

In her world, you come to a pause and evaluate all of that, and it’s really quite powerful. And if you follow all of her steps, you can’t do it quickly. It forces you to slow down and take stock. My wife is an editor, and we’re both always so busy, but now we take little pockets of time to attack different parts of the house and the clutter we have. It’s been really powerful and helpful to us.

Why do you think her method and this show have resonated so much with people everywhere?
Americans tend to get so busy and locked into routines, and Japan’s culture is very different. I’ve worked there quite a bit, and she brings this whole other quality to the show. She’s very thoughtful and kind. I think the show does a good job of showing that, and you really feel it. An awful lot of current TV can be a little sharp and mean, and there’s something old-fashioned about this, and audiences really respond. She doesn’t pass judgment on people’s messy houses — she just wants to help.

You’re well-known for shooting in extreme conditions and locations all over the world. How did this compare?
It was radically different in some ways. Instead of vast and bleak landscapes, like Antarctica, you’re shooting the interiors of people’s homes in LA. Working with EP Hend Baghdady and showrunner Bianca Barnes-Williams, we set out to redefine how to showcase these homes. We used some of the same principles, like how to incorporate these characters into their environment and weave the house into the storyline. That was our main goal.

What were the challenges of shooting this show?
A big one was keeping ourselves out of the shot, which isn’t so easy in a small space. Also, keeping Marie central to all the storytelling. I’ve done several series before, shooting in people’s homes, like Little People, Big World, where we stayed in one family’s home for many years. With this show the crew was walking into their homes for a far shorter time, and none of them were actors. The were baring their souls.

Cleaning up all their clutter before we arrived was contrary to what the show’s all about, so you’re seeing all the ugly. My background’s in cinéma vérité, and a lot of this was stripping back the way these types of unscripted shows are usually done — with multiple cameras. We did use multiple cameras, but often it was just one, as you’re in a tiny room, where there’s no space for another, and we’re shooting wide since the main character in most stories was the home.

As well as being a DP you’re also the owner of Curran Camera, Inc. Did you supply all the camera gear for this through your company?
Sometimes I supply equipment for a series, sometimes not. It all depends on what the project needs. On this, when Hend, Bianca and I began discussing different camera options, I felt it wasn’t a series we could shoot on prime lenses, but we wanted the look that primes would bring. We ended up working with Fujinon Cabrio Cine Zooms and Canon cameras, which gave us a really filmic look, and we got most of our gear from T-stop Camera Rentals in LA. In fact, the Fujinon Cabrio 14-35mm became the centerpiece of the storytelling in the homes because of its wide lens capture — which was crucial for scenes with closets and small rooms and so on.

I assume all the lighting was a big challenge?
You’re right. It was a massive undertaking because we wanted to follow all the progress in each home. And we didn’t want it to be a dingy, rough-looking show, especially since Marie represented this bright light that’d come into people’s homes and then it would get brighter and brighter. We ended up bringing in all the lighting from the east coast, which was the only place I could source what I needed.

For Marie’s Zen house we had a different lighting package with dozens of small fresnels because it was so calm and stood still. For the homes and all the movement, we used about 80 Flex lights — paper-thin LED lights that are easily dimmable and quick to install and take down. Even though we had a pretty small crew, we were able to achieve a pretty consistent look.

How did the workflow operate? How did you deal with dailies?
Our post supervisor Joe Eckardt was pretty terrific, and I’d spend a lot of time going through all the dailies and then give a big download to the crew once a week. We had six to eight camera operators and three crews with two cameras and additional people some days. We had so much footage, and what ended up on screen is just a fraction of what we shot. We had a lot of cards at the end of every day, and they’d be loaded into the post system, and then a team of 16 editors would start going through it all.  Since this was the first season, we were kind of doing it on the fly and trying different techniques to see what worked best.

Color correction and the mix was handled by Margarita Mix. How involved were you in post and the look of the show?
I was very involved, especially early on. Even in the first month or so we started to work on the grade a bit to get some patterns in place; that helped carry us through. We set out to capture a really naturalistic look, and a lot of the homes were very cramped, so we had to keep the wrong lighting look looking wrong, so to speak. I’m pretty happy with what we were able to do. (Margarita Mix’s Troy Smith was the colorist.)

How important is post to you as a DP?
It’s hard to overstate. I’d say it’s not just a big piece of the process, it is the process. When we’re shooting, I only really think about three things; One, what is the story we’re trying to tell? Two, how can we best capture that, particularly with non-actors. How do you create an environment of complete trust where they basically just forget about you? How do we capture Marie doing her thing and not break the flow, since she’s this standup performer? Three, how do we give post what they need? If we’re not giving editorial the right coverage, we’re not doing our job. That last one is the most important to me — since I’m married to an editor, I’m always so aware of post.

The first eight shows aired in January. When is the next season?
We’ve had some light talks about it, and I assume since it’s so popular we’ll do more, but nothing’s finalized yet. I hope we do more.  I love this show.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Red Ranger all-in-one camera system now available

Red Digital Cinema has made its new Red Ranger all-in-one camera system available to select Red authorized rental houses. Ranger includes Red’s cinematic full-frame 8K sensor Monstro in an all-in-one camera system, featuring three SDI outputs (two mirrored and one independent) allowing two different looks to be output simultaneously; wide-input voltage (11.5V to 32V); 24V and 12V power outs (two of each); one 12V P-Tap port; integrated 5-pin XLR stereo audio input (Line/Mic/+48V Selectable); as well as genlock, timecode, USB and control.

Ranger is capable of handling heavy-duty power sources and boasts a larger fan for quieter and more efficient temperature management. The system is currently shipping in a gold mount configuration, with a v-lock option available next month.

Ranger captures 8K RedCode RAW up to 60fps full-format, as well as Apple ProRes or Avid DNxHR formats at 4K up to 30fps and 2K up to 120fps. It can simultaneously record RedCode RAW plus Apple ProRes or Avid DNxHD or DNxHR at up to 300MB/s write speeds.

To enable an end-to-end color management and post workflow, Red’s enhanced image processing pipeline (IPP2) is also included in the system.

Ranger ships complete, including:
• Production top handle
• PL mount with supporting shims
• Two 15mm LWS rod brackets
• Red Pro Touch 7.0-inch LCD with 9-inch arm and LCD/EVF cable
• LCD/EVF adaptor A and LCD/EVF adaptor D
• 24V AC power adaptor with 3-pin 24V XLR power cable
• Compatible Hex and Torx tools

Shooting, posting New Republic’s Indie film, Sister Aimee

After a successful premiere at the Sundance Film Festival, New Republic Studios’ Sister Aimee screened at this month’s SXSW. The movie tells the story of an infamous American evangelist of the 1920s, Sister Aimee Semple McPherson, who gets caught up in her lover’s dreams of Mexico and finds herself on a road trip toward the border.

Sister Aimee shot at the newly renovated New Republic Studios near Austin, Texas, over two and a half weeks. “Their crew used our 2,400-square-foot Little Bear soundstage, our 3,000-square-foot Lone Wolf soundstage, our bullpen office space and numerous exterior locations in our backlot,” reports New Republic Studios president Mindy Raymond, adding that the Sister Aimee production also had access to two screening rooms with 5.1 surround sound, HDMI hookups to 4K monitors and theater-style leather chairs to watch dailies. The film also hit the road, shooting in the New Mexico desert.

L-R: Directors Samantha Buck, Marie Schlingmann at SXSW. Credit: Harrison Funk

Co-written and co-directed by Samantha Buck and Marie Schlingmann, the movie takes some creative license with the story of Aimee. “We don’t look for factual truth in Aimee’s journey,” they explain. “Instead we look for a more timeless truth that says something about female ambition, the female quest for immortality and, most of all, the struggle for women to control their own narratives. It becomes a story about storytelling itself.”

The film, shot by cinematographer Carlos Valdes-Lora at 3.2K ProRes 4444 XQ on an Arri Alexa Mini, was posted at Dallas and Austin-based Charlieuniformtango.

We reached out to the DP and the post team to find out more.

Carlos, why did you choose the package of the Alexa and Cooke Mini S4 Primes?
Carlos Valdes-Lora: In early conversations with the directors, we all agreed that we didn’t want Sister Aimee to feel like a traditional period movie. We didn’t want to use softening filters or vintage lenses. We aimed instead for clear images, deep focus and a rich color palette that remains grounded in the real world. We felt that this would lend the story a greater sense of immediacy and draw the viewer closer to the characters. Following that same thinking, we worked very extensively with the 25mm and 32mm, especially in closeups and medium closeups, emphasizing accessibility.

The Cooke Mini S4s are a beautiful and affordable set (relative to our other options.) We like the way they give deep dimensionality and warmth to faces, and how they create a slightly lower contrast image compared to the other modern lenses we looked at. They quickly became the right choice for us, striking the right balance between quality, size and value.

The Cookes paired with the Alexa Mini gave us a lightweight camera system with a very contained footprint, and we needed to stay fast and lean due to our compressed shooting schedule and often tight shooting quarters. The Chapman Cobra dolly was a big help in that regard as well.

What was the workflow to post like?
Charlieuniformtango producers Bettina Barrow, Katherine Harper, David Hartstein: Post took place primarily between Charlieuniformtango’s Dallas and Austin offices. Post strategizing started months before the shoot, and active post truly began when production began in July 2018.

Tango’s Evan Linton handled dailies brought in from the shoot, working alongside editor Katie Ennis out of Tango’s Austin studio, to begin assembling a rough cut as shooting continued. Ennis continued to cut at the studio through August with directors Schlingmann and Buck.

Editorial then moved back to the directors’ home state of New York to finish the cut for Sundance. (Editor Ennis, who four-walled out of Tango Austin for the first part of post, went to  New York with the directors, working out of a rented space.)

VFX and audio work started early at Tango, with continuously updated timelines coming from editorial, working to have certain locked shots also finished for the Sundance submission, while saving much of the cleanup and other CG heavy shots for the final picture lock.

Tango audio engineer Nick Patronella also tackled dialogue edit, sound design and mix for the submission out of the Dallas studio.

Can you talk about the VFX?
Barrow, Harper, Hartstein: The cut was locked in late November, and the heavy lifting really began. With delivery looming, Tango’s Flame artists Allen Robbins, Joey Waldrip, David Hannah, David Laird, Artie Peña and Zack Smith divided effects shots, which ranged from environmental cleanup, period-specific cleanup, beauty work such as de-aging, crowd simulation, CG sign creation and more. 3D

(L-R) Tango’s Artie Peña, Connor Adams, Allen Robbins in one of the studio’s Flame suites.

Artist Connor Adams used Houdini, Mixamo and Maya to create CG elements and crowds, with final comps being done in Nuke and sent to Flame for final color. Over 120 VFX shots were handled in total and Flame was the go-to for effects. Color and much of the effects happened simultaneously. It was a nice workflow as the project didn’t have major VFX needs that would have impacted color.

What about the color grade?
Barrow, Harper, Hartstein: Directors Buck and Schlingmann and DP Valdes-Lora worked with Tango colorist Allen Robbins to craft the final look of the film — with the color grade also done in Flame. The trio had prepped shooting for a Kodachrome-style look, especially for the exteriors, but really overall. They found important reference in selections of Robert Capa photographs.

Buck, Schlingmann and Valdes-Lora responded mostly to Kodachrome’s treatment of blues, browns, tans, greens and reds (while staying true to skin tone), but also to their gamma values, not being afraid of deep shadows and contrast wherever appropriate. Valdes-Lora wanted to avoid lighting/exposing to a custom LUT on set that would reflect this kind of Kodachrome look, in case they wanted to change course during the process. With the help of Tango, however, they discovered that by dialing back the Capa look it grounded the film a little more and made the characters “feel” more accessible. The roots of the inspiration remained in the image but a little more naturalism, a little more softness, served the story better.

Because of that they monitored on set with Alexa 709, which he thought exposing for would still provide enough room. Production designer Jonathan Rudak (another regular collaborator with the directors) was on the same page during prep (in terms of reflecting this Capa color style), and the practical team did what they could to make sure the set elements complemented this approach.

What about the audio post?
Barrow, Harper, Hartstein: With the effects and color almost complete, the team headed to Skywalker Ranch for a week of final dialogue edit, mix, sound design and Foley, led by Skywalker’s Danielle Dupre, Kim Foscato and E. Larry Oatfield. The team also was able to simultaneously approve color sections in Skywalker’s Stag Theater allowing for an ultra-efficient schedule. With final mix in hand, the film was mastered just after Christmas so that DCP production could begin.

Since a portion of the film was musical, how complex was the audio mix?
Skywalker sound mixer Dupre: The musical number was definitely one of the most challenging but rewarding scenes to design and mix. It was such a strong creative idea that played so deeply into the main character. The challenge was in striking a balance between tying it into the realism of the film while also leaning into the grandiosity of the musical to really sell the idea.

It was really fun to play with a combination of production dialogue and studio recordings to see how we could make it work. It was also really rewarding to create a soundscape that starts off minimally and simply and transitions to Broadway scale almost undetectably — one of the many exciting parts to working with creative and talented filmmakers.

What was the biggest challenge in post?
Barrow, Harper, Hartstein: Finishing a film in five to six weeks during the holidays was no easy feat. Luckily, we were able to have our directors hands-on for all final color, VFX and mix. Collaborating in the same room is always the best when you have no time to spare. We had a schedule where each day was accounted for — and we stuck to it almost down to the hour.

 

DP Chat: Madam Secretary’s Learan Kahanov

By Randi Altman

Cinematographer Learan Kahanov’s love of photography started at an early age, when he would stage sequences and scenes with his Polaroid camera, lining up the photos to create a story.

He took that love of photography and turned it into a thriving career, working in television, features and commercials. He currently works on the CBS drama Madam Secretary — where he was initially  hired to be the A-camera operator and additional DP. He shot 12 episodes and tandem units, then he took over the show fully in Season 3. The New York-shot, Washington, DC-set show stars Téa Leoni as the US Secretary of State, following her struggle to balance her work and personal life.

We recently reached out to Kahanov to find out more about his path, as well as his workflow, on Madam Secretary.

Learan Kahanov on set with director Rob Greenlea.

Can you talk about your path to cinematography?
My mother is a sculptor and printmaker, and when I was in middle school, she went back to get a degree in fine arts with a minor in photography. This essentially meant I was in tow, on many a weeknight, to the darkroom so she could do her printing and, in turn, I learned as well.

I shot mostly black and white all through middle school and high school. I would often use my mother’s art studio to shoot the models who posed for the drawing class she taught. Around the same time, I developed a growing fascination with animal behavior and strove to become a wildlife photographer, until I realized I didn’t have the patience to sit in a tree for days to get the perfect shot.

I soon turned my attention to videography while working at a children’s museum, teaching kids how to use the cameras and how to make short movies. I decided to pursue cinematography officially in high school. I eventually found myself at NYU film school, based off my photography portfolio. As soon as I got to New York City, I started working on indie films, as an electrician and gaffer, shooting every student film and indie project I could.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
I could list artists or filmmakers whose work I gravitate to, but the main thing I learned from my mother about art is that it’s about a feeling. Whether it’s being affected by a beautifully photographed image of a woman in a commercial or getting sucked into the visuals in a wildlife documentary, if you can invoke a feeling and or create an emotion you have made art.

Madam Secretary

I am always looking at things around me, and I’m always aware of how light falls on the world around me. Or how the shape of everyday objects and places change depending on the time, the weather or just my mood at the moment.

My vision of a project is always born out of the story, so the key for me is to always use technology (new or old) to support that story. Sometimes the latest in LED technology is the right tool for the job, sometimes it’s a bare light bulb attached to the underside of a white, five-gallon paint bucket (a trick Gaffer Jack Coffin and I use quite often). I think the balance between vision and technology is a two-way street — the key is to recognize when the technology serves your vision or the other way around.

What new technology has changed the way you work?
In the area of lighting, I have found that no matter what new tools come onto the scene, I still hold true to my go-to lighting techniques that I have preferred for years.

A perfect example would be my love for book lights — a book light is a bounced light that then goes through another layer of diffusion, which is perfect for lighting faces. Whether I am using an old Mole Richardson 5K tungsten unit or the newer ARRI S60 SkyPanels, the concept and end result are basically the same.

That being said, for location work the ARRI LED SkyPanels have become one of the go-to units on my current show, Madam Secretary. The lights’ high-output, low-power consumption, ease for matching existing location color sources and quick effects make them an easy choice for dealing with the faster-paced TV production schedule.

On-set setup

One other piece of gear that I have found myself calling for on a daily basis, since my key grip Ted Lehane introduced me to. It’s a diffusion material called Magic Cloth, which is produced by The Rag Place. This material can work as a bounce, as well as a diffusion, and you can directly light through. It produces a very soft light, as it’s fairly thick, but it does not change the color temperature of the source light. This new material, in conjunction with new LED technology, has created some interesting opportunities for my team.

Many DPs talk about the latest digital sensor, camera support (drone/gimbals, etc.) or LED lighting, but sometimes it’s something very simple, like finding a new diffusion material that can really change the look and the way I work. In fact, I think gripology in general often gets overlooked in the current affairs of filmmaking where everything seems to need to be “state of the art.”

What are some of your best practices or rules that you try to follow on each job?
I have one hard and fast rule in any project I shoot: support the story! I like to think of myself as a filmmaker first, using cinematography as a way to contribute to the filmmaking process. That being said, we can create lots of “rules” and have all the “go-to practices” to create beautiful images, but if what you are doing doesn’t advance the story, or at the very least create the right mood for the scene, then you are just taking a picture.

There are definite things I do because I simply prefer how it looks, but if it doesn’t make sense for the scene/move (based on the directors and my vision), I will then adjust what I do to make sure I am always supporting the story. There are definitely times where a balance is needed. We don’t create in a bubble, as there are all the other factors to consider, like budget, time, shooting conditions, etc. It’s this need/ability to be both technician and artisan that excites me the most about my job.

Can you explain your ideal collaboration with the director when setting the look of a project?
When working in episodic TV, every episode — essentially every eight days — there is a different director. Even when I have a repeat director, I have to adapt quickly between each director’s style. This goes beyond just being a chameleon from a creative standpoint — I need to quickly establish trust and a short hand to help the director put their stamp on their episode, all while staying within the already established look of the show.

Madam Secretary

I have always considered myself not an “idea man” but rather a “make-the-idea-better” man. I say this because being able to collaborate with a director and not just see their vision, but also enhance it and take it a step further (and see their excitement in the process), is completely fulfilling.

Tell us about Madam Secretary. How would you describe the overarching look of the show? How early did you get involved in the production?
I have been a part of Madam Secretary since the beginning, minus the pilot. I was hired as the A camera operator and as an additional DP. Jonathan Brown, ASC, shot the pilot and was the DP for the first two seasons. He was also one of our directors for the first three seasons. In addition to shooting tandem/2nd unit days and filling on scout days, I was the DP whenever Jonathan directed. So while I didn’t create the initial look of the show, I worked closely with Jonathan as the seasons went on until I officially took over in the third season.

Since I took over (and during my episodes), I felt an obligation to hold true to the original look and the intent of the show, while also adding my personal touch and allowing the show’s look to evolve with the series. The show does give us opportunities every week to create something new. While the reoccurring sets/locations do have a relatively set look, every episode takes us to new parts of the world and to new events.

It gives the director, production team and me an opportunity to create different looks and aesthetics to differentiate it from Madam Secretary’s life in DC. While it’s a quick schedule to prep,  research and create new looks for convincing foreign locations every episode (we shoot 99% of the show in New York), it is a challenge that brings a creativity and excitement to the job that I really enjoy.

Learan Kahanov on set with Hillary Clinton for the episode E Pluribus Unum.

Can you talk about what you shoot on and what lenses you use, etc.?
The show is currently shooting on Alexa SXTs with Leica Summicron Prime lenses and Fujinon Cabrio zooms. One of the main things I did when I officially took over the show was to switch to Lecia Primes. We did some testing with Tèa Leoni and Tim Daly on our sets to see how the lenses treated skin tones.

Additionally, we wanted to see how they reacted to the heavy backlight and to the blown out windows we have on many of our sets. We all agreed that the lenses were sharp, but also realized that they created a softer feel on our actors faces, had a nice focus fall-off and they handled the highlights really well. They are flexible enough to help me create different looks while still retaining a consistency for the show. The lenses have an interesting flare characteristic that sometimes makes controlling them difficult, but it all adds to the current look of the show and has yet to be limiting.

You used a Blackmagic Pocket Cinema camera for some specialized shots. Can you describe those?
The show has many scenes that entail some specialized shots that need a small but high-res camera that has an inherently different feel from the Alexa. These shots include webcam and security camera footage. There are also many times when we need to create body/helmet cam footage to emulate images recorded from military/police missions that then were played back in the president’s situation room. That lightweight, high-quality camera allows for a lot of flexibility. We also employ other small cameras like GoPro and DJI Osmo, as well as the Sony A7RII with PL mount.

Madam Secretary

Any challenging scenes that you are particularly proud of?
I don’t think there is an episode that goes by without some type of challenge, but one in particular that I was really happy with took place on a refugee boat in the middle of the Mediterranean Sea.

The scene was set at night where refugees were making a harrowing trip from the north coast of Libya to France. Since we couldn’t shoot on the ocean at night, we brought the boat and a storm into the studio.

Our production designer and art department cut a real boat in half and brought it onto the stage. Drew Jiritano and his special effects team then placed the boat on a gimbal and waterproofed the stage floor so we could place rain towers and air cannons to simulate a storm in the middle of the sea.

Using a technocrane, handheld cameras and interactive lighting, we created a great scene that immersed the audience in a realistic depiction of the dramatic journey that happens more often than most Americans realize.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Blackmagic offers next-gen Ursa Mini Pro camera, other product news

Blackmagic has introduced the Ursa Mini Pro 4.6K G2, a second-generation Ursa Mini Pro camera featuring fully redesigned electronics and a new Super 35mm 4.6K image sensor with 15 stops of dynamic range that combine to support high-frame-rate shooting at up to 300 frames per second.

In addition, the Ursa Mini Pro 4.6K G2 supports Blackmagic RAW and features a new USB-C expansion port for direct recording to external disks. Ursa Mini Pro 4.6K G2 is available now for $5,995 from Blackmagic resellers worldwide.

The new user interface

Key Features:
• Digital film camera with 15 stops of dynamic range
• Super 35mm 4.6K sensor with Blackmagic Design Generation 4 Color Science
• Supports project frame rates up to 60fps and off-speed slow motion recording up to 120fps in 4.6K, 150fps in 4K DCI and 300fps in HD Blackmagic RAW
• Interchangeable lens mount with EF mount included as standard. Optional PL, B4 and F lens mounts available separately
• High-quality 2-, 4- and 6-stop neutral density (ND) filters with IR compensation designed to specifically match the colorimetry and color science of Blackmagic URSA Mini Pro 4.6K G2
• Fully redundant controls including external controls that allow direct access to the most important camera settings such as external power switch, ND filter wheel, ISO, shutter, white balance, record button, audio gain controls, lens and transport control, high frame rate button and more
• Built-in dual C-Fast 2.0 recorders and dual SD/UHS-II card recorders allow unlimited duration recording in high quality
• High-speed USB-C expansion port for recording directly to an external SSD or flash disk
• Lightweight and durable magnesium alloy body
• LCD status display for quickly checking timecode, shutter and lens settings, battery, recording status and audio levels
• Support for Blackmagic RAW files in constant bitrate 3:1, 5:1, 8:1 and 12:1 or constant quality Q0 and Q5 as well as ProRes 4444 XQ, ProRes 4444, ProRes 422 HQ, ProRes 422, ProRes 422 LT, ProRes 422 Proxy recording at 4.6K, 4K, Ultra HD and HD resolutions
• Supports recording of up to 300fps in HD, 150fps in 4K DCI and 120fps at full-frame 4.6K.
• Features all standard connections, including dual XLR mic/line audio inputs with phantom power, 12G-SDI output for monitoring with camera status graphic overlay and separate XLR 4-pin power output for viewfinder power, headphone jack, LANC remote control and standard 4-pin 12V DC power connection
• Built-in high-quality stereo microphones for recording sound
• Offers a four-inch foldout touchscreen for on-set monitoring and menu settings
• Includes full copy of DaVinci Resolve color grading and editing software

Additional Blackmagic news:
– Blackmagic adds Blackmagic RAW to Blackmagic Pocket Cinema Camera 4K
– Blackmagic intros DeckLink Quad HDMI recorder
– Blackmagic updates DeckLink 8K Pro
– Blackmagic announces long-form recording on Blackmagic Duplicator 4K

Colorist Christopher M. Ray talks workflow for Alexa 65-shot Alpha

By Randi Altman

Christopher M. Ray is a veteran colorist with a varied resume that includes many television and feature projects, including Tomorrowland, Warcraft, The Great Wall, The Crossing, Orange Is the New Black, Quantico, Code Black, The Crossing and Alpha. These projects have taken Ray all over the world, including remote places throughout North America, Europe, Asia and Africa.

We recently spoke with Ray, who is on staff at Burbank’s Picture Shop, to learn more about his workflow on the feature film Alpha, which focuses on a young man trying to survive alone in the wilderness after he’s left for dead during his first hunt with his Cro-Magnon tribe.

Ray was dailies colorist on the project, working with supervising DI colorist Maxine Gervais. Gervais of Technicolor won an HPA Award for her work on Alpha in the Outstanding Color Grading — Feature Film category.

Let’s find out more….

Chris Ray and Maxine Gervais at the HPA Awards.

How early did you get involved in Alpha?
I was approached about working on Alpha right before the start of principal photography. From the beginning I knew that it was going to be a groundbreaking workflow. I was told that we would be working with the ARRI Alexa 65 camera, mainly working in an on-set color grading trailer and we would be using FilmLight’s Daylight software.

Once I was on board, our main focus was to design a comprehensive workflow that could accommodate on-set grading and Daylight software while adapting to the ever-changing challenges that the industry brings. Being involved from the start was actually was a huge perk for me. It gave us the time we needed to design and really fine-tune the extensive workflow.

Can you talk about working with the final colorist Maxine Gervais and how everyone communicated?
It was a pleasure working with Maxine. She’s really dialed in to the demands of our industry. She was able to fly to Vancouver for a few days while we were shooting the hair/makeup tests, which is how we were able to form in-person communication. We were able to sit down and discuss creative approaches to the feature right away, which I appreciated as I’m the type of person that likes to dive right in.

At the film’s conception, we set in motion a plan to incorporate a Baselight Linked Grade (BLG) color workflow from FilmLight. This would allow my color grades in Daylight to transition smoothly into Maxine’s Baselight software. We knew from the get-go that there would be several complicated “day for night” scenes that Maxine and I would want to bring to fruition right away. Using the BLG workflow, I was able to send her single “Arriraw” frames that gave that “day for night” look we were searching for. She was able to then send them back to me via a BLG file. Even in remote locations, it was easy for me to access the BLG grade files via the Internet.

[Maxine Gervais weighs in on working with Ray: “Christopher was great to work with. As the workflow on the feature was created from scratch, he implemented great ideas. He was very keen on the whole project and was able to adapt to the ever-changing challenges of the show. It is always important to have on-set color dialed in correctly, as it can be problematic if it is not accurately established in production.”]

How did you work with the DP? What direction were you given?
Being on set, it was very easy for DP Martin Gschlacht to come over to the trailer and view the current grade I was working on. Like Maxine, Martin already had a very clear vision for the project, which made it easy to work with him. Oftentimes, he would call me over on set and explain his intent for the scene. We would brainstorm ways of how I could assist him in making his vision come to life. Audiences rarely see raw camera files, or the how important color can influence the story being told.

It also helps that Martin is a master of aesthetic. The content being captured was extremely striking; he has this natural intuition about what look is needed for each environment that he shoots. We shot in lush rain forests in British Columbia and arid badlands in Alberta, which each inspired very different aesthetics.

Whenever I had a bit of down time, I would walk over to set and just watch them shoot, like a fly on the wall quietly observing and seeing how the story was unfolding. As a colorist, it’s so special to be able to observe the locations on set. Seeing the natural desaturated hues of dead grass in the badlands or the vivid lush greens in the rain forest with your own eyes is an amazing opportunity many of us don’t get.

You were on set throughout? Is that common for you?
We were on set throughout the entire project as a lot of our filming locations were in remote areas of British Columbia and Alberta, Canada. One of our most demanding shooting locations included the Dinosaur Provincial Park in Brooks, Alberta. The park is a UNESCO World Heritage site that no one had been allowed to film at prior to this project. I needed to have easy access to the site in order to easily communicate with the film’s executive team and production crew. They were able to screen footage in their trailer and we had this seamless back-and-forth workflow. This also allowed them to view high-quality files in a comfortable and controlled environment. Also, the ability to flag any potential issues and address them immediately on set was incredibly valuable with a film of such size and complexity.

Alpha was actually the first time I worked in an on-set grading trailer. In the past I usually worked out of the production office. I have heard of other films working with an on-set trailer, but I don’t think I would say that it is overly common. Sometimes, I wish I could be stationed on set more often.

The film was shot mostly with the Alexa 65, but included footage from other formats. Can you talk about that workflow?
The film was mostly shot on the Alexa 65, but there were also several other formats it was shot on. For most of the shoot there was a second unit that was shooting with Alexa XT and Red Weapon cameras, with a splinter unit shooting B-roll footage on Canon 1D, 5D and Sony A7S. In addition to these, there were units in Iceland and South Africa shooting VFX plates on a Red Dragon.

By the end of the shoot, there were several different camera formats and over 10 different resolutions. We used the 6.5K Alexa 65 resolution as the master resolution and mapped all the others into it.

The Alexa 65 camera cards were backed up to 8TB “sled” transfer drives using a Codex Vault S system. The 8TB transfer drives were then sent to the trailer where I had two Codex Vault XL systems — one was used for ingesting all of the footage into my SAN and the second was used to prepare footage for LTO archival. All of the other unit footage was sent to the trailer via shuttle drives or Internet transfer.

After the footage was successfully ingested to the SAN with a checksum verification, it was ready to be colored, processed, and then archived. We had eight LTO6 decks running 24/7, as the main focus was to archive the exorbitant amounts of high-res camera footage that we were receiving. Just the Alexa 65 alone was about 2.8TB per hour for each camera.

Had you worked with Alexa 65 footage previously?
Many times. A few year ago, I was in China for seven months working on The Great Wall, which was one of the first films to shoot with the Alexa 65. I had a month of in-depth pre-production with the camera testing, shooting and honing the camera’s technology. Working very closely with Arri and Codex technicians during this time, I was able to design the most efficient workflow possible. Even as the shoot progressed, I continued to communicate closely with both companies. As new challenges arose, we developed and implemented solutions that kept production running smoothly.

The workflow we designed for The Great Wall was very close to the workflow we ended up using on Alpha, so it was a great advantage that I had previous experience working in-depth with the camera.

What were some of the challenges you faced on this film?
To be honest, I love a challenge. As a colorist, we are thrown into tricky situations every day. I am thankful for these challenges; they improve my craft and enable me to become more efficient at problem solving. One of the largest challenges that I faced in this particular project was working with so many different units, given the number of units shooting, the size of the footage alone and the dozens of format types needed.

We had to be accessible around the clock, most of us working 24 hours a day. Needless to say, I made great friends with the transportation driving team and the generator operators. I think they would agree that my grading trailer was one of their largest challenges on the film since I constantly needed to be on set and my work was being imported/exported in such high resolutions.

In the end, as I was watching this absolutely gorgeous film in the theater it made sense. Working those crazy hours was absolutely worth it — I am thankful to have worked with such a cohesive team and the experience is one I will never forget.

DP Petr Hlinomaz talks about the look of Marvel’s The Punisher

By Karen Moltenbrey

For antiheroes like Frank Castle, the lead character in the Netflix series Marvel’s The Punisher, morality comes in many shades of gray. A vigilante hell-bent on revenge, the Marine veteran used whatever lethal means possible — kidnapping, murder, extortion — to exact revenge on those responsible for the deaths of his family. However, Castle soon found that the criminal conspiracy that set him on this destructive path ran far deeper than initially imagined, and he had to decide whether to embrace his role as the Punisher and help save other victims, or retreat to a more solitude existence.

Alas, in the end, the decision to end the Punisher’s crusade was made not by Frank Castle nor by the criminal element he sought to exact justice upon. Rather, it was made by Netflix, which just recently announced it was cancelling all its live-action Marvel shows. This coming a mere month after Season 2 was released, as many fans are still watching the season’s action play out.

Petr Hlinomaz

The Punisher is dark and intense, as is the show itself. The overall aesthetic is dim and gritty to match the action, yet rich and beautiful at the same time. This is the world initially envisioned by Marvel and then brought to life on screen late in Season 1 by director of photography Petr Hlinomaz under the direction of showrunner Steve Lightfoot.

The Punisher is based on the Marvel Comics character by the same name, and the story is set in the Marvel Cinematic Universe, meaning it shares DNA with the films and other TV shows in the franchise. There is a small family resemblance, but The Punisher is not considered a spin-off of Marvel’s Daredevil, despite the introduction of the Punisher (played by Jon Bernthal) on Season 2 of that series, for which Hlinomaz served as a camera operator and tandem DP. Therefore, there was no intent to match the shows’ cinematic styles.

“The Punisher does not have any special powers like other Marvel characters possess; therefore, I felt that the photographic style should be more realistic, with strong compositions and lighting resembling Marvel’s style,” Hlinomaz says. “It’s its own show. In the Marvel universe, it is not uncommon for characters to go from one show to another and then another after that.”

Establishing the Look
It seems that Hlinomaz followed somewhat in the characters’ footsteps himself, later joining The Punisher crew and taking on the role of DP after the first 10 episodes. He sussed out Lightfoot to find out what he liked as far as framing, look, camera movement and lighting were concerned, and built upon the look of those initial 10 episodes to finish out the last three episodes of Season 1. Then Hlinomaz enhanced that aesthetic on Season 2.

Hlinomaz was assisted by Francis Spieldenner, a Marvel veteran familiar with the property, who in Season 1 and again in Season 2 functioned as A camera/steadicam operator and who shot tandems in addition to serving as DP on two episodes (209 and 211) for Season 2.

“Steve and I had some discussions regarding the color of lighting for certain scenes in Season 2, but he pretty much gave me the freedom of devising the look and camera movement for the show on my own,” recalls Hlinomaz. “I call this look ‘Marvel Noir,’ which is low light and colorful. I never use any normal in the camera color temperature settings (for instance, 3,200K for night and 5,600K for day). I always look for different settings that fit the location and feel of the scene, and build the lighting from there. My approach is very source-oriented, and I do not like cheating in lighting when shooting scenes.”

According to Hlinomaz, the look they were striving for was a mix of Taxi Driver and The Godfather, but darker and more raw. “We primarily used wide-angle lenses to place our characters into our sets and scenery and to see geographically where they are. At times we strived to be inside the actors’ head.” They also used Jason Bourne films as a guideline, “making Jon (the Punisher) and all our characters feel small in the large NYC surroundings,” he adds. “The stunt sequences move fast, continuously and are brutally real.”

In terms of color, Hlinomaz uses very low light with a dark, colorful palette. This compliments New York City, which is colorful, while the city’s multitude of lights and colors “provide a spectacular base for the filming.” The show highlights various location throughout the city. “We felt the look is very fitting for this show, the Punisher being an earnest human being in the beginning of his life, but after joining the force is troubled by his past, PTSD and his family being brutally slaughtered, and in turn, he is brutal and ruthless to ‘bad people,’” explains Hlinomaz.

For instance, in a big fight scene in Season 1, Episode 11 at Micro’s hideout, Hlinomaz showed the top portion of the space to its fullest extent. “It looks dark, mysterious. We used a mixture of top, side and uplighting to make the space look interesting, with lots of color temperature mixes,” he says. “There was a plethora of leftover machinery and huge transformers and generators that were no longer in use, and stairwells that provided a superb backdrop for this sequence.”

The Workflow
For the most part, Hlinomaz has just one day to prep for an episode with the director, and that is often during the technical scout day. “Aside from reading the script and exchanging a few emails, that is the only prep we get,” he says.

During the technical scout, a discussion takes place with the director concerning how the scenes should look and feel. “We discuss lighting and grip, set dressing, blocking, shooting direction, time of day, where we light from, where the sun should be and so on, along with any questions concerning the locations for the next episodes,” he says.

During the scout and rehearsal, Hlinomaz looks for visually stimulating backgrounds, camera angles and shots that will enhance and propel the story line.

When they start shooting the episode, the group rehearses the scene, discusses the most efficient or suitable blocking for the scene and which lenses to use. During the shoot, Hlinomaz takes stills that will be used by the colorists as reference for the lighting, density, color and mood. When the episode is cut and roughly colored, he then will view the episode at the lab (Company 3 in New York) and make notations. Those notes are then provided to the post producer and colorist Tony D’Amore (from Encore) for the final color pass and Lightfoot’s approval.

The group employs HDR, “which, in a way, is hard because you always have to protect for overexposure on sources within the frame,” adds Hlinomaz. In fact, D’Amore has commended Hlinomaz, the directors and Lightfoot with devising unique lighting scenarios that highlighted the HDR aspect of the show in Season 2.

Tools of the Trade
The Punisher’s main unit uses two cameras – “we have crew to cover two at all times,” Hlinomaz says. That number increases to three or more as needed for certain sequences, though there are times when just one camera is used for certain scenes and shots.

According to Hlinomaz, Netflix and Marvel only shoot with Red 4K cameras and up. For the duration of The Punisher shoot, the crew only carried four “Panavised” Red cameras. “We shot 4K but frequently used the 5K and 6K settings to go a bit wider with the [Panavision] Primo lenses, or for a tilt and swing lens special look,” he says, adding that he has used Red cameras for the past four years and is still impressed with the color rendering of the Red sensors. Prior to shooting the series, he tested Zeiss Ultra Prime lenses, Leica Summilux lenses, along with Panavision Primos; Hlinomaz chose the Primos for their 3D rendering of the subjects.

The lens set ranged from 10mm to 150mm; there was also an 11:1 zoom lens that was used sparingly. It all depended on the shot. In Episode 13, when Frank finally shoots and kills hitman Billy Russo (aka Jigsaw), Hlinomaz used an older 12mm lens with softer edges to simulate Billy’s state as he is losing a lot of blood. “It looked great, somewhat out of focus along the edges as Frank approaches; then, when Frank steps closer for the kill, he comes into clear focus,” Hlinomaz explains.

In fact, The Punisher was shot using the same type of camera and lenses as the second season of the now-cancelled Marvel/Netflix series Luke Cage (Hlinomaz served as a DP on Luke Cage Season 2 and a camera operator for four episodes of Season 1). In addition to wide-angle lenses, the show also used more naturalistic lighting, similar to The Punisher.

Hlinomaz details another sequence pertaining to his choice of cameras and lenses on The Punisher, whereby he used 10mm and 14mm lenses for a fight scene inside an elevator. Spieldenner, the A cam operator, was inside the elevator with the performers. “We didn’t pull any walls for that, only the ceilings were pulled for one overhead shot when Frank flips a guy over his shoulder,” explains Hlinomaz. “I did not want to pull any walls; when you do, it feels like the camera is on the outside, especially if it’s a small space like that elevator.”

On-Set Challenges
A good portion of the show is filmed outdoors — approximately two-thirds of the series —which always poses an additional challenge due to constantly changing weather conditions, particularly in New York. “When shooting exteriors, you are in the elements. Night exteriors are better than day exteriors because you have more control, unless the day provides constant lighting — full sun or overcast, with no changes. Sometimes it’s impractical or prohibitive to use overhead cover to block out the sun; then you just have to be quick and make smart decisions on how to shoot a scene with backlight on one side and front fill that feels like sunlight on the other, and make it cut and look good together,” explains Hlinomaz.

As he noted earlier, Hlinomaz is a naturalist when it comes to lighting, meaning he uses existing source-driven lighting. “I like simplicity. I use practicals, sun and existing light to give and drive our light direction,” he further adds. “We use every possible light, from big HMIs all the way down to the smallest battery-driven LED lights. It all depends on a given shot, location, sources and where the natural or existing light is coming from. On the other hand, sometimes it is just a bounce card for a little fill or nothing extra to make the shot look great.”

All The Punisher sets, meanwhile, have hard ceilings. “That means with our use of lower camera angles and wide lenses, we are seeing everything, including the ceilings, and are not pulling bits of ceilings and hanging any lights up from the grid. All lighting is crafted from the floor, driven by sources, practicals, floor bounces, windows and so on,” says Hlinomaz. “My feeling is that this way, the finished product looks better and more natural.”

Most of Season 1’s crew returned for Season 2, so they were familiar with the dark and gritty style, which made things easier on Hlinomaz. The season begins with the Punisher somewhere in the Midwest before agent Madani brings Frank back to New York, although all the filming took place throughout New York.

One of the more challenging sequences this season, according to Hlinomaz, was an ambulance chase that was filmed in Albany, New York. For the shoot, they used a 30-foot Louma crane and Edge arm from Action Camera cars, and three to four Red cameras. For the actual ambulance drop, they placed four additional cameras. “We had to shoot many different passes with stunts as well as the actors, in addition to the Edge arm pass. It was quite a bit of work,” he says. Of course, it didn’t help that when they arrived in Albany to start filming, they encountered a rain delay, but “we used the time to set up the car and ambulance rigs and plan to the last detail how to approach our remaining days there.” For the ambulance interior, they shot on a greenscreen stage with two ambulances — one on a shaky drive simulation rig and the other mounted 20 feet or so high on a teeter rig that simulated the drop of the highway as it tilted forward until it was pointing straight to the ground.

“If I remember correctly, we spent six days total on that sequence,” says Hlinomaz.

The second season of The Punisher was hard work, but a fun and rewarding experience, Hlinomaz contends. “It was great to be surrounded from top to bottom with people working on this show who wanted to be there 100 percent, and that dedication and our hard work is evident, I believe, in the finished season,” he adds.

As Hlinomaz waited for word on Season 3 of The Punisher, he lent his talents to Jessica Jones, also set in the Marvel Cinematic Universe — and sadly also receiving the same ultimate fate — as Hlinomaz stepped in to help shoot Episode 305, with the new Red DSMC2 Gemini 5K S35 camera. “I had a great experience there and loved the new camera. I am looking forward to using it on my next project,” he adds.


Karen Moltenbrey is a veteran VFX and post writer.

Color plays big role in the indie thriller Rust Creek

In the edge-of-your-seat thriller Rust Creek, confident college student Sawyer (Hermione Corfield) loses her way while driving through very rural Appalachia and quickly finds herself in a life-or-death struggle with some very dangerous men. The modestly-budgeted feature from Lunacy Productions — a company that encourages female filmmakers in top roles — packs a lot of power with virtually no pyrotechnics using well-thought-out filmmaking techniques, including a carefully planned and executed approach to the use of color throughout the film.

Director Jen McGowan and DP Michelle Lawler

Director Jen McGowan, cinematographer Michelle Lawler and colorist Jill Bogdanowicz of Company 3 collaborated to help express Sawyer’s character arc through the use of color. For McGowan, successful filmmaking requires thorough prep. “That’s where we work out, ‘What are we trying to say and how do we illustrate that visually?’” she explains. “Film is such a visual medium,” she adds, “but it’s very different from something like painting because of the element of time. Change over time is how we communicate story, emotion and theme as filmmakers.”

McGowan and Lawler developed the idea that Sawyer is lost, confused and overwhelmed as her dire situation becomes clear. Lawler shot most of Rust Creek handholding an ARRI Alexa Mini (with Cooke S4s) following Sawyer as she makes her way through the late autumn forest. “We wanted her to become part of the environment,” Lawler says. “We shot in winter and everything is dead, so there was a lot of brown and orange everywhere with zero color separation.”

Production designer Candi Guterres pushed that look further, rather than fighting it, with choices about costumes and some of the interiors.

“They had given a great deal of thought to how color affects the story,” recalls colorist Bogdanowicz, who sat with both women during the grading sessions (using Blackmagic’s DaVinci Resolve) at Company 3 in Santa Monica. “I loved the way color was so much a part of the process, even subtly, of the story arc. We did a lot in the color sessions to develop this concept where Sawyer almost blends into the environment at first and then, as the plot develops and she finds inner strength, we used tonality and color to help make her stand out more in the frame.”

Lawler explains that the majority of the film was shot on private property deep in the Kentucky woods, without the use of any artificial light. “I prefer natural light where possible,” she says. “I’d add some contrast to faces with some negative fill and maybe use little reflectors to grab a rake of sunlight on a rock, but that was it. We had to hike to the locations and we couldn’t carry big lights and generators anyway. And I think any light I might have run off batteries would have felt fake. We only had sun about three days of the 22-day shoot, so generally I made use of the big ‘silk’ in the sky and we positioned actors in ways that made the best use of the natural light.”

In fact, the weather was beyond bad, it was punishing. “It would go from rain to snow to tornado conditions,” McGowan recalls. “It dropped to seven degrees and the camera batteries stopped working.”

“The weather issues can’t be overstated,” Lawler adds, describing conditions on the property they used for much of the exterior location. “Our base camp was in a giant field. The ground would be frozen in the morning and by afternoon there would be four feet of mud. We dug trenches to keep craft services from flooding.”

The budget obviously didn’t provide for waiting around for the elements to change, David Lean-style. “Michelle and I were always mindful when shooting that we would need to be flexible when we got to the color grading in order to tie the look together,” McGowan explains. “I hate the term ‘fix it post.’ It wasn’t about fixing something, it was about using post to execute what was intended.”

Jill Bogdanowicz

“We were able to work with my color grading toolset to fine tune everything shot by shot,” says Bogdanowicz. “It was lovely working with the two of them. They were very collaborative but were very clear on what they wanted.”

Bogdanowicz also adapted a film emulation LUT, which was based on the characteristics of a Fujifilm print stock and added in a subtle hint of digital grain, via a Boris FX Sapphire plug-in, to help add a unifying look and filmic feel to the imagery. At the very start of the process, the colorist recalls, “I showed Jen and Michelle a number of ‘recipes’ for looks and they fell in love with this one. It’s somewhat subtle and elegant and it made ‘electric’ colors not feel so electric but has a film-style curve with strong contrast in the mids and shadows you can still see into.”

McGowan says she was quite pleased with the work that came out of the color theater. “Color is not one of the things audiences usually pick up on, but a lot of people do when they see Rust Creek. It’s not highly stylized, and it certainly isn’t a distracting element, but I’ve found a lot of people have picked up on what we were doing with color and I think it definitely helped make the story that much stronger.”

Rust Creek is currently streaming on Amazon Prime and Google.

Helicopter Film Services intros Titan ultra-heavy lifting drone

Helicopter Filming Services (HFS) has launched an ultra-heavy lift drone that incorporates a large, capable airframe paired with the ARRI SRH-3. Known as the Titan, the drone’s ARRI SRH-3 stabilized head enables easy integration of existing ARRI lens motors and other functionality directly with the ARRI Alexa 65 and LF cameras.

HFS developed the large drone in response to requests from some legendary DPs and VFX supervisors to enable filmmakers to fly large-format digital or 35mm film packages.

“We have trialed other heavy-lift machines, but all of them have been marginal in terms of performance when carrying the larger cameras and lenses that we’re asked to fly,” says Alan Perrin, chief UAV pilot at HFS. “What we needed, and what we’ve designed, is a system that will capably and safely operate with the large-format cameras and lenses that top productions demand.”

The Titan combines triple redundancy on flight controls and double redundancy on power supply and ballistic recovery into an aircraft that can deploy and operate easily on any production involving a substantial flight duration. The drone can easily fly a 35mm film camera while carrying an ARRI 435 and 400-foot magazine.

Here are some specs:
• Optimized for large-format digital and 35mm film cameras
• Max payload up to 30 kilograms
• Max take-off mass — 80 kilograms
• Redundant flight control systems
• Ballistic recovery system (parachute)
• Class-leading stability
• Flight duration up to 15 minutes (subject to payload weight and configuration)
• HD video downlink
• Gimbal: ARRI SRH3 or Movi XL

Final payload-proving flights are taking place now, and the company is in the process of planning first use on major productions. HFS is also exploring the ability to fly a new 65mm film camera on the Titan.

SciTech Medallion Recipient: A conversation with Curtis Clark, ASC

By Barry Goch

The Academy of Motion Pictures Arts & Sciences has awarded Curtis Clark, ASC, the John A. Bonner Medallion “in appreciation for outstanding service and dedication in upholding the high standards of the Academy.” The presentation took place in early February and just prior to the event, I spoke to Clark and asked him to reflect on the transition from film to digital cinema and his contributions to the industry.

Clark’s career as a cinematographer includes features, TV and commercials. He is also the chair of the ASC Motion Imaging Technology Council that developed the ASC- CDL.

Can you reflect on the changes you’ve seen over your career and how you see things moving ahead in the future?
Once upon a time, life was an awful lot simpler. I look back on it nostalgically when it was all film-based, and the possibilities of the cinematographer included follow-up on the look of dailies and also follow through with any photographic testing that helped to hone in on the desired look. It had its photochemical limitations; its analog image structure was not as malleable or tonally expansive as the digital canvas we have now.

Do you agree that Kodak’s Cineon helped us to this digital revolution — the hybrid film/digital imaging system where you would shoot on film, scan it and then digitally manipulate it before going back out to film via a film recorder?
That’s where the term digital intermediate came into being, and it was an eye opener. I think at the time not everyone fully understood the ramifications of the sort of impact it was making. Kodak created something very potent and led the way in terms of methodologies, or how to arrive at integration of digital into what was then called a hybrid imaging system —combining digital and film together.

The DCI (Digital Cinema Initiative) was created to establish digital projection standards. Without a standard we’d potentially be creating chaos in terms of how to move forward. For the studios, distributors and exhibitors, it would be a nightmare Can you talk about that?
In 2002, I had been asked to form a technology committee at the ASC to explore these issues: how the new emerging digital technologies were impacting the creative art form of cinematography and of filmmaking, and also to help influence the development of these technologies so they best serve the creative intent of the filmmaker.

DCI proposed that for digital projection to be considered ready for primetime, its image quality needed to be at least as good as, if not better than, a print from the original negative. I thought this was a great commitment that the studios were making. For them to say digital projection was going to be judged against a film print projection from the original camera negative of the exact same content was a fantastic decision. Here was a major promise of a solution that would give digital cinema image projection an advantage since most people saw release prints from a dupe negative.

Digital cinema had just reached the threshold of being able to do 2K digital cinema projection. At that time, 4K digital projection was emerging, but it was a bit premature in terms of settling on that as a standard. So you had digital cinema projection and the emergence of a sophisticated digital intermediate process that could create the image quality you wanted from the original negative, but projected on a digital projection.

In 2004, the Michael Mann film Collateral film was shot with the Grass Valley Viper Film Stream, the Sony F900 and Sony F950 cameras, the latest generation of digital motion picture cameras — basically video cameras that were becoming increasingly sophisticated with better dynamic range and tonal contrast, using 24fps and other multiple frame rates, but 24p was the key.
These cameras were used in the most innovative and interesting manner, because Mann combined film with digital, using the digital for the low-light level night scenes and then using film for the higher-light level day exterior scenes and day interior scenes where there was no problem with exposure.

Because of the challenge of shooting the night scenes, they wanted to shoot at such low light levels that film would potentially be a bit degraded in terms of grain and fog levels. If you had to overrate the negative, you needed to underexpose and overdevelop it, which was not desirable, whereas the digital cameras thrived in lower light levels. Also, you could shoot at a stop that gave you better depth of field. At the time, it was a very bold decision. But looking back on it historically, I think it was the inflection point that brought the digital motion picture camera into the limelight as a possible alternative to shooting on film.

That’s when they decided to do Camera Assessment Series tests, which evaluates all the different digital cinema cameras available at the time?
Yeah, with the idea being that we’d never compare two digital cameras together, we’d always compare the digital camera against a film reference. We did that first Camera Assessment Series, which was the first step in the direction of validating the digital motion picture camera as viable for shooting motion pictures compared with shooting on film. And we got part way there. A couple of the cameras were very impressive: the Sony F35, the Panavision Genesis, the Arri D21 and the Grass Valley Viper were pretty reasonable, but this was all still mainly within a 2K (1920×1080) realm. We had not yet broached that 4K area.

A couple years later, we decided to do this again. It was called the Image Control Assessment Series, ICAS. That was shot at Warner Bros. It was the scenes that we shot in a café — daylight interior and then night time exterior. Both scenes had a dramatically large range of contrast and different colors in the image. It was the big milestone. The new Arri Alexa was used, along with the Sony F65 and the then latest versions of the Red cameras.

So we had 4K projection and 4K cameras and we introduced the use of ACES (Academy Color Encoding System) color management. So we were really at the point where all the key components that we needed were beginning to come together. This was the first instance where these digital workflow components were all used in a single significant project testing. Using film as our common benchmark reference — How are these cameras in relation to film? That was the key thing. In other words, could we consider them to be ready for prime time? The answer was yes. We did that project in conjunction with the PGA and a company called Revelations Entertainment, which is Morgan Freeman’s company. Lori McCreary, his partner, was one of the producers who worked with us on this.

So filmmakers started using digital motion picture cameras instead of film. And with digital cinema having replaced film print as a distribution medium, these new generation digital cameras started to replace film as an image capture medium. Then the question was would we have an end-to-end digital system that would become potentially viable as an alternative to shooting on film.

L to R: Josh Pines, Steve MacMillan, Curtis Clark and Dhanendra Patel.

Part of the reason you are getting this acknowledgement from the Academy is your dedication on the highest quality of image and the respect for the artistry, from capture through delivery. Can you talk about your role in look management from on-set through delivery?
I think we all need to be on the same page; it’s one production team whose objective is maintaining the original creative intent of the filmmakers. That includes director and cinematographer and working with an editor and a production designer. Making a film is a collective team effort, but the overall vision is typically established by the director in collaboration with the cinematographer and a production designer. The cinematographer is tasked with capturing that with lighting, with camera composition, movement, lens choices — all those elements that are part of the process of creative filmmaking. Once you start shooting with these extremely sophisticated cameras, like the Sony F65 or Venice, Panavision Millennium DXL, an Arri or the latest versions of the Red camera, all of which have the ability to reproduce high dynamic range, wide color gamut and high resolution. All that raw image data is inherently there and the creative canvas has certainly been expanded.

So if you’re using these creative tools to tell your story, to advance your narrative, then you’re doing it with imagery defined by the potential of what these technologies are able to do. In the modern era, people aren’t seeing dailies at the same time, not seeing them together under controlled circumstances. The viewing process has become fragmented. When everyone had to come together to view projected dailies, there was a certain camaraderie constructive contributions that made the filmmaking process more effective. So if something wasn’t what it should be, then everyone could see exactly what it was and make a correction if you needed to do that.

But now, we have a more dispersed production team at every stage of the production process, from the initial image capture through to dailies, editorial, visual effects and final color grading. We have so many different people in disparate locations working on the production who don’t seem to be as unified, sometimes, as we were when it was all film-based analog shooting. But now, it’s far easier and simpler to integrate visual effects into your workflow. Like Cineon indicated when it first emerged, you could do digital effects as opposed to optical effects and that was a big deal.

So coming back to the current situation, and particularly now with the most advanced forms of imaging, which include high dynamic range, wider color gamut, wider than even P3, REC 2020, having a color management system like ACES that actually has enough color gamut to be able to contain any color space that you capture and want to be able to manipulate.

Can you talk about the challenges you overcame, and how that fits into the history of cinema as it relates to the Academy recognition you received?
As a cinematographer, working on feature films or commercials, I kept thinking, if I’m fortunate enough to be able to manage the dailies and certainly the final color grading, there are these tools called lift gain gamma, which are common to all the different color correctors. But they’re all implemented differently. They’re not cross-platform-compatible, so the numbers from a lift gain gamma — which is the primary RGB grading — from one color corrector will not translate automatically to another color corrector. So I thought, we should have a cross platform version of that, because that is usually seen as the first step for grading.

That’s about as basic as you can get, and it was designed so that it would be a cross-platform implementation, so that everybody who installs and applies the ASC-CDL in a color grading system compatible with that app, whether you did it on a DaVinci, Baselight, Lustre or whatever you were using, the results would be the same and transferable.

You could transport those numbers from one set-up on set using a dailies creation tool, like ColorFront for example. You could then use the ASC CDL to establish your dailies look during the shoot, not while you’re actually shooting, but with the DIT to establish a chosen look that could then be applied to dailies and used for VFX.

Then when you make your way into the final color grading session with the final cut — or whenever you start doing master color grading going back to the original camera source — you would have these initial grading corrections as a starting point as references. This now gives you the possibility of continuing on that color grading process using all the sophistication of a full color corrector, whether it’s power windows or secondary color correction. Whatever you felt you needed to finalize the look.

I was advocating this in the ASC Technology Committee, as it was called, now subsequently renamed the Motion Imaging Technology Council (MITC). We needed a solution like this and there were a group of us who got together and decided that we would do this. There were plenty of people who were skeptical, “Why would you do something like that when we already have lift gain gamma? Why would any of the manufacturers of the different color grading systems integrate this into their system? Would it not impinge upon their competitive advantage if they had a system that people were used to using, and if their own lift gain gamma would work perfectly well for them, why would they want to use the ASC CDL?

We live in a much more fragmented post world, and I saw that becoming even more so with the advances of digital. The ASC CDL would be a “look unifier” that would establish initial look parameters. You would be able to have control over the look at every stage of the way.

I’m assuming that the cinematographer would work with the director and editor, and they would assess certain changes that probably should be made because we’re now looking at cut sequences and what we had thought would be most appropriate when we were shooting is now in the context of an edit and there may need to be some changes and adjustments.

Were you involved in ACES? Was it a similar impetus for ACES coming about? Or was it just spawned because visual effects movies became so big and important with the advent of digital filmmaking?
It was bit of both, including productions without VFX. So I would say that initially it was driven by the fact that there really should be a standardized color management system. Let me give you an example of what I’m talking about. When we were all photochemical and basically shooting with Kodak stock, we were working with film-based Kodak color science.

It’s a color science that everybody knew and understood, even if they didn’t understand it from an engineering photochemical point of view, they understood the effects of it. It’s what helps enable the look and the images that we wanted to create.

That was a color management system that was built into film. That color science system could have been adapted into the digital world, but Kodak resisted that because of the threat to negatives. If you apply that film color science to digital cameras, then you’re making digital cameras look more like film and that could pose a threat to the sale of color film negative.

So that’s really where the birth of ACES came about — to create a universal, unified color management system that would be appropriate anywhere you shot and with the widest possible color gamut. And it supports any camera or display technology because it would always have a more expanded (future proofing) capability within which the digital camera and display technologies would work effectively and efficiently but accurately, reliably and predictably.

Very early on, my ASC Technology Committee (now called Motion Imaging Technology Council) got involved with ACES development and became very excited about it. It was the missing ingredient needed to be able to make the end-to-end digital workflow the success that we thought that it could become. Because we no longer could rely on film-based color science, we had to either replicate that or emulate it with a color management system that could accommodate everything we wanted to do creatively. So ACES became that color management system.

So, in addition to becoming the first cross-platform primary color grading tool, the ASC CDL became the first official ACES look modification transform. Because ACES is not a color grading tool, it’s a color management system, you have to have color grading tools with color management. So you have the color management with ACES, you have the color grading with ASC CDL and the combination of those together is the look management system because it takes them all to make that work. And it’s not that the ASC CDL is the only tool you use for color grading, but it has the portable cross-platform ability to be able to control the color grading from dailies through visual effects up to the final color grade when you’re then working with a sophisticated color corrector.

What do you see for the future of cinematography and the merging of the worlds of post and on-set work and, what do you see as future challenges for future integrations between maintaining the creative intent and the metadata.
We’re very involved in metadata at the moment. Metadata is a crucial part of making all this work, as you well know. In fact, we worked on the common 3D LUT format, which we worked on with the Academy. So there is a common 3D LUT format that is something that would again have cross-platform consistency and predictability. And it’s functionality and its scope of use would be better understood if everyone were using it. It’s a work in progress. Metadata is critical.

I think as we expand the canvas and the palette of the possibility of image making, you have to understand what these technologies are capable of doing, so that you can incorporate them into your vision. So if you’re saying my creative vision includes doing certain things, then you would have to understand the potential of what they can do to support that vision. A very good example in the current climate is HDR.

That’s very controversial in a lot of ways, because the set manufacturers really would love to have everything just jump off the screen to make it vibrant and exciting. However, from a storytelling point of view, it may not be appropriate to push HDR imagery where it distracts from the story.
Well, it depends on how it’s done and how you are able to use that extended dynamic range when you have your bright highlights. And you can use foreground background relationships with bigger depth of field for tremendous effect. They have a visceral presence, because they have a dimensionality when, for example, you see the bright images outside of a window.

When you have an extended dynamic range of scene tones that could add dimensional depth to the image, you can choreograph and stage the blocking for your narrative storytelling with the kind of images that take advantage of those possibilities.

So HDR needs to be thought of as something that’s integral to your storytelling, not just something that’s there because you can do it. That’s when it can become a distraction. When you’re on set, you need a reference monitor that is able to show and convey, all the different tonal and color elements that you’re working with to create your look, from HDR to wider color gamut, whatever that may be, so that you feel comfortable that you’ve made the correct creative decision.

With virtual production techniques, you can incorporate some of that into your live-action shooting on set with that kind of compositing, just like James Cameron started with Avatar. If you want to do that with HDR, you can. The sky is the limit in terms of what you can do with today’s technology.

So these things are there, but you need to be able to pull them all together into your production workflow to make sure that you can comfortably integrate in the appropriate way at the appropriate time. And that it conforms to what the creative vision for the final result needs to be and then, remarkable things can happen. The aesthetic poetry of the image can visually drive the narrative and you can say things with these images without having to be expositional in your dialogue. You can make it more of an experientially immersive involvement with the story. I think that’s something that we’re headed toward, that’s going to make the narrative storytelling very interesting and much more dynamic.

Certainly, and certainly with the advancements of consumer technology and better panels and the high dynamic range developments, and Dolby Vision coming into the home and Atmos audio coming into the home. It’s really an amazing time to be involved in the industry; it’s so fun and challenging.

It’s a very interesting time, and a learning curve needs to happen. That’s what’s driven me from the very beginning and why I think our ASC Motion Imaging Technology Council has been so successful in its 16 years of continuous operation influencing the development of some of these technologies in very meaningful ways. But always with the intent that these new imaging technologies are there to better serve the creative intent of the filmmaker. The technology serves the art. It’s not about the technology per se, it’s about the technology as the enabling component of the art. It enables the art to happen. And expands it’s scope and possibility to broader canvases with wider color gamuts in ways that have never been experienced or possible before.


Barry Goch is a Finishing Artist at The Foundation and a Post Production Instructor at UCLA Extension. You can follow him on Twitter at @gochya.

Red intros LCD touch monitor for DSMC2 cameras

Red Digital Cinema has introduced the DSMC2 Touch 7-inch Ultra-Brite LCD monitor to its line of camera accessories. It offers an optically-bonded touchscreen with Gorilla Glass that allows for what the company calls “intuitive ways to navigate menus, adjust camera parameters and review .R3D clips directly out of the camera.”

The monitor offers a brighter high-definition viewing experience for recording and viewing footage on DSMC2 camera systems, even in direct sunlight. A 1920×1200 resolution display panel provides 2,200 nits of brightness to overcome viewing difficulties in bright outdoor environments as well as a high-pixel density (at 323ppi) and a 1200:1 contrast ratio.

The Ultra-Brite display mounts to Red’s DSMC2 Brain or other 1/4-20 mounting surfaces, and provides a LEMO connection to the camera, making it an ideal monitoring option for gimbals, cranes, and cabled remote viewing. Shooters can use a DSMC2 LEMO Adaptor A in conjunction with the Ultra-Brite display for convenient mounting options away from the DSMC2 camera Brain.

Check out a demo of the new monitor, priced at $3,750, here.

Lucid and Eys3D partner on VR180 depth camera module

EYS3D Microelectronics Technology, the company behind embedded camera modules in some top-tier AR/VR headsets, has partnered with that AI startup Lucid. Lucid will power their next-generation depth-sensing camera module, Axis. This means that a single, small, handheld device can capture accurate 3D depth maps with up to a 180-degree field of view at high resolution, allowing content creators to scan, reconstruct and output precise 3D point clouds.

This new camera module, which was demoed for the first time at CES, will allow developers, animators and game designers a way to transform the physical world into a virtual one, ramping up content for 3D, VR and AR all with superior performance in resolution and field of view at a lower cost than some technologies currently available.

A device capturing the environment exactly as you perceive it, but enhanced with capabilities of precise depth, distance and understanding could help eliminate the boundaries between what you see in the real world and what you can create in the VR and AR world. This is what the Lucid-powered EYS3D’s Axis camera module aims to bring to content creators, as they gain the “super power” of transforming anything in their vision into a 3D object or scene which others can experience, interact with and walk in.

What was only previously possible with eight to 16 high-end DSLR cameras, and expensive software or depth sensors is now combined into one tiny camera module with stereo lenses paired with IR sensors. Axis will cover up to a 180-degree field of view while providing millimeter-accurate 3D in point cloud or depth map format. This device provides a simple plug-and-play experience through USB 3.1 Gen1/2 and supported Windows and Linux software suites, allowing users to further develop their own depth applications such as 3D reconstructing an entire scene, scanning faces into 3D models or just determining how far away an object is.

Lucid’s AI-enhanced 3D/depth solution, known as 3D Fusion Technology, is currently deployed in many devices, such as 3D cameras, robots and mobile phones, including the Red Hydrogen One, which just launched through AT&T and Verizon nationwide.

EYS3D’s new depth camera module powered by Lucid will be available in Q3 2019.

Review: iOgrapher Multi Case for mobile filmmaking

By Brady Betzel

Thanks to the amazing iPhone X, Google Pixel and Samsung Galaxy, almost everyone has a high-end video camera on their person at all times and this is helping to spur on mobile filmmaking and vlogging.

From YouTube to Instagram to movies like Unsane (Steven Soderbergh) or Tangerine (Sean Baker) — and regardless of whether you think a $35,000 camera setup tells a story better than a $1,000 cell phone (looking at you Apple Phone XS Max) — mobile filmmaking is here to stay and will only get better.

iOgrapher’s latest release is the iOgrapher Multi Case, a compact mobile filmmaking mounting solution that works with today’s most popular phones. iOgrapher has typically created solutions that were tied to the mobile device being used for filmmaking, such as an iPhone, the latest Samsung Galaxy phones, iPads or even action cameras like a GoPro Hero 7 Black.

With the new iOgrapher Multi Case you can fit any mobile device that measures more than 5 ½” x 2 ¼” and less than 6 ½” by 3 ⅜”. Unfortunately, you won’t be fitting an iPad or a GoPro in the iOgrapher Multi Case, but don’t fret! iOgrapher makes rigs for those as well. On the top of the Multi Case are two cold shoe mounts for lights, microphones or any other device, like a GoPro. To mount things with ¼” 20 screw mounts in the cold shoes you will need to find a cold shoe to ¼” 20 adapter, which is available on iOgrapher’s accessory page. You can also find these at Monoprice or Amazon for real cheap.

And if you are looking to order more mounts you may want to order some extra cold shoe adapters that can be mounted on the handles of the iOgrapher Multi Case in the additional ¼” 20 screw mounts. The mounts on the handles are great for adding in additional lighting or microphones. I’ve even found that if you are going to be doing some behind-the-scenes filming or need another angle for your shooting, a small camera like a GoPro can be easily mounted and angled. With all this mounting you should assume that you are going to be using the iOgrapher on a sturdy tripod. Just for fun, I mounted the iOgrapher Multi Case onto a GoPro 3-Way Grip, which can also be used as a light tripod. It wasn’t exactly stable but it worked. I wouldn’t suggest using it for more than an emergency shooting situation though.

On the flip side (all pun intended), the iOgrapher can be solidly mounted vertically with the ¼” 20 screw mounts on the handles. With Instagram making headway with vertical video in their Instagram Stories, iOgrapher took that idea and built that into their Multi Case, further cementing grumbling from the old folks who just don’t get vertical video.

Testing
I tried out both a Samsung Galaxy s8+ as well as an iPhone 7+ with their cases on inside of the iOgrapher Multi Case. Both fit. The iPhone 7+ was stretching the boundaries of the Multi Case, but it did fit and worked well. The way the phones are inserted into the Multi Case is by a spring-loaded bottom piece. From the left or top side, if you are shooting vertically, you push the bottom of the mobile device into the corner covered slots of the iOgrapher Multi Case until the top or the left side can be secured under the left or top side of the Multi Case. It’s really easy.

I was initially concerned with the spring loading of the case; I wasn’t sure if the springs would be resilient enough to handle the constant pulling in and out of the phones, but the springs are high quality and held up beautifully. I even tried inserting my mobile phones tons of times and didn’t notice any issues with the springs or my phones.

Take care when inserting your phone into the Multi Case if you have a protective shield on the screen of your device. If you aren’t extra careful it can pull or snag on the cover — especially with the tight fit of a case. Just pay attention and there will be nothing to worry about. The simple beauty of the iOgrapher is that with a wider grip of your filmmaking device, you have a larger area to distribute any shaking coming from your hands, essentially helping stabilize your filmmaking without the need for a full-fledged gimbal.

If you accidentally drop your iOgrapher you may get a scratch, but for the most part they are built sturdy and can withstand punishment, whether it’s from your four year old or from weather. If you want to get a little fancy, you can buy affordable lights like the Litra Torch (check out my review) to attach to the cold shoe mounts, or even a Rode microphone (don’t forget the TRS to TRRS adapter if you are plugging into an iPhone), and you are off and running.

Summing Up
I have been really intrigued with iOgrapher’s products since day one. They are an affordable and sturdy way to jump into filmmaking using cameras everyone carries with them every day: their phones.

Whether you are a high school student looking to get steady and professional mobile video, or a journalist looking for a quick way to make the most of your shots with just a phone, light, mic and tripod mount, the iOgrapher Multi Case will unlock your mobile filmmaking potential.

The iOgrapher Multi Case is a very durable protective case for your mobile filmmaking devices that is a steal at $79.99. If you are a parent that is looking for an inexpensive way to try and tease your child’s interest in video take a look at www.iographer.com and grab a few accessories like a Manfrotto light and Rode VideoMicro to add some subtle lighting and pick up the best quality audio.

Make sure to check out Dave Basulto’s — the creator of iOgrapher — demo of the iOgrapher Multi Case, including trying out the fit of different phones.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Catching up with Aquaman director James Wan

By Iain Blair

Director James Wan has become one of the biggest names in Hollywood thanks to the $1.5 billion-grossing Fast & Furious 7, as well as the Saw, Conjuring and Insidious films — three of the most successful horror franchises of the last decade.

Now the Malaysian-born, Australian-raised Wan, who also writes and produces, has taken on the challenge of bringing Aquaman and Atlantis to life. The origin story of half-surface dweller, half-Atlantean Arthur Curry stars Jason Momoa in the title role. Amber Heard plays Mera, a fierce warrior and Aquaman’s ally throughout his journey.

James Wan and Iain Blair

Additional cast includes Willem Dafoe as Vulko, council to the Atlantean throne; Patrick Wilson as Orm, the present King of Atlantis; Dolph Lundgren as Nereus, King of the Atlantean tribe Xebel; Yahya Abdul-Mateen II as the revenge-seeking Manta; and Nicole Kidman as Arthur’s mom, Atlanna.

Wan’s team behind the scenes included such collaborators as Oscar-nominated director of photography Don Burgess (Forrest Gump), his five-time editor Kirk Morri (The Conjuring), production designer Bill Brzeski (Iron Man 3), visual effects supervisor Kelvin McIlwain (Furious 7) and composer Rupert Gregson-Williams (Wonder Woman).

I spoke with the director about making the film, dealing with all the effects, and his workflow.

Aquaman is definitely not your usual superhero. What was the appeal of doing it? 
I didn’t grow up with Aquaman, but I grew up with other comic books, and I always was well aware of him as he’s iconic. A big part of the appeal for me was he’d never really been done before — not on the big screen and not really on TV. He’s never had the spotlight before. The other big clincher was this gave me the opportunity to do a world-creation film, to build a unique world we’ve never seen before. I loved the idea of creating this big fantasy world underwater.

What sort of film did you set out to make?
Something that was really faithful and respectful to the source material, as I loved the world of the comic book once I dove in. I realized how amazing this world is and how interesting Aquaman is. He’s bi-racial, half-Atlantean, half-human, and he feels he doesn’t really fit in anywhere at the start of the film. But by the end, he realizes he’s the best of both worlds and he embraces that. I loved that. I also loved the fact it takes place in the ocean so I could bring in issues like the environment and how we treat the sea, so I felt it had a lot of very cool things going for it — quite apart from all the great visuals I could picture.

Obviously, you never got the Jim Cameron post-Titanic memo — never, ever shoot in water.
(Laughs) I know, but to do this we unfortunately had to get really wet as over 2/3rds of the film is set underwater. The crazy irony of all this is when people are underwater they don’t look wet. It’s only when you come out of the sea or pool that you’re glossy and dripping.

We did a lot of R&D early on, and decided that shooting underwater looking wet wasn’t the right look anyway, plus they’re superhuman and are able to move in water really fast, like fish, so we adopted the dry-for-wet technique. We used a lot of special rigs for the actors, along with bluescreen, and then combined all that with a ton of VFX for the hair and costumes. Hair is always a big problem underwater, as like clothing it behaves very differently, so we had to do a huge amount of work in post in those areas.

How early on did you start integrating post and all the VFX?
It’s that kind of movie where you have to start post and all the VFX almost before you start production. We did so much prep, just designing all the worlds and figuring out how they’d look, and how the actors would interact with them. We hired an army of very talented concept artists, and I worked very closely with my production designer Bill Brzeski, my DP Don Burgess and my visual effects supervisor Kelvin McIlwain. We went to work on creating the whole look and trying to figure out what we could shoot practically with the actors and stunt guys and what had to be done with VFX. And the VFX were crucial in dealing with the actors, too. If a body didn’t quite look right, they’d just replace them completely, and the only thing we’d keep was the face.

It almost sounds like making an animated film.
You’re right, as over 90% of it was VFX. I joke about it being an animated movie, but it’s not really a joke. It’s no different from, say, a Pixar movie.

Did you do a lot of previs?
A lot, with people like Third Floor, Day For Nite, Halon, Proof and others. We did a lot of storyboards too, as they are quicker if you want to change a camera angle, or whatever, on the fly. Then I’d hand them off to the previs guys and they’d build on those.

What were the main technical challenges in pulling it all together on the shoot?
We shot most of it Down Under, near Brisbane. We used all nine of Village Roadshow Studios’ soundstages, including the new Stage 9, as we had over 50 sets, including the Atlantis Throne Room and Coliseum. The hardest thing in terms of shooting it was just putting all the actors in the rigs for the dry-for-wet sequences; they’re very cumbersome and awkward, and the actors are also in these really outrageous costumes, and it can be quite painful at times for them. So you can’t have them up there too long. That was hard. Then we used a lot of newish technology, like virtual production, for scenes where the actors are, say, riding creatures underwater.

We’d have it hooked up to the cameras so you could frame a shot and actually see the whole environment and the creature the actor is supposed to be on — even though it’s just the actors and bluescreen and the creature is not there. And I could show the actors — look, you’re actually riding a giant shark — and also tell the camera operator to pan left or right. So it was invaluable in letting me adjust performance and camera setups as we shot, and all the actors got an idea of what they were doing and how the VFX would be added later in post. Designing the film was so much fun, but executing it was a pain.

The film was edited by Kirk Morri, who cut Furious 7, and worked with you on the Insidious and The Conjuring films. How did that work?
He wasn’t on set but he’d visit now and again, especially when we were shooting something crazy and it would be cool to actually see it. Then we’d send dailies and he’d start assembling, as we had so much bluescreen and VFX stuff to deal with. I’d hop in for an hour or so at the end of each day’s shoot to go over things as I’m very hands on — so much so that I can drive editors crazy, but Kirk puts up with all that.

I like to get a pretty solid cut from the start. I don’t do rough assemblies. I like to jump straight into the real cut, and that was so important on this because every shot is a VFX shot. So the sooner you can lock the shot, the better, and then the VFX teams can start their work. If you keep changing the cut, then you’ll never get your VFX shots done in time. So we’d put the scene together, then pass it to previs, so you don’t just have actors floating in a bluescreen, but they’re in Atlantis or wherever.

Where did you do the post?
We did most of it back in LA on the Warner lot.

Do you like the post process?
I absolutely love it, and it’s very important to my filmmaking style. For a start, I can never give up editing and tweaking all the VFX shots. They have to pull it away from me, and I’d say that my love of all the elements of the post process — editing, sound design, VFX, music — comes from my career in suspense movies. Getting all the pieces of post right is so crucial to the end result and success of any film. This post was creatively so much fun, but it was long and hard and exhausting.

James Wan

All the VFX must have been a huge challenge.
(Laughs) Yes, as there’s over 2,500 VFX shots and we had everyone working on it — ILM, Scanline, Base, Method, MPC, Weta, Rodeo, Digital Domain, Luma — anyone who had a computer! Every shot had some VFX, even the bar scene where Arthur’s with his dad. That was a set, but the environment outside the window was all VFX.

What was the hardest VFX sequence to do?
The answer is, the whole movie. The trench sequence was hard, but Scanline did a great job. Anything underwater was tough, and then the big final battle was super-difficult, and ILM did all that.

Did the film turn out the way you hoped?
For the most part, but like most directors, I’m never fully satisfied.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Inside the mind and workflow of a 14-year-old filmmaker

By Brady Betzel

From editing to directing, I have always loved how mentoring and teaching is a tradition that lives on in this industry. When I was an assistant editor, my hope was that the editors would let me watch them work, or give me a chance to edit. And a lot of the time I got that opportunity.

Years ago I worked with an editor named Robb McPeters, who edited The Real Housewives of New York City. I helped cut a few scenes, and Robb was kind enough to give me constructive feedback. This was the first time I edited a scene that ran on TV. I was very excited, and very appreciative of his feedback. Taking the time to show younger assistant editors who have their eye on advancement makes you feel good — something I’ve learned firsthand.

As I’ve become a “professional” editor I have been lucky enough to mentor assistant editors, machine room operators, production assistants and anyone else that was interested in learning post. I have found mentoring to be very satisfying, but also integral to the way post functions. Passing on our knowledge helps the community move forward.

Even with a couple of little scenes to cut for Robb, the direction I received helped make me the kind of editor I am today. Throughout the years I was lucky enough to encounter more editors like Robb and took all of the advice I could.

Last year, I heard that Robb’s son, Griffin, had made his first film at 13 years old, Calling The Shots. Then a few months ago I read an article about Griffin making a second film, at 14 years old, The Adventure of T.P. Man and Flusher. Griffin turns 15 in February and hopes to make a film a year until he turns 18.

It makes sense that someone who has been such a good mentor has produced a son with such a passion for filmmaking. I can see the connection between fatherhood and mentorship, especially between an editor and an assistant. And seeing Robb foster his son’s love for filmmaking, I realized I wanted to be able to do that with my sons. That’s when I decided to reach out to find out more.

CAN YOU TALK ABOUT YOUR MOST RECENT FILM?
The Adventure of T.P. Man and Flusher is really a story of adventure, friendship and finding love. After learning that his best friend Jim (Sam Grossinger) has attempted suicide, Tom (Adam Simpson) enlists the help of the neighborhood kingpin, Granddaddy’ (Blake Borders). Their plan is to sneak Jim out of the hospital for one last adventure before his disconnected parents move him off to Memphis. On the way they encounter a washed up ‘90s boy-band star and try to win the hearts of their dream girls.

Tom realizes that this adventure will not fix his friend, but their last night together does evolve into the most defining experience of their lives.

HOW DID YOU COME UP WITH THE IDEA FOR THIS FILM?
The Adventure of T.P. Man and Flusher is a feature film that I wrote while in 8th grade. I saved every penny I could earn and then begged my parents to let me use money from my college savings. They knew how important this film was to me so they agreed. This is my second feature and I wanted to do everything better, starting with the script to casting. I was able to cast professional actors and some of my schoolmates.

I shot in 4K UHD using my Sony A7riii. I then brought the footage into the iMac and transcoded into CineForm 720p files. This allowed me to natively edit them on the family iMac in Adobe Premiere. We have a cabin in Humboldt County, which is where I assemble my rough cuts.

I spent hours and hours this summer in my grandfather’s workshop editing the footage. Day after day my mom and sister would go swimming at the river, pick berries, all the lazy summer day stuff and I would walk down to the shop to cut, so that I could finish a version of my scene.

Once I finished my director’s cut, I would show the assembly to my parents, and they would start giving me ideas on what was working and what wasn’t. I am currently polishing the movie, adding visual effects (in After Effects), sound design, and doing a color grade in Adobe SpeedGrade. I’ll also add the final 5.1 surround sound mix in Adobe Audition to deliver for distribution.

WHERE DID YOU GET THE IDEA FOR THE FILM?
In 8th grade, a classmate attempted suicide and it affected me very deeply. I wondered if other kids were having this type of depression. After doing some research I realized that many kids suffer from deep depression. In fact, in 2016, adolescents and young adults aged 15 to 24 had a suicide rate of 13.15. That amazed and saddened me. I felt that I had to do something about it. I took my ideas and headed to our cabin in the woods to write the script over my winter break.

I was so obsessed with this story that I wrote a 120-page script.

CAN YOU TALK ABOUT PRODUCING?
It was a lot of scheduling, scheduling and scheduling. Locking locations, permits, insurance, and did I mention scheduling?

I think there was some begging in there too. “Please let us use. Please can we…” My school SCVi was extremely helpful with getting me insurance. It was heartwarming to see how many people wanted to help. Even support from companies, including Wooden Nickel who donated an entire lighting package.

WHAT ABOUT AS A DIRECTOR?
As the director I really wanted to push the fantastical and sometimes dark and lonely world these characters were living in. Of course, because I wrote the script I already had an idea of what I wanted to capture in the scene, but I put it to paper with shotlist’s and overhead camera placements. That way I had a visual reference to show of how I wanted to film from day one to the end.

Rehearsals with the actors were key with such a tight shooting schedule. Right from the start the cast responded to me as their director, which surprised me because I had just turned 14. Every question came to me for approval to represent my vision.

My dad was on set as my cinematographer, supporting me every step of the way. We have a great way of communicating. Most of the time we were on the same page, but if we were not, he deferred to me. I took my hits when I was wrong and then learned from them.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT MAKING THIS FILM?
This was a true, small-budget, independent film that I made at 14 years old. Our production office was my mom and dad and myself. Three people usually don’t make films. Even though I am young, my parents trusted the weight of the film to me. It is my film. This means I did a little of everything all of the time, from pulling costumes to stocking the make-up kit to building my own 4K editing system.

We had no grips, no electric, no PAs. If we needed water or craft service, it was me, my dad and my mom. If a scene needed to be lit, my dad and I lit everything ourselves, we were the last ones loading costumes, extension cords and equipment. In post was all the same ordeal.

WHAT WAS YOUR FAVORITE PART?
I really love everything about filmmaking. I love crafting a story, having to plan and think of how to capture a scene. How show something that isn’t necessarily in front of your eyes. I love talking out my ideas. My mom teases me that I even sleep moviemaking because she saw me in the hall going to the bathroom the other night and I mumbled, “Slow pan on Griffin going to bathroom.”

But post is really where the movie comes together. I like seeing what works for a scene. Which reaction is better? What music or sound effects help tell the story? Music design is also very personal to me. I listen to songs for hours to find the perfect one for a scene.

WHAT’S YOUR LEAST FAVORITE?
Having to cut some really great scenes that I know an actor is looking forward to seeing in that first screening. It is a really hard decision to remove good work. I even cut my grandmother from my first film. Now that’s hard!

WHAT CAMERAS AND PRODUCTION EQUIPMENT DO YOU USE?
For recording I use the Sony A7rIII with various lenses recording to a Ninja Flame at 10-bit 4K. For sound I use a Røde NG2 boom and three lav mics. For lighting we used a few Aputure LED lights and a Mole Richardson 2k Baby Junior.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I am not much of a night person. I get really tired around 9:30pm. In fact, I still have a bedtime of 10:00pm. I would say my best work is done at the time I have after school until my bedtime. I edit every chance I get. I do have to break for dinner and might watch one half of a episode of The Office. Other than that I am in the bay from 3:30-10:00pm every day.

CAN YOU THINK OF ANOTHER JOB YOU MIGHT WANT SOMEDAY?
No, not really. I enjoy taking people on emotional rides, creating a presentation that evokes personal feelings and using visuals to takes my audience somewhere else. With all that said, if I couldn’t do this I would probably build professional haunted houses. Is that a real job?

IT’S STILL VERY EARLY, BUT HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
My parents have this video of me reaching for the camera on the way to my first day of pre-school saying, “I want the camera, I want to shoot.”

When I was younger, silent films mesmerized me. I grew up wanting to be Buster Keaton. The defining moment was seeing Jaws. I watched it at five and then realized what being a filmmaker was, making a mosaic of images (as mentioned by Hitchcock on editing). I began trying to create. At 11 and 12 I made shorts, at 13 I made my first full-length feature film. The stress and hard work did not even faze me; I was excited by it.

CAN YOU TALK ABOUT YOUR FIRST FILM?
Calling the Shots, which is now available on Amazon Prime, was an experiment to see if I could make a full-length film. A test flight, if you will. With T.P. Man I really got to step behind the camera and an entirely different side of directing I didn’t get to experience with my first film since I was the lead actor in that.

I also love the fact that all the music and sound design and graphics were done with my hands and alone, most the time, in my editing suite. My dad designed it for me. I have two editing systems that I bounce back and forth between. I can set the lighting in the room, watch on a big 4K monitor and mix in 5.1 surround. Some kids have tree forts. I have my editing bay.

FINALLY, DO YOU GET STRESSED OUT FROM THE PROCESS?
I don’t allow myself to stress out about any of these things. The way I look at it is that I have a very fun and hard job. I try to keep things in perspective — there are no lives in danger here. I do my best work when I am relaxed. But, if there is a time, I walk away, take a bike ride or watch a movie. Watching others work inspires me to make my movies better.

Most importantly, I brainstorm about my next project. This helps me keep a perspective that this project will soon be over and I should enjoy it while I can and make it the best I possibly can.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

DP Chat: No Activity cinematographer Judd Overton

By Randi Altman

Judd Overton, who grew up in the Australian Outback, knew he wanted to be a DP before he even knew exactly what that was, spending a lot of his time watching and re-watching movies on VHS tapes. When he was young, a documentary film crew came to his town. “I watched as the camera operator was hanging off the side of my motorbike filming as we charged over sand dunes. I thought that was a pretty cool job!”

No Activity

The rest, as they say, is history. Overton’s recent work includes the Netflix comedy series The Letdown and No Activity, which is a remake of the Australian comedy series of the same name. It stars Patrick Brammall and Tim Meadows and is produced by CBS Television Studios in association with Funny or Die, Jungle and Gary Sanchez Productions. It streams on CBS All Access.

We recently reached out to Overton, who also just completed the documentary Lessons from Joan, about one of the first female British theater directors, Joan Littlewood.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
What I love about what I do is being able to see things, and show the world to audiences in a way people haven’t seen before. I always keep abreast of technology, but for me the technology really needs to service the story. I choose particular equipment in order to capture the emotion of the piece.

What new technology has changed the way you work (looking back over the past few years?
The greatest change in my world is the high-quality, high-ISO cameras now on the market. This has meant being able to shoot in a much less obtrusive way, shooting and lighting to create footage that is far closer to reality.

The use of great-quality LED lighting is something I’m really enjoying. The ability to create and capture any color and control it from your iPhone opens the floodgates for some really creative lighting.

 

Judd Overton

Can you describe your ideal collaboration with the director when setting the look of a project?
Every director is different, it’s a role and relationship I fill as required. Some directors like to operate the camera themselves. In that case, I oversee the lighting. Some directors just want to work with the actors, so my job then involves more responsibilities for coverage, camera movement and selecting locations.

I try to be open to each new experience and make creative guidelines for a project in collaboration with the director and producers, trying to preempt obstacles before they strike.

Tell us about the CBS All Access show No Activity. Can you describe the overall look of the show and what you and the director/producers wanted to achieve?
I shot the pilot for the original No Activity five years ago. Trent O’Donnell (writer/director, co-creator) wanted to make a series out of simple two hander (two actor) scenes.

We decided to use the police procedural drama genre because we knew the audience would fill in gaps with their own knowledge. In a show where very little happens, the mood and style become far more important.

How early did you get involved in the production?
I’ve been involved since the show was conceptualized. We shot the pilot in a parking lot in one of Sydney’s seedier areas. We fought off a lot of rats.

No Activity

How did you go about choosing the right camera and lenses for this project?
I had to shoot three cameras, as the show is heavily improvised. Other than my main cameras with zoom lenses, I chose the best cameras for each sequence. We used Blackmagic cameras Ursa Pro and Micro for a lot of our rigged positions. I also used Panasonic cameras for our available light work, and even an Arri 65 for some projection plates.

Were there any scenes that you are particularly proud of?
The scene I had the most fun with was the siege, which plays over the last two episodes of Season 2. We dusted off and fired up two 1930s Arc lights. Carbon Arc lights are what all the old Hollywood films used before HMIs. They are a true 5600 Kelvin, daylight source.

My gaffer’s father actually made these units, and they were refurbished for Quentin Tarantino’s film Once Upon a Time in Hollywood. We used them as searchlights for our nighttime siege, and the bright beams and plumes of smoke rising really gave the scene an epic scale.

What’s your go-to gear — things you can’t live without?
Communication is everything, and the latest toy in my toy box is HME headsets. They allow me to have constant communications with my camera operators, grips and electrics, essential when you’re running five cameras across multiple units.

Director Barry Jenkins on latest, If Beale Street Could Talk

By Iain Blair

If they handed out Oscars for shots of curling cigarette smoke, Barry Jenkins’ follow-up to his Oscar-winning Moonlight would win hands down. If Beale Street Could Talk looks certain to be an awards show darling, already picking up three Golden Globe nods — Best Drama Motion Picture, Best Screenplay for Jenkins and Best Supporting Actress for Regina King.

Based on the 1974 novel by writer and civil rights activist James Baldwin, it tells the story of a young black couple — Clementine “Tish” Rivers (KiKi Layne) and Alonzo “Fonny” Hunt (Stephan James) — who grow up together in Harlem and get engaged. But their romantic dreams soon begin to dissolve under the harsh glare of white authority and racism when Fonny is falsely accused of rape and thrown in jail, just as Tish realizes she is pregnant with their child.

While the couple is the focus of the film, the family drama also features a large ensemble cast that includes King as Tish’s mother and Colman Domingo as her father, along with Michael Beach, Brian Tyree Henry, Diego Luna, Pedro Pascal and Dave Franco.

Behind the camera, Jenkins reteamed with Moonlight cinematographer James Laxton, editors Nat Sanders and Joi McMillion, and composer Nick Britell.

I spoke with Jenkins about making the film and workflow.

Our writer Iain Blair with Barry Jenkins

It’s always a challenge to adapt an acclaimed novel for the screen. How tough was this one?
It was extremely tough, especially since I love James Baldwin so much. Every step of the way you’re deciding at which point you have to be completely faithful to the material and then where it’s OK to break away from the text and make it your own for the movie version.

I first read the novel around 2010, and in 2013 I went to Europe to get away and write the screenplay. I also wrote one for Moonlight, which then ended up happening first. This was a harder project to get made. Moonlight was smaller and more controllable. And this is told from a female’s perspective, so there were a lot of challenges.

What sort of film did you set out to make?
I wanted to take the energy of the novel and its lush romantic sensuality, and then pair it with the more biting, bitter social commentary of Baldwin’s non-fiction work. I see film as a very malleable art form, and I felt I could build it. So at times it could be extremely lush and beautiful — even distractingly so — but then it could turn very dark and angry, and contain all of that.

The film was shot by your go-to cinematographer James Laxton. Talk about the look you wanted and how you got it.
There are a lot of cinema references in Moonlight, but we couldn’t find many for this period set in this sort of neighborhood. There are nods to great directors and stylists, like Douglas Sirk and Hou Hsiao-hsien, but we ended up paying more attention to stills. We studied the work of the great photographers Roy DeCarava and Gordon Parks. I wanted it to look lush and beautiful.

You shot on location, and it’s a period piece. How hard was that?
It was pretty challenging because I’m the kind of guy — and James is too — where we like to have the freedom to point the camera anywhere and just shoot. But when you’re making a period film in New York, which is changing so fast every damn day, you just don’t have that freedom. So it was very constricting, and our production designer Mark Friedberg had to be very inventive and diligent about all the design.

Where did you post?
We split it between New York and partly in LA. We cut the whole film here in LA at this little place in Silverlake called Fancy Post, and did all the sound mix at Formosa. Then we moved to New York since the composer lives there, and we did the DI at Technicolor PostWorks in New York with colorist Alex Bickel, who did Moonlight. We spent a lot of time getting the look just right — all the soft colors. We chose to shoot on the Alexa 65, which is unusual for a small drama, but we loved the intimacy it gave us.

You reteamed with your go-to editors Nat Sanders, who’s cut all three of your films, and Joi McMillion, who cut Moonlight with Nat. Tell us how it worked this time.
Fancy Post is essentially a house, so they each had their own bedroom, and I’d come in each day and check on their progress. Both of them were at film school with me, and we all work really well together, and I love the editing process.

Can you talk about the importance of music and sound in the film?
Sound has always been so important to me, ever since film school. One of my professors there was Richard Portman, who really developed the overlapping, multi-track technique with Robert Altman.  I’ll always remember one of the first things he said to us about the importance of sound: a movie is 50 percent image and 50 percent sound, not ninety-five percent image and five percent sound. So that’s how I approach it.

We had a fantastic sound team: supervising sound editor Onnalee Blank and re-recording mixer Matt Waters. They usually do these huge projects with dragons and so on, like Game of Thrones, but they also do small dramas like this. They came on very late, but did incredible, really detailed work with all the dialogue. And there’s a lot of dialogue and conversation, most of it in interiors, and then there’s the whole soundscape that they built up layer by layer, which takes us back in time to the 1970s. They mixed all the dialogue so it comes from the front of the room, but we also created what we called “the voice of God” for all of Tish’s voiceovers.

 

In this story she really functions as the voice of James Baldwin, and while the voiceovers are in her head, we surround the audience with them. That was the approach. Just as with Moonlight, I feel that a film’s soundscape is beholden to the mental states and consciousness of the main characters, and not necessarily to a genre or story form. So in this, composer Nick Britell and I both felt that the sound of the film is paced by how Tish and Fonny are feeling. That opened it up in so many ways. Initially, we thought we’d have a pure jazz score, since it suited the era and location, but as we watched the actors working it evolved into this jazz chamber orchestra kind of thing.

This is obviously not a VFX-driven piece, but the VFX must have played a role in the final look. What was involved?
Crafty Apes in LA and Phosphene and Significant Others in New York did it all, and we had some period stuff, clean up and some augmentation, but we didn’t use any greenscreens on set. The big thing was that New York in the ‘70s was much grittier and dirtier, so all the graffiti on the subway cars was VFX. I hadn’t really worked much with visual effects before, but I loved it

There’s been so much talk in Hollywood about the lack of diversity — in front of and behind the camera. Do you see much improvement since we last spoke?
Well, look at all the diverse films out last year and now this year — Green Book, The Hate U Give, Black Panther, Widows, BlacKkKlansman — with black directors and casts. So there has been change, and I think Moonlight was part of a wave, increasing visibility around this issue. There’s more accountability now, and we’re in the middle of a cycle that is continuing. Change is a direction, not a destination.

Barry Jenkins on set.

We’re heading into awards season. How important are they for a film like this?
Super important. Look, Moonlight would not have had the commercial success it had if it hadn’t been for all the awards attention and talk.

Where do you keep your Oscar?
I used to keep it on the floor behind my couch, but I got so much shit about keeping it hidden that now it sits up high on a speaker. I’m very proud of it.

What’s next?
I’m getting into TV. I’m doing a limited series for Amazon called The Underground Railroad, and we’re in pre-production. I’ve got a movie thing on the horizon, but my focus is on this right now.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Review: GoPro Hero 7 Black action camera

By Brady Betzel

Every year GoPro offers a new iteration of its camera. One of the biggest past upgrades was from the Hero 4 to the Hero 5, with an updated body style, waterproofing without needing external housing and minimal stabilization. That was one of the biggest… until now.

The Hero 7 Black is by far the best upgrade GoPro users have seen, especially if you are sitting on a Hero 5 or earlier. I’ll tell you up front that the built-in stabilization (called Hypersmooth) alone is worth the Hero 7 Black’s $399 price tag, but there are a ton of other features that have been upgraded and improved.

There are three versions of the Hero 7: Black for $399, Silver for $299 and White for $199. The White is the lowest priced Hero 7 and includes features like 1080p @ 60fps video recording, a built-in battery, waterproofing to 33 feet-deep without extra housing, standard video stabilization, 2x slow-mo (1440p/1080p @ 60fps), video recording up to 40Mb/s (1440p), two-mic audio recording, 10MP Photos, and 15/1 burst photos. After reading that you can surmise that the Hero 7 White is as basic as it gets, GoPro even skipped 24fps video recording, ProTune and a front LCD display. But that doesn’t mean the Hero 7 White is a throwaway; what I love about the latest update to the Hero line is the simplicity in operating the menus. In previous generations, the GoPro Hero menus were difficult to use and would often cause me to fumble shots. The Hero 7 menu has been streamlined for a much more simple mode selection process, making the Hero 7 White a basic and relatively affordable waterproof GoPro.

The Hero 7 Silver can be purchased for $299 and has everything the Hero 7 White has, plus some extras, including 4K video recording at 30fps up to 60MB/s, 10MP photos with wide dynamic range to bring out details in the highlights and shadows and a GPS location to show you where your videos and photos were taken. .

The Hero 7 Black
The Hero 7 Black is the big gun in the GoPro Hero 7 lineup. For anyone who wants to shoot multiple frame rates; harness a flat picture profile using ProTune to have extended range when color correcting; record ultra-smooth video without an external gimbal and no post processing; or shoot RAW photos, the Hero 7 Black is for you.

The Hero 7 Black has all of the features of the White and Silver plus a bunch more, including the front-facing LCD display. One of the biggest still-photo upgrades is the ability to shoot 12MP photos with SuperPhoto. SuperPhoto is essentially a “make my image look like the GoPro photos on Instagram” look. It’s an auto-image processor that will turn good photos into awesome photos. Essentially it’s an HDR mode that gives as much latitude in the shadows and highlights as well as noise reduction.
Beyond the SuperPhoto, the Hero 7 has burst rates from 3/1 up to 30/1, a timelapse photo function with intervals ranging from .5 seconds to 60 seconds; the ability to shoot RAW photos in GPR format alongside JPG; the ability to shoot video in 4K at 60fps, 30fps and 24fps in wide mode, as well as 30 and 24fps in SuperView mode (essentially ultra-wide angle); 2.7K wide video up to 120fps and down to 24fps in linear view (no wide-angle warping) all the way down to 720p in wide at 240fps. s.

The Hero 7 records in both MP4 H.264/AVC and H.265/HEVC formats at up to 78MB/s (4K). The Hero 7 Black has a bunch of additional modes including Night Photo; Looping; Timelapse Photo; Timelapse Video; Night Lapse Photo; 8x Slow Mo and Hypersmooth stabilization. It has Wake on Voice commands, as well as live streaming to Facebook Live, Twitch, Vimeo and YouTube. It also features Timewarp video (I will talk more about later); a GP1 processor created by GoPro; advanced metadata that the GoPro app uses to create videos of just the good parts (like smiling photos); ProTune; Karma compatibility; dive-housing compatibility; three-mic stereo audio; RAW audio captured in WAV format; the ability to plug in an external mic with the optional 3.5mm audio mic in cable; and HDMI video output with a micro HDMI cable.

I really love the GoPro Hero 7 and consider it a must-buy if you are on the edge about upgrading an older GoPro camera.

Out of the Box
When I opened the GoPro Hero7 Black I was immediately relieved that it was the same dimensions as the Hero 5 and 6, since I have access to the GoPro Karma drone, Karma gimbal and various accessories. (As a side note, the Hero 7 White and Silver are not compatible with the Karma Drone or Gimbal.) I quickly plugged in the Hero 7 Black to charge it, which only took half an hour. When fully drained the Hero 7 takes a little under two hours to charge.

I was excited to try the new built-in stabilization feature Hypersmooth, as well as the new stabilized in-camera timelapse creator, TimeWarp. I received the Hero 7 Black around Halloween so I took it to an event called “Nights of the Jack” at King Gillette Ranch in Calabasas, California, near Malibu. It took place after dark and featured lit-up jack-o-lanterns, so I figured I could test out the TimeWarp, Hypersmooth and low-light capabilities in one fell swoop.

It was really incredible. I used a clamp mount to hold it onto the kids’ wagon and just hit record. When I stopped recording, the GoPro finished processing the TimeWarp video and I was ready to view it or share it. Overall, the quality of video and the low-light recording were pretty good — not great but good. You can check out the video on YouTube.

The stabilization was mind blowing, especially considering it is electronic image stabilization (EIS), which is software-based, not optical, which is hardware-based. Hardware-based stabilization is typically preferred to software-based stabilization, but GoPro’s EIS is incredible. For most shooting scenarios, the built-in stabilization will be amazing — everyone who watches your clips will think that you are using a hardware gimbal. It’s that good.

The Hero 7 Black has a few options for TimeWarp mode to keep the video length down — you can choose different speeds: 2x, 5x, 10x, 15x, and 30x. For example, 2x will take one minute of footage and turn it into 30 seconds, and 30x will take five minutes of footage and turn it into 10 seconds. Think of TimeWarp as a stabilized timelapse. In terms of resolution, you can choose from 16:9 or 4:3 aspect ratio; 4K, 1440p or 1080p. I always default to 1080 if posting on Instagram or Twitter, since you can’t really see what the 4K difference, and it saves all my data bits and bytes for better image fidelity.

If you’re wondering why you would use TimeWarp over Timelapse, there are a couple of differences. Timewarp will create a smooth video when walking, riding a bike or generally moving around because of the Hypersmooth stabilization. Timelapse will act more like a camera taking pictures at a certain interval to show a passage of time (say from day to night) and will playback a little more choppy. Check out a sample day-to-night timelapse I filmed using the Hero 7 Black set to Timelapse on YouTube.

So beyond the TimeWarp what else is different? Well, just plain shooting 4K at 60fps — you now have the ability to enable the EIS stabilization where you couldn’t on the GoPro Hero 6 Black. It’s a giant benefit for anyone shooting 4K in the palm of their hands and wanting to even slow their 4K down by 50% and retain smooth motion with stabilization already done in-camera. This is a huge perk in my mind. The image processing is very close to what the Hero 6 produces and quite a bit better than the what the Hero 5 produces.

When taking still images, the low-light ability is pretty incredible. With the new Superphoto setting you can get that signature high saturation and contrast with noise reduction. It’s a great setting, although I noticed the subject in focus cannot be moving too fast or you will get some purple fringing. When used under the correct circumstances, the Superphoto is the next iteration of HDR.

I was surprised how much I used the GoPro Hero 7 Black’s auto-rotating menu feature when the camera was held vertically. The Hero 6 could shoot vertically but with the addition of the auto-rotation of the menu, the Hero 7 Black encourages more vertically photos and videos. I found myself taking more vertical photos, especially outdoors — getting a lot more sky in the shots, which adds an interesting perspective.

Summing Up
In the end, the GoPro Hero 7 Black is a must-buy if you are looking for the latest and greatest action-cam or are on the fence about upgrading from the Hero 5 or 6. The Hypersmooth video stabilization is incredible. If you want to take it a step further, combining it with a Karma gimbal will give you a silky smooth shot.

I really fell in love with the TimeWarp function, whether you are a prosumer filming your family at Disneyland or shooting a show in the forest, a quick TimeWarp is a great way to film some dynamic b-roll without any post processing.

Don’t forget the Hero 7 Black has voice control for hands-free operation. On the outside,the Hero 7 Black is actually black in color unlike the Hero 6 (which is a gray) and also has the number “7” labeled on it for easy finding in your case.

I would really love for GoPro to make these cameras charge wirelessly on a mat like my Galaxy phone. It seems like the GoPro action-cameras would be great to just throw on a wireless charger and also use the charger as a file-transfer station. It gets cumbersome to remove a bunch of tiny memory cards or use a bunch of cables to connect your cameras, so why not make it wireless?! I’m sure they are thinking of things like that, because focusing on stabilization was the right move in my opinion.

If GoPro can continue to make focused and powerful updates to their cameras, they will be here for a long time — and the Hero 7 is the right way to start.

Check out GoPro’s website for more info, including accessories like the Travel Kit, which features a little mini tripod/handle (called “Shorty”), a rubberized cover with a lanyard and a case for $59.99.

If you need the ultimate protection for your GoPro Hero 7 Black, look into GoPro Plus, which, for $4.99 a month, gives you VIP support; automatic cloud backup, access for editing on your phone from anywhere and camera replacement for up to two cameras per year of the same model, no questions asked, when something goes wrong. Compare all the new GoPro Hero 7 Models on their website website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Director Peter Farrelly gets serious with Green Book

By Iain Blair

Director, producer and writer Peter Farrelly is best known for the classic comedies he made with his brother Bob: Dumb and Dumber; There’s Something About Mary; Shallow Hal; Me, Myself & Irene; The Three Stooges; and Fever Pitch. But for all their over-the-top, raunchy and boundary-pushing comedy, those movies were always sweet-natured at heart.

Peter Farrelly

Now Farrelly has taken his gift for heartfelt comedy and put his stamp on a very different kind of film, Green Book, a racially charged feel-good drama inspired by a true friendship that transcended race, class and the 1962 Mason-Dixon line.

Starring Oscar-nominee Viggo Mortensen and Oscar-winner Mahershala Ali, it tells the fact-based story of the ultimate odd couple: Tony Lip, a bouncer from The Bronx and Dr. Don Shirley, a world-class black pianist. Lip is hired to drive and protect the worldly and sophisticated Shirley during a 1962 concert tour from Manhattan to the Deep South, where they must rely on the titular “Green Book” — a travel guide to safe lodging, dining and business options for African-Americans during the segregation era.

Set against the backdrop of a country grappling with the valor and volatility of the civil rights movement, the two men are confronted with racism and danger as they challenge long-held assumptions, push past their seemingly insurmountable differences and embrace their shared humanity.

The film also features Linda Cardellini as Tony Vallelonga’s wife, Dolores, along with Dimiter D. Marinov and Mike Hatton as two-thirds of The Don Shirley Trio. The film was co-written by Farrelly, Nick Vallelonga and Brian Currie and reunites Farrelly with editor Patrick J. Don Vito, with whom he worked on the Movie 43 segment “The Pitch.” Farrelly also collaborated for the first-time with cinematographer Sean Porter (read our interview with him), production designer Tim Galvin and composer Kris Bowers.

I spoke with Farrelly about making the film, his workflow and the upcoming awards season. After its Toronto People’s Choice win and Golden Globe nominations (Best Director, Best Musical or Comedy Motion Picture, Best Screenplay, Best Actor for Mortensen, Best Supporting Actor for Ali), Green Book looks like a very strong Oscar contender.

You told me years ago that you’d love to do a more dramatic film at some point. Was this a big stretch for you?
Not so much, to be honest. People have said to me, “It must have been hard,” but the hardest film I ever made was The Three Stooges… for a bunch of reasons. True, this was a bit of a departure for me in terms of tone, and I definitely didn’t want it to get too jokey — I tend to get jokey so it could easily have gone like that.  But right from the start we were very clear that the comedy would come naturally from the characters and how they interacted and spoke and moved, and so on, not from jokes.

So a lot of the comedy is quite nuanced, and in the scene where Tony starts talking about “the orphans” and Don explains that it’s actually about the opera Orpheus, Viggo has this great reaction and look that wasn’t in the script, and it’s much funnier than any joke we could have made there.

What sort of film did you set out to make?
A drama about race and race relations set in a time when it was very fraught, with light moments and a hopeful, uplifting ending.

It has some very timely themes. Was that part of the appeal?
Absolutely. I knew that it would resonate today, although I wish it didn’t. What really hooked me was their common ground. They really are this odd couple who couldn’t be more different — an uneducated, somewhat racist Italian bouncer, and this refined, highly educated, highly cultured doctor and classically trained pianist. They end up spending all this time together in a car on tour, and teach each other so much along the way. And at the end, you know they’ll be friends for life.

Obviously, casting the right lead actors was crucial. What did Viggo and Mahershala bring to the roles?
Well, for a start they’re two of the greatest actors in the world, and when we were shooting this I felt like an observer. Usually, I can see a lot of the actor in the role, but they both disappeared totally into these characters — but not in some method-y way where they were staying in character all the time, on and off the set. They just became these people, and Viggo couldn’t be less like Tony Lip in real life, and the same with Mahershala and Don. They both worked so hard behind the scenes, and I got a call from Steven Spielberg when he first saw it, and he told me, “This is the best buddy movie since Butch Cassidy and the Sundance Kid,” and he’s right.

It’s a road picture, but didn’t you end up shooting it all in and around New Orleans?
Yes, we did everything there apart from one day in northern New Jersey to get the fall foliage, and a day of exteriors in New York City with Viggo for all the street scenes. Louisiana has everything, from rolling hills to flats. We also found all the venues and clubs they play in, along with mansions and different looks that could double for places like Pennsylvania, Ohio, Indiana, Iowa, Missouri, Kentucky, Tennessee, as well as Carolinas and the Deep South.

We shot for just 35 days, and Louisiana has great and very experienced crews, so we were able to work pretty fast. Then for scenes like Carnegie Hall, we used CGI in post, done by Pixel Magic, and we were also amazingly lucky when it came to the snow scenes set in Maryland at the end. We were all ready to use fake snow when it actually started snowing and sticking. We got a good three, four inches, which they told us hadn’t happened in a decade or two down there.

Where did you post?
We did most of the editing at my home in Ojai, and the sound at Fotokem, where we also did the DI with colorist Walter Volpatto.

Do you like the post process?
I love it. My favorite part of filmmaking is the editing. Writing is the hardest part, pulling the script together. And I always have fun on the shoot, but you’re always having to make sure you don’t screw up the script. So when you get to the edit and post, all the hard work is done in that sense, and you have the joy of watching the movie find its shape as you cut and add in the sound and music.

What were the big editing challenges, given there’s such a mix of comedy and drama?
Finding that balance was the key, but this film actually came together so easily in the edit compared with some of the movies I’ve done. I’ll never forget seeing the first assembly of There’s Something About Mary, which I thought was so bad it made me want to vomit! But this just flowed, and Patrick did a beautiful job.

Can you talk about the importance of music and sound in the film.
It was a huge part of the film and we had a really amazing pianist and composer in Kris Bowers, who worked a lot with Mahershala to make his performance as a musician as authentic as possible. And it wasn’t just the piano playing — Mahershala told me right at the start, “I want to know just how a pianist sits at the piano, how he moves.” So he was totally committed to all the details of the role. Then there’s all the radio music, and I didn’t want to use all the obvious, usual stuff for the period, so we searched out other great, but lesser-known songs. We had great music supervisors, Tom Wolfe and Manish Raval, and a great sound team.

We’re already heading into the awards season. How important are awards to you and this film?
Very important. I love the buzz about it because that gets people out to see it. When we first tested it, we got 100%, and the studio didn’t quite believe it. So we tested again, with “a tougher” audience, and got 98%. But it’s a small film. Everyone took pay cuts to make it, as the budget was so low, but I’m very proud of the way it turned out.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Red upgrades line with DSMC2 Dragon-X 5K S35 camera

Red Digital Cinema has further simplified its product line with the DSMC2 Dragon X 5K S35 camera. Red also announced the DSMC2 Production Module and DSMC2 Production Kit, which are coming in early 2019. More on that in a bit.

The DSMC2 Dragon-X camera uses the Dragon sensor technology found in many of Red’s legacy cameras with an evolved sensor board to enable Red’s enhanced image processing pipeline (IPP2) in camera.

In addition to IPP2, the Dragon-X provides 16.5 stops of dynamic range, as well as 5K resolution up to 96fps in full format and 120fps at 5K 2.4:1. Consistent with the rest of Red’s DSMC2 line-up, Dragon-X offers 300MB/s data transfer speeds and simultaneous recording of Redcode RAW and Apple ProRes or Avid DNxHD/HR.

The new DSMC2 Dragon-X is priced at $14,950 and is also available as a fully-configured kit priced at $19,950. The kit includes: 480GB Red Mini-Mag; Canon lens mount; Red DSMC2 Touch LCD 4.7-inch monitor; Red DSMC2 outrigger handle; Red V-Lock I/O expander; two IDX DUO-C98 batteries with VL-2X charger; G-Technology ev Series Red Mini-Mag reader; Sigma 18-35mm F1.8 DC HSM art lens; Nanuk heavy-duty camera case.

Both the camera and kit are available now at red.com or through Red’s authorized dealers.

Red also announced the new DSMC2 Production Module. Designed for pro shooting configurations, this accessory mounts directly to the DSMC2 camera body and incorporates an industry standard V-Lock mount with integrated battery mount and P-Tap for 12V accessories. The module delivers a comprehensive array of video, XLR audio, power and communication connections, including support for 3-pin 24V accessories. It has a smaller form factor and is more lightweight than Red’s RedVolt Expander with a battery module.

The DSMC2 Production Module is available to order for$4,750 and is expected to ship in early 2019. It will also be available as a DSMC2 Production Kit that will include the DSMC2 Production Module and DSMC2 production top plate. The DSMC2 Production Kit is also available for order for $6,500 and is expected to ship in early 2019.

Scarlet-W owners can upgrade to DSMC2 Dragon-X for $4,950 through Red authorized dealers or directly from Red.

DP Chat: Green Book’s Sean Porter

Sean Porter has worked as a cinematographer on features, documentaries, short films and commercials. He was nominated for a Film Independent Spirit Award for Best Cinematography for his work on It Felt Like Love, and his credits include 20th Century Women, Green Room, Rough Night and Kumiko, the Treasure Hunter.

His most recent collaboration was with director Peter Farrelly on Green Book, which is currently in theaters. Set in 1962, the film follows Italian-American bouncer/bodyguard Tony Lip (Academy Award-nominee Viggo Mortensen) and world-class black pianist Dr. Don Shirley (Academy Award-winner Mahershala Ali) on a concert tour from Manhattan to the Deep South. They must rely on “The Green Book” to guide them to the few establishments that were then safe for African-Americans. Confronted with racism and danger — as well as unexpected humanity and humor — they are forced to set aside differences to survive and thrive on the journey of a lifetime.

Green Book director Peter Farrelly (blue windbreaker) with DP Sean Porter (right, brown jacket).

Porter chose the Alexa Mini mounted with Leica Summilux-C lenses to devise the look for “Green Book.” End-to-end post services were provided by FotoKem, from dailies at their New Orleans site to final color and deliverables at Burbank.

We spoke to him recently about his rise to director of photography and his work on Green Book:

How did you become interested in cinematography?
My relationship with cinematography, and really filmmaking, developed over many years during my childhood. I didn’t study fine art or photography in school, but discovered it later as many others do. I went in through the front door when I was probably 12 or so, and it’s been a long road.

I’m the oldest of four — two brothers and a sister. We grew up in a small town about an hour outside of Seattle, we had a modest yard that butted up to the “back woods.” It was an event when the neighborhood kids got on bikes and road a half mile or so to the only small convenience store around. There wasn’t much to do there, so we naturally had to be pretty inventive in our play. We’d come home from school, put on the TV and at the time Movie Magic was airing on The Discovery Channel. I think that show honestly was a huge inspiration, not only to me but to my brothers as well, who are also visual artists. It was right before Jurassic Park changed the SFX landscape — it was a time when everything was still done photographically, by hand. There were episodes showing how these films achieved all sorts of amazing images using rather practical tools and old school artistry.

My dad was always keen on technology and he had various camcorders throughout the years, beginning with the VHS back when the recorder had to be carried separately. As the cameras became more compact and easier to use, my brothers and I would make all kinds of films, trying to emulate what we had seen on the show. We were experimenting with high-level concepts at a very young age, like forced perspective, matte paintings, miniatures (with our “giant” cat as the monster) and stop motion.

I picked up the technology bug and by the time I was in middle school I was using our family’s first PC to render chromakeys — well before I had access to NLEs. I was conning my teachers into letting me produce “video” essays instead of writing them. Later we moved closer to Seattle and I was able to take vocational programs in media production and went on to do film theory and experimental video at the University of Washington, where I think I started distilling my focus as a cinematographer.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
As I mentioned earlier, I didn’t discover film via fine art or photography, so I didn’t have that foundation of image making and color theory. I learned it all just by doing and paying attention to what I responded to. I didn’t have famous artists to lean on. You could say it was much more grassroots. My family was a lover of popular films, especially American comedies and action adventure. We watched things like Spies like Us, Star Wars, Indiana Jones and The Princess Bride. It was all pure entertainment, of course. I wasn’t introduced to Bergman or Fellini until much, much later. As we got older, my film language expanded and I started watching films by Lynch and Fincher. I will say that those popular ‘90s films had a great combination of efficient storytelling and technical craft that I still resonate with to this day. It’s very much a sort of “blue-collar” film language.

Staying on top of the technology oscillates between an uncontrollable obsession and an unbearable chore. I’ve noticed over the years that I’m becoming less and less invigorated by the tech — many of the new tools are invaluable, but I love relying on my team to filter out the good from the hype so I can focus on how best to tell the story. Some developments you simply can’t ignore; I remember the day I saw footage in class from a Panasonic DVX100. It changed everything!

What new technology has changed the way you work (looking back over the past few years)?
I feel like the digital cameras, while continuing to get better, have slowed down a bit. There was such a huge jump between the early 2000s and the late 2000s. There’s no question digital acquisition has changed the way we make images — and it’s up for debate if it’s been a wholly positive shift. But generally, it’s been very empowering for filmmakers, especially on smaller budgets. It’s given me and my peers the chance to create cinema-quality images on projects that couldn’t afford to shoot on 16mm or 35mm. And over the last five years, the gap between digital and film has diminished, even vanished for many of us.

But if I had to single out one development it’s probably been LEDs over the last two or three years. Literally, five years ago it was all HMI and Kino Flos, and now I don’t remember the last time I touched a Kino. Sometimes we go entire jobs without firing up an HMI. The LEDs have gotten much better recently, and the control we have on set is unprecedented. It makes you wonder how we did it before!

What are some of your best practices or rules you try to follow on each job?
Every time I start a new project, I say to myself, “This time I’m going to get my shit together.” I think I’m going to get organized, develop systems, databases, Filemaker apps, whatever, and streamline the process so I can be more efficient. I’ll have a method for combining scouting photos with storyboards and my script notes so everything is in one place and I can disseminate information to relevant departments. Then I show up at prep and realize the same thing I realize every movie: They are all so, so different.

It’s an effort in futility to think you can adopt a “one-size-fits-all” mentality to preproduction. It just doesn’t work. Some directors storyboard every shot. Some don’t even make shot lists. Some want to previs every scene during the scouting process using stand-ins, others won’t even consider blocking until the actors are there, on the day. So I’ve learned that the efficiency is found in adaptation. My job is to figure out how to get inside my director’s head, see things the way they are seeing them and help them get those ideas into actions and decisions. There’s no app for that, unfortunately! I suppose I try to really listen, and not just to the words my director uses to describe things, but to the subtext and what is between the lines. I try to understand what’s really important to them so I can protect those things and fight for them when the pressure to compromise starts mounting.

Linda Cardellini as Dolores Vallelonga and Viggo Mortensen as Tony Vallelonga in “Green Book,” directed by Peter Farrelly.

On a more practical note, I read many years ago about a DP who would stand on the actor’s mark and look back toward the camera — just to be aware of what sort of environment they were putting the talent in. Addressing a stray glare or a distracting stand might make a big difference to the actor’s experience. I try to do that as often as I can.

Explain your ideal collaboration with the director when setting the look of a project.
It’s hard to reduce such an array of possible experiences down to an “ideal,” as an ideal situation for one film might not be ideal for another depending on the experience the director wants to create on set. I’ve had many different, even conflicting, “processes” with my directors because it suited that specific collaboration. Again, it’s about adapting, being a chameleon to their process. It’s not about coming in and saying, “This is the best way to do this.”

I remember with one director we basically locked ourselves in her apartment for three days and just watched films. We’d pause them now and then and discuss a shot or a scene, but a lot of the time it was just about being together experiencing this curated body of work and creating a visual foundation for us to work from. With another director, we didn’t really watch any films at all, but we did lots and lots of testing. Camera tests, lens tests, lighting tests, filter tests, makeup and SFX tests. And we’d go into a DI suite and look at everything and talk about what was working and what wasn’t. He was also a DP so I think that technical, hands-on approach made sense to him. I think I tested every commercially available fluorescent tube that was on the market to find the right color for that film. I’ll admit as convenient as it would be to have a core strategy to work from, I think I would tire of it. I love walking onto a film and saying, “Ok, how are we going do this?”

Tell us about Green Book. How would you describe the overarching look of the film that you and Peter Farrelly wanted to achieve?
I think, maybe more than I want to admit, that the look of my films is a culmination of the restraints that are imparted by either myself or by production. You’re only going to have a certain amount of time and money for each scene, so calculations and compromises must be made there. You have to work with the given location, time of day and how it’s going be art decorated, so that adds a critical layer. Peter wanted to work a certain way with his actors and have lots of flexibility, so you adapt your process to make that work. Then you give yourself certain creative constraints, and somewhere in between all those things pushing on each other, the look of the film emerges.

That sounds a little arbitrary and Pete and I had some discussions about how it should look, but they were broad conversations. Honesty and authenticity were very important to Pete. He didn’t want things to ever look or feel disingenuous. My very first conversation with him after I was hired was about the car work. He was getting pressure to shoot it all on stage with LED screens. I was honest with him. I told him he’d probably get more time with his actors, and more predictable results on stage, but he’d get more realism from the look and from the performances dragging the entire company out onto the open road and battling the elements.

So we shot all the car work practically, save for a few specific night scenes. I took his words to heart and tried to shape the look out of what was authentic to the time. My gaffer and I researched what lighting fixtures were used then — it wasn’t like it is now with hundreds of different light sources. Back then it was basically tungsten, fluorescent, neon mercury and sodium. We limited our palette to those colors and tuned all our fixtures accordingly. I also avoided stylistic choices that would have made the film feel dated or “affected” — the production design, wardrobe and MCU departments did all of that. Pete and I wanted the story to feel just as relevant now as it did then, so I kept the images clean and largely unadulterated.

How early did you get involved in the production?
I came on about five weeks before shooting. I prepped for one week and then we were all sent home! Some negotiations had stalled production and for several weeks I didn’t know if we would start up again. I’m very grateful everyone made it work so we could make the film.

How did you go about choosing the right camera and lenses for Green Book?
While 35mm would have been a great choice aesthetically for the film, there were some real production advantages to shooting digitally. As we were shooting all the car work practically, it was my prerogative to get as much of the coverage inside the car accomplished at a go. Changing lighting conditions, road conditions and tight schedules prohibited me from shooting an angle, then pulling over and re-rigging the camera. We had up to three Alexa Mini cameras inside the car at once, and many times that was all the coverage planned for the scene, save for a couple cutaways. This allowed us to get multi-page scenes done very efficiently while maintaining light continuity, keeping the realism of the landscapes and capturing those happy (and sometimes sad) accidents.

I chose some very clean, very fast, and very portable lenses: the Leica Summilux-Cs. I used to shoot stills with various Leica film cameras and developed an affinity for the way the lenses rendered. They are always sharp, but there’s some character to the fall off and the micro-contrast that always make faces look great. I had shot many of my previous films with vintage lenses with lots of character and could have easily gone that route, but as I mentioned, I was more interested in removing abstractions — finding something more modern yet still classic and utilitarian.

Any challenging scenes that you are particularly proud of?
Not so much a particular scene, but a spanning visual idea. Many times, when you start a film, you’ll have some cool visual arc you want to try to employ, and along the way various time, location or schedule constraints eventually break it all down. Then you’re left with a few disparate elements that don’t connect the way you wanted them to. Knowing I would face those same challenges but having a bit more resources than some of my other films, I aimed low but held my ground: I wanted the color of the streetlights to work on a spectrum, shifting between safety and danger deepening on the scene or where things were heading in the story.

I broke the film down by location and worked with my gaffer to decide where the environment would be majority sodium (safe/familiar/hopeful) and where it would be mercury (danger/fear/despair). It sounds very rudimentary but when you try to actually pull it off with so many different locations, it can get out of hand pretty quickly. And, of course, many scenes had varying ratios of those colors. I was pleased that I was able to hold onto the idea and not have it totally disintegrate during the shoot.

What’s your go-to gear (camera, lens, mount/accessories) — things you can’t live without?
Go-to tools change from job to job, but the one I rely on more than any is my crew. Their ideas, support and positive energy keep me going in the darkest of hours! As for the nuts and bolts — lately I rarely do a job without SkyPanels and LiteMats. For my process on set, I’ve managed to get rid of just about everything except my light meter and my digital still camera. The still camera is a very fast way to line up shots, and I can send images to my iPad and immediately communicate framing ideas to all departments. It saves a lot of time and guess work!

Main Image: Sean Porter (checkered shirt) on set of Green Book, pictured with director Peter Farrelly.

Steve McQueen on directing Widows

By Iain Blair

British director/writer/producer Steve McQueen burst onto the international scene in 2013 when his harrowing 12 Years a Slave dominated awards season, winning as Academy Award, Golden Globe, BAFTA and a host of others. His directing was also recognized with many nominations and awards.

Now McQueen, who also helmed the 2011 feature Shame (Michael Fassbender, Carey Mulligan) is back with the film Widows.

A taut thriller, 20th Century Fox’s Widows is set in contemporary Chicago in a time of political and societal turmoil. When four armed robbers are killed in a botched heist, their widows — with nothing in common except a debt left behind by their dead husbands’ criminal activities — take fate into their own hands to forge a future on their own terms.

With a screenplay by Gillian Flynn and McQueen himself — and based on the old UK television miniseries of the same name — the film stars, among others, Viola Davis, Michelle Rodriguez, Colin Farrell, Brian Tyree Henry, Daniel Kaluuya, Carrie Coon, Jon Bernthal, Robert Duvall and Liam Neeson.

The production team includes Academy Award-nominated editor Joe Walker (12 Years a Slave), Academy Award-winning production designer Adam Stockhausen (The Grand Budapest Hotel) and director of photography Sean Bobbit (12 Years a Slave).

I spoke with McQueen, whose credits also include 2008’s Hunger, about making the film and his love of post.

This isn’t just a simple heist movie, is it?
No, it isn’t. I wanted to make an all-encompassing movie, an epic in a way, about how we live our daily lives and how they’re affected by politics, race, gender, religion and corruption, and do it through this story. I remember watching the TV series as a kid and how it affected me — how strong all these women were — and I decided to change the location from London to Chicago, which is really an under-used city in movies, and make it a more contemporary view of all these issues.

You assembled a great cast, led by Oscar-winner Viola Davis. What did she bring to the table?
So much weight and gravitas. She’s like an iceberg. There’s so much hidden depth in everything she does, and there’s this well of meaning and emotion she brings to the role, and then everyone has to step up to that.

What were the main technical challenges in pulling it all together?
The big one was logistics and dealing with all the Chicago locations. We had over 60 locations, all over the city, and 81 speaking parts. So there was a lot of planning, and if one thing got stuck it threw off the whole schedule. It would have been almost impossible to reschedule some of the scenes.

How tough was the shoot?
Pretty tough. They’re always grueling, and when you’re writing a script you don’t always think about how many night shoots you’re going to face, and you forget about this big machine you have to bring with you to all the locations. Trying to make any quick change or adjustment is like trying to turn the Titanic. It takes a while.

How early on did you start integrating post and all the VFX?
From day one. You have to when you have a big production with a set release date, so we began cutting and assembling while I shot.

Where did you post?
In Amsterdam, where I live, and then we finished it off in London.

Do you like the post process?
I love it. It’s my favorite part as you have civilized hours — 9 till 5 or whatever —and you’re in total control. You’re not having to deal with 40 or 50 people. It’s just you and the editor in a dark room, actually making the film.

Joe Walker has cut all of your films, including Hunger and Shame, as well Blade Runner 2049, Arrival and Sicario. Can you talk about working with him?
He wasn’t on set, and we had someone else assembling stuff as Joe was still finishing up Blade Runner. He came in when I got back to Amsterdam. Joe and I go way back to 2007, when we did Hunger, and we always work very closely together. I sit right next to him, and I’m there for every single cut, dissolve, whatever. I’m very present. I’m not one of those directors who comes in, gives some notes and then disappears. I don’t know how you do that. I love editing and finding the pace and rhythm. What makes Joes such a great editor is that he started off in music, so he has a great sense of how to work with sound.

What were the big editing challenges?
There are all these intertwined stories and characters, so it’s about finding the right balance and tone and rhythm. The whole opening sequence is all about pulling the audience in and then grabbing them with a caress and then a slap — and another caress and slap — as we set up the story and the main characters. Then there are so many parts to the story that it’s like this big Swiss watch: all these moving parts and different functions. But you always go back to the widows. A script isn’t a film, it’s a guide, so you’re feeling your way in the edit, and seeing what works and what doesn’t. The whole thing has to be cohesive, one thing. That’s your goal.

What about the visual effects?
They were all done by One Of Us and Outpost VFX (both in the UK), but the VFX were all about enhancing stuff, not dazzling the audience. The aim was always for realism, not fantasy.

Talk about the importance of sound and music.
They’re huge for me, and it’s interesting as a lot of the movie has no sound or music. At the beginning, there’s just this one chord on a violin when we get to the title card, and that’s it. There’s no sound for 2/3 of the movie, and then we only have some ambient music and Procul Harum’s “Whiter Shade of Pale” and a Van Morrison song. That’s why all the sound design is so important. When the women lose their husbands, I didn’t want it to be hammy and tug at your heartstrings. I wanted you to feel that pain and that grief and that journey. When they start to act and take control of their lives, that’s when the music and sound kick in, almost like this muscular drive. Our supervising sound editor James Harrison did a great job with all that. We did all the mixing in Atmos at De Lane Lea in London.

Where did you do the DI and how important is it to you?
We did it at Company 3 London with colorist Tom Poole, and it’s very important. We shot on film, and our DP Sean and I spent a lot of time just talking about the palette and the look. When you’re shooting in over 60 locations, it’s not so much about putting your own stamp and look on them, but about embracing what they offer you visually and then tweaking it.

For the warehouse scenes, there was a certain mood and it had crappy tungsten lighting, so we changed it a bit to feel more tactile, and it was the same with most of the locations. We’d play with the palette and the visual mood, which the DI allows you to do so well.

Did the film turn out the way you hoped?
(Laughs) I always hope it turns out better than I hoped or imagined, as your imagination can only take you so far. What’s great is when you go beyond that and come up with something cooler than you could have imagined. That’s what I always want.

What’s next?
I’ve got a few things cooking on the stove, and I should finish writing something in the next few months and then start it next year.

All Images Courtesy of 20th Century Fox/Merrick Morton


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

DP Chat: Polly Morgan, ASC, BSC

Cinematographer Polly Morgan, who became an active member of the ASC in July, had always been fascinated with films, but she got the bug for filmmaking as a teenager growing up in Great Britain. A film crew shot at her family’s farmhouse.

“I was fixated by the camera and cranes that were being used, and my journey toward becoming a cinematographer began.”

We reached out to Morgan recently to talk about her process and about working on the FX show Legion.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
I am inspired by the world around me. As a cinematographer you learn to look at life in a unique way, noticing elements that you might not have been aware of before. Reflections, bouncing light, colors, atmosphere and so many more. When I have time off, I love to travel and experience different cultures and environments.

I spend my free time reading various periodicals to stay of top of the latest developments in technology. Various publications, such as the ASC’s magazine, help to not only highlight new tools but also people’s experiences with them. The filmmaking community is united by this exploration, and there are many events where we are able to get together and share our thoughts on a new piece of equipment. I also try to visit different vendors to see demos of new advances in technology.

Has any recent or new technology changed the way you work?
Live on-set grading has given me more control over the final image when I am not available for the final DI. Over the last two years, I have worked more on episodic television, and I am often unable to go and sit with the colorist to do the final grade, as I am working on another project. Live grading enables me to get specific with adjustments on the set, and I feel confident that with good communication, these adjustments will be part of the final look of the project.

How do you go about choosing the right camera and lenses to achieve the right look for a story?
I like to vary my choice of camera and lenses depending on what story I am telling.
When it comes to cameras, resolution is an important factor depending on how the project is going to be broadcast and if there are specific requirements to be met from the distributor, or if we are planning to do any unique framing that might require a crop into the sensor.

Also, ergonomics play a part. Am I doing a handheld show, or mainly one in studio mode? Or are there any specifications that make the camera unique that will be useful for that particular project? For example, I used the Panasonic VariCam when I needed an extremely sensitive sensor for night driving around downtown Los Angeles. Lenses are chosen for contrast and resolution and speed. Also, sometimes size and weight play a part, especially if we are working in tight locations or doing lots of handheld.

What are some best practices, or rules, you try to follow on each job?
Every job is different, but I always try to root my work in naturalism to keep it grounded. I feel like a relatable story can have the most impact on its viewer, so I want to make images that the audience can connect with and be drawn into emotionally. As a cinematographer, we want our work to be invisible but yet always support and enhance the narrative.

On set, I always ensure a calm and pleasant working environment. We work long and bizarre hours, and the work is demanding so I always strive to make it an enjoyable and safe experience for everyone,

Explain your ideal collaboration with the director when setting the look of a project.
It is always my aim to get a clear idea of what the director is imagining when they describe a certain approach. As we are all so different, it is really about establishing a language that can be a shorthand on set and help me to deliver exactly what they want. It is invaluable to look at references together, whether that is art, movies, photography or whatever.

As well as the “look,” I feel it is important to talk about pace and rhythm and how we will choose to represent that visually. The ebb and flow of the narrative needs to be photographed, and sometimes directors want to do that in the edit, or sometimes we express it through camera movement and length of shots. Ideally, I will always aim to have a strong collaboration with a director during prep and build a solid relationship before production begins.

How do you typically work with a colorist?
This really varies from project to project, depending if I am available to sit in during the final DI. Ideally, I would work with the colorist from pre-production to establish and build the look of the show. I would take my camera tests to the post house and work on building a LUT together that would be the base look that we work off while shooting.

I like to have an open dialogue with them during the production stage so they are aware and involved in the evolution of the images.

During post, this dialogue continues as VFX work starts to come in and we start to bounce the work between the colorist and the VFX house. Then in the final grade, I would ideally be in the room with both the colorist and the director so we can implement and adjust the look we have established from the start of the show.

Tell us about FX’s Legion. How would you describe the general look of the show?
Legion is a love letter to art. It is inspired by anything from modernist pop art to old Renaissance masters. The material is very cerebral, and there are many mental planes or periods of time to express visually, so it is a very imaginative show. It is a true exploration of color and light and is a very exciting show to be a part of.

How early did you get involved in the production?
I got involved with Legion starting in Season 2. I work alongside Dana Gonzales, ASC, who established the look of the show in Season one with creator Noah Hawley. My work begins during the production stage when I worked with various directors both prepping and shooting their individual episodes.

Any challenging scenes that you are particularly proud of how it turned out?
Most of the scenes in Legion take a lot of thought to figure out… contextually as well as practically. In Season 2, Episode 2, a lot of the action takes place out in the desert. After a full day, we still had a night shoot to complete with very little time. Instead of taking time to try to light the whole desert, I used one big soft overhead and then lit the scene with flashlights on the character’s guns and headlights of the trucks. I added blue streak filters to create multiple horizontal blue flares from each on-camera source (headlights and flashlights) that provided a very striking lighting approach.

FX’s Legion, Season 2, Episode 2

With the limited hours available, we didn’t have enough time to complete all the coverage we had planned so, instead, we created one very dynamic camera move that started overhead looking down at the trucks and then swooped down as the characters ran out to approach the mysterious object in the scene. We followed the characters in the one move, ending in a wide group shot. With this one master, we only ended up needing a quick reverse POV to complete the scene. The finished product was an inventive and exciting scene that was a product of limitations.

What’s your go-to gear (camera, lens, mount/accessories you can’t live without)?
I don’t really have any go-to gear except a light meter. I vary the equipment I use depending on what story I am telling. LED lights are becoming more and more useful, especially when they are color- and intensity-controllable and battery-operated. When you need just a little more light, these lights are quick to throw in and often save the day!

iOgrapher now offering Multi Case for Androids and iOS phones

iOgrapher has debuted the iOgrapher Multi Case rig for mobile filmmaking. It’s the companies first non-iOS offering. An early pioneer of mobile media filmmaking cases for iOS devices, iOgrapher is now targeting mobile filmmakers with a flexible design to support recent model iOS and Android mobile phones of all sizes.

The iOgrapher Multi Case features:

• Slide in function for a strong and secure fit
• The ability to attach lighting and mics for higher quality mobile video production
• Flexible mount options for any standard tripod in landscape or portrait mode
• ¼ 20-inch screw mounts on handles to attach accessories
• Standard protective cases for your phone can be used — filmmakers no longer need to remove protective cases to use the iOgrapher Multi Case
• It works with Moment Lenses. Users do not need to remove Moment Lens cases or lenses to use the iOgrapher Multi Case
• The Multi Case is designed to work with iPhone 6 and later models, and has been tested to work with popular Samsung, Google Pixel, LG and Motorola phones.

With the launch of the Multi Case, iOgrapher is introducing a new design. The capabilities and mounting options have evolved as a result of customer reviews and feedback, as well as real-world use cases from professional broadcasters, filmmakers, pro-sport coaches and training facilities.

The iOgrapher Multi Case is available for pre-order and is priced at $79. It will ship at the end of November.

Timecode Systems’ timecode-over-bluetooth solution

Timecode Systems has introduced UltraSync Blue, which uses the company’s new patented timecode sync and control protocol. UltraSync Blue transmits timecode to a recording device over Bluetooth with sub-frame accuracy. This enables timecode to be transmitted wirelessly from UltraSync Blue directly into the media file of a connected device.

“The beauty of this solution is that the timecode is embedded directly into a timecode track, so there is no need for any additional conversion software; the metadata is in the right format to be automatically recognized by professional NLEs,” reports Paul Scurrell, CEO of Timecode Systems. “This launches a whole new era for multicamera video production in which content from prosumer and consumer audio and video has the potential to be combined, aligned and edited together with ease and efficiency, and with the same high level of accuracy as footage from top-end, professional recording devices.”

The device itself measures just 55mmx43mmx17mm, weighs only 36g, and costs $179 US, making it small enough to fit neatly into a pocket during filming and affordable enough to be used on any type of production, from documentaries, news gathering, and reality shows to wedding videos and independent films.

By removing the restrictions of a wired connection, crews not only benefit from the convenience of being cable-free, but also from even more versatility in how they can sync content. One feature of UltraSync Blue is the ability to use a single unit to sync up to four recording devices shooting in close range over Bluetooth — a great option for small shoots and interviews, and also for content captured for vlogs and social media.

However, as filming is not always this simple, especially in the professional world, UltraSync Blue is also designed to work seamlessly with the rest of the Timecode Systems product range. For more complicated shoots, sprawling filming locations and recording using a variety of professional equipment, UltraSync Blue can be connected to devices over Bluetooth and then synced over robust, long-range RF to other camera and audio recorders using Timecode Systems units. This also includes any equipment containing a Timecode Systems OEM sync module, such as the AtomX Sync module that was recently launched by Atomos for the new Ninja V.

“With more and more prosumer and consumer cameras and sound recorders coming with built-in Bluetooth technology, we saw an opportunity to use this wireless connectivity to exchange timecode metadata,” Scurrell adds. “By integrating a robust Bluetooth Low Energy chip into UltraSync Blue, we’ve been able to create a simple, low-cost timecode sync product that has the potential to work with any camera or sound recording device with Bluetooth connectivity.”

Timecode Systems is now working with manufacturers and app developers to adopt its new super-accurate timing protocol into their Bluetooth-enabled products. At launch, both the MAVIS professional camera app and Apogee MetaRecorder app (both for iPhone) are already fully compatible, allowing — for the first time — sound and video recorded on iPhone devices to be synchronized over the Timecode Systems network.

“It’s been an exciting time for sync technology. In the past couple of years, we’ve seen some massive advancements not only in terms of reducing the size and cost of timecode solutions, but also with solutions becoming more widely compatible with more consumer-level devices such as GoPro and DSLR cameras,” Scurrell explains. “But there was still no way to embed frame-accurate timecode into sound and video recordings captured on an iPhone; this was the biggest thing missing from the market. UltraSync Blue, in combination with the MAVIS and MetaRecorder apps, fills this gap.”

Zoom Corporation is working on new releases of H3-VR Handy Recorder and F8n MultiTrack Field Recorder. When released later this year, both of these Zoom sound recorders will have the ability to receive timecode over Bluetooth from the UltraSync Blue.

Timecode Systems is now taking orders for UltraSync Blue and will be shipping in October 2018.

New CFast 2.0 card for ARRI Alexa Mini and Amira cameras

ARRI has introduced the ARRI Edition AV Pro AR 256 CFast 2.0 card by Angelbird, which has been designed and certified for use in the ARRI Alexa Mini and Amira camera systems and can be used for ProRes and MXF/ARRIRAW recording. (Support for new CFast 2.0 cards is currently not planned for ALEXA XT, SXT(W) and LF cameras.)

ARRI has worked closely with Angelbird Technologies, based in Vorarlberg, Austria. Angelbird is no stranger to film production, and some of their gear can be found at ARRI Rental European locations.

For the ARRI Edition CFast card, the Angelbird team developed an ARRI-specific card that uses a combination of thermally conductive material and so-called underfill to provide superior heat dissipation from the chips and to secure the electronic components against mechanical damage.

The result, according to ARRI, is a rock-solid 256 GB CFast 2.0 card with stable recording performance all the way across the storage space. The ARRI Edition AV PRO AR 256 memory card is available from ARRI and other sales channels offering ARRI products.

GoPro introduces new Hero7 camera lineup

GoPro’s new Hero7 lineup includes the company’s flagship Hero7 Black, which comes with a timelapse video mode, live streaming and improved video stabilization. The new video stabilization, HyperSmooth, allows users to capture professional-looking, gimbal-like stabilized video without  a motorized gimbal. HyperSmooth also works underwater and in high-shock and wind situations where gimbals fail.

With Hero7 Black, GoPro is also introducing a new form of video called TimeWarp. TimeWarp Video applies a high-speed, “magic-carpet-ride” effect, transforming longer experiences into short, flowing videos. Hero7 Black is the first GoPro to live stream, enabling users to automatically share in realtime to Facebook, Twitch, YouTube, Vimeo and other platforms internationally.

Other Hero7 Black features:

  • SuperPhoto – Intelligent scene analyzation for professional-looking photos via automatically applied HDR, Local Tone Mapping and Multi-Frame Noise Reduction
  • Portrait Mode – Native vertical-capture for easy sharing to Instagram Stories, Snapchat and others
  • Enhanced Audio – Re-engineered audio captures increased dynamic range, new microphone membrane reduces unwanted vibrations during mounted situations
  • Intuitive Touch Interface – 2-inch touch display with simplified user interface enables native vertical (portrait) use of camera
  • Face, Smile + Scene Detection – Hero7 Black recognizes faces, expressions and scene-types to enhance automatic QuikStory edits on the GoPro app
  • Short Clips – Restricts video recording to 15- or 30-second clips for faster transfer to phone, editing and sharing.
  • High Image Quality – 4K/60 video and 12MP photos
  • Ultra Slo-Mo – 8x slow motion in 1080p240
  • Waterproof – Waterproof without a housing to 33ft (10m)
  • Voice Control – Verbal commands are hands-free in 14 languages
  • Auto Transfer to Phone – Photos and videos move automatically from camera to phone when connected to the GoPro app for on-the-go sharing
  • GPS Performance Stickers – Users can track speed, distance and elevation, then highlight them by adding stickers to videos in the GoPro app

The Hero7 Black is available now on pre-order for $399.

Panavision, Sim, Saban Capital agree to merge

Saban Capital Acquisition Corp., a publicly traded special purpose acquisition company, Panavision and Sim Video International have agreed to combine their businesses to create a premier global provider of end-to-end production and post production services to the entertainment industry. Under the terms of the business combination agreement, Panavision and Sim will become wholly owned subsidiaries of Saban Capital Acquisition Corp. Upon completion, Saban Capital Acquisition Corp. will change its name to Panavision Holdings Inc. and is expected to continue to trade on the Nasdaq stock exchange. Kim Snyder, president and chief executive officer of Panavision, will serve as chairman and chief executive officer. Bill Roberts, chief financial officer of Panavision, will serve in that role for the combined company.

Panavision designs, manufactures and provides high-precision optics and camera technology for the entertainment industry and is a leading global provider of production equipment and services. Sim is a leading provider of production and post production solutions with facilities in Los Angeles, Vancouver, Atlanta, New York and Toronto.

“This acquisition will leverage the best of Panavision’s and Sim’s resources by providing comprehensive products and services to best address the ever-adapting needs of content creators globally,” says Snyder.

“We’re combining the talent and integrated services of Sim with two of the biggest names in the business, Panavision and Saban,” adds James Haggarty, president and CEO of Sim. “The resulting scale of the new combined enterprise will better serve our clients and help shape the content-creation landscape.”

The respective boards of directors of Saban Capital Acquisition Corp., Panavision and Sim have unanimously approved the merger with completion subject to Saban Capital Acquisition Corp. stockholder approval, certain regulatory approvals and other customary closing conditions. The parties expect that the process will be completed in the first quarter of 2019.

Our Virtual Production Roundtable

By Randi Altman

Evolve or die. That old adage, while very dramatic, fits well with the state of our current production workflows. While most productions are now shot digitally, the warmth of film is still in the back of pros’ minds. Camera makers and directors of photography often look for ways to retain that warmth in digital. Whether it’s through lighting, vintage lenses, color grading, newer technology or all of the above.

There is also the question of setting looks on-set and how 8K and HDR are affecting the picture and workflows. And let’s not forget shooting for OTT series. There is a lot to cover!

In an effort to get a variety of perspectives, we reached out to a few cinematographers and some camera manufacturers to talk trends and technology. Enjoy!

Claudio Miranda, ASC

Claudio Miranda is a Chilean cinematographer who won an Oscar for his work on Life of Pi. He also worked on The Curious Case of Benjamin Button, the first movie nominated for a cinematography Oscar that was shot entirely on digital. Other films include Oblivion, Tomorrowland and the upcoming Top Gun: Maverick.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Seems like everyone is shooting large format. Chris Nolan and Quentin Tarantino shot 65mm film for their last projects. New digital cameras such as the Alexa LF and Sony Venice cater to this demand. People seem to like the shallow depth of field of these larger format lenses.

How is HDR affecting the way things are being shot these days? Are productions shooting/monitoring HDR on-set?
For me, too much grain in HDR can be distracting. This must be moderated in the camera acquisition format choice and DI. Panning in a high-contrast environment can cause painful strobing. This can be helped in the DI and set design. HDR done well is more important than 8K or even 3D.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K can be important for VFX plates. For me, creatively it is not important, 4K is enough. The positive of 8K is just more K. The downside is that I would rather the camera companies focus on dynamic range, color latitude, sensitivity and the look and feel of the captured image instead of trying to hit a high K number. Also, there are storage and processing issues.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
I have not shot for a streaming service. I do think we need to pay attention to all deliverables and make adjustments accordingly. In the DI, I am there for the standard cinema pass, HDR pass, IMAX pass, home video pass and other formats that arise.

Is the availability of all those camera resolutions a help or a hindrance?
I choose the camera that will fit the job. It is my job in prep to test and pick the camera that best serves the movie.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
On set, I am able to view HDR or 709. I test the pipeline and make sure the LUT is correct and make modifications if needed. I do not play with many LUTs on set, I normally just have one. I treat the camera like a film stock. I know I will be there in the DI to finalize the look. On set is not the place for futzing with LUTs on the camera. My plate is full enough as it is.

If not already covered, how has production changed in the last two years?
I am not sure production has changed, but there are many new tools to use to help make work more efficient and economical. I feel that I have always had to be mindful of the budget, no matter how large the show is. I am always looking for new solutions.

Daryn Okada, ASC
Daryn Okada is known for his work on films such as Mean GirlsAnna Karenina and Just Like Heaven. He has also worked on many TV series, such as Scandal, Grey’s Anatomy and Castle. He served as president of the ASC from 2006 to 2009.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses? 

Modern digital cinema cameras can achieve a level of quality with the proper workflows and techniques to evolve a story’s visual identity parallel explorations shooting on film. Larger image sensors, state-of-the-art lenses and mining historic optics enable cinematographers to use their experience and knowledge of the past to paint rich visual experiences for today’s audience.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
HDR is a creative and technical medium just as shooting and projecting 65mm film would be. It’s up to the director and the cinematographer to decide how to orchestrate the use of HDR for their particular story.

Can you address 8K? What are the positives, and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K will is working its way into production like 65mm and 35mm VistaVision did by providing more technical resolution for use in VFX or special-venue exhibition. The enormous amount of data and cost to handle it must be justified by its financial return and does it benefit a particular story. Latitude and color depth are paramount to creating a motion picture’s pallet and texture. Trying to use a format just because it’s technically possible may be distracting to an audience’s acceptance of a story or creative concept.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?

I think the delivery specifications of OTT have generally raised the bar, making 4K and wide color gamut the norm. For cinematographers that have spent years photographing features, we are accustomed to creating images with detail for a big screen and a wide color pallet. It’s a natural creative process to shoot for 4K and HDR in that respect. 

Are the availability of all those camera resolutions a help or a hindrance? 
Having the best imaging available is always welcomed. Even if a camera is not technically exploited, the creation of subtle images is richer and possible through the smoother transition and blending of color, contrast and detail from originating with higher resolutions and color range.

Can you talk about color management from the sensor/film to the screen? How do you ensure correct color management from the set into dailies and post, the DI and final delivery?
As a cinematographer we are still involved in workflows for dailies and post production to ensure everyone’s creative efforts to the final production are maintained for the immediate viewer and preserved for the audiences in the future.

How has production changed over the last two years?
There are more opportunities to produce content with creative high-quality cinematography thanks to advancements in cameras and cost-effective computing speed combined with demands of high quality displays and projection.

Vanja Černjul, ASC
This New York-based DP recently worked on the huge hit Crazy Rich Asians. In addition to feature film work, Černjul has shot TV shows (Deuce’s season 1 finale and two seasons of Marco Polo, as well as commercials for Panasonic and others.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
One interesting trend I noticed is the comeback of image texture. In the past, cinematographers used to expose film stock differently according to the grain texture they desired. Different exposure zones within the same frame had different grain character, which produced additional depth of the image. We lost that once we switched to digital. Crude simulations of film grain, such as overall filters, couldn’t produce the dimensionality we had with film.

Today, I am noticing new ways of bringing the texture back as a means of creative expression. The first one comes in the form of new, sophisticated post production tools designed to replicate the three-dimensional texturing that occurs naturally when shooting film, such as the realtime texturing tool LiveGrain. Monitoring the image on the set with a LiveGrain texture applied can impact lighting, filtration or lens choices. There are also new ways to manipulate texture in-camera. With the rise of super-sensitive, dual-native ISO sensors we can now shoot at very low-light levels and incorporate so-called photon shot noise into the image. Shot noise has organic character, very much like film grain.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?

The creative potential of HDR technology is far greater than that of added resolution. Unfortunately, it is hard for cinematographers to take full advantage of HDR because it is still far from being the standard way the audience sees our images. We can’t have two completely different looks for a single project, and we have to make sure the images are working on SDR screens. In addition, it is still impractical to monitor in HDR on the set, which makes it difficult to adjust lighting and lens choices to expanded dynamic range. Once HDR screens become a standard, we will be able to really start creatively exploring this new territory.

Crazy Rich Asians

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
Additional resolution adds more available choices regarding relationship of optical systems and aspect ratios. I am now able to choose lenses for their artifacts and character regardless of the desired aspect ratio. I can decide to shoot one part of the film in spherical and the other part in anamorphic and crop the image to the project’s predetermined aspect ratio without fear of throwing away too much information. I love that freedom.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices and workflows, if at all?
For me, the only practical difference between shooting high-quality content for cable or streaming is the fact that Netflix demands their projects to be capt
ured in true 4K RAW. I like the commitment to higher technical standards, even though this may be an unwelcome restriction for some projects.

Is the availability of all those camera resolutions a help or a hindrance?
I like choices. As large format lenses become more available, shooting across formats and resolutions will become easier and simpler.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The key for correct color management from the set to final color grading is in preproduction. It is important to take the time to do proper tests and establish the communication between DIT, the colorist and all other people involved as early as possible. This ensures that original ideas aren’t lost in the process.

Adjusting and fine-tuning the LUT to the lenses, lighting gels and set design and then testing it with the colorist is very important. Once I have a bulletproof LUT, I light and expose all the material for it specifically. If this part of the process is done correctly, the time in final color grading can be spent on creative work rather than on fixing inconsistencies.

I am very grateful for ACES workflow, which offers long-overdue standardization. It is definitely a move in the right direction.

How has production changed over the last two years?
With all the amazing post tools that are becoming more available and affordable, I am seeing negative trends of further cutting of preproduction time, and lack of creative discipline on the set. I sincerely hope this is just a temporary confusion due to recalibration of the process.

Kate Reid, DP
Kate Reid is a UK-based DP working in TV and film. Her recent work includes the TV series Hanna (Amazon) Marcella 2 (Netflix) and additional photography on the final season Game of Thrones for HBO. She is currently working on Press for BBC.

Can you talk about some camera trends you’ve been seeing? Such as Large Format? The use of old/vintage lenses?
Large format cameras are being used increasingly on drama productions to satisfy the requirement for additional resolution by certain distribution platforms. And, of course, the choice to use large format cameras in drama brings with it another aesthetic that DPs now have as another tool: Choosing if increased depth-of-field fall off, clarity in the image etc., enhances the particular story they wish to portray on screen.

Like many other DPs, I have always enjoyed using older lenses to help make the digital image softer, more organic and less predictable, but the larger format cameras now mean that much of this older glass designed for 35mm size sensor may not cover the increased sensor size, so newer lenses designed for the larger format cameras may become popular by necessity, alongside older larger format glass that is enjoying a renaissance.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
I have yet to shoot a show that requires HDR delivery. It hasn’t yet become the default in drama production in the UK.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and frame rate more important currently?
I don’t inherently find an ultra sharp image attractive. Through older glass and diffusion filters on the lens, I am usually looking to soften and break down my image, so I personally am not all about the extra Ks. How the camera’s sensor reproduces color and handles highlights and shadows is of more interest to me, and I believe has more impact on the picture.

Of primary importance is how practical a camera is to work with — size and how comfortable the camera is to handle would supersede excessive resolution — as the first requirement of any camera has got to be whether it allows you to achieve the shots you have in mind, because a story isn’t told through its resolution.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices, and workflows, if at all?
The major change is the requirement by Netflix for true 4K resolution, determining which cameras cinematographers are allowed to shoot on. For many cinematographers the Arri Alexa was their digital camera of choice, which was excluded by this rule, and therefore we have had to look to other cameras for such productions. Learning a new camera, its sensor, how it handles highlights, produces color, etc., and ensuring the workflow through to the post facility is something that requires time and testing, which has certainly added to a DP’s workload.

From a creative perspective, however, I found shooting for OTTs (I shot two episodes of the TV series Hanna made by Working Title TV and NBC Universal for Amazon) has been more liberating than making a series for broadcast television as there is a different idea and expectation around what the audience wants to watch and enjoy in terms of storytelling. This allowed for a more creative way of filming.

Is the availability of all those camera resolutions a help or a hindrance?
Where work is seen now can vary from a mobile phone screen to a digital billboard in Times Square, so it is good for DPs to have a choice of cameras and their respective resolutions so we can use the best tool of each job. It only becomes a hindrance if you let the technology lead your creative process rather than assist it.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Ideally, I will have had the time and opportunity to shoot tests during prep and then spend half a day with the show’s colorist to create a basic LUT I can work with on set. In practice, I have always found that I tweak this LUT during the first days of production with the DIT, and this is what serves me throughout the rest of the show.

I usually work with just one LUT that will be some version of a modified Rec. 709 (unless the look of the show drastically requires something else). It should then be straight forward in that the DIT can attach a LUT to the dailies, and this is the same LUT applied by editorial so that exactly what you see on set is what is being viewed in the edit.

However, where this fails is that the dailies uploaded to FTP sites — for viewing by the execs, producers and other people who have access to the work — are usually very compressed with low resolution, so it bears little resemblance to how the work looked on set or looks in the edit. This is really unsatisfying as for months, key members of production are not seeing an accurate reflection of the picture. Of course, when you get into the grade this can be restored, but it’s dangerous if those viewing the dailies in this way have grown accustomed to something that is a pale comparison of what was shot on set.

How has production changed over the last two years?
There is less differentiation between film and television in how productions are being made and, critically, where they are being seen by audiences, especially with online platforms now making award-winning feature films. The high production values we’ve seen with Netflix and Amazon’s biggest shows has seen UK television dramas pushing to up their game, which does put pressure on productions, shooting schedules and HODs, as the budgets to help achieve this aren’t there yet.

So, from a ground-level perspective, for DPs working in drama this looks like more pressure to produce work of the highest standard in less time. However, it’s also a more exciting place to be working as the ideas about how you film something for television versus cinema no longer need apply. The perceived ideas of what an audience is interested in, or expect, are being blown out the water by the success of new original online content, which flies in the face of more traditional storytelling. Broadcasters are noticing this and, hopefully, this will lead to more exciting and cinematic mainstream television in the future.

Blackmagic’s Bob Caniglia
In addition to its post and broadcast tools, Blackmagic offers many different cameras, including the Pocket Cinema Camera, Pocket Cinema Camera 4K, Micro Studio Camera 4K, Micro Cinema Camera, Studio Camera, Studio Camera 4K, Ursa Mini Pro, Ursa Mini 4.6K, Ursa Broadcast.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Lens freedom is on everyone’s mind right now… having the freedom to shoot in any style. This is bringing about things like seeing projects shot on 50-year-old glass because the DP liked the feel of a commercial back in the ‘60s.

We actually just had a customer test out actual lenses that were used on The Godfather, The Shining and Casablanca, and it was amazing to see the mixing of those with a new digital cinema camera. And so many people are asking for a camera to work with anamorphic lenses. The trend is really that people expect their camera to be able to handle whatever look they want.

For large format use, I would say that both Hollywood and indie filmmakers are using them more often. Or, at least they trying to get the general large format look by using anamorphic lenses to get a shallow depth of field.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Right now, HDR is definitely more of a concern for DPs in Hollywood, but also with indie filmmakers and streaming service content creators. Netflix and Hulu have some amazing HDR shows right now. And there is plenty of choice when it comes to the different HDR formats and shooting and monitoring on set. All of that is happening everyday, while 8K still needs the industry to catch up with the various production tools.

As for impacting shooting, HDR is about more immersive colors, and a DP needs to plan for it. It gives viewers a whole new level of image detail in what they shoot. They have to be much more aware of every surface or lighting impact so that the viewer doesn’t get distracted. Attention to detail gets even higher in HDR, and DPs and colorists will need to keep a close eye on every shot, including when an image in a sideview mirror’s reflection is just a little too sharp and needs a tweak.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
You can never have enough Ks! Seriously. It is not just about getting a beautiful 8K TV, it is about giving the production and post pros on a project as much data as possible. More data means more room to be creative, and is great for things like keying.

Latitude and framerate are important as well, and I don’t think any one is more important than another. For the viewers, the beauty will be in large displays, you’re already seeing 8K displays in Times Square, and though you may not need 8K on your phone, 8K on the side of a building or highway will be very impactful.

I do think one of the ways 8K is changing production practices is that people are going to be much more storage conscious. Camera manufacturers will need to continue to improve workflows as the images get larger in an effort to maximize storage efficiencies.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
For streaming content providers, shoots have definitely been impacted and are forcing productions to plan for shooting in a wider number of formats. Luckily, companies like Netflix have been very good about specifying up front the cameras they approve and which formats are needed.

Is the availability of all those camera resolutions a help or a hindrance?
While it can be a bit overwhelming, it does give creatives some options, especially if they have a smaller delivery size than the acquisition format. For instance, if you’re shooting in 4K but delivering in HD, you can do dynamic zooms from the 4K image that look like an optical zoom, or you can get a tight shot and wide shot from the same camera. That’s a real help on a limited budget of time and/or money.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Have the production and the post people planning together from the start and create the look everyone should be working on right up front.

Set the LUTs you want before a single shot is done and manage the workflow from camera to final post. Also, choose post software that can bring color correction on-set, near-set and off-set. That lets you collaborate remotely. Definitely choose a camera that works directly with any post software, and avoid transcoding.

How has production changed in the last two years?
Beyond the rise of HDR, one of the other big changes is that more productions are thinking live and streaming more than ever before. CNN’s Anderson Cooper now does a daily Facebook Live show. AMC has the live Talking Dead-type formats for many of their shows. That trend is going to keep happening, so cinematographers and camera people need to be thinking about being able to jump from scripted to live shooting.

Red Digital Cinema’s Graeme Nattress
Red Digital Cinema manufactures professional digital cameras and accessories. Red’s DSMC2 camera offers three sensor options — Gemini 5K S35, Helium 8K S35 and Monstro 8K VV.

Can you talk about some camera trends you’ve been seeing?
Industry camera trends continue to push image quality in all directions. Sensors are getting bigger, with higher resolutions and more dynamic range. Filmmakers continue to innovate, making new and amazing images all the time, which drives our fascination for advancing technology in service to the creative.

How is HDR affecting the way things are being shot these days?
One of the benefits of a primary workflow based on RAW recording is that HDR is not an added extra, but a core part of the system. Filmmakers do consider HDR important, but there’s some concern that HDR doesn’t always look appealing, and that it’s not always an image quality improvement. Cinematography has always been about light and shade and how they are controlled to shape the image’s emotional or storytelling intent. HDR can be a very important tool in that it greatly expands the display canvas to work on, but a larger canvas doesn’t mean a better picture. The increased display contrast of HDR can make details more visible, and it can also make motion judder more apparent. Thus, more isn’t always better; it’s about how you use what you have.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
Without resolution, we don’t have an image. Resolution is always going to be an important image parameter. What we must keep in mind is that camera resolution is based on input resolution to the system, and that can — and often will — be different to the output resolution on the display. Traditionally, in video the input and output resolutions were one and the same, but when film was used — which had a much higher resolution than a TV could display — we were taking a high-resolution input and downsampling it to the display, the TV screen.

As with any sampled system, in a digital cinema camera there are some properties we seek to protect and others to diminish. We want a high level of detail, but we don’t want sharpening artifacts and we don’t want aliasing. The only way to achieve that is through a high-resolution sensor, properly filtered (optical low-pass) that can see a large amount of real, un-enhanced detail. So yes, 8K can give you lots of fine detail should you want it, but the imaging benefits extend beyond downsampling to 4K or 2K. 8K makes for an incredibly robust image, but noise is reduced, and what noise remains takes on more of a texture, which is much more aesthetically pleasing.

One challenge of 8K is an increase in the amount of sensor data to be recorded, but that can be addressed through quality compression systems like RedCode.

Addressing dynamic range is very important because dynamic range and resolution work together to produce the image. It’s easy to think that high resolutions have a negative impact upon dynamic range, but improved pixel design means you can have dynamic range and resolution.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Color management is vitally important and so much more than just keeping color control from on-set through to delivery. Now with the move to HDR and an increasing amount of mobile viewing, we have a wide variety of displays, all with their own characteristics and color gamuts. Color management allows content creators to display their work at maximum quality without compromise. Red cameras help in multiple ways. On camera, one can monitor in both SDR and HDR simultaneously with the new IPP2 image processing pipeline’s output independence, which also allows you to color via CDL and creative 3D LUT in such a way as to have those decisions represented correctly on different monitor types.

In post and grading, the benefits of output independence continue, but now it’s critical that scene colors, which can so easily go out of gamut, are dealt with tastefully. Through the metadata support in the RedCode format, all the creative decisions taken on set follow through to dailies and post, but never get in the way of producing the correct image output, be it for VFX, editorial or grading.

Panavision’s Michael Cioni 
Panavision designs and manufactures high-precision camera systems, including both film and digital cameras, as well as lenses and accessories for the motion picture and television industries.

Can you talk about some camera trends you’ve been seeing?
With the evolution of digital capture, one of the most interesting things I’ve noticed in the market are new trends emerging from the optics side of cinematography. At a glance, it can appear as if there is a desire for older or vintage lenses based on the increasing resolution of large format digital cameras. While resolution is certainly a factor, I’ve noticed the larger contributor to vintage glass is driven by the quality of sensors, not the resolution itself. As sensors increase in resolution, they simultaneously show improvements in clarity, low-light capability, color science and signal-to-noise ratio.

The compounding effect of all these elements are improving images far beyond what was capable with analog film technology, which explains why the same lens behaves differently on film, S35 digital capture and large-format digital capture. As these looks continue to become popular, Panavision is responding through our investments in both restoration of classic lenses as well as designing new lenses with classic characteristics and textures that are optimized for large format photography on super sensors.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look?
Creating images is not always about what component is better, but rather how they elevate images by working in concert. HDR images are a tool that increases creative control alongside high resolution and 16-bit color. These components work really well together because a compelling image can make use of more dynamic range, more color and more clarity. Its importance is only amplified by the amalgamation of high-fidelity characteristics working together to increase overall image flexibility.

Today, the studios are still settling into an HDR world because only a few groups, led by OTT, are able to distribute in HDR to wide audiences. On-set tools capable of HDR, 4K and 16-bit color are still in their infancy and currently cost-prohibitive. 4K/HDR on the set is going to become a standard practice by 2021. 4K wireless transmitters are the first step — they are going to start coming online in 2019. Smaller OLED displays capable of 750 nits+ will follow in 2020, creating an excellent way to monitor higher quality images right on set. In 2021, editorial will start to explore HDR and 4K during the offline process. By 2024, all productions will be HDR from set to editorial to post to mobile devices. Early adopters that work out the details today will find themselves ahead of the competition and having more control as these trends evolve. I recommend cinematographers embrace the fundamentals of HDR, because understanding the tools and trends will help prevent images from appearing artificial or overdone.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
One of the reasons we partnered with Red is because the Monstro 8K VV sensor makes no sacrifice in dynamic range while still maintaining ultra high smoothness at 16 bits. The beauty of technology like this is that we can finally start to have the best from all worlds — dynamic range, resolution, bit depth, magnification, speed and workflow — without having to make quality sacrifices. When cinematographers have all these elements together, they can create images previously never seen before, and 8K is as much part of that story as any other element.

One important way to view 8K is not solely as a thermometer for high-resolution sharpness. A sensor with 35 million pixels is necessary in order to increase the image size, similar to trends in professional photography. 8K large format creates a larger, more magnified image with a wider field of view and less distortion, like the difference in images captured by 70mm film. The biggest positive I’ve noticed is that DXL2’s 8K large-format Red Monstro sensor is so good in terms of quality that it isn’t impacting images themselves. Lower quality sensors can add a “fingerprint” to the image, which can distort the original intention or texture of a particular lens.

With sensors like Monstro capable of such high precision, the lenses behave exactly as the lens maker intended. The same Panavision lenses on a lower grade sensor, or even 35mm film, are exhibiting characteristics that we weren’t able to see before. This is literally breathing new life into lenses that previously didn’t perform the same way until Monstro and large format.

Is the availability of so many camera formats a help or a hindrance?
You don’t have to look far to identify individuals who are easily fatigued by having too many choices. Some of these individuals cope with choices by finding ways to regulate them, and they feel fewer choices means more stability and perhaps more control (creative and economic). As an entrepreneur, I find the opposite to be true: I believe regulating our world, especially with regards to the arts and sciences, is a recipe for protecting the status quo. I fully admit there are situations in which people are fatigued by too many complex choices.

I find that failure is not of the technology itself, rather it’s the fault of the manufactures who have not provided the options in easy-to-consume ways. Having options is exactly what creatives need in order to explore something new and improved. But it’s also up to manufacturers to deliver the message in ways everyone can understand. We’re still learning how to do that, and with each generation the process changes a bit. And while I am not always certain which are the best ways to help people understand all the options, I am certain that the pursuit of new art will motivate us to go out of our comfort zones and try something previously thought not possible.

Have you encountered any examples of productions that have shot streaming content (i.e. for Netflix/Amazon) and had to change production practices and workflows for this format/deliverable?
Netflix and Amazon are exceptional examples of calculated risk takers. While most headlines discuss their investment in the quantity of content, I find the most interesting investment they make is in relationships. Netflix and Amazon are heavily invested in standards groups, committees, outreach, panels and constant communication. The model of the past and present (incumbent studios) are content creators with technology divisions. The model of the future (Netflix, Amazon, Hulu, Apple, Google and YouTube) are all the technology companies with the ability to create content. And technology companies approach problems from a completely different angle by not only embracing the technology, they help invent it. In this new technological age, those who lead and those who follow will likely be determined by the tools and techniques used to deliver. What I call “The Netflix Effect” is the impact Netflix has on traditional groups and how they have all had to strategically pivot based on Netflix’s impact.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The DXL2 has an advanced color workflow. In collaboration with LiveGrade by Pomfort, DXL2 can capture looks wirelessly from DITs in the form of CDLs and LUTs, which are not only saved into the metadata of the camera, but also baked into in-camera proxy files in the form of Apple ProRes or Avid DNx. These files now contain visual references of the exact looks viewed on monitors and can be delivered directly to post houses, or even editors. This improves creative control because it eliminates the guess work in the application of external color decisions and streamlines it back to the camera where the core database is kept with all the other camera information. This metadata can be traced throughout the post pipeline, which also streamlines the process for all entities that come in contact with camera footage.

How has production changed over the last two years?
Sheesh. A lot!

ARRI‘s Stephan Ukas-Bradley
The ARRI Group manufactures and distributes motion picture cameras, digital intermediate systems and lighting equipment. Their camera offerings include the Alexa LF, Alexa Mini, Alexa 65, Alexa SXT W and the Amira.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Large format opens some new creative possibilities, using a shallow depth of field to guide the audience’s view and provide a wonderful bokeh. It also conveys a perspective truer to the human eye, resulting in a seemingly increased dimensional depth. The additional resolution combined with our specially designed large format Signature Primes result in beautiful and emotional images.

Old and vintage lenses can enhance a story. For instance, when Gabriel Beristain, ASC, used Bausch & Lomb Super Baltar on the Starz show Magic City, and Bradford Young used detuned DNA lenses in conjunction with Alexa 65 on Solo: A Star Wars Story, certain characteristics like flares, reflections, distortions and focus fall-off are very difficult to recreate in post organically, so vintage lenses provide an easy way to create a unique look for a specific story and a way for the director of photography to maintain creative control.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Currently, things are not done much differently on set when shooting HDR versus SDR. While it would be very helpful to monitor in both modes on-set, HDR reference monitors are still very expensive and very few productions have the luxury to do that. One has to be aware of certain challenges when shooting for an HDR finish. High contrast edges can result in a more pronounced stutter/strobing effect when panning the camera, windows that are blown out in SDR might retain detail in the HDR pass and now all of a sudden, a ladder or grip stand are visible.

In my opinion, HDR is more important than higher resolution. HDR is resolution-independent in regard to viewing devices like phone/tablets and gives the viewer a perceived increased sharpness, and it is more immersive than increased resolution. Also, let’s not forget that we are working in the motion picture industry and that we are either capturing moving objects or moving the camera, and with that introducing motion blur. Higher resolution only makes sense to me in combination with higher frame rates, and that in return will start a discussion about aesthetics, as it may look hyper-real compared to the traditional 24fps capture. Resolution is one aspect of the overall image quality, but in my opinion extended dynamic range, signal/noise performance, sensitivity, color separation and color reproduction are more important.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices and workflows, if at all?
Shooting streaming content has really not changed production practices or workflows. At ARRI, we offer very flexible and efficient workflows and we are very transparent documenting our ARRIRAW file formats in SMPTE RDD 30 (format) and 31 (processing) and working with many industry partners to provide native file support in their products.

Is the availability of all those camera resolutions a help or a hindrance?
I would look at all those different camera types and resolutions as different film stocks and recommend to creatives to shoot their own test and select the camera systems based on what suits their project best.

We offer the ARRI Look Library for Amira, Alexa Mini and Alexa SXT (SUP 3.0), which is a collection of 87 looks, each of them available in three different intensities provided in Rec. 709 color space. Those looks can either be recorded or only used for monitoring. These looks travel with the picture, embedded in the metadata of the ARRIRAW file, QuickTime Atom or HD/SDI stream in form of the actual LUT and ASC CDL. One can also create a look dynamically on set, feeding the look back to the camera and having the ASC CDL values embedded in the same way.

More commonly, one would record in either ARRIRAW or ProRes LogC, while applying a standard Rec. 709 look for monitoring. The “C” in LogC stands for Cineon, which is a film-like response very much the like of a scanned film image. Colorists and post pros are very familiar with film and color grading LogC images is easy and quick.

How has production changed over the last two years?
I don’t have the feeling that production has changed a lot in the past two years, but with the growing demand from OTTs and increased production volume, it is even more important to have a reliable and proven system with flexible workflow options.

Main Image: DP Kate Reid.