Arraiy 4.11.19

Category Archives: Cameras

Hobo Films’ Howard Bowler on new series The System

By Randi Altman

Howard Bowler, the founder of New York City-based audio post house Hobo
Audio, has launched Hobo Films, a long-form original content development company.

Howard Bowler’s many faces

Bowler is also the founder and president of Green Point Creative, a marijuana-advocacy branding agency focused on the war on drugs and changing drug laws. And it is this topic that inspired Hobo Films’ first project, a dramatic series called The System. It features actress Lolita Foster from Netflix’s Orange Is The New Black.

Bowler has his hand in many things these days, and with those paths colliding, what better time to reach out to find out more?

After years working in audio post, what led you to want to start an original long-form production arm?
I’ve always wanted to do original scripted content and have been collecting story ideas for years. As our audio post business has grown, it’s provided us a platform to develop this related, exciting and creative business.

You are president/founder of Green Point Creative. Can you tell us more about that initiative?
Green Point Creative is an advocacy platform that was born out of personal experience. After an arrest followed by release (not me), I researched the history of marijuana prohibition. What I found was shocking. Hobo VP Chris Stangroom and I started to produce PSAs through Green Point to share what we had learned. We brought in Jon Mackey to aid in this mission, and he’s since moved up the ranks of Hobo into production management. The deeper we explored this topic, the more we realized there was a much larger story to tell and one that couldn’t be told through PSAs alone.

You wrote the script for the show The System? Can you tell our readers what the show is about?
The show’s storyline plots the experiences of a white father raising his bi-racial son, set against the backdrop of the war on drugs. The tone of the series is a cross between Marvel Comics and Schindler’s List. What happens to these kids in the face of a nefarious system that has them in its grips, how they get out, fight back, etc.

What about the shoot? How involved were you on set? What cameras were used? Who was your DP?
I was very involved the whole time working with the director Michael Cruz. We had to change lines of the script on set if we felt they weren’t working, so everyone had to be flexible. Our DP was David Brick, an incredible talent, driven and dedicated. He shot on the Red camera and the footage is stunning.

Can you talk about working with the director?
I met Michael Cruz when we worked together at Grey, a global advertising agency headquartered in NYC. I told him back then that he was born to direct original content. At the time he didn’t believe me, but he does now.

L-R: DP David Brick and director Mike Cruz on set

Mike’s directing style is subtle but powerful; he knows how to frame a shot and get the performance. He also knows how to build a formidable crew. You’ve got to have a dedicated team in place to pull these things off.

What about the edit and the post? Where was that done? What gear was used?
Hobo is a natural fit for this type of creative project and is handling all the audio post as well as the music score that is being composed by Hobo staffer and musician Oscar Convers.

Mike Cruz tapped the resources of his company, Drum Agency to handle the first phase of editing and they pulled together the rough cuts. For final edit, we connected with Oliver Parker. Ollie was just coming off two seasons of London Kills, a police thriller that’s been released to great reviews. Oliver’s extraordinary editing elevated the story in ways I hadn’t predicted. All editing was done on an Avid Media Composer. Music was composed by Hobo staffer Oscar Convers.

The color grade via Juan Salvo at TheColourSpace using Blackmagic Resolve. [Editor’s Note: We reached out to Salvo to find out more. “We got the original 8K Red files from editorial and conformed that on our end. The look was really all about realism. There’s a little bit of stylized lighting in some scenes, and some mixed-temperature lights as well. Mostly, the look was about finding a balance between some of the more stylistic elements and the very naturalist, almost cinéma vérité tone of the series.

“I think ultimately we tried to make it true-to-life with a little bit of oomph. A lot of it was about respecting and leaning into the lighting that DP Dave Brick developed on the shoot. So during the dialogue scenes, we tend to have more diffuse light that feels really naturalist and just lets the performances take center stage, and in some of the more visual scenes we have some great set piece lighting — police lights and flashlights — that really drive the style of those shots.”]

Where can people see The System?
Click here view the first five minutes of the pilot and learn more about the series.

Any other shows in the works?
Yes, we have several properties in development and to help move these projects forward, we’ve brought on Tiffany Jackman to lead these efforts. She’s a gifted producer who spent 10 years honing her craft at various agencies, as well as working on various films. With her aboard, we can now create an ecosystem that connects all the stories.

All Is True director Kenneth Branagh

By Iain Blair

Five-time Oscar-nominee Ken Branagh might be the biggest Shakespeare fan in the business. In fact, it’s probably fair to say that the actor/director/producer/screenwriter largely owes his fame and fortune to the Bard. For the past 30 years he’s directed (and often starred in) dozens of theatrical productions, as well as feature film adaptations of Shakespeare’s works, starting with 1989’s Henry V. That film won him two Oscar nominations: Best Actor and Best Director. He followed that with Much Ado About Nothing, Othello, Hamlet (which won him a Best Adapted Screenplay Oscar nod), Love’s Labour’s Lost and As You Like It.

Ken Branagh and Iain Blair

So it was probably only a matter of time before the Irish star jumped at the chance to play Shakespeare himself in the new film All Is True, a fictionalized look at the final years of the playwright. Set in 1613, Shakespeare is acknowledged as the greatest writer of the age, but disaster strikes when his renowned Globe Theatre burns to the ground. Devastated, Shakespeare returns to Stratford, where he must face a troubled past and a neglected family — wife Anne (Judi Dench) and two daughters, Susanna (Lydia Wilson) and Judith (Kathryn Wilder). The large ensemble cast also includes Ian McKellen as the Earl of Southampton.

I sat down with Branagh — whose credits include directing such non-Shakespeare movies as Thor, Cinderella and Murder on the Orient Express and acting in Dunkirk and Harry Potter and the Chamber of Secrets — to talk about about making the film and his workflow.

You’ve played many of Shakespeare’s characters in film or on stage. Was it a dream come true to finally play the man himself, or was it intimidating?
It was a dream come true, as I feel like he’s been a guide and mentor since I discovered him at school. And, rather like a dog, he’s given me unconditional love ever since. So I was happy to return some. It’s easy to forget that he was just a guy. He was amazing and a genius, but first and foremost he was a human being.

What kind of film did you hope to make?
A chamber piece, a character piece that took him out of his normal environment. I didn’t want it to be the predictable romp inside a theater, full of backstage bitching and all that sort of theatricality. I wanted to take him away from that and put him back in the place he was from, and I also wanted to load the front part of the movie with silence instead of tons of dialogue.

How close do you feel it gets to the reality of his final years?
I think it’s very truthful about Stratford. It was a very litigious society, and some of the scenes — like the one where John Lane stands up in church and makes very public accusations — all happened. His son Hamnet’s death was unexplained, and Shakespeare did seem to be very insecure in some areas. He wanted money and success and he lived in a very volatile world. If he was supposed to be this returning hero coming back to the big house and a warm welcome from his family, whom he hadn’t seen much of the past two decades, it didn’t quite happen that way. No, he was this absentee dad and husband, and the town had an ambivalent relationship with him; it wasn’t a peaceful retirement at all.

The film is visually gorgeous, and all the candlelit scenes reminded me of Barry Lyndon.
I’m so glad you said that as DP Zac Nicholson and I were partly inspired by that film and that look, and we used only candlelight and no additional lights for those scenes. Painters, like Vermeer and Rembrandt, were our inspiration for all the day and night scenes, respectively.

Clint Eastwood told me, “Don’t ever direct and star in a movie unless you’re a sucker for punishment — it’s just too hard.” So how hard was it?
(Laughs) He’s right. It is very hard, and a lot of work, but it’s also a big privilege. But I had a lot of great help — the crew and people like Judi and Ian. They had great suggestions and you listen to every tidbit they have to offer. I don’t know how Clint does it, but I do a lot of listening and stealing. The directing and acting are so interlinked to me, and I love directing as I get to watch Ian and Judi work, and they’re such hard workers. Judi literally gets to the set before anyone else, and she’s pacing up and down and getting ready to defend Anne Hathaway. She has this huge empathy for her characters which you feel so much, and here she was giving voice to a woman who could not read or write.

Where did you post?
We were based at Longcross Studios, where we did Murder on the Orient Express and the upcoming Artemis Fowl. We did most of it there, and then we ended up at The Post Republic, which has places in London and Berlin, to do the final finishing. Then we did all the final mixing at Twickenham with the great re-recording mixer Andy Nelson and his team. It was my second picture with Andy Nelson as the rerecording mixer. I am completely present throughout and I am completely involved in the final mix.

Do you like the post process?
I love it. It’s the place where I understood, right from my first film, that it could make — in terms of performance — a good one bad, a good one great, a bad one much better. The power of change in post is just amazing to me, and realizing that anything is possible if you have the imagination. So the way you juxtapose the images you’ve collected — and the way a scene from the third act might actually work better in the first act — is so huge in post. That fluidity was a revelation to me, and you can have these tremendous eureka moments in post that can be beautiful and so inspiring.

Can you talk about working with editor Una Ni Dhongaile, who cut The Crown and won a BAFTA for Three Girls?
She’s terrific. She wasn’t on the set but we talked a lot during the shoot. I like her because she really has an opinion. She’s definitely not a “yes” person, but she’s also very sensitive. She also gets very involved with the characters and protects you as a director. She won’t let you cut too soon or too deep, and she encourages you to take a moment to think about stuff. She’s one of those editors who has this special kind of intuition about what the film needs, in addition to all her technical skills and intellectual understanding of what’s going on.

What were the big editing challenges?
After doing a lot of very long takes we used the very best, and despite using a very painterly style we didn’t make the film feel too static. We didn’t want to falsely or artificially cut to just affect the pace, but allow it to flow naturally so every minute was earned. We also didn’t want to feel afraid of holding a particular shot for a long time. We definitely needed pauses and rests, and Shakespeare is musical in his poetry and the way he juxtaposes fast and slow moments. So all those decisions were critical and needed mulling as well as executing.

Talk about the importance of sound and music, as it’s a very quiet film.
It’s absolutely critical in a world like this where light and sound play huge roles and are so utterly different to our own modern understanding of it. The aural and audio space you can offer an audience for this was a big chance to adventure back in time, when the world was far more sparsely populated. Especially in a little place like Stratford; silence played a big role as well. You’re offering a hint of the outside world and the aural landscape is really the bedrock for all the introspection and thoughtfulness this movie deals with.

Patrick Doyle’s music has this gossamer approach — that was the word we used. It was like a breath, so that the whole sound experience invited the audience into the meditative world of Shakespeare. We wanted them to feel the seasons pass, the wind in the trees, and how much more was going on than just the man thinking about his past. It was the experience of returning home and being with this family again, so you’d hear a creak of a chair and it would interrupt his thoughts. So we worked hard on every little detail like that.

Where did you do the grading and coloring?
Post Republic in their North London facility, and again, I’m involved every step of the way.

Did making this film change your views about Shakespeare the man?
Yes, and it was an evolving thing. I’ve always been drawn to his flawed humanity, so it seemed real to be placing this man in normal situations and have him be right out of his comfort zone at the start of the film. So you have this acclaimed, feted and busy playwright, actor, producer and stage manager suddenly back on the dark side of the moon, which Stratford was back then. It was a small town, a three-day trip from London, and it must have been a shock. It was candlelight and recrimination. But I think he was a man without pomp. His colleagues most often described him as modest and gentle, so I felt a vulnerability that surprised me. I think that’s authentic to the man.

What’s next for you?
Disney’s Artemis Fowl, the fantasy-adventure based on the books, which will be released on May 29, and then I start directing Death on the Nile for Fox, which starts shooting late summer.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Arraiy 4.11.19

Showrunner: Eric Newman of Netflix’s Narcos: Mexico

By Iain Blair

Much like the drugs that form the dark heart of Narcos: Mexico, the hit Netflix crime drama is full of danger, chills and thrills — and is highly addictive. It explores the origins of the modern, ultra-violent drug war by going back to its roots, beginning at a time when the Mexican trafficking world was a loose and disorganized confederation of independent growers and dealers. But that all changed with the rise of the Guadalajara Cartel in the 1980s as Félix Gallardo (Diego Luna) — the real-life former Sinaloan police-officer-turned-drug lord — takes the helm, unifying traffickers in order to build an empire.

L-R: Director José Padilha and producer Chris Brancato bookend Eric Newman on the set of Narcos, Season 1.

The show also follows DEA agent Kiki Camarena (Michael Peña), who moves his wife and young son from California to Guadalajara to take on a new post. He quickly learns that his assignment will be more challenging than he ever could have imagined. As Kiki garners intelligence on Félix and becomes more entangled in his mission, a tragic chain of events unfold, affecting the drug trade and the war against it for years to come.

Narcos showrunner, writer and executive producer Eric Newman is a film and television veteran whose resume includes the Academy Award-nominated Children of Men, as well as The Dawn of the Dead, The Last Exorcism and Bright. After over 20 years in the movie industry, Newman transitioned into television as an executive producer on Hemlock Grove for Netflix. It was his curiosity about the international drug trade that led him to develop and executive produce his passion project Narcos, and Newman assumed showrunning responsibilities at the end of its first season. Narcos: Mexico initially started out as the fourth season of Narcos before Netflix decided to make it a stand-alone series.

I recently spoke with Newman about making the show, his involvement in post and another war that’s grabbed a lot of headlines — the one between streaming platforms and traditional cinema.

Do you like being a showrunner?
Yeah! There are aspects of it I really love. I began toward the end of the first season and there was this brief period where I tried not to be the showrunner, even though it was my show. I wasn’t really a writer — I wasn’t in the WGA — so I had a lot of collaborators, but I still felt alone in the driver’s seat. It’s a huge amount of work, from the writing to the shoot and then post, and it never really ends. It’s exhausting but incredibly rewarding.

What are the big challenges of running this show?
If I’d known more about TV at the time, I might have been far more frightened than I was (laughs).The big one is dealing with all the people and personalities involved. We have anywhere between 200 and 400 people working on the show at any given time, so it’s tricky. But I love working with actors, I think I’m a good listener, and any major problems are usually human-oriented. And then there’s all the logistics and moving parts. We began the series shooting in Colombia and then moved the whole thing to Mexico, so that was a big challenge. But the cast and crew are so great, we’re like a big family at this point, and it runs pretty smoothly now.

How far along are you with the second season of Narcos: Mexico?
We’re well into it, and while it’s called Season Two, the reality for us is that it’s the fifth season of a long, hard slog.

This show obviously deals with a lot of locations. How difficult is it when you shoot in Mexico?
It can be hard and grueling. We’re shooting entirely in Mexico — nothing in the States. We shot in Colombia for three years and we went to Panama once, and now we’re all over Mexico — from Mexico City to Jalisco, Puerto Vallarta, Guadalajara, Durango and so on.

It’s also very dangerous subject matter, and one of your location scouts was murdered. Do you worry about your safety?
That was a terrible incident, and I’m not sure whoever shot him even knew he was a location scout on our show. The reality is that a number of incredibly brave journalists, who had nowhere near the protection we have, had already shared these stories — and many were killed for it. So in many ways we’re late to the party.

Of course, you have to be careful anywhere you go, but that’s true of every city. You can find trouble in LA or New York if you are in the wrong place. I don’t worry about the traffickers we depict, as they’re mainly all dead now or in jail, and they seem OK with the way they’re depicted… that it’s pretty truthful. I worry a little bit more about the police and politicians.

Where do you post and do you like the post process?
I absolutely love post, and I think it’s a deeply underrated and under-appreciated aspect of the show. We’ve pulled off far more miracles in post than in any of the writing and shooting. We do all the post at Lantana in LA with the same great team that we’ve had from the start, including post producer Tim King and associate post producer Tanner King.

When we began the series in Colombia, we were told that Netflix didn’t feel comfortable having the footage down there for editing because of piracy issues, and that worked for me. I like coming back to edit and then going back down to Mexico to shoot. We shoot two episodes at a time and cut two at a time. I’m in the middle of doing fixes on Episode 2 and we’re about to lock Episode 3.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
We have four full-time editors — Iain Erskine, Garret Donnelly, Monty DeGraff and Jon Otazua — who each take separate episodes, plus we have one editor dedicated to the archival package, which is a big part of the show. We’ve also promoted two assistant editors, which I’m very proud of. That’s a nice part of being on a show that’s run for five years; you can watch people grow and move them up the ladder.

You have a huge cast and a lot of moving pieces in each episode. What are the big editing challenges?
We have a fair amount of coverage to sort through, and it’s always about telling the story and the pacing — finding the right rhythm for each scene.

This show has a great score and great sound design. Where do you mix, and can you talk about the importance of sound and music?
We do all the mixing at Technicolor, and we have a great team that includes supervising sound editor Randle Akerson and supervising ADR editor Thomas Whiting. (The team also includes sound effects editors Dino R. DiMuro and Troy Prehmus, dialogue editor David Padilla, music editor Chris Tergesen, re-recording mixers Pete Elia and Kevin Roache and ADR mixer Judah Getz.)

It’s all so crucial. All you have to do is look at a rough edit without any sound of music and it’s just so depressing. I come from a family of composers, so I really appreciate this part of post, and composer Gustavo Santaolalla has done a fantastic job, and the music’s changed a bit since we moved to Mexico. I’m fairly involved with all of it. I get a final playback and maybe I’ll have a few notes, but generally the team has got it right.

In 2017, you formed Screen Arcade with producer Bryan Unkeless, a production company based at Netflix with deals for features and television. I heard you have a new movie you’re producing for Netflix, PWR with Jamie Foxx and Joseph Gordon-Levitt?
It’s all shot, and we’re just headed into the Director’s Cut. We’re posting in New York and have our editorial offices there. Netflix is so great to partner with. They care as much about the quality of image and sound as any studio I’ve ever worked with — and I’ve worked with everyone. In terms of the whole process and deliverables, there’s no difference.

It’s interesting because there’s been a lot of pushback against Netflix and other streaming platforms from the studios, purists and directors like Steven Spielberg. Where do you see the war for cinema’s future going?
I think it’ll be driven entirely by audience viewing habits, as it should be. Some of my all-time favorite movies — The Bridge on the River Kwai, Taxi Driver, Sunset Boulevard, Barry Lyndon — weren’t viewed in a movie house.

Cinema exhibition is a business. They want Black Panther and Star Wars, so it’s a commerce argument not a creative one. With all due respect to Spielberg, no one can dictate viewing habits, and maybe for now they can deny Netflix and streaming platforms Academy awards, but not forever.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


DP Chat: The Man in the High Castle’s Gonzalo Amat

By Randi Altman

Amazon’s The Man in the High Castle is based on the 1962 Phillip K. Dick novel, which asks the question: “What would it look like if the Germans and Japanese won World War II?” It takes a look at the Nazi and Japanese occupation of portions of the United States and the world. But it’s a Philip K. Dick story, so you know there is more to it than that… like an alternate reality.

The series will premiere its fourth and final season this fall on the streaming service. We recently reached out to cinematographer Gonzalo Amat, who was kind enough to talk to us about workflow and more.

How did you become interested in cinematography?
Since I was very young, I had a strong interest in photography and was shooting stills as long as I can remember. Then, when I was maybe 10 or 12 years old, I discovered that movies also had a photographic aspect. I didn’t think about doing it until I was already in college studying communications, and that is when I decided to make it my career.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology?
Artistically, I get inspiration from a lot of sources, such as photography, film, literature, painting or any visual medium. I try to curate what I consume, though. I believe that everything we feed our brain somehow shows up in the work we do, so I am very careful about consuming films, books and photography that feed the story that I will be working on. I think any creation is inspiration. It can be all the way from a film masterpiece to a picture drawn by a kid, music, performance art, historical photographs or testimonies, too.

About staying on top: I read trade magazines and stay educated through seminars and courses, but at some point, it’s also about using those tools. So I try to test the tools instead of reading about them. Almost any rental place or equipment company will let you try newer tools. If I’m shooting, we try to schedule a test for a particular piece of equipment we want to use, during a light day.

What new technology has changed the way you work?
The main new technology would be the migration of most projects to digital. That has changed the way we work on set and collaborate with the directors, since everyone can now see, on monitors, something closely resembling the final look of the project.

A lot of people think this is a bad thing that has happened, but for me, it actually allows more clear communication about the concrete aspects of a sometimes very personal vision. Terms like dark, bright, or colorful are very subjective, so having a reference is a good point to continue the conversation.

Also, digital technology has helped use more available light on interiors and use less light on exterior nights. Still, it hasn’t reached the latitude of film, where you could just let the windows burn. It’s trickier for exterior day shots, where I think you end up needing more control. I would also say that the evolution of visual effects as a more invisible tool has helped us achieve a lot more from a storytelling perspective and has affected the way we shoot scenes in general.

What are some of your best practices, or rules you try to follow on each job?
Each project is different, so I try to learn how that particular project will be. But there are some time-tested rules that I try to implement. The main line is to always go for the story; every answer is always in the script. Another main rule is communication. So being open about questions, even if they seem silly. It’s always good to ask.

Another rule is listening to ideas. People that end up being part of my team are very experienced and sometimes have solutions to problems that come up. If you are open to ideas, more ideas will come, and people will do their jobs with more intention and commitment. Gratitude, respect, collaboration, communication and being conscious about safety is important and part of my process.

Gonzalo Amat on set

Explain your ideal collaboration with the director when setting the look of a project.
Every director is different, so I look at each new project as an opportunity to learn. As a DP, you have to learn and adapt, since through your career you will be asked for different levels of involvement. Because of my interest in storytelling, I personally prefer a bit more of a hands-off approach from directors; talking more about story and concepts, where we collaborate setting up the shoots for covering a scene, and same with lighting: talking moods and concepts that get polished as we are on set. Some directors will be very specific, and that is a challenge because you have to deliver what is inside their heads and hopefully make it better. I still enjoy this challenge, because it also makes you work for someone’s vision.

Ideally, developing the look of a project comes from reading the script together and watching movies and references together. This is when you can say “dark like this” or “moody like this” because visual concepts are very subjective, and so is color. From then on, it’s all about breaking up the script and the visual tone and arc of the story, and subsequently all the equipment and tools for executing the ideas. Lots of meetings as well as walking the locations with just the director and DP are very useful.

How would you describe the overarching look of the show?
Basically, the main visual concept of this project is based in film noir, and our main references were The Conformist and Blade Runner. As we went along, we added some more character-based visual ideas inspired by projects like In the Mood for Love and The Insider for framing.

The main idea is to visually portray the worlds of the characters through framing and lighting. Sometimes, we play it the way the script tells us; sometimes we counterpoint visually what it says, so we can make the audience respond in an emotional way. I see cinematography as the visual music that makes people respond emotionally to different moods. Sometimes it’s more subtle and sometimes more obvious. We prefer to not be very intrusive, even though it’s not a “realist” project.

How early did you get involved in the production?
I start four or five weeks before the season. Even if I’m not doing the first episode, I will still be there to prepare new sets and do some tests for new equipment or characters. Preparation is key in a project like this, because once we start with the production the time is very limited.

Did you start out on the pilot? Did the look change from season to season at all?
James Hawkinson did the pilot, and I came in when the series got picked up. He set up the main visual concepts, and when it came to series I adapted some of the requirements from the studio and the notes from Ridley Scott into the style we see now.

The look has been evolving from season to season, as we feel we can be bolder with the visual language of the show. If you look at the pilot all the way to the end of Season 3, or Season 4, which is filming, you can definitely see a change, even though it still feels like the same project — the language has been polished and distilled. I think we have reached the sweet spot.

Does the look change at all when the timelines shift?
Yes, all of the timelines require a different look and approach with lighting and camera use. Also, the art design and wardrobe changes, so we combine all those subtle changes to give each world, place and timeline a different feel. We have lots of conceptual meetings, and we develop the look and feel of each timeline and place. Once these concepts are established, the team gets to work constructing the sets and needed visual elements, and then we go from there.

This is a period piece. How did that affect the look, if at all?
We have tried to give it a specific and unique look that still feels tied to the time period so, yes, the fact that this happens in our own version of the ‘60s has determined the look, feeling and language of the series. We base our aesthetics in what the real world was in 1945, which our story diverges from to form this alternate world.

The 1960s of the story are not the real 1960s because there is no USA and no free Europe, so that means most of the music and wardrobe doesn’t look like the 1960s we know. There are many Nazi and Japanese visual elements on the visuals that distinguish us from a regular 1960s look, but it still feels period.

How did you go about choosing the right camera and lenses for this project?
Because we had a studio mandate to finish in 4K, the Red One with Zeiss Master Prime lenses was chosen in the pilot, so when I came on we inherited that tech. We stuck with all this for the first season, but after a few months of shooting we adapted the list and filters and lighting. On Season 2, we pushed to change to an ARRI Alexa camera, so we ended up adjusting all the equipment around this new camera and it’s characteristics — such as needing less light, so we ended up with less lighting equipment.

We also added classic Mitchell Diffusion Filters and some zooms. Lighting and grip equipment have been evolving toward less and less equipment since we light less and less. It’s a constant evolution. We also looked at some different lens options in the season breaks, but we haven’t added them because we don’t want to change our budget too much from season to season, and we use them as required.

Any challenging scenes that you are particularly proud of in Season 3?
I think the most challenging scene was the one in the Nebenwelt tunnel set. We had to have numerous meetings about what this tunnel was as a concept and then, based on the concept, find a way to execute it in a visual way. We wanted to make sure that the look of the scene matched the concepts of quantum physics within the story.

I wanted to achieve lighting that felt almost like plasma. We decided to put a mirror at the end of the tunnel with circle lighting right above it. We then created the effect of the space travel by using a blast of light — using lighting strikes with an elaborate setup that collectively used more than a million watts. It was a complex setup, but fortunately we had a lot of very talented people come together to execute it.

What’s your go-to gear (camera, lens, mount/accessories) — things you can’t live without?
On this project, I’d say it’s the 40mm lens. I don’t think this project would have the same vibe without this lens. Then, of course, I love the Technocrane, but we don’t use it every day, for budgetary and logistical reasons.

For other projects, I would say the ARRI Alexa camera and the 40mm and handheld accessories. You can do a whole movie with just those two; I have done it, and it’s liberating. But if I had an unlimited budget, I would love to use a Technocrane every day with a stabilized remote head.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Collaboration company Pix acquires Codex

Pix has reached an agreement to acquire London-based Codex, in a move that will enable both companies to deliver a range of new products and services, from streamlined camera capture to post production finishing.

The Pix System  is a collaboration tool that provides industry pros with secure access to production content on mobile devices, laptops or TVs from offices, homes or while traveling. They won an Oscar for its technology in 2019.

Codex products include recorders and media processing systems that transfer digital files and images from the camera to post, and tools for color dynamics, dailies creation, archiving, review and digital asset management.

“Our clients have relied on Pix to protect their material and ideas throughout all phases of production. In Codex, we found a group that similarly values relationships with attention to critical details,” explains Pix founder/CEO Eric Dachs. “Codex will retain its distinct brand and culture, and there is a great deal we can do together for the benefit of our clients and the industry.”

Over the years, Pix and Codex have seen wide industry adoption, delivering a proven record of contributing value to their clients. Introduced in 2003, Pix soon became a trusted and widely used secure communication and content management provider. The Pix System enables creative continuity and reduces project risk by ensuring that ideas are accurately shared, stored, and preserved throughout the entire production process.

“Pix and Codex are complementary, trusted brands used by leading creatives, filmmakers and studios around the world,” says Codex managing director Marc Dando. “The integration of both services into one simplified workflow will deliver the industry a fast, secure, global collaborative ecosystem.”

With the acquisition of Codex, Pix will expand its servicing reach across the globe. Pix founder Dachs will remain as CEO, and Dando will take on the role of chief design officer at Pix, with a focus on existing and new products.


NAB 2019: postPerspective Impact Award winners

postPerspective has announced the winners of our Impact Awards from NAB 2019. Seeking to recognize debut products with real-world applications, the postPerspective Impact Awards are voted on by an anonymous judging body made up of respected industry artists and pros (to whom we are very grateful). It’s working pros who are going to be using these new tools — so we let them make the call.

It was fun watching the user ballots come in and discovering which products most impressed our panel of post and production pros. There are no entrance fees for our awards. All that is needed is the ability to impress our voters with products that have the potential to make their workdays easier and their turnarounds faster.

We are grateful for our panel of judges, which grew even larger this year. NAB is exhausting for all, so their willingness to share their product picks and takeaways from the show isn’t taken for granted. These men and women truly care about our industry and sharing information that helps their fellow pros succeed.

To be successful, you can’t operate in a vacuum. We have found that companies who listen to their users, and make changes/additions accordingly, are the ones who get the respect and business of working pros. They aren’t providing tools they think are needed; they are actively asking for feedback. So, congratulations to our winners and keep listening to what your users are telling you — good or bad — because it makes a difference.

The Impact Award winners from NAB 2019 are:

• Adobe for Creative Cloud and After Effects
• Arraiy for DeepTrack with The Future Group’s Pixotope
• ARRI for the Alexa Mini LF
• Avid for Media Composer
• Blackmagic Design for DaVinci Resolve 16
• Frame.io
• HP for the Z6/Z8 workstations
• OpenDrives for Apex, Summit, Ridgeview and Atlas

(All winning products reflect the latest version of the product, as shown at NAB.)

Our judges also provided quotes on specific projects and trends that they expect will have an impact on their workflows.

Said one, “I was struck by the predicted impact of 5G. Verizon is planning to have 5G in 30 cities by end of year. The improved performance could reach 20x speeds. This will enable more leverage using cloud technology.

“Also, AI/ML is said to be the single most transformative technology in our lifetime. Impact will be felt across the board, from personal assistants, medical technology, eliminating repetitive tasks, etc. We already employ AI technology in our post production workflow, which has saved tens of thousands of dollars in the last six months alone.”

Another echoed those thoughts on AI and the cloud as well: “AI is growing up faster than anyone can reasonably productize. It will likely be able to do more than first thought. Post in the cloud may actually start to take hold this year.”

We hope that postPerspective’s Impact Awards give those who weren’t at the show, or who were unable to see it all, a starting point for their research into new gear that might be right for their workflows. Another way to catch up? Watch our extensive video coverage of NAB.


NAB 2019: An engineer’s perspective

By John Ferder

Last week I attended my 22nd NAB, and I’ve got the Ross lapel pin to prove it! This was a unique NAB for me. I attended my first 20 NABs with my former employer, and most of those had me setting up the booth visits for the entire contingent of my co-workers and making sure that the vendors knew we were at each booth and were ready to go. Thursday was my “free day” to go wandering and looking at the equipment, cables, connectors, test gear, etc., that I was looking for.

This year, I’m part of a new project, so I went with a shopping list and a rough schedule with the vendors we needed to see. While I didn’t get everywhere I wanted to go, the three days were very full and very rewarding.

Beck Video IP panel

Sessions and Panels
I also got the opportunity to attend the technical sessions on Saturday and Sunday. I spent my time at the BEITC in the North Hall and the SMPTE Future of Cinema Conference in the South Hall. Beck TV gave an interesting presentation on constructing IP-based facilities of the future. While SMPTE ST2110 has been completed and issued, there are still implementation issues, as NMOS is still being developed. Today’s systems are and will for the time being be hybrid facilities. The decision to be made is whether the facility will be built on an IP routing switcher core with gateways to SDI, or on an SDI routing switcher core with gateways to IP.

Although more expensive, building around an IP core would be more efficient and future-proof. Fiber infrastructure design, test equipment and finding engineers who are proficient in both IP and broadcast (the “Purple Squirrels”) are large challenges as well.

A lot of attention was also paid to cloud production and distribution, both in the BEITC and the FoCC. One such presentation, at the FoCC, was on VFX in the cloud with an eye toward the development of 5G. Nathaniel Bonini of BeBop Technology reported that BeBop has a new virtual studio partnership with Avid, and that the cloud allows tasks to be performed in a “massively parallel” way. He expects that 5G mobile technology will facilitate virtualization of the network.

VFX in the Cloud panel

Ralf Schaefer, of the Fraunhofer Heinrich-Hertz Institute, expressed his belief that all devices will be attached to the cloud via 5G, resulting in no cables and no mobile storage media. 5G for AR/VR distribution will render the scene in the network and transmit it directly to the viewer. Denise Muyco of StratusCore provided a link to a virtual workplace: https://bit.ly/2RW2Vxz. She felt that 5G would assist in the speed of the collaboration process between artist and client, making it nearly “friction-free.” While there are always security concerns, 5G would also help the prosumer creators to provide more content.

Chris Healer of The Molecule stated that 5G should help to compress VFX and production workflows, enable cloud computing to work better and perhaps provide realtime feedback for more perfect scene shots, showing line composites of VR renders to production crews in remote locations.

The Floor
I was very impressed with a number of manufacturers this year. Ross Video demonstrated new capabilities of Inception and OverDrive. Ross also showed its new Furio SkyDolly three-wheel rail camera system. In addition, 12G single-link capability was announced for Acuity, Ultrix and other products.

ARRI AMIRA (Photo by Cotch Diaz)

ARRI showed a cinematic multicam system built using the AMIRA camera with a DTS FCA fiber camera adapter back and a base station controllable by Sony RCP1500 or Skaarhoj RCP. The Sony panel will make broadcast-centric people comfortable, but I was very impressed with the versatility of the Skaarhoj RCP. The system is available using either EF, PL, or B4 mount lenses.

During the show, I learned from one of the manufacturers that one of my favorite OLED evaluation monitors is going to be discontinued. This was bad news for the new project I’ve embarked on. Then we came across the Plura booth in the North Hall. Plura as showing a new OLED monitor, the PRM-224-3G. It is a 24.5-inch diagonal OLED, featuring two 3G/HD/SD-SDI and three analog inputs, built-in waveform monitors and vectorscopes, LKFS audio measurement, PQ and HLG, 10-bit color depth, 608/708 closed caption monitoring, and more for a very attractive price.

Sony showed the new HDC-3100/3500 3xCMOS HD cameras with global shutter. These have an upgrade program to UHD/HDR with and optional processor board and signal format software, and a 12G-SDI extension kit as well. There is an optional single-mode fiber connector kit to extend the maximum distance between camera and CCU to 10 kilometers. The CCUs work with the established 1000/1500 series of remote control panels and master setup units.

Sony’s HDC-3100/3500 3xCMOS HD camera

Canon showed its new line of 4K UHD lenses. One of my favorite lenses has been the HJ14ex4.3B HD wide-angle portable lens, which I have installed in many of the studios I’ve worked in. They showed the CJ14ex4.3B at NAB, and I even more impressed with it. The 96.3-degree horizontal angle of view is stunning, and the minimization of chromatic aberration is carried over and perhaps improved from the HJ version. It features correction data that support the BT.2020 wide color gamut. It works with the existing zoom and focus demand controllers for earlier lenses, so it’s  easily integrated into existing facilities.

Foot Traffic
The official total of registered attendees was 91,460, down from 92,912 in 2018. The Evertz booth was actually easy to walk through at 10a.m. on Monday, which I found surprising given the breadth of new interesting products and technologies. Evertz had to show this year. The South Hall had the big crowds, but Wednesday seemed emptier than usual, almost like a Thursday.

The NAB announced that next year’s exhibition will begin on Sunday and end on Wednesday. That change might boost overall attendance, but I wonder how adversely it will affect the attendance at the conference sessions themselves.

I still enjoy attending NAB every year, seeing the new technologies and meeting with colleagues and former co-workers and clients. I hope that next year’s NAB will be even better than this year’s.

Main Image: Barbie Leung.


John Ferder is the principal engineer at John Ferder Engineer, currently Secretary/Treasurer of SMPTE, an SMPTE Fellow, and a member of IEEE. Contact him at john@johnferderengineer.com.


NAB 2019: A cinematographer’s perspective

By Barbie Leung

As an emerging cinematographer, I always wanted to attend an NAB show, and this year I had my chance. I found that no amount of research can prepare you for the sheer size of the show floor, not to mention the backrooms, panels and after-hours parties. As a camera operator as well as a cinematographer who is invested in the post production and exhibition end of the spectrum, I found it absolutely impossible to see everything I wanted to or catch up with all the colleagues and vendors I wanted to. This show is a massive and draining ride.

Panasonic EV1

There was a lot of buzz in the ether about 5G technology. Fast and accurate, the consensus seems to be that 5G will be the tipping point in implementing a lot of the tech that’s been talked about for years but hasn’t quite taken off yet, including the feasibility of autonomous vehicles and 8K streaming stateside.

It’s hard to deny the arrival of 8K technology while staring at the detail and textures on an 80-inch Sharp 8K professional display. Every roof tile, every wave in the ocean is rendered in rich, stunning detail.

In response to the resolution race, on the image capture end of things, Arri had already announced and started taking orders for the Alexa Mini LF — its long-awaited entry into the large format game — in the week before NAB.

Predictably, at NAB we saw many lens manufacturers highlighting full-frame coverage. Canon introduced its Sumire Prime lenses, while Fujinon announced the Premista 28-100mm T2.9 full-format zoom.

Sumire Prime lenses

Camera folks, including many ASC members, are embracing large format capture for sure, but some insist the appeal lies not so much in the increased resolution, but rather in the depth and overall image quality.

Meanwhile, back in 35mm sensor land, Panasonic continues its energetic push of the EVA1 camera. Aside from presentations at their booth emphasizing “cinematic” images from this compact 5.7K camera, they’ve done a subtle but not-to-subtle job of disseminating the EVA1 throughout the trade show floor. If you’re at the Atomos booth, you’ll find director/cinematographers like Elle Schneider presenting work shot with Atomos with the EVA1 balanced on a Ronin-S, and if you stop by Tiffen you’ll find an EVA1 being flown next to the Alexa Mini.

I found a ton of motion control at the show. From Shotover’s new compact B1 gyro stabilized camera system to the affable folks at Arizona-based Defy, who showed off their Dactylcam Pro, an addictively smooth-to-operate cable-suspension rig. The Bolt high-speed Cinebot had high-speed robotic arms complete with a spinning hologram.

Garret Brown at the Tiffen booth.

All this new gimbal technology is an ever-evolving game changer. Steadicam inventor Garrett Brown was on hand at the Tiffen booth to show the new M2 sled, which has motors elegantly built into the base. He enthusiastically heralded that camera operators can go faster and more “dangerously” than ever. There was so much motion control that it vied for attention alongside all the talk of 5G, 8K and LED lighting.

Some veterans of the show have expressed that this year’s show felt “less exciting” than shows of the past eight to 10 years. There were fewer big product launch announcements, perhaps due to past years where companies have been unable to fulfill the rush of post-NAB orders for new products for 12 or even 18 months. Vendors have been more conservative with what to hype, more careful with what to promise.

For a new attendee like me, there was more than enough new tech to explore. Above all else, NAB is really about the people you meet. The tech will be new next year, but the relationships you start and build at NAB are meant to last a career.

Main Image: ARRI’s Alexa Mini LF.


Barbie Leung is a New York-based cinematographer and camera operator working in independent film and branded content. Her work has played Sundance, the Tribeca Film Festival and Outfest. You can follow her on Instagram at @barbieleungdp.


Colorfront at NAB with 8K HDR, product updates

Colorfront, which makes on-set dailies and transcoding systems, has rolled out new 8K HDR capabilities and updates across its product lines. The company has also deepened its technology partnership with AJA and entered into a new collaboration with Pomfort to bring more efficient color and HDR management on-set.

Colorfront Transkoder is a post workflow tool for handling UHD, HDR camera, color and editorial/deliverables formats, with recent customers such as Sky, Pixelogic, The Picture Shop and Hulu. With a new HDR GUI, Colorfront’s Transkoder 2019 performs the realtime decompression/de-Bayer/playback of Red and Panavision DXL2 8K R3D material displayed on a Samsung 82-inch Q900R QLED 8K Smart TV in HDR and in full 8K resolution (7680 X 4320). The de-Bayering process is optimized through Nvidia GeForce RTX graphics cards with Turing GPU architecture (also available on Colorfront On-Set Dailies 2019), with 8K video output (up to 60p) using AJA Kona 5 video cards.

“8K TV sets are becoming bigger, as well as more affordable, and people are genuinely awestruck when they see 8K camera footage presented on an 8K HDR display,” said Aron Jaszberenyi, managing director, Colorfront. “We are actively working with several companies around the world originating 8K HDR content. Transkoder’s new 8K capabilities — across on-set, post and mastering — demonstrate that 8K HDR is perfectly accessible to an even wider range of content creators.”

Powered by a re-engineered version of Colorfront Engine and featuring the HDR GUI and 8K HDR workflow, Transkoder 2019 supports camera/editorial formats including Apple ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE (High Density Encoding).

Transkoder 2019’s mastering toolset has been further expanded to support Dolby Vision 4.0 as well as Dolby Atmos for the home with IMF and Immersive Audio Bitstream capabilities. The new Subtitle Engine 2.0 supports CineCanvas and IMSC 1.1 rendering for preservation of content, timing, layout and styling. Transkoder can now also package multiple subtitle language tracks into the timeline of an IMP. Further features support fast and efficient audio QC, including solo/mute of individual tracks on the timeline, and a new render strategy for IMF packages enabling independent audio and video rendering.

Colorfront also showed the latest versions of its On-Set Dailies and Express Dailies products for motion pictures and episodic TV production. On-Set Dailies and Express Dailies both now support ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE. As with Transkoder 2019, the new version of On-Set Dailies supports real-time 8K HDR workflows to support a set-to-post pipeline from HDR playback through QC and rendering of HDR deliverables.

In addition, AJA Video Systems has released v3.0 firmware for its FS-HDR realtime HDR/WCG converter and frame synchronizer. The update introduces enhanced coloring tools together with several other improvements for broadcast, on-set, post and pro AV HDR production developed by Colorfront.

A new, integrated Colorfront Engine Film Mode offers an ACES-based grading and look creation toolset with ASC Color Decision List (CDL) controls, built-in LOOK selection including film emulation looks, and variable Output Mastering Nit Levels for PQ, HLG Extended and P3 colorspace clamp.

Since launching in 2018, FS-HDR has been used on a wide range of TV and live outside broadcast productions, as well as motion pictures including Paramount Pictures’ Top Gun: Maverick, shot by Claudio Miranda, ASC.

Colorfront licensed its HDR Image Analyzer software to AJA for AJA’s HDR Image Analyzer in 2018. A new version of AJA HDR Image Analyzer is set for release during Q3 2019.

Finally, Colorfront and Pomfort have teamed up to integrate their respective HDR-capable on-set systems. This collaboration, harnessing Colorfront Engine, will include live CDL reading in ACES pipelines between Colorfront On-Set/Express Dailies and Pomfort LiveGrade Pro, giving motion picture productions better control of HDR images while simplifying their on-set color workflows and dailies processes.

NAB 2019: First impressions

By Mike McCarthy

There are always a slew of new product announcements during the week of NAB, and this year was no different. As a Premiere editor, the developments from Adobe are usually the ones most relevant to my work and life. Similar to last year, Adobe was able to get their software updates released a week before NAB, instead of for eventual release months later.

The biggest new feature in the Adobe Creative Cloud apps is After Effects’ new “Content Aware Fill” for video. This will use AI to generate image data to automatically replace a masked area of video, based on surrounding pixels and surrounding frames. This functionality has been available in Photoshop for a while, but the challenge of bringing that to video is not just processing lots of frames but keeping the replaced area looking consistent across the changing frames so it doesn’t stand out over time.

The other key part to this process is mask tracking, since masking the desired area is the first step in that process. Certain advances have been made here, but based on tech demos I saw at Adobe Max, more is still to come, and that is what will truly unlock the power of AI that they are trying to tap here. To be honest, I have been a bit skeptical of how much AI will impact film production workflows, since AI-powered editing has been terrible, but AI-powered VFX work seems much more promising.

Adobe’s other apps got new features as well, with Premiere Pro adding Free-Form bins for visually sorting through assets in the project panel. This affects me less, as I do more polishing than initial assembly when I’m using Premiere. They also improved playback performance for Red files, acceleration with multiple GPUs and certain 10-bit codecs. Character Animator got a better puppet rigging system, and Audition got AI-powered auto-ducking tools for automated track mixing.

Blackmagic
Elsewhere, Blackmagic announced a new version of Resolve, as expected. Blackmagic RAW is supported on a number of new products, but I am not holding my breath to use it in Adobe apps anytime soon, similar to ProRes RAW. (I am just happy to have regular ProRes output available on my PC now.) They also announced a new 8K Hyperdeck product that records quad 12G SDI to HEVC files. While I don’t think that 8K will replace 4K television or cinema delivery anytime soon, there are legitimate markets that need 8K resolution assets. Surround video and VR would be one, as would live background screening instead of greenscreening for composite shots. No image replacement in post, as it is capturing in-camera, and your foreground objects are accurately “lit” by the screens. I expect my next major feature will be produced with that method, but the resolution wasn’t there for the director to use that technology for the one I am working on now (enter 8K…).

AJA
AJA was showing off the new Ki Pro Go, which records up to four separate HD inputs to H.264 on USB drives. I assume this is intended for dedicated ISO recording of every channel of a live-switched event or any other multicam shoot. Each channel can record up to 1080p60 at 10-bit color to H264 files in MP4 or MOV and up to 25Mb.

HP
HP had one of their existing Z8 workstations on display, demonstrating the possibilities that will be available once Intel releases their upcoming DIMM-based Optane persistent memory technology to the market. I have loosely followed the Optane story for quite a while, but had not envisioned this impacting my workflow at all in the near future due to software limitations. But HP claims that there will be options to treat Optane just like system memory (increasing capacity at the expense of speed) or as SSD drive space (with DIMM slots having much lower latency to the CPU than any other option). So I will be looking forward to testing it out once it becomes available.

Dell
Dell was showing off their relatively new 49-inch double-wide curved display. The 4919DW has a resolution of 5120×1440, making it equivalent to two 27-inch QHD displays side by side. I find that 32:9 aspect ratio to be a bit much for my tastes, with 21:9 being my preference, but I am sure there are many users who will want the extra width.

Digital Anarchy
I also had a chat with the people at Digital Anarchy about their Premiere Pro-integrated Transcriptive audio transcription engine. Having spent the last three months editing a movie that is split between English and Mandarin dialogue, needing to be fully subtitled in both directions, I can see the value in their tool-set. It harnesses the power of AI-powered transcription engines online and integrates the results back into your Premiere sequence, creating an accurate script as you edit the processed clips. In my case, I would still have to handle the translations separately once I had the Mandarin text, but this would allow our non-Mandarin speaking team members to edit the Mandarin assets in the movie. And it will be even more useful when it comes to creating explicit closed captioning and subtitles, which we have been doing manually on our current project. I may post further info on that product once I have had a chance to test it out myself.

Summing Up
There were three halls of other products to look through and check out, but overall, I was a bit underwhelmed at the lack of true innovation I found at the show this year.

Full disclosure, I was only able to attend for the first two days of the exhibition, so I may have overlooked something significant. But based on what I did see, there isn’t much else that I am excited to try out or that I expect to have much of a serious impact on how I do my various jobs.

It feels like most of the new things we are seeing are merely commoditized versions of products that may originally have been truly innovative when they were initially released, but now are just slightly more fleshed out versions over time.

There seems to be much less pioneering of truly new technology and more repackaging of existing technologies into other products. I used to come to NAB to see all the flashy new technologies and products, but now it feels like the main thing I am doing there is a series of annual face-to-face meetings, and that’s not necessarily a bad thing.

Until next year…


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.