Audionamix – 7.1.20

Category Archives: Color Grading

Picture Shop buys The Farm Group

Burbank’s Picture Shop has acquired UK-based The Farm Group. The Farm Group was founded in 1998 and currently has four locations in London, as well as facilities in Manchester, Bristol and Los Angeles.

The Farm, London

The Farm also operates the in-house post production teams for BBC Sport in Salford, England; UKTV; and Fremantle Media. This deal marks Picture Shop’s second international acquisition, followed by the deal it made for Vancouver’s Finalé Post earlier this year.

The founders of The Farm, Nicky Sargent and Vikki Dunn, will stay involved in The Farm Group. In a joint statement, Sargent and Dunn said, “We are delighted that after 20 successful years, we have a new partner. Picture Shop is poised to expand in the international post market and provide the combination of technical, creative and professional excellence to the world’s content creators.”

The duo will also re-invest in the expanded Picture Head Group, which includes Picture Head and audio post company Formosa Group, in addition to Picture Shop.

L-R: The Farm Group’s Nicky Sargent and Vikki Dunn.

Bill Romeo, president of Picture Shop, says, “Based on the amount of content being created internationally, we felt it was important to have a presence worldwide and support our clients’ needs. The Farm, based on its reputation and creative talent, will be able to maintain the philosophy of Picture Shop. It is a perfect fit. Our clients will benefit from our collaborative efforts internationally, as well as benefit from our technology and experience. We will continue to partner and support our clients while maintaining our boutique feel.”

Recent work from The Farm Group includes BBC Two’s Summer of Rockets, Sky One’s Jamestown and Britain’s Got Talent.

 

Yesterday director Danny Boyle

By Iain Blair

Yesterday, everyone knew The Beatles. Today, only a struggling singer-songwriter in a tiny English seaside town remembers their songs. That’s the brilliant-yet-simple setup for Yesterday, the new rock ’n’ roll comedy from Academy Award-winning director Danny Boyle (Slumdog Millionaire, Trainspotting) and Oscar-nominated screenwriter Richard Curtis (Four Weddings and a Funeral, Love Actually, Notting Hill).

Danny Boyle on set with lead actor Himesh Patel

Jack Malik (Himesh Patel of BBC’s EastEnders) is the struggling singer-songwriter whose dreams of fame are rapidly fading, despite the fierce devotion and support of his childhood best friend/manager, Ellie (Lily James, Mamma Mia! Here We Go Again). But after a freak bus accident during a mysterious global blackout, Jack wakes up to discover that only he remembers The Beatles and their music, and his career goes supercharged when he ditches his own mediocre songs and instead starts performing hit after hit by the Fab Four — as if he’d written them.

Yesterday co-stars Ed Sheeran and James Corden (playing themselves) and Emmy Award-winner Kate McKinnon as Jack’s Hollywood agent. Along with new versions of The Beatles’ most beloved hits, Yesterday features a seasoned group of collaborators, including DP Christopher Ross (Terminal, the upcoming Cats), editor Jon Harris (Kingsman: The Secret Service, 127 Hours), music producer Adem Ilhan (The Ones Below, In the Loop) and composer Daniel Pemberton (Steve Jobs, Spider-Man: Into the Spider-Verse).

I recently spoke with Boyle, whose eclectic credits include Shallow Grave, The Beach, A Life Less Ordinary, Trance, Steve Jobs, Sunshine and 127 Hours, about making the film and the workflow.

What was your first reaction when you read this script?
I was a big fan of Richard’s work, and we’d worked together on the opening ceremony for the 2012 London Olympics, when we did this Chariots of Fire spoof with Rowan Atkinson, and I casually said to him, “If you’ve ever got anything for me, send it over.” And he said, “Funnily enough, I do have a script that might suit you,” and he sent it over, and I was just overwhelmed when I read it. He’d managed to take these two fairly ordinary people and their love story, and then intertwine it, like a double helix, with this love letter to The Beatles, which is the whole texture and feeling of this film.

It comes across as this very uplifting and quite emotional film.
I’m glad you said that, as I thought this whole simple idea — and it’s not sci-fi, but it’s not really explained — of this global amnesia about The Beatles and all their songs was just so glorious and wonderful, and just like listening to one of their songs. It really moved me, and especially the scene at the end. That affected me in a very personal way.  It’s about the wonder of cinema and its relationship to time, and film is the only art form that really looks at time in such detail because film is time. And that relates directly to editing, where you’re basically compressing time, stretching it, speeding it up, freezing it — and even stopping it. No other art form can do that.

The other amazing aspect of film is that going to the movies is also an expression of time. The audience says, “I’m yours for the next two hours,” and in return you give them time that’s manipulated and squeezed and stretched, and even stopped. That’s pretty amazing, I think. That’s what I tried to do with this film, do something that brings back The Beatles and all that sense of pure joy in their music, and how it changed people’s lives forever.

Is it true that Jack is partly based on Ed Sheeran’s own life story?
It is, absolutely, and he’s good friends with Richard Curtis. Ed played all the little pubs and small festivals where we shot, and very unsuccessfully when he started out. Then he was propelled into superstardom, and that also appeared to happen overnight. Where did all his great songs come from? Then, like in the film, Ed actually returned to his childhood sweetheart and they ended up getting married, and you go, “Wow! OK. That’s amazing.” So all that gave us the exo-skeleton of the film, and Ed’s also done some acting — he was in Game of Thrones and Bridget Jones’ Baby, and then he also wrote the song at the end, so it was really perfect he was also in it.

What did Himesh bring to the role of Jack?
The only trepidation I had was when I began auditioning people for the part, as it was basically, “Come in and sing a couple of Beatles songs.” And some were probably better technically than Himesh, but I soon realized it was going to be far harder than I thought to get the right guy. We had great actors who weren’t great singers, and vice versa, and we didn’t want just a karaoke version of 17 songs.

And making it more complicated was that, unlike in the film, we all do remember The Beatles. But then Himesh walked in, played “Yesterday” and “Back in the USSR,” and even though I was oversaturated by The Beatles music at this point, they just grabbed me. He made them his own, as if they were his songs. He was also very modest with it as well, in his demeanor and approach. He doesn’t rethink the wheel. He says, “This is the song you’ve missed, and I’m bringing it back to you.” And that’s the quality he brings to his performance. There’s a genuine simplicity, but he’s also very funny and subtle. He doesn’t try and hijack The Beatles and lay on extra notes that you don’t need. He’s a very gentle guy, and he lets you see the song for what it is, the beauty of them.

Obviously, the music and sound were crucial in this, and usually films have the actors lipsync, but Himesh sang live?
Totally. He played and sang live — no dubs or pre-records. Early on I sat down with Simon Hayes, who won the Oscar for mixing Les Mis, and told him that’s what I wanted. It’s very difficult to do live recording well, but once Simon heard Himesh sing, he got it.

The songs in this help tell the story, and they’re as important as all the dialogue, so every time you hear Himesh play and sing it live. Then for all the big concerts, like at Wembley, we added extra musicians, which we over-dubbed. So even if there were mistakes or problems with Himesh’s performances, we kept it, as you’ve got to believe it’s him and his songs. It had to be honest and true.

We screened the premiere in Dolby Vision Atmos in London, and it’s got such a fantastic range. The sound is so crisp and clean — and not just the effects, but all the dialogue, which is a big tribute to Simon. It’ll be so sad if we lose cinema to streaming on TV and watching films on tiny phones because we’ve now achieved a truly remarkable technical standard in sound.

Where did you do all the post?
We edited at a few places. We were based at Pinewood to start with, as I was involved with the Bond film, and then we moved to some offices in central London. Finally, we ended up at Working Title, where they have a great editing setup in the basement. Then as usual we did all the sound mixing at Pinewood with Glenn Freemantle and his team from Sound 24. They’ve done a lot of my films.

We did all the visual effects with my usual guy, VFX supervisor Adam Gascoyne over at Union Visual Effects in London. He’s done all my films for a very long time now, and they did a lot of stuff with crowd and audience work for the big shows. Plus, a lot of invisible stuff like extensions, corrections, cleanup and so on.

You also reteamed with editor Jon Harris, whose work on 127 Hours earned him an Oscar nom. What were the big editing challenges?
We had quite a few. There was this wonderful scene of Jack going on the James Corden show and playing “Something,” the George Harrison song, and we ultimately had to cut the whole thing. On its own, it was this perfect scene, but in the context of the film it came too late, and it was also too reminiscent of “Yesterday” and “The Long and Winding Road.”

The film just didn’t need it, and it was quite a long sequence, and it was really sad to cut it, but it just flowed better without it. Originally, we started the film with a much longer sequence showing Jack being unsuccessful, and once we tested that, it was immediately obvious that the audience understood it all very quickly. We just didn’t need all that, so we had to cut a lot of that. It’s always about finding the right rhythm and pace for the story you’re telling.

L-R: Iain Blair and Danny Boyle

Where was the DI done?
At Goldcrest with colorist Adam Glasman, who has worked a lot with DP Chris Ross. It was a very joyous film to make and I wanted it to look joyful too, with a summer spirit, but also with a hint of melancholy. I think Himesh has that too, and it doesn’t affect the joy, but it’s a sub-note. It’s like the English countryside, where we tried to capture all its beauty but also that feeling it’s about to rain all the time. It’s that special bittersweet feeling.

I assume Paul and Ringo gave you their blessing on this project?
Yeah, you have to get their agreement as they monitor the use of the songs, and Working Title made a great deal with them. It was very expensive, but it gave us the freedom to be able to change the songs in the edit at the last minute if need be, which we did a few times. We got beautiful letters back, very touching, and Paul was very funny as he gave us permission to use “Yesterday,” which we also used as the film title. He told us that his original lyric title was “Scrambled Eggs,” and if the film turned out to be a mess, we could just call it Scrambled Eggs instead.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Audionamix – 7.1.20

DP Chat: Good Omens cinematographer Gavin Finney

By Randi Altman

London-born cinematographer Gavin Finney, BSC, has a wealth of television series and film experience under his belt, including Wolf Hall, The Fear and the upcoming series based on the film of the same name, Hulu’s Four Weddings and a Funeral. One of his most recent projects was the six-episode Amazon series Good Omens, starring Michael Sheen (Aziraphale) and David Tennant (Crowley) as an angel and a demon with a very long history, who are tasked with saving the world. It’s based on the book by Neil Gaiman and Terry Pratchett.

Finney was drawn to cinematography by his love of still photography and telling stories. He followed that passion to film school and fell in love with what could be done with moving images.

Let’s find out more about Finney and his work on Good Omens.

How would you describe the look of Good Omens? How did you work with the director/s/producers to achieve the look they wanted?
There is a progression through the story where things get increasingly strange as Adam (who our main characters believe is the antichrist) comes into his powers, and things in his head start manifesting themselves. It is also a 6,000-year-long buddy movie between an angel and a demon! There is Adam’s world — where everything is heightened and strangely perfect — and Aziraphale and Crowley’s world of heaven and hell. At some point, all these worlds intersect. I had to keep a lot of balls in the air in regard to giving each section its own look, but also making sure that when these worlds collide, it still makes sense.

Each era depicted in the series had a different design treatment — obviously in the case of costume and production design — but also in the way we shot each scene and the way they were lit. For instance, Neil Gaiman had always imagined the scene in the church in the blitz in Episode 3 to be an homage to the film noir style of the time, and we lit and photographed it in that style. Ancient Rome was given the patina of an Alma-Tadema oil painting, and we shot Elizabethan London in an exact recreation of Shakespeare’s Globe Theatre. The ‘60s were shot mainly on our Soho set, but redressed with posters from that time, and we changed the lighting to use more neon and used bare bulbs for signage.

I also graded the dailies throughout production on DaVinci Resolve, adding film grain and different looks to different time periods to help anchor where we were in the story. Neil wanted heaven and hell to feel like two parts of the same celestial building, so heaven occupied the best penthouse offices, and hell was stuck in the damp, moldy basement where nothing works properly.

We found a huge empty building for the heaven set that had shiny metal flooring and white walls. I frosted all the windows and lit them from outside using 77 ARRI Skypanels linked to a dimmer desk so we could control the light over the day. We also used extremely wide-angle lenses such as the Zeiss rectilinear 8mm lens to make the space look even bigger. The hell set used a lot of old, slightly greenish fluorescent fittings, some of them flickering on and off. Slimy dark walls and leaking pipes were added into the mix.

For another sequence Neil and Douglas wanted an old-film look. To do this, ARRI Media in London constructed a hand-cranked digital camera out of an old ARRI D21 camera and connected it to an ARRI 435 hand-crank wheel and then to a Codex recorder. This gave us a realistic, organic varis-peed/vari-exposure look. I added a Lensbaby in a deliberately loose mount to emulate film weave and vignetting. In this way I was able to reproduce very accurately the old-style, hand-cranked black and white look of the first days of cinema.

How early did you get involved in the production?
I’d worked with the director Douglas Mackinnon a few times before (on Gentlemen’s Relish and The Flying Scotsman), and I’d wanted to work with him again a number of times but was never available. When I heard he was doing this project, I was extremely keen to get involved, as I loved the book and especially the kind of world that Neil Gaiman and Terry Pratchett were so good at creating. Fortunately, he asked me to join the team, and I dropped everything I was doing to come on board. I joined the show quite late and had to fly from London to Cape Town on an early scout the day after getting the job!

How did you go about choosing the right camera and lenses for this project?
We shot on Leica Summilux Primes and ARRI Alura zooms (15.5-45mm and 45-
250mm) and ARRI Alexa SXT and Alexa Mini cameras outputting UHD 4K files. The Alexa camera is very reliable, easy to work with, looks great and has very low noise in the color channels, which is useful for green/bluescreen work. It can also shoot at 120fps without cutting into the sensor size. We also had to make sure that both cameras and lenses were easily available in Cape Town, where we filmed after the
UK section.

The Alexa output is also very flexible in the grade, and we knew we were going to be pushing the look in a number of directions in post. We also shot with the Phantom Flex 4K high-speed camera at 1,000fps for some scenes requiring ultra-slo motion, and for one particular sequence, a specially modified ARRI D-21 that could be “hand-cranked” like an old movie camera.

You mentioned using Resolve on set. Is this how you usually work? What benefit did you get from doing this?
We graded the dailies on Blackmagic’s DaVinci Resolve with our DIT Rich
Simpson. We applied different looks to each period of the story, often using a modified film emulation plugin. It’s very important to me that the dailies look great and that we start to establish a look early on that can inform the grade later.

Rich would bring me a variety of looks each day and we’d pick the one we liked for that day’s work. Rich was also able to export our selected looks and workflow to the South African DIT in Cape Town. This formed the starting point of the online grade done at Molinare on FilmLight Baselight under the hugely capable hands of Gareth Spensley. Gareth had a big influence on the look of the series and did some fantastic work balancing all the different day exteriors and adding some magic.

Any challenging scenes you are particularly proud of?
We had some very big sets and locations to light, and the constantly moving style of photography we employed is always a challenge to light — you have to keep all the fixtures out of shot, but also look after the actors and make sure the tone is right for the scene. A complicated rig was the Soho street set that Michael Ralph designed and built on a disused airbase. This involved four intersecting streets with additional alleyways, many shops and a main set — the bookshop belonging to Aziraphale.

This was a two-story composite set (the interior led directly to the exterior). Not only did we have to execute big crane moves that began looking down at the whole street section and then flew down and “through” the windows of the bookshop and into an interior scene. We also had to rig the set knowing that we were going to burn the whole thing down.

Another challenge was that we were filming in the winter and losing daylight at 3:30pm but needing to shoot day exterior scenes to 8pm or later. My gaffer (Andy Bailey) and I designed a rig that covered the whole set (involving eight cranes, four 18Kw HMIs and six six-meter helium hybrid balloons) so that we could seamlessly continue filming daylight scenes as it got dark and went to full night without losing any time. We also had four 20×20-foot mobile self-lighting greenscreens that we could move about the set to allow for the CGI extensions being added later.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
The script inspires me artistically. If I don’t love the story and can’t immediately “see” how it might look, I don’t do it. After that, I’m inspired by real life and the way changing light utterly transforms a scene, be it a landscape or an interior. I also visit art galleries regularly to understand how other people see, imagine and communicate.

What new technology has changed the way you work (looking back over the past few years)?
Obviously, digital cinematography has had a huge impact. I trained in film and spent the first 16 years of my career shooting film exclusively, but I was happy to embrace digital when it came in. I love keeping up with all the advances.

Lighting is also going digital with the advent of LED fixtures with on-board computers. I can now dial any gel color or mix my own at any dimmer level from an app on my phone and send it to dozens of fixtures. There is an incredible array of tools now at our disposal, and I find that very exciting and creatively liberating.

What are some of your best practices or rules you try to follow on each job?
I tend to work on quite long jobs — my last two shows shot for 109 and 105 days, respectively. So keeping to sensible hours is critical. Experienced producers who are concerned with the welfare, health and safety of their crew keep to 10 hours on camera, a one-hour lunch and five-days weeks only. Anything in excess of that results in diminishing returns and an exhausted and demoralized crew.

I also think prep time is incredibly important, and this is another area that’s getting squeezed by inexperienced producers to the detriment of the production. Prep time is a comparatively cheap part of the process but one that reaps huge dividends on the shoot. Being fully prepared, making the right location and set design choices, and having enough to time to choose equipment and crew and work out lighting designs all make for a smooth-running shoot.

Explain your ideal collaboration with the director when setting the look of a project.
This goes back to having enough prep time. The more time there is to visit possible locations and simply talk through all the options for looks, style, movement and general approach the better. I love working with visual directors who can communicate their ideas but who welcome input. I also like being able to ditch the plan on the day and go with something better if it suddenly presents itself. I like being pushed out of my comfort zone and challenged to come up with something wonderful and fresh.

What’s your go-to gear — things you can’t live without?
I always start a new production from scratch, and I like to test everything that’s available and proven in the field. I like to use a selection of equipment — often different cameras and lenses that I feel suit the aesthetic of the show. That said, I think
ARRI Alexa cameras are reliable and flexible and produce very “easy to work with” images.

I’ve been using the Letus Helix Double and Infinity (provided by Riz at Mr Helix) with an Exhauss exoskeleton support vest quite a lot. It’s a very flexible tool that I can operate myself and it produces great results. The Easyrig is also a great back-saver when doing a lot of handheld-work, as the best cameras aren’t getting any lighter.

Apart from that, comfortable footwear and warm, waterproof clothing are essential!


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


DP Chat: Catch-22’s Martin Ruhe, ASC

By Randi Altman

For the bibliophiles out there, you know Catch-22 as the 1961 book by Joseph Heller. Cinephiles might remember the 1970 film of the same name starring Alan Arkin. And for those who are familiar with the saying, but not its origins, a Catch-22 is essentially a no-win situation. The famous idiom comes from the book — specifically the main character, Captain John Yossarian, a World War II bombardier who finds himself needing to escape the war, but rules and regulations hold him back.

Martin Ruhe (right) on-set with George Clooney.

Now there is yet another Catch-22 to point to: Hulu’s miniseries, which stars Christopher Abbott, Kyle Chandler, Hugh Laurie and George Clooney. Clooney is also an executive producer, alongside Grant Heslov, Luke Davies, David Michôd, Richard Brown, Steve Golin and Ellen Kuras. The series was written by Davies and Michôd and directed by Clooney, Heslov and Kuras, who each directed two episodes. It was shot entirely in Italy.

We recently reached out to the show’s German-born DP, Martin Ruhe, ASC, to find out about his workflow on the series and how he became a cinematographer.

Tell us about Catch-22. How would you describe the look of the film that you and the directors wanted to achieve?
George was very clear — he wanted to push the look of the show toward something we don’t see very often these days in TV or films. He wanted to feel the heat of the Italian summer.

We also wanted to contrast the absurdity of what happens on the ground with the claustrophobic and panic of the aerial work. We ended up with a strong warm tone and a lot of natural light. And we move the camera as if we‘re always with our hero (Abbott). Very often we travel with him in fluent camera moves, and then we contrast that with shaky hand-held camera work in the air. It was good fun to be able to have such a range to work with.

Were you given examples of the look that was wanted?
We looked at newsreel footage from the period and at stills and benefitted from production designer David Gropman‘s research. Then I took stills when we did camera tests with our actors in costume. I worked on those on my computer until we got to a place we all liked.

Stefan Sonnenfeld at Company 3 did the grading for the show and loved it. He gave us a LUT that we used for our dailies. Later, when we did the final grade, we added film grain and refined our look to what it is now.

How early did you get involved in the production?
I spoke with George Clooney and Grant Heslov for the first time four months before we started to shoot. I had eight weeks of prep.

How did you go about choosing the right camera and lenses for this project?
A lot of the scenes were happening in very small spaces. I did a lot of research on smaller cameras, and since we would have a lot of action scenes in those planes, I did not want to use any cameras with a rolling shutter.

I ended up using Arri Alexa Minis with Cooke S4 lenses and also some Flare cameras by IO industries, which could record 4K raw to Q7 Odyssey recorders. We mounted those little ones on the planes whenever they were flying for real. We also used it for the parachute jump.

This is a period piece. How did that affect your choices?
The main effect was the choice of light sources when we shot interiors and night scenes. I love fluorescents, and they existed in the period, but just not in those camps and not in the streets of Rome at night. We used a lot of practicals and smaller sources, which we spread out in the little streets of a small town where we shot, called Viterbo (standing in for Rome).

Another thing I learned was that in those camps at night, lights were blacked out. That meant we were stuck with moonlight and general ambience for night scenes, which we created with HMI sources — sometimes direct if we needed to cover big areas, like when the air base gets attacked at night in Episode 5.

Any challenging scenes that you are particularly proud of or found most challenging? 
In the end of Episode 5, Yossarian’s plane loses both engines in combat and goes down. We see YoYo and others escape the plane, while the pilot takes the plane over water and tries to land it. It’s a very dramatic scene.

We shot some exteriors of the real B25 Mitchell over Sardinia. We mounted camera systems in a DC3 and our second Mitchell to get the shots with the real planes. The destruction on the engines and the additional planes were added in post. The interiors of our actors in the plane were shot at Cinecitta Studios in Rome. We had a fuselage of a real B-25 on a gimbal. The studio was equipped with a 360-degree screen and a giant top light.

In the plane, we shot with a hand-held ARRI Alexa Mini camera. It was only the actors, myself and my focus puller inside. We never altered the physical space of the plane but instead embraced the claustrophobia. We see all of the crew members getting out — only the pilot stays on board. There was so little physical space for our actors since the fuselage was rigged to the gimbal, and then we also had to create the lighting for them to jump into within a couple of feet of space.

Then, when Yossarian leaves the plane, we actually put a small camera on a stuntman while another stuntman in Yossarian’s wardrobe did a real jump. We combined that with some plate shots from a helicopter (with a 3D plane in it) and some shots of our actor on a rig on the backlot of Cinecitta.

It all worked out. It was always our goal to shoot as many real elements as we could and leave the rest with post.

Stepping away from Catch-22. How did you become interested in cinematography?
I grew up in a small town in western Germany. No one in my family had anything to do with film. I loved movies and wanted to work on them as a director. After a little journey, I got an internship at a camera rental in London. It was then I saw for the first time what cinematographers do. I loved it and knew that was it. Then I studied in Berlin, became a focus puller for a couple of years and started working as a DP on music videos, then commercials and then, a little later, films.

What inspires you artistically?
Photography and movies. There is a lot of good work out there by a lot of talented DPs. I love to look at photographers I like as well as some documentary stills like the ones you see in the World Press Photo contest once a year. I love it when it is real. There are so many images around us every day, but if I don’t believe them (where they seem real to me), they are just annoying.

Looking back over the last few years, what new technology has changed the way you work?
Maybe LED lighting and maybe the high sensitivity of today’s digital cameras. You are so much more free in your choice of locations, days and, especially, night work because you can work with fewer lights.

What are some of your best practices or rules you try to follow on each job?
Keep it as simple as you can, and stay true to your vision.

Explain your ideal collaboration with the director when setting the look of a project.
I’m not sure there is just one way to go. After reading the script, you have an idea of what it can be, and then you start getting the information of the where and in what frame you will work.

Martin Ruhe behind the ARRI Alexa.

I love to spend time with my directors in prep — going to the locations, seeing them in different light, like mornings, noon or during night. Then I love to work with stills and sometimes also reference pictures to show what I think it can be and present a way we can get there. It’s always very important to leave some space for things to develop.

What’s your go-to gear — things you can’t live without?
I look for the right gear for each project. I like ARRI cameras, but I’ve also shot two movies with Panavision cameras.

I have shot movies in various countries, and the early ones didn’t have big budgets, so I tried to work with local crew and gear that was available. The thing I like about that is you get to know different ways of doing things, and also you might work with gear you would have never picked yourself. It keeps you flexible. When I start a project, I am trying to develop a feel for the story and the places it lives. Once I have that feel, I start into how and decide what tools I’ll use.

Photo Credit: Philippe Antonello


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Yoomin Lee joins MPC London as senior colorist

Yoomin Lee has joined Moving Picture Company’s color team in London. Lee got her start working for some of Australia’s top post houses including Frame Set & Match, The Lab and Cutting Edge, before joining Jogger Studios London in 2016.

While at Jogger, she worked on many campaigns, including those for Google, Valentino, FIFA and Samsung. A collaboration with director Anton Corbijn has seen her grade projects for Depeche Mode and U2, including the visuals for the latter’s The Joshua Tree Tour in 2017, which played across the world’s largest concert screen.

When asked what brings her inspiration, Lee says, “I get inspired by any visual art form, and often from nature, especially for light. I become more observant of how things are lit. Color grading is such a unique art form and technology, and it’s all about details and finesse. I find it very inspiring when I collaborate with creative people who are always eager to push the boundaries to achieve their craft.”

Lee will be working on FilmLight’s Baselight.

You can check out her work here.


Amazon’s Sneaky Pete: DP Arthur Albert on the look of Season 3

By Karen Moltenbrey

Crime has a way of finding Pete Murphy, or should we say Marius Josipovic (Giovanni Ribisi). Marius is a con man who assumed his cellmate’s identity when he was paroled from prison. His plan was twofold: first, pretend to be the still-incarcerated Pete, from whom the family has been estranged for the past 20 years, and hide out on their farm in Connecticut. Second, con the family out of money so he can pay back a brutal mobster (Bryan Cranston, who also produces).

Arthur Albert

Marius’s plan, however, is flawed. The family is lovable, \ quirky and broke. Furthermore, they are in the bail bond business and one of his “cousins” is a police officer — not ideal for a criminal. Ultimately, Marius starts to really care for the family while also discovering that his cover is not that safe.

Similar to how Marius’ plans on Sneaky Pete have changed, so has the show’s production on the current and final Season 3, which is streaming on Amazon now. This season, the story shifts from New York to California, in tandem with the storylines. Blake Masters also took over as showrunner, and cinematographer Arthur Albert (ER, The Blacklist, Breaking Bad, Better Call Saul) came on as director of photography, infusing his own aesthetic into the series.

“I asked Blake if he wanted me to maintain the look they had used previously, and he said he wanted to put his own stamp on it and raise the bar in every department. So, I had free rein to change the look,” notes Albert.

The initial look established for Sneaky Pete had a naturalistic feel, and the family’s bail office was lit with fluorescent lighting. Albert, in contrast, opted for a more cinematic look with portrait-style lighting. “It’s just an aesthetic choice,” he says. “The sets, designed by (Jonathan) Carlson, are absolutely brilliant, and I tried to keep them as rich and layered as possible.”

For Manhattan scenes, Masters wanted a mid-century, modern look. “I made New York moody and as interesting as I could — cooler, more contrasty,” says Albert. When the story shifts to Southern California, Masters asked for a bright, more vibrant look. “There’s a big location change. For this season, you want to feel that change. It’s a big decision for the whole family to pick up their operation and move it, so I wanted the overall look of the show to feel new and different.”

The edginess and feeling of danger, though, comes less from the lighting in this show and more from the camera movement. The use of Steadicam gives it a bit of a stalking feel, serving as a moving viewpoint.

When Albert first met with Masters, they discussed what they thought worked in previous episodes. They liked the ones that used handheld and close-up shots that were wide and close to the actor, but in the end they went with a more traditional approach used by Jon Avnet, who directed four of the 10 episodes this season.

Season 3 was primarily shot with two cameras (Albert’s son, Nick, served as second-unit DP and A-camera operator, and Jordan Keslow, B-camera/Steadicam operator). A fan of Red cameras — Albert used an early incarnation for the last six episodes of ER – he employed Red’s DSMC2 with the new Gemini 5K S35 sensor for Season 3. The Gemini leverages dual sensitivity modes to provide greater flexibility for a variety of shooting environments.

The DP also likes the way it renders skin tones without requiring diffusion. “The color is really true and good, and the dynamic range is great. It held for really bright window areas and really dark areas, both with amazing range,” he says. The interiors of the sets were filmed on a stage in Los Angeles, and the exteriors were shot on location afterward. With the Gemini’s two settings (standard mode for well-lit conditions and a low-light setting), “You can shoot a room where you can barely see anyone, and it looks fully lit, or if it’s a night exterior where you don’t have enough time, money or space to light it, or in a big set space where suddenly you want to shoot high speed and you need more light. You just flip a switch, and you’ve got it. It was very clean with no noise.”

This capability came in handy for a shoot in Central Park at night. The area was heavily restricted in terms of using lights. Albert used the 3200 ISO setting and the entire skyline of 59th Street was visible — the clouds and how they reflected the light of the buildings, the detail of the night sky, the silhouettes of the buildings. In another similar situation, he used the low-light setting of the camera for a night sequence filmed in Grand Central Terminal. “It looked great, warm and beautiful; there is no way we could have lit that vast space at night to accommodate a standard ISO,” says Albert.

As far as lenses on Sneaky Pete, they used the Angenieux short zooms because they are lightweight and compact, can be put on a Steadicam and are easy to hold. “And I like the way they look,” Albert says. He also used the new Sigma prime lenses, especially when an extreme wide angle was needed, and was impressed with their sharpness and lack of distortion.

Throughout filming, the cinematographer relied on Red’s IPP2 (image processing pipeline) in-camera, which resulted in a more effective post process, as it is designed for an HDR workflow, like Sneaky Pete — which is required by Amazon.

The color grade for the series was done at Level 3 Post by Scott Ostrowsky, who had also handled all the previous seasons of Sneaky Pete and with whom Albert had worked with on The Night Shift and other projects. “He shoots a very cinematic look and negative. I know his style and was able to give him that look before he came into the suite. And when we did the reviews together, it was smooth and fast,” Ostrowsky says. “At times Sneaky Pete has a very moody look, and at times it has a very open look, depending on the environment we were shooting in. Some of the dramatic scenes are moody and low-light. Imagine an old film noir movie, only with color. It’s that kind of feel, where you can see through the shadows. It’s kind of inky and adds suspense and anticipation.”

Ostrowsky worked with the camera’s original negative — “we never created a separate stream,” he notes. “It was always from the camera neg, unless we had to send a shot out for a visual effects treatment.”

Sneaky Pete was shot in 5K, from which a 3840×2160 UHD image was extracted, and that is what Ostrowsky color graded. “So, if I needed to use some kind of window or key, it was all there for me,” he says. Arthur or Nick Albert would then watch the second pass with Ostrowsky, who would make any further changes, and then the producers would watch it, adding their notes. Ostrowsky worked used the Blackmagic DaVinci Resolve.

“I want to make the color work for the show. I don’t want the color to distract from the show. The color should tell the story and help the story,” adds Ostrowsky.

While not every change has been for the best for Pete himself since Season 1, the production changes on Sneaky Pete’s last season appear to be working just fine.


Karen Moltenbrey is a veteran VFX and post writer.


Lenovo intros next-gen ThinkPads

Lenovo has launched the next generation of its ThinkPad P Series with the release of five new ThinkPads, including the ThinkPad P73, ThinkPad P53, ThinkPad P1 Gen 2 and ThinkPad P53s and P43s.

The ThinkPad P53 features the Nvidia Quadro RTX 5000 GPU with RT and Tensor cores, offering realtime raytracing and AI acceleration. It now features Intel Xeon and 9th Gen Core class CPUs with up to eight cores (including the Core i9) up to 128GB of memory and 6TB of storage.

This mobile workstation also boasts a new OLED touch display with Dolby Vision HDR for superb color and some of the deepest black levels ever. Building on the innovation behind the ThinkPad P1 power supply, Lenovo is also maximizing the portability of this workstation with a 35 percent smaller power supply. The ThinkPad P53 is designed to handle everything from augmented reality and VR content creation to the deployment of mobile AI or ISV workflows. The ThinkPad P53 will be available in July, starting at $1,799.

At 3.74 pounds and 17.2mm thin, Lenovo’s thinnest and lightest 15-inch workstation — the ThinkPad P1 Gen 2 — includes the latest Nvidia Quadro Turing T1000 and T2000 GPUs. The ThinkPad P1 also features eight-core Intel 9th Gen Xeon and Core CPUs and an OLED touch display with Dolby Vision HDR.

The ThinkPad P1 Gen 2 will be available at the end of June starting at $1,949.

With its 17.3-inch Dolby Vision 4K UHD screen and mobility with a 35% smaller power adaptor, Lenovo’s ThinkPad P73 offers users maximum workspace and mobility. Like the ThinkPad 53, it features the Intel Xeon and Core processors and the most powerful Nvidia Quadro RTX graphics. The ThinkPad P73 will be available in August starting at $1,849.

The ThinkPad P43s features a 14-inch chassis and will be available in July starting at $1,499.

Rounding out the line is the ThinkPad P53s which combines the latest Nvidia Quadro graphics and Intel Core processors — all in a thin and light chassis. The ThinkPad P53s will be available in June, starting at $1,499.

For the first time, Lenovo is adding new X-Rite Pantone Factory Color Calibration to the ThinkPad P1 Gen 2, ThinkPad P53 and ThinkPad P73. The unique factory color calibration profile is stored in the cloud to ensure more accurate recalibration. This profile allows for dynamic switching between color spaces, including sRGB, Adobe RGB and DCI-P3 to ensure accurate ISV application performance.

The entire ThinkPad portfolio is also equipped with advanced ThinkShield security features – from ThinkShutter to privacy screens to self-healing BIOS that recover when attacked or corrupted – to help protect users from every angle and give them the freedom to innovate fearlessly.


Rocketman director Dexter Fletcher on Elton John musical

By Iain Blair

The past year has been huge for British director Dexter Fletcher. He was instrumental in getting Bohemian Rhapsody across the finish line when he was brought in to direct the latter part of the production after Bryan Singer was fired. The result? A $903 million global smash that Hollywood never saw coming.

L-R: Dexter Fletcher and Iain Blair

Now he’s back with Rocketman, another film about another legendary performer and musician, Elton John. But while the Freddie Mercury film was more of a conventional, family-friendly biopic that opted for a PG-13 rating and approach that sidestepped a lot of the darker elements of the singer’s life, Rocketman fully embraces its R-rating and dives headfirst into the sex, drugs and rock ‘n’ roll circus that was Elton’s life at the time — hardly surprisingly, the gay sex scenes have already been censored in Russia.

Conceived as an epic musical fantasy about Elton’s breakthrough years, the film follows the transformation of shy piano prodigy Reginald Dwight into international superstar Elton John. It’s set to Elton’s most beloved songs — performed by star Taron Egerton — and tells the story of how a small-town boy became one of the most iconic figures in pop culture.

Rocketman also stars Jamie Bell as Elton’s longtime lyricist and writing partner, Bernie Taupin; Richard Madden as Elton’s first manager John Reid; and Bryce Dallas Howard as Elton’s mother Sheila Farebrother.

Fletcher started as a child actor, appearing in such films as Bugsy Malone, The Elephant Man and The Bounty before graduating to adult roles in film (Lock, Stock and Two Smoking Barrels) and TV (Band of Brothers). He made his directing debut with 2012’s Wild Bill and has since made a diverse slate of films, including Eddie the Eagle.

I recently met up with him to talk about making the film and his workflow.

When you took this on, were you worried you’d now be seen as the go-to director for films about gay British glam rockers?
(Laughs) No, and I never set out to create this specific cinematic universe about gay British glam rockers. I don’t know how many more of them there are left that people want to see films about — maybe Marc Bolan. That would be the next obvious one.

Dexter Fletcher and Taron Egerton on the set of Rocketman.

What happened was that I was attached early on to direct Bohemian Rhapsody, but then Rocketman came up. While I was preparing that, Bohemian Rhapsody folks came back (after director Bryan Singer was fired during the shoot) and said they needed some help to get it done, so it was more of a coincidence, and Elton’s music is so different from Queen’s.

What sort of film did you set out to make?
Definitely an epic musical, and something that was different and imaginative. The first idea that came up was “based on a true fantasy,” and having done biopics before, like Eddie the Eagle, I know you can’t do them accurately. You simply can’t fit a life into two hours, and there’s always people who nitpick it and go, “He’s wearing the wrong shoes, and it missed this bit” and so on. A biopic isn’t a documentary, and you have to breathe creative life into it. The truth/fantasy element of this was far more important to me in telling Elton’s story than doing a by-the-numbers recreation of his career and life.

Casting was obviously crucial. What did Taron bring to the mix?
A great voice, a great performance, and Elton encouraged him to really make the performance his own. He kept saying, “Do your own thing, don’t just copy me. If they wanted just a copy of my music, they can just buy it. So make it original.”

Taron has this great innate ability of being vulnerable and confident at the same time, and that’s a great gift. He’s able to be very driven and focused and yet retain that inner vulnerability and able to show you both sides clashing, which I felt was very true of Elton.

So Elton gave you a pretty free hand?
Yeah, he did. He sat us down right at the start and said to Taron, “Don’t do an impression of me. No one wants that. Do you, honestly, and that’s what will convince an audience and draw them in.” He was right, and he was extremely giving and generous with us. He’s had this incredible life, and he’s putting it all out there on the big screen, which is pretty brave.

In Bohemian Rhapsody, Rami lipsynced, partly because Freddie Mercury famously had this amazing range that’s so hard to replicate. Taron sang all his vocals?
He did, every note. It was prerequisite for the role, and he wanted to anyway. A musical is very different from a biopic.

You took an imaginative visual approach to the story, especially with all the fantasy elements. How did you collaborate on the look with DP George Richmond, who has shot all your films, whose credits include Tomb Raider and the Kingsman franchise.
He’s got a great eye, and we looked at a lot of the great ‘70s musicals, like The Rose, All That Jazz and Stardust, and there’s this dusty, raw, grainy quality to them. I really wanted to get that feel and texture as this is a period piece and I didn’t want it to look too modern.

So we studied the camera moves and lighting, and it was the same with all the costumes by Julian Day. Elton’s outrageous outfits were the starting point, but we pushed it a little further. So it’s about emotion and how you felt more than just recreating a look or costume or a chronological retelling, and I elaborated as much as possible.

How tough was the shoot?
Fairly grueling. We shot mainly at Bray Studios, outside London, where they shot all the old Hammer Film Productions horror films, and then did some stuff in London.

Where did you post?
We did it at Hireworks in London, which is above this beautiful old cinema, right in the heart of London. It was really conducive for the editing and what we were creating. Then we did all the sound and the mixing round the corner at Goldcrest.

Do you like the post process?
I love it, and I love it more and more now. It’s this whole side of filmmaking that I never really appreciated as an actor, as I didn’t have much to do with it. But now I can’t wait to get in the edit, and I’m there every day, and I’m very involved with every aspect of post.

Talk about editing with Chris Dickens. What were the big editing challenges?
I’d never worked with him before, though we’d met for another project. He was on the set with us at Bray where he had edit rooms set up, so that was very convenient. He’d start assembling and I could just walk over and check on it all. We had a rough assembly just two weeks after we finished shooting, and as I said, I’m very involved.

We’d discuss stuff, especially all the big musical numbers. Those were the big challenges, as you have to make the music and visuals all work together, and you’re working to a playback track. You’re also very exposed when it comes to changing the tempo or rhythm of a scene. You can’t just cut a few words, as you’re locked into the track. And everyone knows the songs. Songs like “Your Song” and “Don’t Let The Sun Go Down On Me” were done totally live, so it was pretty complicated. But Chris is very experienced with all that, as he cut Slumdog Millionaire. That end musical number there is a total triumph of editing.

Talk about the importance of sound and music. I assume Elton was quite involved and listened to everything?
He was very interested, of course, but he let us all do our own thing. Giles Martin, son of Beatles’ producer George Martin, was in charge of the music. He’s an amazing producer in his own right, and he also has a long history with Elton, who stayed at his house when Giles was young. So there’s a very strong relationship there, and that was key in terms of dealing with Elton’s music legacy.

Giles was the custodian of all that and was instrumental in re-imagining all of Elton’s songs in the most interesting ways, and Elton would listen to them and love it. For instance, at first we thought “Crocodile Rock” wouldn’t fit, but we re-did it in this elevated, really hard rock way, and it worked out so well. We recorded at various places, including Abbey Road and Air Studios and, of course, we spent a lot of time and detail on all the music and sound. Just like with ‘Bohemian Rhapsody,’ if you don’t get that right, the film’s not going to work.

Visual effects play a big role. What was involved?
Cinesite in Montreal did them all, with a few by Atomic Arts, and we had a lot of artists and VFX guys working on them. We had so many, from fireworks to scenes with people floating and all the crowd scenes at stadiums, where we had to recreate fans in the ‘70s. So some of that stuff was fairly standard, but they did a brilliant job. I quite like working with VFX, as it allows you to do so much more with a period piece like this.

Where did you do the DI, and how important is it to you?
We did it at Goldcrest with colorist Rob Pizzey, and it’s very important to myself and my DP George Richmond. As I said, we wanted to get a very particular look and texture to it, and George and I worked closely on that from the very outset. Then in the DI, Rob and George worked very closely on it. In fact, George was shooting in Boston at the time, but he flew back to do it and finalize the look. I’m learning so much from him about the DI, and I give my notes and they make all the changes and adjustments. I love the DI.

Did the film turn out the way you hoped?
Absolutely. I’m really happy with it. In fact, it’s turned out better than I hoped.

What’s next?
I don’t have anything lined up. I’m out of work!


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Blackmagic intros Teranex Mini SDI to DisplayPort 8K HDR monitor

Blackmagic’s Teranex Mini SDI to DisplayPort 8K HDR, is an advanced 8K monitoring solution that lets you use the new Apple Pro Display XDR as a color-critical reference monitor on set and in post.

With dual on-screen scope overlays, HDR, 33-point 3D LUTs and monitor calibration that’s designed for the pro film and television market, the new Teranex Mini SDI to DisplayPort 8K HDR works with the new generation of monitors, like Apple’s just-announced Pro Display XDR. The Teranex Mini SDI to DisplayPort 8K HDR will be available in October for $1,295.

The Teranex Mini SDI to DisplayPort 8K HDR can use third-party calibration probes to accurately align connected displays for precise color. There are two on-screen scopes that can be selected between WFM, Parade, Vector and Histogram.

The front panel includes controls and a color display for input video, audio meters and the video standard indicator. The rear panel has Quad Link 12G-SDI for HD, Ultra HD and 8K formats. There are two DisplayPort connections for regular computer monitors or USB-C-style DisplayPort monitors, such as the Pro Display XDR. The built-in scaler will ensure the video input standard is scaled to the native resolution of the connected DisplayPort monitor. Customers can even connect both 2SI or Square Division inputs.

Teranex Mini SDI to DisplayPort 8K HDR makes it easy to work in 8K. Users just need only to connect an HDR-compatible DisplayPort monitor to allow HDR SDI monitoring. Static metadata PQ and Hybrid Log Gamma (HLG) formats in the VPID are handled according to the ST2108-1, ST2084 and the ST425 standards.

Teranex Mini SDI to DisplayPort 8K HDR handles ST425, which defines two new bits in the VPID to indicate transfer characteristic of SDR, HLG or PQ. Plus the ST2108-1 standard defines how to transport HDR static or dynamic metadata over SDI. Plus there is support for ST2082-10 for 12G SDI as well as ST425 for 3G-SDI sources. It also supports both Rec.2020 and Rec.709 colorspaces and 100% of the DCI-P3 format.

Features include:
• Support for HDR via SDI and DisplayPort
• Two built-in scopes live overlaid on the monitor
• Film industry quality 33-point 3D LUTs
• Automatic monitor calibration support using color probes
• Advanced Quad Link 12G-SDI inputs for 8K
• Scales input video to the native monitor resolution
• Includes LCD for monitoring and menu settings
• Utility software included for Mac and Windows
• Supports latest 8K DisplayPort monitors and displays
• Can be used on a desktop or rack mounted

Whiskey Cavalier DPs weigh in on the show’s look, DITs

While ABC recently cancelled freshman series Whiskey Cavalier, their on-set workflow is an interesting story to tell. The will-they-won’t-they drama featured FBI agent Will Chase (Scott Foley) and CIA operative Frankie Trowbridge (Lauren Cohan) — his codename is Whiskey Cavalier and hers is Fiery Tribune. The two lead an inter-agency team of spies who travel all over the world, periodically saving the world and each other, all while navigating friendship, romance and office politics.

David “Moxy” Moxness

Like many episodic television shows, Whiskey Cavalier used two cinematographers who alternated episodes so that the directors could work side-by-side with a cinematographer while prepping. David “Moxy” Moxness, CSC, ASC, shot the pilot. Moxness had previously worked on shows like Lethal Weapon, Fringe and Smallville and was just finishing another show when Warner Bros. sent him the pilot script.

“I liked it and took a meeting with director Peter Atencio,” explains Moxness. “We had a great meeting and seemed to be on the same page creatively. For me, it’s so much about collaborating on good shows with great people. Whiskey gave me that feeling.” Sid Sidell, ASC, a friend and colleague of Moxness’, was brought on as the second DP.

While Whiskey Cavalier’s plot has its two main characters traveling all over the world, principal photography took place in Prague. Neither cinematographer had worked there previously, although Moxness had passed through on vacation years before. While prepping and shooting the pilot, Moxness developed the look of the show with director Atencio. “Peter and I had the idea of using the color red when our lead character Will Chase was conflicted emotionally to trigger an emotional response for him,” he explains. “This was a combo platter of set dressing, costumes and lighting. We were very precise about not having the color red in frame other than these times. Also, when the team was on a mission, we kept to a cooler palette while their home base, New York, used warmer tones.”

This didn’t always prove to be straightforward. “You still have to adjust to location surroundings — when scouting for the pilot, I realized Prague still had mostly sodium vapor streetlights, which are not often seen in America anymore,” explains Moxness. “This color was completely opposite to what Peter and I had discussed regarding our nighttime palette, and we had a big car chase over a few nights and in different areas. I knew time and resources would in no way allow us to change or adjust this, and that I would have to work backwards from the existing tones. Peter agreed and we reworked that into our game. For our flashbacks, I shot 35mm 4-perf film with an ARRI IIC hand-cranked camera and Kowa lenses. That was fun! We continued all of these techniques and looks during the series.”

DITs
Mission, a UK-based DIT/digital services provider serving Europe, was brought on to work beside the cinematographers. Mission has an ever-expanding roster of DITs and digital dailies lab operators and works with cinematographers from preproduction onward, safeguarding their color decisions as a project moves from production into post.

Moxness and Sidell hadn’t worked with Mission before, but a colleague of Moxness’ had spoken to him about the experience of working with Mission on a project the year before. This intrigued Moxness, so he was waiting for a chance to work with them.

“When Whiskey chose to shoot in Prague I immediately reached out to Mission’s managing director, Mark Purvis,” explains Moxness. “Mark was enthusiastic about setting us up on Whiskey. After a few conversations to get to know each other, Mark suggested DIT Nick Everett. Nick couldn’t have been a better match for me and our show.”

Interestingly, Sidell had often worked without a DIT before his time on Whiskey Cavalier. He says, “My thoughts on the DP/DIT relationship changed drastically on Whiskey Cavalier. By choice, before Whiskey, I did the majority of my work without a DIT. The opportunity to work alongside Nick Everett and his Mission system changed my view of the creative possibilities of working with a DIT.”

Gear
Whiskey Cavalier was shot with the ARRI Alexa Mini and primarily ARRI Master Prime lenses with a few Angenieux zooms. Both Moxness and Sidell had worked with the Mini numerous times before, finding it ideal for episodic television. The post workflow was simple. On set, Everett used Pomfort’s LiveGrade to set the look desired by the cinematographers. Final color was done at Picture Shop in Los Angeles by senior colorist George Manno.

Moxy (behind camera) and director/EP Peter Atencio (to his right) on the Prague set.

“There are a few inherent factors shooting episodic television that can, and often do, handcuff the DP with regards to maintaining their intended look,” says Moxness. “The shooting pace is very fast, and it is not uncommon for editorial, final color and sometimes even dailies to happen far away from the shooting location. Working with a properly trained and knowledgeable DIT allows the DP to create a desired look and get it into and down the post pipeline to maintain that look. Without a proper solid roadmap, others start to input their subjective vision, which likely doesn’t match that of the DP. When shooting, I feel a strong responsibility to put my thumbprint on the work as I was hired to do. If not, then why was I chosen over others?”

Since successfully working on Whiskey Cavalier in Prague, Mission has set up a local office in Prague, led by Mirek Sochor and dedicated to Mission’s expansion into Central Europe.

And Moxness will be heading back to Prague to shoot Amazon’s The Wheel of Time.

 

Behind the Title: Ntropic Flame artist Amanda Amalfi

NAME: Amanda Amalfi

COMPANY: Ntropic (@ntropic)

CAN YOU DESCRIBE YOUR COMPANY?
Ntropic is a content creator producing work for commercials, music videos and feature films as well as crafting experiential and interactive VR and AR media. We have offices in San Francisco, Los Angeles, New York City and London. Some of the services we provide include design, VFX, animation, color, editing, color grading and finishing.

WHAT’S YOUR JOB TITLE?
Senior Flame Artist

WHAT DOES THAT ENTAIL?
Being a senior Flame artist involves a variety of tasks that really span the duration of a project. From communicating with directors, agencies and production teams to helping plan out any visual effects that might be in a project (also being a VFX supervisor on set) to the actual post process of the job.

Amanda worked on this lipstick branding video for the makeup brand Morphe.

It involves client and team management (as you are often also the 2D lead on a project) and calls for a thorough working knowledge of the Flame itself, both in timeline management and that little thing called compositing. The compositing could cross multiple disciplines — greenscreen keying, 3D compositing, set extension and beauty cleanup to name a few. And it helps greatly to have a good eye for color and to be extremely detail-oriented.

WHAT MIGHT SURPRISE PEOPLE ABOUT YOUR ROLE?
How much it entails. Since this is usually a position that exists in a commercial house, we don’t have as many specialties as there would be in the film world.

WHAT’S YOUR FAVORITE PART OF THE JOB?
First is the artwork. I like that we get to work intimately with the client in the room to set looks. It’s often a very challenging position to be in — having to create something immediately — but the challenge is something that can be very fun and rewarding. Second, I enjoy being the overarching VFX eye on the project; being involved from the outset and seeing the project through to delivery.

WHAT’S YOUR LEAST FAVORITE?
We’re often meeting tight deadlines, so the hours can be unpredictable. But the best work happens when the project team and clients are all in it together until the last minute.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
The evening. I’ve never been a morning person so I generally like the time right before we leave for the day, when most of the office is wrapping up and it gets a bit quieter.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Probably a tactile art form. Sometimes I have the urge to create something that is tangible, not viewed through an electronic device — a painting or a ceramic vase, something like that.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I loved films that were animated and/or used 3D elements growing up and wanted to know how they were made. So I decided to go to a college that had a computer art program with connections in the industry and was able to get my first job as a Flame assistant in between my junior and senior years of college.

ANA Airlines

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Most recently I worked on a campaign for ANA Airlines. It was a fun, creative challenge on set and in post production. Before that I worked on a very interesting project for Facebook’s F8 conference featuring its AR functionality and helped create a lipstick branding video for the makeup brand Morphe.

IS THERE A PROJECT THAT YOU ARE MOST PROUD OF?
I worked on a spot for Vaseline that was a “through the ages” concept and we had to create looks that would read as from 1880s, 1900, 1940s, 1970s and present day, in locations that varied from the Arctic to the building of the Brooklyn Bridge to a boxing ring. To start we sent the digitally shot footage with our 3D and comps to a printing house and had it printed and re-digitized. This worked perfectly for the ’70s-era look. Then we did additional work to age it further to the other eras — though my favorite was the Arctic turn-of-the-century look.

NAME SOME TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Flame… first and foremost. It really is the most inclusive software — I can grade, track, comp, paint and deliver all in one program. My monitors — the 4K Eizo and color-calibrated broadcast monitor, are also essential.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Mostly Instagram.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? 
I generally have music on with clients, so I will put on some relaxing music. If I’m not with clients, I listen to podcasts. I love How Did This Get Made and Conan O’Brien Needs a Friend.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Hiking and cooking are two great de-stressors for me. I love being in nature and working out and then going home and making a delicious meal.

Behind the Title: Editor and colorist Grace Novak

One of her favorite parts of the job is when she encounters a hard edit and it finally clicks and falls into place.

NAME: New York-based Grace Novak

WHAT’S YOUR JOB TITLE?
Editor and Colorist

WHAT DOES THAT ENTAIL?
I work with directors/clients to make their project come to life using an editing program. Then during the color process, I bring it even closer to their aesthetic vision.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
It can include a lot of not-so-creative work like troubleshooting and solving technical problems, especially when doing assistant color/edit work either for myself or for someone else.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love the great moment when you push through a hard edit and it finally clicks. I also love getting to collaborate with other great creators and filmmakers and working one-on-one in the editing room. I find it to be a great learning experience.

WHAT’S YOUR LEAST FAVORITE?
When nothing works and I don’t know why. But, luckily, once I figure it out (eventually, hours later sometimes) I’ve learned to solve a new issue.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Definitely the mornings once I’ve had some coffee. I’m a morning person who is most active around the hours of 8-11. Once lunch hits, it can be hard not to want to take a good midday nap.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
When I was younger, for some reason, I told everyone I wanted to be a barber. I think that’s because I liked using scissors. Seriously, though, I’d probably be working with kids in some way or as an educator. I still hope to teach down the road.

WHY DID YOU CHOOSE THIS PROFESSION?
I knew I wanted a job where I could be creative, and with editing I can also be technically proficient. I love the combination of the two.

Dissonance

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I always knew I wanted to be involved with film, probably since I was 12. I remember starting to edit on Windows Movie Maker and being enamored with the effects. I especially liked the really awful and gaudy one that went through a gradient of colors. Don’t worry, I would never use something like that now.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I’m working on a lot of short indie films right now including Dissonance, Bogalusa and Siren. I’m also an assistant editor on the feature film The Outside Story.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Dissonance, a short experimental film that is currently in color right now (with me), is probably the most proud I am of a project purely because of how far it pushed me as an artist, editor and collaborator.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I follow a lot, but in the post world that includes postPerspective, BCPC and Jonny Elwyn.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
If I can, I like to listen to podcasts. That’s probably my primary podcast listening time besides at the gym. Obviously, I can only do this during my color work. For music, I like tunes that aren’t too upbeat and more relaxing. For podcasts I like to listen to either comedians or Reply All, Blank Check and Reveal.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I like to read and play video games. I also started to do cross-stitch recently and it’s nice to find a way to use my hands that doesn’t involve a computer or a controller. I make sure to exercise a lot as well because I find that helps my stress levels like nothing else can.

Picture Shop acquires Vancouver-based Finalé

Picture Shop has acquired Finalé Post in Vancouver. Burbank-based Picture Shop, which provides finishing and VFX work for episodic television, including The Walking Dead, NCIS, Hawaii Five-0 and Chilling Adventures of Sabrina, had been looking to make an expansion into the Vancouver market. The company will be branded Finalé, a Picture Shop company.

“Having a Vancouver-based location has always been a strategy of ours, but it was very important to find the right company,” says Picture Shop president Bill Romeo. “We are thrilled to incorporate Finalé into the Picture Shop family. With the amount of content being produced, our goal is to always have strategic locations that support our clients’ needs but still maintain our company’s philosophy — creating an experience with the highest level of service and a creative partnership with our clients.”

Launched in 1988 by Finalé CEO and industry veteran Don Thompson, Finalé is located in the center of Vancouver and has served most major studios. Finalé offers a host of post production services, ranging from digital dailies and color, through 4K HDR finishing and editorial. It also offers mobile dailies and editorial rentals in Toronto and other major Canadian production centers. Finalé’s credits include Descendants 3, iZombie, Tomorrowland and The Magicians.

Main Image: ( L-R) Picture Shop’s Tom Kendall and Robert Glass, Finalé’s Don Thompson, Picture Shop’s Bill Romeo and Finalé’s Andrew Jha.

 

Review: CyberPower PC workstation with AMD Ryzen

By Brady Betzel

With the influx of end users searching for alternatives to Mac Pros, as well as new ways to purchase workstation-level computing solutions, there is no shortage of opinions on what brands to buy and who might build it. Everyone has a cousin or neighbor that builds systems, right?

I’ve often heard people say, “I’ve never built a system or used (insert brand name here), but I know they aren’t good.” We’ve all run into people who are dubious by nature. I’m not so cynical, and when it comes to operating and computer systems, I consider myself Switzerland.

When looking for the right computer system, the main question you should ask is, “What do you need to accomplish?” Followed by, “What might you want to accomplish in the future?” I’m a video editor and colorist, so I need the system I build to work fluidly with Avid Media Composer, Blackmagic DaVinci Resolve and Adobe’s Premiere and After Effects. I also want my system to work with Maxon Cinema 4D in case I want to go a little further than Video Copilot’s Element 3D and start modeling in Cinema 4D. My main focus is video editing and color correction but I also need flexibility for other tools.

Lately, I’ve been reaching out to companies in the hopes of testing as many custom-built Windows -based PCs as possible. There have been many Mac OS-to-Windows transplants over the past few years, so I know pros are eager for options. One of the latest seismic shifts have come from the guys over at Greyscalegorilla moving away from Mac to PCs. In particular, I saw that one of the main head honchos over there, Nick Campbell (@nickvegas), went for a build complete with the Ryzen Threadripper 32-core workhorse. You can see the lineup of systems here. This really made me reassess my thoughts on AMD being a workstation-level processor, and while not everyone can afford the latest Intel i9 or AMD Threadripper processors, there are lower-end processors that will do most people just fine. This is where the custom-built PC makers like CyberPower PC, who equip machines with AMD processors, come into play.

So why go with a company like CyberPowerPC? The prices for parts are usually competitive, and the entire build isn’t much more than if you purchased the parts by themselves. Also, you deal with CyberPower PC for Warranty issues and not individual companies for different parts.

My CustomBuild
In my testing of an AMD Ryzen 7 1700x-based system with a Samsung NVMe hard drive and 16GB of RAM, I was able to run all of the software I mentioned before. The best part was the price; the total was around, $1,000! Not bad for someone editing and color correcting. Typically those machines can run anywhere from $2,000 to $10,000. Although the parts in those more expensive systems are more complex and have double to triple the amount of cores, some of that is wasted. And when on a budget you will be hard-pressed to find a better deal than CyberPower PC. If you build a system yourself, you might get close but not far off.

While this particular build isn’t going to beat out the AMD Threadripper’s or Intel i9-based systems, the AMD Ryzen-based systems offer a decent bang for the buck. As I mentioned, I focus on video editing and color correcting so I tested a simple one-minute UHD (3840×2160) 23.98 H.264 export. Using Premiere along with Adobe’s Media Encoder, I used about :30 seconds of Red UHD footage as well as some UHD S-log3/s-gamut3 footage I shot on the Sony a7 III creating a one-minute long sequence.

I then exported it as an H.264 at a bitrate around 10Mb/s. With only a 1D LUT on the Sony a7iii footage, the one-minute sequence took one minute 13 seconds. With added 10% resizes and a “simple” Gaussian blur over all the clips, the sequence exported in one minute and four seconds. This is proof that the AMD GPU is working inside of Premiere and Media Encoder. Inside Premiere, I was able to playback the full-quality sequence on a second monitor without any discernible frames dropping.

So when people tell you AMD isn’t Intel, technically they are right, but overall the AMD systems are performing at a high enough level that for the money you are saving, it might be worth it. In the end, with the right expectations and dollars, an AMD-based system like this one is amazing.

Whether you like to build your own computer or just don’t want to buy a big-brand system, custom-built PCs are a definite way to go. I might be a little partial since I am comfortable opening up my system and changing parts around, but the newer cases allow for pretty easy adjustments. For instance, I installed a Blackmagic DeckLink and four SSD drives for a RAID-0 setup inside the box. Besides wishing for some more internal drive cages, I felt it was easy to find the cables and get into the wiring that CyberPowerPC had put together. And because CyberPowerPC is more in the market for gaming, there are plenty of RGB light options, including the memory!

I was kind of against the lighting since any color casts could throw off color correction, but it was actually kind of cool and made my setup look a little more modern. It actually kind of got my creativity going.

Check out the latest AMD Ryzen processors and exciting improvements to the Radeon line of graphics cards on www.cyberpowerpc.com and www.amd.com. And, hopefully, I can get my hands on a sweet AMD Ryzen Threadripper 2990WX with 32 cores and 64 threads to really burn a hole in my render power.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

VFX house a52 launches a52 Color

Santa Monica-based visual effects studio a52 has launched a new custom-built space called a52 Color. It focuses on color grading and finishing. a52 Color is now home to a52 colorist Paul Yacono and new hire Daniel de Vue, who joins from London where he was head of color at Glassworks. a52 Color is able to offer clients access to combined or end-to-end services from its network of affiliated companies, which include Rock Paper Scissors, a52 VFX and Elastic.

“Color has been an offering within a52 with Paul Yacono for over half a decade, so it’s already an established part of the culture here,” explains executive producer Thatcher Peterson, who now runs with a52 after coming over from a four-year stint as EP at The Mill. “And with Daniel joining us from London, the distinction of a52 Color to become a separate entity thrusts our services and talent into its own spotlight.”

Yacono’s first major color project of out a52, was the Netflix series House of Cards, which proved that this boutique facility had the bandwidth to service high-volume 4K projects. Since that time, Yacono has established a body of work that ranges from ads for Target, Nike and BMW to the iconic title sequence for Game of Thrones. Yacono’s latest work includes the feature documentaries Struggle: The Life and Art of Szukalski, 13th, Amanda Knox, the TV miniseries Five Came Back and spots for Toyota, Prada, Samsung and Lexus.

Danish colorist de Vue has worked for directors such as Martin Werner, Martin de Thurah, Andreas Nilsson and Wally Pfister, and crafted the mood for brands such as Nike, Principal Financial, Vans, Mercedes, Toyota, Adidas, H&M and Xbox. Recently he graded an Elliot Rausch-directed TUMI spot featuring Lenny Kravitz and Zoë Kravitz on a journey to their family’s Bahamian roots.

Equipped for theatrical and broadcast color grading, the studio boasts two suites outfitted with FilmLight Baselight grading systems and is equipped for HDR with Dolby Vision certification. Additionally, remote grading services are also available throughout the US and internationally.

EP Peterson was at Company 3 for over 15 years, where he helped grow their core business from commercials to features and television.

As company founder Angus Wall, also an Oscar-winning editor for The Girl With the Dragon Tattoo, explains, “In adding high-end color and DI to our suite of companies, a52 Color completes our offerings for end-to-end, best of breed creative services.”

Goldcrest Post hires industry vet Dom Rom as managing director

Domenic Rom, a veteran of the New York post world, has joined Goldcrest Post as managing director. In this new role, he will oversee operations, drive sales and pursue growth strategies for Goldcrest, a provider of post services for film and television. Rom was most recently president/GM of Deluxe TV Post Production Services in LA.

“Domenic is a visionary leader who brings a client-centric approach toward facility management, and understands the industry’s changing dynamics,” says Goldcrest Films owner/executive director Nick Quested. “He inspires his team to perform at a peak level and deliver the quality services our clients expect.”

In his previous position, Rom led Deluxe’s global services for television, including its subsidiaries Encore and Level 3. Prior to that, he was managing director of Deluxe’s New York studio, which included East Coast operations for Encore, Company 3 and Method. He was SVP at Technicolor Creative Services for three years and an executive at Postworks for 11. Rom began his career as a colorist at DuArt Film Labs, eventually becoming executive VP in charge of its digital and film labs.

Rom says that he looks forward to working with Goldcrest Post’s management team, including head of production Gretchen McGowan and head of picture Jay Tilin. “We intend to be a very client-oriented facility,” he notes. “When clients walk in the door, they should feel at home, feel that this is their place. Jay and Gretchen both get that. We will work together very closely to ensure Goldcrest is a solid, responsive facility.”

He is also very happy about being back in New York City. “New York is my home,” says Rom. “When I decided to come back to the city just walking around town made me feel alive again. The New York market is so tight, the energy so high it just felt right. The people are real, the clients are amazing and the work is equal to anywhere in the world. I don’t regret a second of the past few years… I expanded my knowledge of other markets and made life-long friendships all over the world. At the end of the day though, my family and my work family are in New York.”

Recent projects for Goldcrest include the Netflix series Russian Doll and the independent features Sorry to Bother You, The Miseducation of Cameron Post, Native Son and High Flying Bird.

Sugar Studios LA gets social for celebrity-owned Ladder supplement

Sugar Studios LA completed a social media campaign for Ladder perfect protein powder and clean energy booster supplements starring celebrity founders Arnold Schwarzenegger, LeBron James, DJ Khaled, Cindy Crawford and Lindsey Vonn. The playful ad campaign focuses on social media, foregoing the usual TV commercial push and pitching the protein powder directly to consumers.

One spot shows Arnold in the gym annoyed by a noisy dude on the phone, prompting him to turn up his workout soundtrack. Then DJ Khaled is scratching encouragement for LeBron’s workout until Arnold drowns them out with his own personal live oompah band.

The ads were produced and directed by longtime Schwarzenegger collaborator Peter Grigsby, while Sugar Studios’ editor Nico Alba (Chevrolet, Ferrari, Morongo Casino, Mattel) cut the project using Adobe Premiere. When asked about using random spot lengths, as opposed to traditional :15s, :30s, and :60s, Alba explains, “Because it’s social media, we’re not always bound to those segments of time anymore. Basically, it’s ‘find the story,’ and because there are no rules, it makes the storytelling more fun. It’s a process of honing everything down without losing the rhythm or the message and maintaining a nice flow.”

Nico Alba and Jijo Reed. Credit: David Goggin

“Peter Grigsby requested a skilled big-brand commercial editor on this campaign,” Reed says. “Nico was the perfect fit to create that rhythm and flow that only a seasoned commercial editor could bring to the table.”

“We needed a heavy-weight gym ambience to set the stage,” says Alba, who worked closely with sound design/mixers Bret Mazur and Troy Ambroff to complement his editing. “It starts out with a barrage of noisy talking and sounds that really irritate Arnold, setting up the dueling music playlists and the sonic payoff.”

The audio team mixed and created sound design with Avid Pro tools Ultimate. Audio plugins called on include Waves Mercury bundle,, DTS Surround tools and iZotope RX7 Advanced.

The Sugar team also created a cinematic look to the spots, thanks to colorist Bruce Bolden, who called on Blackmagic DaVinci Resolve and a Sony BVM OLED monitor. “He’s a veteran feature film colorist,” says Reed, “so he often brings that sensibility to advertising spots as well, meaning rich blacks and nice, even color palettes.”

Storage used at the studio is Avid Nexis and Facilis Terrablock.

Tony Dustin joins Efilm as senior colorist

Tony Dustin has joined the Deluxe Creative Services team as senior colorist at Hollywood’s  Efilm. He will also be doing some work for sister company Encore. With more than 20 years of experience in color grading, Dustin’s work spans styles and genres, with a talent for revealing details in the darker palettes of many of his projects. He will be using Blackmagic DaVinci Resolve.

Dustin’s credits include the Netflix dramatic series Sense8, for which he was nominated for an HPA Award; Hulu horror series Castle Rock; Best Picture Academy Award-nominee Silver Linings Playbook, directed by David O. Russell; and Gran Torino, directed by Clint Eastwood.

Dustin’s first project for Efilm is the biographical drama Harriet, working with Oscar-winning cinematographer John Toll, with whom Dustin previously collaborated with on Sense8.

He comes to Efilm from Technicolor, where he spent nearly 17 years. He’s also held various color-centric roles at Westwind Media and Efilm sister company Encore. Dustin got his start in post by discovering the color grading process through his work in the vault at Editel while attending college. Having spent many hours developing negatives in a photo lab as a youth, Dustin has a well-honed eye and deep appreciation for cinematic visuals.

Little’s dailies-to-ACES finishing workflow via FotoKem

FotoKem’s Atlanta and Burbank facilities both worked on the post production — from digital dailies through finishing with a full ACES finish — for Universal Pictures’ and Legendary Entertainment’s film, Little.

From producer Will Packer (Girls Trip, Night School, the Ride Along franchise) and director/co-writer Tina Gordon (Peeples, Drumline), Little tells the story of a tech mogul (Girls Trip’s Regina Hall) who is transformed into a 13-year-old version of herself (Marsai Martin) and must rely on her long-suffering assistant (Insecure’s Issa Rae) just as the future of her company is on the line.

Martin, who stars in the TV series Black-ish, had the idea for the film when she was 10 and acts as an executive producer on the film.

Principal photography for Little took place last summer in the Atlanta area. FotoKem’s Atlanta location provided digital dailies, with looks developed by FotoKem colorist Alastor Arnold alongside cinematographer Greg Gardiner (Girls Trip, Night School), who shot with Sony F55 cameras.

Cinematographer Greg Gardiner on set.

“Greg likes a super-clean look, which we based on Sony color science with a warm and cool variant and a standard hero LUT,” says Arnold. “He creates the style of every scene with his lighting and photography. We wanted to maximize his out-of-the-camera look and pass it through to the grading process.”

Responding to the sharp growth of production in Georgia, FotoKem entered the Atlanta market five years ago to offer on-the-ground support for creatives. “FotoKem Atlanta is an extension of our Burbank team with colorists and operations staff to provide the upfront workflow required for file-based dailies,” says senior VP Tom Vice of FotoKem’s creative services division.

When editor David Moritz and the editorial team moved to Los Angeles, FotoKem sent EDLs to its nextLAB dailies platform, the facility’s proprietary digital file management system, where shots for VFX vendors were transcoded as ACES EXR files with full color metadata. Non-VFX shots were also automatically pulled from nextLAB for conform. The online was completed in Blackmagic Resolve.

The DI and the film conform happened concurrently, with Arnold and Gardiner working together daily. “We had a full ACES pipeline, with high dynamic range and high bit rate, which both Greg and I liked,” Arnold says. “The film has a punchy, crisp chromatic look, but it’s not too contemporary in style or hyper-pushed. It’s clean and naturalistic with an extra chroma punch.”

Gordon was also a key part of the collaboration, playing an active role in the DI, working closely with Gardiner to craft the images. “She really got into the color aspect of the workflow,” notes Arnold. “Of course, she had a vision for the movie and fully embraced the way that color impacts the story during the DI process.”

Arnold’s first pass was for the theatrical grade and the second for the HDR10 grade. “What I like about ACES is the simplicity of transforming to different color spaces and working environments. And the HDR grade was a quicker process,” he says. “HDR is increasingly part of our deliverables, and we’re seeing a lot more ACES workflows lately, including work on trailers.”

FotoKem’s deliverables included a DCP, DCDM and DSM for the theatrical release; separations and .j2k files for HDR10 archiving; and ProRes QuickTime files for QC.

Review: Avid Media Composer Symphony 2018 v.12

By Brady Betzel

In February of 2018, we saw a seismic shift in the leadership at Avid. Chief executive officer Louis Hernandez Jr. was removed and subsequently replaced by Jeff Rosica. Once Rosica was installed, I think everyone who was worried Avid was about to be liquidated to the highest bidder breathed a sigh of temporary relief. Still unsure whether new leadership was going to right a tilting ship, I immediately wanted to see a new action plan from Avid, specifically on where Media Composer and Symphony were going.

Media Composer with Symphony

Not long afterward, I was happily reading how Avid was taking lessons from its past transgressions and listening to its clients. I heard Avid was taking tours around the industry and listening to what customers and artists needed from them. Personally, I was asking myself if Media Composer with Symphony would ever be the finishing tool of Avid DS was. I’m happy to say, it’s starting to look that way.

It appears from the outside that Rosica is indeed the breath of fresh air Avid needed. At NAB 2019, Avid teased the next iteration of Media Composer, version 2019, with overhauled interface and improvements, such as a 32-bit float color pipeline workflow complete with ACES color management and a way to deliver IMF packages; a new engine with a distributed processing engine; and a whole new product called Media Composer|Enterprise, all of which will really help sell this new Media Composer. But the 2019 update is coming soon and until then I took a deep dive into Media Composer 2018 v12, which has many features editors, assistants, and even colorists have been asking for: a new Avid Titler, shape-based color correction (with Symphony option), new multicam features and more.

Titling
As an online editor who uses Avid Media Composer with Symphony option about 60% of the time, titling is always a tricky subject. Avid has gone through some rough seas when dealing with how to fix the leaky hole known as the Avid Title Tool. The classic Avid Title Tool was basic but worked. However, if you aligned something in the Title Tool interface to Title Safe zones, it might jump around once you close the Title Tool interface. Fonts wouldn’t always stay the same when working across PC and MacOS platforms. The list goes on, and it is excruciatingly annoying.

Titler

Let’s take a look at some Avid history: In 2002, Avid tried to appease creators and introduced the, at the time, a Windows-only titler: Avid Marquee. While Marquee was well-intentioned, it was extremely difficult to understand if you weren’t interested in 3D lighting, alignment and all sorts of motion graphics stuff that not all editors want to spend time learning. So, most people didn’t use it, and if they did it took a little while for anyone taking over the project to figure out what was done.

In December of 2014, Avid leaned on the New Blue Titler, which would work in projects higher than 1920×1080 resolution. Unfortunately, many editors ran into a very long render at the end, and a lot bailed on it. Most decided to go out of house and create titles in Adobe Photoshop and Adobe After Effects. While this all relates to my experience, I assume others feel the same.

In Avid Media Composer 2018, the company has introduced the Avid Titler, which in the Tools menu is labeled: Avid Titler +. It works like an effect rather than a rendered piece of media like in the traditional Avid Title Tool, where an Alpha and a Fill layer worked. This method is similar to how NewBlue or Marquee functioned. However, Avid Titler works by typing directly on the record monitor; adding a title is as easy as marking an in and out point and clicking on the T+ button on the timeline.

You can specify things like kerning, shadow, outlines, underlines, boxes, backgrounds and more. One thing I found peculiar was that under Face, the rotation settings rotate individual letters and not the entire word by default. I reached out to Avid and they are looking into making the entire word rotation option the default in the mini toolbar of Avid Titler. So stay tuned.

Also, you can map your fast forward and rewind buttons to “Go To Next/Previous Event.” This allows you to jump to your next edits in the timeline but also to the next/previous keyframes when in the Effect Editor. Typically, you click on the scrub line in the record window and then you can use those shortcuts to jump to the next keyframe. In the Avid Titler, it would just start typing in the text box. Furthermore, when I wanted to jump out of Effect Editor mode and back into Edit Mode, I usually hit “y,” but that did not get me out of Effects Mode (Avid did mention they are working on updates to the Avid Titler that would solve this issue). The new Avid Titler definitely has some bugs and/or improvements that are needed, and they are being addressed, but it’s a decent start toward a modern title editor.

Shape-based color correction

Color
If you want advanced color correction built into Media Composer, then you are going to want the Symphony option. Media Composer with the Symphony option allows for more detailed color correction using secondary color corrections as well as some of the newer updates, including shape-based color correction. Before Resolve and Baselight became more affordable, Symphony was the gold standard for color correction on a budget (and even not on a budget since it works so well in the same timeline the editors use). But what we are really here for is the 2018 v.12 update of Shapes.

With the Symphony option, you can now draw specific regions on the footage for your color correction to affect. It essentially works similarly to a layer-based system like Adobe Photoshop. You can draw shapes with the same familiar tools you are used to drawing with in the Paint or AniMatte tools and then just apply your brightness, saturation or hue swings in those areas only. On the color correction page you can access all of these tools on the right-hand side, including the softening, alpha view, serial mode and more.

When using the new shape-based tools you must point the drop-down menu to “CC Effect.” From there you can add a bunch of shapes on top of each other and they will play in realtime. If you want to lay a base correction down, you can specify it in the shape-based sidebar, then click shape and you can dial in the specific areas to your or your client’s taste. You can check off the “Serial Mode” box to have all corrections interact with one another or uncheck the box to allow for each color correction to be a little more isolated — a really great option to keep in mind when correcting. Unfortunately, tracking a shape can only be done in the Effect Editor, so you need to kind of jump out of color correction mode, track, and then go back. It’s not the end of the world, but it would be infinitely better if you could track efficiently inside of the color correction window. Avid could even take it further by allowing planar tracking by an app like Mocha Pro.

Shape-based color correction

The new shape-based corrector also has an alpha view mode identified by the infinity symbol. I love this! I often find myself making mattes in the Paint tool, but it can now be done right in the color correction tool. The Symphony option is an amazing addition to Media Composer if you need to go further than simple color correction but not dive into a full color correction app like Baselight or Resolve. In fact, for many projects you won’t need much more than what Symphony can do. Maybe a +10 on the contrast, +5 on the brightness and +120 on the saturation and BAM a finished masterpiece. Kind of kidding, but wait until you see it work.

Multicam
The final update I want to cover is multicam editing and improvements to editing group clips. I cannot emphasize enough how much time this would have saved me as an assistant editor back in the pre-historic Media Composer days… I mean we had dongles, and I even dabbled in the Meridian box. Literally days of grouping and regrouping could have been avoided with the Edit Group feature. But I did make a living fixing groups that were created incorrectly, so I guess this update is a Catch 22. Anyway, you can now edit groups in Media Composer by creating a group, right-clicking on that group and selecting Edit Group. From there, the group will now open in the Record Monitor as a sequence, and from there you can move, nudge and even add cameras to a previously created group. Once you are finished, you can update the group and refresh any sequences that used that group to update if you wish. One issue is that with mixed frame rate groups, Avid says committing to that sequence might produce undesirable effects.

Editing workspace

Cost of Entry
How much does Media Composer cost these days? While you can still buy it outright, it seems a bit more practical to go monthly since you will automatically get updates, but it can still be a little tricky. Do you need PhraseFind and/or ScriptSync? Do you need the Symphony option? Do you need to access shared storage? There are multiple options depending on your needs. If you want everything, then Media Composer Ultimate for $49 per month is what you want. If you want Media Composer and just one add-on, like Symphony, it will cost $19 per month plus $199 per year for the Symphony option. If you want to test the water before jumping in, you can always try Media Composer First.

For a good breakdown of the Media Composer pricing structure, check out KeyCode Media  page (a certified reseller). Another great link with tons of information organized into easily digestible bites is this. Additionally, www.freddylinks.com is a great resource chock full of everything else Avid, written by Avid technical support specialist Fredrik Liljeblad out of Sweden.

Group editing

Summing Up
In the end, I use and have used Media Composer with Symphony for over 15 years, and it is the most reliable nonlinear editor supporting multiple editors in a shared network environment that I have used. While Adobe Premiere Pro, Apple Final Cut Pro X and Blackmagic Resolve are offering fancy new features and collaboration modes, Avid seems to always hold stabile when I need it the most. These new improvements and a UI overhaul (set to debut in May), new leadership from Rosica, and the confidence of Rosica’s faithful employees all seem to be paying off and getting Avid back on the track they should have always been on.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Colorist Peter Doyle joins Warner Bros. De Lane Lea’s picture services division

World-renown and respected supervising colorist Peter Doyle, whose large body of work includes The Lord of the Rings trilogy, has joined London’s Warner Bros. De Lane Lea’s (WBDLL) new picture services division. Doyle brings with him extensive technical and creative expertise acquired over a 40-year career.

Doyle has graded 12 of the 100 highest-grossing films of all time including the Harry Potter film series. His recent credits include Darkest Hour (see our interview with him here), The Ballad of Buster Scruggs and both Fantastic Beasts films.

Doyle will be working alongside BAFTA-winning colorist Asa Shoul (Mission Impossible: Fallout, Baby Driver, Amazon’s Tin Star), who joined WBDLL at the end of last year. The additions of Doyle and Shoul beef up WBDLL’s picture division to match the studio’s sound facilities De Lane Lea.

Speaking of joining the company, Doyle says, “I first worked with Warner Bros. on The Matrix in 1999. Since then, grading and delivering films to Warner Bros. for filmmakers such as Tim Burton, David Yates, Dick Zanuck and David Heyman has always felt like a partnership. Warner Bros. always brought tremendous passion to the projects and a deep desire to best represent the creative intent of the filmmakers. WBDLL represents a third-generation post facility; it’s been conceived with the philosophy that origination and delivery are part of the same process. It’s managed by a newly assembled crew that over the course of their careers have answered some of the most complex post production challenges the industry has devised. WBDLL is an environment and indeed a concept I feel London has needed for many years.”

The new facilities at WBDLL include two 4K HDR FilmLight Baselight X grading theatres, Autodesk Flame online suites, digital dailies facilities, dark fiber connectivity and a mastering and QC department. WBDLL has additional facilities based at Warner Bros. Studios Leavesden, including a 50-seat 4K screening room, 4K VFX review theater and in-facility and on-location digital dailies, offering clients a full end-to-end service.

WBDLL has been the choice for many large features including Dumbo, Wonder Woman, Three Billboards Outside Ebbing, Missouri, Fantastic Beasts, Early Man, Mission Impossible: Fallout and Outlaw King. Its roster of high-end TV clients include Netflix, Amazon, Starz, BBC and ITV.

Last year the company announced it was cementing its future in Soho by moving to the purpose-built Ilona Rose House in 2021, which is currently under construction.

Keep Me Posted adds senior colorist Aidan Stanford

Burbank’s Keep Me Posted, a FotoKem company specializing in creative and technical episodic post services, has brought Aidan Stanford as senior colorist. He will work on episodic and feature projects.

With over 25 years of experience, Stanford’s work history ranges from photochemical color timing to digital color grading and includes DI, broadcast, commercials and shorts. His varied background includes color timing 65mm film for Lawrence of Arabia (IMAX 2002 restoration/release); the DI, HDR and all video deliverables for the Oscar-winning Get Out; and multiple seasons of Emmy Award-winning television series. His credits include the features Happy Death Day, Insidious: The Last Key and Benji and his episodic credits include Modern Family, Drunk History, You’re the Worst and Fresh Off the Boat.

At Keep Me Posted, Stanford will be working on Nucoda and Resolve.

“Aidan brings a deep knowledge of film, an artistic eye and a keen technical ability to help our creative customers bring their vision to reality,” says Mike Brodersen, FotoKem’s chief strategy officer. “His comprehensive skill set in combination with his expertise in color have made him a trusted collaborator with many filmmakers and showrunners.”

Collaboration company Pix acquires Codex

Pix has reached an agreement to acquire London-based Codex, in a move that will enable both companies to deliver a range of new products and services, from streamlined camera capture to post production finishing.

The Pix System  is a collaboration tool that provides industry pros with secure access to production content on mobile devices, laptops or TVs from offices, homes or while traveling. They won an Oscar for its technology in 2019.

Codex products include recorders and media processing systems that transfer digital files and images from the camera to post, and tools for color dynamics, dailies creation, archiving, review and digital asset management.

“Our clients have relied on Pix to protect their material and ideas throughout all phases of production. In Codex, we found a group that similarly values relationships with attention to critical details,” explains Pix founder/CEO Eric Dachs. “Codex will retain its distinct brand and culture, and there is a great deal we can do together for the benefit of our clients and the industry.”

Over the years, Pix and Codex have seen wide industry adoption, delivering a proven record of contributing value to their clients. Introduced in 2003, Pix soon became a trusted and widely used secure communication and content management provider. The Pix System enables creative continuity and reduces project risk by ensuring that ideas are accurately shared, stored, and preserved throughout the entire production process.

“Pix and Codex are complementary, trusted brands used by leading creatives, filmmakers and studios around the world,” says Codex managing director Marc Dando. “The integration of both services into one simplified workflow will deliver the industry a fast, secure, global collaborative ecosystem.”

With the acquisition of Codex, Pix will expand its servicing reach across the globe. Pix founder Dachs will remain as CEO, and Dando will take on the role of chief design officer at Pix, with a focus on existing and new products.

NAB 2019: postPerspective Impact Award winners

postPerspective has announced the winners of our Impact Awards from NAB 2019. Seeking to recognize debut products with real-world applications, the postPerspective Impact Awards are voted on by an anonymous judging body made up of respected industry artists and pros (to whom we are very grateful). It’s working pros who are going to be using these new tools — so we let them make the call.

It was fun watching the user ballots come in and discovering which products most impressed our panel of post and production pros. There are no entrance fees for our awards. All that is needed is the ability to impress our voters with products that have the potential to make their workdays easier and their turnarounds faster.

We are grateful for our panel of judges, which grew even larger this year. NAB is exhausting for all, so their willingness to share their product picks and takeaways from the show isn’t taken for granted. These men and women truly care about our industry and sharing information that helps their fellow pros succeed.

To be successful, you can’t operate in a vacuum. We have found that companies who listen to their users, and make changes/additions accordingly, are the ones who get the respect and business of working pros. They aren’t providing tools they think are needed; they are actively asking for feedback. So, congratulations to our winners and keep listening to what your users are telling you — good or bad — because it makes a difference.

The Impact Award winners from NAB 2019 are:

• Adobe for Creative Cloud and After Effects
• Arraiy for DeepTrack with The Future Group’s Pixotope
• ARRI for the Alexa Mini LF
• Avid for Media Composer
• Blackmagic Design for DaVinci Resolve 16
• Frame.io
• HP for the Z6/Z8 workstations
• OpenDrives for Apex, Summit, Ridgeview and Atlas

(All winning products reflect the latest version of the product, as shown at NAB.)

Our judges also provided quotes on specific projects and trends that they expect will have an impact on their workflows.

Said one, “I was struck by the predicted impact of 5G. Verizon is planning to have 5G in 30 cities by end of year. The improved performance could reach 20x speeds. This will enable more leverage using cloud technology.

“Also, AI/ML is said to be the single most transformative technology in our lifetime. Impact will be felt across the board, from personal assistants, medical technology, eliminating repetitive tasks, etc. We already employ AI technology in our post production workflow, which has saved tens of thousands of dollars in the last six months alone.”

Another echoed those thoughts on AI and the cloud as well: “AI is growing up faster than anyone can reasonably productize. It will likely be able to do more than first thought. Post in the cloud may actually start to take hold this year.”

We hope that postPerspective’s Impact Awards give those who weren’t at the show, or who were unable to see it all, a starting point for their research into new gear that might be right for their workflows. Another way to catch up? Watch our extensive video coverage of NAB.

Colorist Andreas Brueckl on embracing ACES workflow

By Debra Kaufman

Senior colorist Andreas Brueckl has graded a wide range of projects, from feature films to over 1,000 commercials, in Europe, the Middle East and Asia. He began his career at Bavaria Film/Cinepost in Germany, then freelanced across Europe and the Middle East before landing at 1000Volt in Istanbul, where he was lead colorist for almost four years. In 2014, he moved to Pinewood Studios Malaysia and is now currently senior colorist at FutureWorks in Mumbai, India.

Andreas Brueckl

With his cinematic grading approach, Brueckl was an early adopter of the ACES workflow. Since then he has published tutorials about ACES workflows and color grading. He spoke to postPerspective about adopting the ACES workflow and why he’s encouraging cinematographers and VFX houses to use it

Tell me about how those first trials worked out?
In 2013, when I was working at 1000Volt in Istanbul, I played around with ACES color spaces, but I was so busy — working on as many as six TV commercials a day — that I didn’t really have the time to devote to learning something new. That changed when I started at Pinewood Studios in Malaysia in 2014. The Malaysian government really wanted to build up the film industry and attract international clients. They teamed up with Imagica from Japan to create a post department. I had this beautiful brand new 100-seat 4K grading theater and a new FilmLight Baselight. I graded my first feature there in the typical telecine way with a P3 timeline, and then I started from scratch with the same movie and graded it in ACES, learning along the way. After a week or so of working on it, my grade clearly looked way better in ACES.

How was the learning process?
I was used to starting from a log image, which is the way most of us DI colorists graded for many years — and was irritated that my image was suddenly so contrasty and saturated. Thankfully, Andy Minuth and Daniele Siragusano from FilmLight helped me to understand that a scene-referred color space isnʼt as limited as a display-referred color space. In other words, I wasn’t losing information or limiting myself, and I could always dial it back to a more log-looking image if needed. Knowing this, I could achieve a “film-style” grading more readily. After a year of using ACES, and as Pinewood Malaysia started getting more and more Singaporean and Chinese clients, I made ACES tutorials with Chinese subtitles to help educate those clients.

Bazaar

Now that you’re working at FutureWorks, are you still using ACES?
In 2017, I signed on at FutureWorks in Mumbai where we work on a wide range of content, including blockbuster movies, smaller movies, TV commercials and, more recently, lots of streaming TV from Amazon Prime and Netflix. We’ve really committed to ACES there. Hope Aur Hum and Bazaar are just two examples of how well ACES has worked. Besides always grading in ACES, we switched our entire VFX pipeline to ACES in combination with Baselight grade files. In-house, all of that was easy — and welcomed by our clients. I have cinematographers coming in asking if we’re grading in ACES. Some of them already know the benefits of ACES quite well, and others just heard it is a new and very “filmic” approach of grading. So the DPs that haven’t tried ACES yet are keen to know everything about this new grading style.

How has switching to an ACES pipeline for visual effects worked out?
It was and still is a bit more work to convince VFX vendors to switch to ACES. They’re not concerned about ACES per se, but about the size of the OpenEXR files which, at uncompressed 4K, can go up to 50MB per frame. For that reason, they sometimes want to stick to the 10-bit DPX they’ve used for the past 10 years.

I found that communication is key to get the VFX facility to embrace the ACES workflow. To make it easier, we meet the compositing supervisors of all the VFX vendors and walk them through the process in Nuke and how to use the Baselight plugin. It makes it super easy.

Hope Aur Hum

If there is no demand for uncompressed files, there’s nothing wrong with using an OpenEXR Zip 1 or Piz compression, which is actually smaller than DPX renders. This year, I’m working on some of the biggest feature films and Netflix and Amazon shows in the Indian market. I’m making it clear from the beginning to all the vendors that we work in ACES and we go for an ACES VFX workflow. We’ve found that once we contact all the VFX houses and walk them through the process, they have no problem implementing the ACES workflows.

What do you personally like about ACES?
First of all, ACES is not a plugin that only works on one platform — it is an entire system that connects all platforms. I explain to the DPs that I can mix my LMTs (Look Modification Transforms) to shape the look and play with the density in chosen areas. Essentially, I have the chance to mix my own digital film stock. ACES gives me a base look much faster than I could get from a log telecine timeline workflow, where I would have had to build up a time-consuming grade from a Log image.

As HDR grades become more popular, ACES is absolutely mandatory in my opinion. One big advantage of using ACES is the ability to get additional details in the highlights. Finally, ACES is the perfect workflow for deliveries to multiple platforms. With just a few adjustments, I can make deliverables in P3, Rec.709, HDR and so on without quality loss.

Main Image: Bazaar


Debra Kaufman has been writing about the intersection of technology and media/entertainment for nearly 30 years. She currently writes the daily newsletter for USC’ Entertainment Technology Center (www.etcentric.org).

The Kominsky Method‘s post brain trust: Ross Cavanaugh and Ethan Henderson

By Iain Blair

As Bette Davis famously said, “Old age ain’t no place for sissies!” But Netflix’s The Kominsky Method proves that in the hands of veteran sitcom creator Chuck Lorre — The Big Bang Theory, Two and a Half Men and many others — there’s plenty of laughs to be mined from old age… and disease, loneliness and incontinence.

The show stars Michael Douglas as divorced, has-been actor and respected acting coach Sandy Kominsky and Alan Arkin as his longtime agent Norman Newlander. The story follows these bickering best friends as they tackle life’s inevitable curveballs while navigating their later years in Los Angeles, a city that values youth and beauty above all. Both comedic and emotional, The Kominsky Method won Douglas a Golden Globe.

Ethan Henderson and Ross Cavanaugh

The single-camera show is written by Al Higgins, David Javerbaum and Lorre, who also directed the first episode. Lorre, Higgins and Douglas executive produce the series, which is produced by Chuck Lorre Productions in association with Warner Bros. Television.

I recently spoke with associate producer Ross Cavanaugh and post coordinator Ethan Henderson about posting the show.

You are currently working on Season 2?
Ross Cavanaugh: Yes, and we’re moving along quite quickly. We’re already about three-quarters of the way through the season shooting-wise, out of the eight-show arc.

Where do you shoot, and what’s the schedule like?
Cavanaugh: We shoot mainly on the lot at Warner Bros. and then at various locations around LA. We start prepping each show one week before we start shooting, and then we get dailies the day after the first shooting day.

Our dailies lab is Picture Shop, which is right up the street in Burbank and very convenient for us. So getting footage from the set to them is quick, and they’re very fast at turning the dailies around. We usually get them by midnight the same day we drop them off,  then our editors start cutting fairly quickly after that.

Where do you do all the post?
Cavanaugh: Mainly at Picture Shop, who are very experienced in TV post work. They do all the post finishing and some of the VFX stuff — usually the smaller things, like beauty fixes and cleanup. They also do all the final color correction since DP Anette Haellmigk really wanted to work with colorist George Manno. They’ve been really great.

Ethan Henderson: We’re back and forth from the lot to Picture Shop, and once we get more heavily involved in all the post, I spend a lot of time there while we are onlining the show, coloring and doing the VFX drop-ins, and when we start the final deliverables process, since everything for Netflix comes out of there.

What are the big challenges of post production on this show, and how closely do you work with Chuck Lorre?
Cavanaugh: As with any TV show, you’re always on a very tight deadline, and there are a lot of moving parts to deal with very quickly. While our prolific showrunner Chuck Lorre is busy with all the projects he has going — especially with all the writing — he always makes time for us. He’s very passionate about the cut and is extremely on top of things.

I’d say the challenges on this show are actually fairly minimal. Basically, we ran a pretty tight ship on the first season, and now I’d say it’s a well-oiled machine. We haven’t had any big problems or surprises in post, which can happen.

Let’s talk about editing. You had two editors for Season 1 in Matthew Barbato and Gina Sansom. I assume that’s because of the time factor. How does that work?
Cavanaugh: Each editor has their own assistant editor — that was true in Season One (Matthew with Jack Cunningham and Gina with Barb Steele) and in Season two (Steven Lang with Romeo Rubio and Gina with Rahul Das). They cut separately and work on an odds-and-evens schedule, each doing every other episode. We all get together to watch screenings of the Director’s Cut, usually in the editorial bay.

What are the big editing challenges?
Cavanaugh: We have a pretty big cast, and there’s a ton of jokes and stuff going on all the time. In addition to Michael Douglas and Alan Arkin, the actors are so experienced. They give such great performances — there’s a lot of material for the editors to cut from. To be honest, the scripts are all so tight that I think one of the challenges is knowing when to cut out a joke, to serve the pacing of an episode.

This isn’t a VFX-driven show, but there are some visual effects shots. Can you explain?
Cavanaugh: We do a lot of driving scenes and use 24frame.com, who have this really good wraparound HD projection technology, so we pretty much shoot all our car scenes on the stage.

Henderson: Once in a while, we’ll pick up some exterior or establishing shots on a freeway using doubles in the cars. All the plates are picked ahead of time. Occasionally, for the sake of continuity, we’ll have to replace a plate in the background and put a different section of the plate in because too many cars ran by, and it didn’t match up in the edit.

That’s one of the things that comes up every so often. The other big thing is that both of the leads wear glasses, so reflections of crew and equipment can become an issue; we have to deal with all that and clean it up.

Cavanaugh: We don’t use many big VFX shots, and we can’t reveal much about what happens in the new season, but sometimes there’s stuff like the scene in season one where one of the characters threw some firecrackers at Michael Douglas’ feet. We obviously weren’t going to throw real ones at Michael Douglas, although I think he’d have sucked it up if we’d done it that way! We were shooting in a residential neighborhood at night and we couldn’t set off real ones because they are very loud, so we ended up doing it all with VFX. FuseFx handled the workload for the heavier VFX work.

Henderson: There was a big shot in the pilot where we did a lot of shot extensions in a restaurant where Sandy Kominsky (Douglas) and Nancy Travis’ character are having coffee. It was this big sweeping pan down over the city.

Can you talk about the importance of sound and music?
Cavanaugh: They both play a key role, and we have a great team that includes music editor Joe Deveau, supervising sound editor Lou Thomas, and sound mixers Yuri Reese and Bill Smith. The sound recording quality we get on set is always great, so that means we only need very minimal ADR. The whole sound mix is done here on the lot at Warners.

Our composer, Jeff Cardoni, worked with Chuck on Young Sheldon, and he’s really on top of getting all the new cues for the show. We basically have two versions of our main title sequence music cues — one is very bombastic and in-your-face, and the other is a bit more subtle — and it’s funny how it broke down in the first season. The guy who cut the pilot and the odd episodes went with the more bombastic version, while the second editor on the even episodes preferred the softer cues, so I’ll be curious to see how all that breaks down in the new season.

How important is all the coloring on this?
Cavanaugh: Very important. After we do all the online, we ship it over to George at Picture Shop and spend about a day and a half on it. The DP either comes in or gets a file, and she gives her notes. Then we’ll play it for Chuck. We’re in the HDR world with Dolby Vision, and it makes it look so beautiful — but then we have to do the standard pass on it as well.

I know you can’t reveal too much about the new season, but what can fans expect?
Henderson: They’re getting a continuation of these two characters’ journey together — growing old and everything that comes with that. I think it feels like a very natural extension of the first season.

Cavanaugh: In terms of the post process, I feel like we’re a Swiss watch now. We’re ticking along very smoothly. Sometimes post can be a nightmare and full of problems, so it’s great to have it all under control.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Colorfront at NAB with 8K HDR, product updates

Colorfront, which makes on-set dailies and transcoding systems, has rolled out new 8K HDR capabilities and updates across its product lines. The company has also deepened its technology partnership with AJA and entered into a new collaboration with Pomfort to bring more efficient color and HDR management on-set.

Colorfront Transkoder is a post workflow tool for handling UHD, HDR camera, color and editorial/deliverables formats, with recent customers such as Sky, Pixelogic, The Picture Shop and Hulu. With a new HDR GUI, Colorfront’s Transkoder 2019 performs the realtime decompression/de-Bayer/playback of Red and Panavision DXL2 8K R3D material displayed on a Samsung 82-inch Q900R QLED 8K Smart TV in HDR and in full 8K resolution (7680 X 4320). The de-Bayering process is optimized through Nvidia GeForce RTX graphics cards with Turing GPU architecture (also available on Colorfront On-Set Dailies 2019), with 8K video output (up to 60p) using AJA Kona 5 video cards.

“8K TV sets are becoming bigger, as well as more affordable, and people are genuinely awestruck when they see 8K camera footage presented on an 8K HDR display,” said Aron Jaszberenyi, managing director, Colorfront. “We are actively working with several companies around the world originating 8K HDR content. Transkoder’s new 8K capabilities — across on-set, post and mastering — demonstrate that 8K HDR is perfectly accessible to an even wider range of content creators.”

Powered by a re-engineered version of Colorfront Engine and featuring the HDR GUI and 8K HDR workflow, Transkoder 2019 supports camera/editorial formats including Apple ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE (High Density Encoding).

Transkoder 2019’s mastering toolset has been further expanded to support Dolby Vision 4.0 as well as Dolby Atmos for the home with IMF and Immersive Audio Bitstream capabilities. The new Subtitle Engine 2.0 supports CineCanvas and IMSC 1.1 rendering for preservation of content, timing, layout and styling. Transkoder can now also package multiple subtitle language tracks into the timeline of an IMP. Further features support fast and efficient audio QC, including solo/mute of individual tracks on the timeline, and a new render strategy for IMF packages enabling independent audio and video rendering.

Colorfront also showed the latest versions of its On-Set Dailies and Express Dailies products for motion pictures and episodic TV production. On-Set Dailies and Express Dailies both now support ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE. As with Transkoder 2019, the new version of On-Set Dailies supports real-time 8K HDR workflows to support a set-to-post pipeline from HDR playback through QC and rendering of HDR deliverables.

In addition, AJA Video Systems has released v3.0 firmware for its FS-HDR realtime HDR/WCG converter and frame synchronizer. The update introduces enhanced coloring tools together with several other improvements for broadcast, on-set, post and pro AV HDR production developed by Colorfront.

A new, integrated Colorfront Engine Film Mode offers an ACES-based grading and look creation toolset with ASC Color Decision List (CDL) controls, built-in LOOK selection including film emulation looks, and variable Output Mastering Nit Levels for PQ, HLG Extended and P3 colorspace clamp.

Since launching in 2018, FS-HDR has been used on a wide range of TV and live outside broadcast productions, as well as motion pictures including Paramount Pictures’ Top Gun: Maverick, shot by Claudio Miranda, ASC.

Colorfront licensed its HDR Image Analyzer software to AJA for AJA’s HDR Image Analyzer in 2018. A new version of AJA HDR Image Analyzer is set for release during Q3 2019.

Finally, Colorfront and Pomfort have teamed up to integrate their respective HDR-capable on-set systems. This collaboration, harnessing Colorfront Engine, will include live CDL reading in ACES pipelines between Colorfront On-Set/Express Dailies and Pomfort LiveGrade Pro, giving motion picture productions better control of HDR images while simplifying their on-set color workflows and dailies processes.

Color Chat: Light Iron’s Sean Dunckley

Sean Dunckley joined Light Iron New York’s studio in 2013, where he has worked on episodic television and features films. He finds inspiration in many places, but most recently in the photography of Stephen Shore and Greg Stimac. Let’s find out more…

NAME: Sean Dunckley

COMPANY: LA- and NYC-based Light Iron

CAN YOU DESCRIBE YOUR COMPANY?
Light Iron is a Panavision company that offers end-to-end creative and technical post solutions. I color things there.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I like to get involved early in the process. Some of the most rewarding projects are those where I get to work with the cinematographer from pre-production all the way through to the final DCP.

Ongoing advances in technology have really put the spotlight on the holistic workflow. As part of the Panavision ecosystem, we can offer solutions from start to finish, and that further strengthens the collaboration in the DI suite. We can help a production with camera and lens choices, oversee dailies and then bring all that knowledge into the final grade.

Recently, I had a client who was worried about the speed of his anamorphics at night. The cinematographer was much more comfortable shooting the faster spherical lenses, but the film and story called for the anamorphic look. In pre-production, I was able to show him how we can add some attributes of anamorphic lenses in post. That project ended up shooting a mix of anamorphic and spherical, delivering on both the practical and artistic needs.

Hulu’s Fyre Fraud doc.

WHAT SYSTEM DO YOU WORK ON?
Filmlight’s Baselight. Its color management tools offer with strong paint capabilities, and the Blackboard 2 panel is very user-friendly.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Now that DI systems have expanded their tools, I can integrate last-minute fixes during the DI sessions without having to stop and export a shot to another application. Baselight’s paint tools are very strong and have allowed me to easily solve many client issues in the room. Many times, this has saved valuable time against strict deadlines.

WHAT’S YOUR FAVORITE PART OF THE JOB?
That’s easy. It is the first day of a new project. It feels like an artistic release when I am working with filmmakers to create style frames. I like to begin the process by discussing the goals of color with the film’s creative team.

I try to get their take on how color can best serve the story. After we talk, we play for a little while. I demonstrate the looks that have been inspired by their words and then form a color palette for the project. During this time, it is just as important to learn what the client doesn’t like as much as what they do like.

WHAT’S YOUR LEAST FAVORITE?
I think the hours can be tough at times. The deadlines we face often battle with the perfectionist in me.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Architecture is a field I would have loved to explore. It’s very similar, as it is equal parts technical and creative.

WHY DID YOU CHOOSE THIS PROFESSION?
I had always been interested in post. I used to cut skateboard videos with friends in high school. In film school, I pursued more of an editing route. After graduation, I got a job at a post house and quickly realized I wanted to deviate and dive into color.

Late Night with Emma Thompson. Photo by Emily Aragones

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Recent film titles I worked on include Late Night and Brittany Runs a Marathon, both of which got picked up at Sundance by Amazon.

Other recent projects include Amazon Studio’s Life Itself, and the Fyre Fraud documentary on Hulu. Currently, I am working on multiple episodic series for different OTT studios.

The separation that used to exist between feature films, documentaries and episodics has diminished. Many of my clients are bouncing between all types of projects and aren’t contained to a single medium.

It’s a unique time to be able to color a variety of productions. Being innovative and flexible is the name of the game here at Light Iron, and we’ve always been encouraged to follow the client and not the format.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It’s impossible to pick a single project. They are all my children!

WHERE DO YOU FIND INSPIRATION?
I go through phases but right now it’s mostly banal photography. Stephen Shore and Greg Stimac are two of my favorite artists. Finding beauty in the mundane has a lot to do with the shape of light, which is very inspiring to me as a colorist.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
I need my iPhone, Baselight and, of course, my golf course range finder.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I follow Instagram for visuals, and I keep up with Twitter for my sports news and scores.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I have young children, so they make sure I leave those stresses back at the office, or at least until they go to bed. I also try to sneak in some golf whenever I can.

NAB 2019: First impressions

By Mike McCarthy

There are always a slew of new product announcements during the week of NAB, and this year was no different. As a Premiere editor, the developments from Adobe are usually the ones most relevant to my work and life. Similar to last year, Adobe was able to get their software updates released a week before NAB, instead of for eventual release months later.

The biggest new feature in the Adobe Creative Cloud apps is After Effects’ new “Content Aware Fill” for video. This will use AI to generate image data to automatically replace a masked area of video, based on surrounding pixels and surrounding frames. This functionality has been available in Photoshop for a while, but the challenge of bringing that to video is not just processing lots of frames but keeping the replaced area looking consistent across the changing frames so it doesn’t stand out over time.

The other key part to this process is mask tracking, since masking the desired area is the first step in that process. Certain advances have been made here, but based on tech demos I saw at Adobe Max, more is still to come, and that is what will truly unlock the power of AI that they are trying to tap here. To be honest, I have been a bit skeptical of how much AI will impact film production workflows, since AI-powered editing has been terrible, but AI-powered VFX work seems much more promising.

Adobe’s other apps got new features as well, with Premiere Pro adding Free-Form bins for visually sorting through assets in the project panel. This affects me less, as I do more polishing than initial assembly when I’m using Premiere. They also improved playback performance for Red files, acceleration with multiple GPUs and certain 10-bit codecs. Character Animator got a better puppet rigging system, and Audition got AI-powered auto-ducking tools for automated track mixing.

Blackmagic
Elsewhere, Blackmagic announced a new version of Resolve, as expected. Blackmagic RAW is supported on a number of new products, but I am not holding my breath to use it in Adobe apps anytime soon, similar to ProRes RAW. (I am just happy to have regular ProRes output available on my PC now.) They also announced a new 8K Hyperdeck product that records quad 12G SDI to HEVC files. While I don’t think that 8K will replace 4K television or cinema delivery anytime soon, there are legitimate markets that need 8K resolution assets. Surround video and VR would be one, as would live background screening instead of greenscreening for composite shots. No image replacement in post, as it is capturing in-camera, and your foreground objects are accurately “lit” by the screens. I expect my next major feature will be produced with that method, but the resolution wasn’t there for the director to use that technology for the one I am working on now (enter 8K…).

AJA
AJA was showing off the new Ki Pro Go, which records up to four separate HD inputs to H.264 on USB drives. I assume this is intended for dedicated ISO recording of every channel of a live-switched event or any other multicam shoot. Each channel can record up to 1080p60 at 10-bit color to H264 files in MP4 or MOV and up to 25Mb.

HP
HP had one of their existing Z8 workstations on display, demonstrating the possibilities that will be available once Intel releases their upcoming DIMM-based Optane persistent memory technology to the market. I have loosely followed the Optane story for quite a while, but had not envisioned this impacting my workflow at all in the near future due to software limitations. But HP claims that there will be options to treat Optane just like system memory (increasing capacity at the expense of speed) or as SSD drive space (with DIMM slots having much lower latency to the CPU than any other option). So I will be looking forward to testing it out once it becomes available.

Dell
Dell was showing off their relatively new 49-inch double-wide curved display. The 4919DW has a resolution of 5120×1440, making it equivalent to two 27-inch QHD displays side by side. I find that 32:9 aspect ratio to be a bit much for my tastes, with 21:9 being my preference, but I am sure there are many users who will want the extra width.

Digital Anarchy
I also had a chat with the people at Digital Anarchy about their Premiere Pro-integrated Transcriptive audio transcription engine. Having spent the last three months editing a movie that is split between English and Mandarin dialogue, needing to be fully subtitled in both directions, I can see the value in their tool-set. It harnesses the power of AI-powered transcription engines online and integrates the results back into your Premiere sequence, creating an accurate script as you edit the processed clips. In my case, I would still have to handle the translations separately once I had the Mandarin text, but this would allow our non-Mandarin speaking team members to edit the Mandarin assets in the movie. And it will be even more useful when it comes to creating explicit closed captioning and subtitles, which we have been doing manually on our current project. I may post further info on that product once I have had a chance to test it out myself.

Summing Up
There were three halls of other products to look through and check out, but overall, I was a bit underwhelmed at the lack of true innovation I found at the show this year.

Full disclosure, I was only able to attend for the first two days of the exhibition, so I may have overlooked something significant. But based on what I did see, there isn’t much else that I am excited to try out or that I expect to have much of a serious impact on how I do my various jobs.

It feels like most of the new things we are seeing are merely commoditized versions of products that may originally have been truly innovative when they were initially released, but now are just slightly more fleshed out versions over time.

There seems to be much less pioneering of truly new technology and more repackaging of existing technologies into other products. I used to come to NAB to see all the flashy new technologies and products, but now it feels like the main thing I am doing there is a series of annual face-to-face meetings, and that’s not necessarily a bad thing.

Until next year…


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Company 3 NY adds senior colorist Joseph Bicknell

Company 3 has added colorist Joseph Bicknell to its New York office. He has relocated following his time as co-director/founder of finishing house Cheat based in London where he worked on commercial campaigns and music videos, including campaigns for Nike, Mercedes and Audi and videos for A$AP Rocky and Skepta.

Bicknell started his career at age 15, working as a runner on London-based productions. After serving in nearly every aspect of production and post, he discovered his true passion lay in color grading, where artists can make creative choices quickly and sees results instantly. He honed his skills first freelancing and then at Cheat.

He will be working on Blackmagic Resolve. And as with all Company 3 colorists, Bicknell is available at locations globally via remote color session.

Autodesk’s Flame 2020 features machine learning tools

Autodesk’s new Flame 2020 offers a new machine-learning-powered feature set with a host of new capabilities for Flame artists working in VFX, color grading, look development or finishing. This latest update will be showcased at the upcoming NAB Show.

Advancements in computer vision, photogrammetry and machine learning have made it possible to extract motion vectors, Z depth and 3D normals based on software analysis of digital stills or image sequences. The Flame 2020 release adds built-in machine learning analysis algorithms to isolate and modify common objects in moving footage, dramatically accelerating VFX and compositing workflows.

New creative tools include:
· Z-Depth Map Generator— Enables Z-depth map extraction analysis using machine learning for live-action scene depth reclamation. This allows artists doing color grading or look development to quickly analyze a shot and apply effects accurately based on distance from camera.
· Human Face Normal Map Generator— Since all human faces have common recognizable features (relative distance between eyes, nose, location of mouth) machine learning algorithms can be trained to find these patterns. This tool can be used to simplify accurate color adjustment, relighting and digital cosmetic/beauty retouching.
· Refraction— With this feature, a 3D object can now refract, distorting background objects based on its surface material characteristics. To achieve convincing transparency through glass, ice, windshields and more, the index of refraction can be set to an accurate approximation of real-world material light refraction.

Productivity updates include:
· Automatic Background Reactor— Immediately after modifying a shot, this mode is triggered, sending jobs to process. Accelerated, automated background rendering allows Flame artists to keep projects moving using GPU and system capacity to its fullest. This feature is available on Linux only, and can function on a single GPU.
· Simpler UX in Core Areas— A new expanded full-width UX layout for MasterGrade, Image surface and several Map User interfaces, are now available, allowing for easier discoverability and accessibility to key tools.
· Manager for Action, Image, Gmask—A simplified list schematic view, Manager makes it easier to add, organize and adjust video layers and objects in the 3D environment.
· Open FX Support—Flame, Flare and Flame Assist version 2020 now include comprehensive support for industry-standard Open FX creative plugins such as Batch/BFX nodes or on the Flame timeline.
· Cryptomatte Support—Available in Flame and Flare, support for the Cryptomatte open source advanced rendering technique offers a new way to pack alpha channels for every object in a 3D rendered scene.

For single-user licenses, Linux customers can now opt for monthly, yearly and three-year single user licensing options. Customers with an existing Mac-only single user license can transfer their license to run Flame on Linux.
Flame, Flare, Flame Assist and Lustre 2020 will be available on April 16, 2019 at no additional cost to customers with a current Flame Family 2019 subscription. Pricing details can be found at the Autodesk website.

Atomos’ new Shogun 7: HDR monitor, recorder, switcher

The new Atomos Shogun 7 is a seven-inch HDR monitor, recorder and switcher that offers an all-new 1500-nit, daylight-viewable, 1920×1200 panel with a 1,000,000:1 contrast ratio and 15+ stops of dynamic range displayed. It also offers ProRes RAW recording and realtime Dolby Vision output. Shogun 7 will be available in June 2019, priced at $1,499.

The Atomos screen uses a combination of advanced LED and LCD technologies which together offer deeper, better blacks the company says rivals OLED screens, “but with the much higher brightness and vivid color performance of top-end LCDs.”

A new 360-zone backlight is combined with this new screen technology and controlled by the Dynamic AtomHDR engine to show millions of shades of brightness and color. It allows Shogun 7 to display 15+ stops of real dynamic range on-screen. The panel, says Atomos, is also incredibly accurate, with ultra-wide color and 105% of DCI-P3 covered, allowing for the same on-screen dynamic range, palette of colors and shades that your camera sensor sees.

Atomos and Dolby have teamed up to create Dolby Vision HDR “live” — a tool that allows you to see HDR live on-set and carry your creative intent from the camera through into HDR post. Dolby have optimized their target display HDR processing algorithm which Atomos has running inside the Shogun 7. It brings realtime automatic frame-by-frame analysis of the Log or RAW video and processes it for optimal HDR viewing on a Dolby Vision-capable TV or monitor over HDMI. Connect Shogun 7 to the Dolby Vision TV and AtomOS 10 automatically analyzes the image, queries the TV and applies the right color and brightness profiles for the maximum HDR experience on the display.

Shogun 7 records images up to 5.7kp30, 4kp120 or 2kp240 slow motion from compatible cameras, in RAW/Log or HLG/PQ over SDI/HDMI. Footage is stored directly to AtomX SSDmini or approved off-the-shelf SATA SSD drives. There are recording options for Apple ProRes RAW and ProRes, Avid DNx and Adobe CinemaDNG RAW codecs. Shogun 7 has four SDI inputs plus a HDMI 2.0 input, with both 12G-SDI and HDMI 2.0 outputs. It can record ProRes RAW in up to 5.7kp30, 4kp120 DCI/UHD and 2kp240 DCI/HD, depending on the camera’s capabilities. Also, 10-bit 4:2:2 ProRes or DNxHR recording is available up to 4Kp60 or 2Kp240. The four SDI inputs enable the connection of most quad-link, dual-link or single-link SDI cinema cameras. Pixels are preserved with data rates of up to 1.8Gb/s.

In terms of audio, Shogun 7 eliminates the need for a separate audio recorder. Users can add 48V stereo mics via an optional balanced XLR breakout cable, or select mic or line input levels, plus record up to 12 channels of 24/96 digital audio from HDMI or SDI. Monitoring selected stereo tracks is via the 3.5mm headphone jack. There are dedicated audio meters, gain controls and adjustments for frame delay.

Shogun 7 features the latest version of the AtomOS 10 touchscreen interface, first seen on the Ninja V.  The new body of Shogun 7 has a Ninja V-like exterior with ARRI anti-rotation mounting points on the top and bottom of the unit to ensure secure mounting.

AtomOS 10 on Shogun 7 has the full range of monitoring tools, including Waveform, Vectorscope, False Color, Zebras, RGB parade, Focus peaking, Pixel-to-pixel magnification, Audio level meters and Blue only for noise analysis.

Shogun 7 can also be used as a portable touchscreen-controlled multi-camera switcher with asynchronous quad-ISO recording. Users can switch up to four 1080p60 SDI streams, record each plus the program output as a separate ISO, then deliver ready-for-edit recordings with marked cut-points in XML metadata straight to your NLE. The current Sumo19 HDR production monitor-recorder will also gain the same functionality in a free firmware update.

There is asynchronous switching, plus use genlock in and out to connect to existing AV infrastructure. Once the recording is over, users can import the XML file into an NLE and the timeline populates with all the edits in place. XLR audio from a separate mixer or audio board is recorded within each ISO, alongside two embedded channels of digital audio from the original source. The program stream always records the analog audio feed as well as a second track that switches between the digital audio inputs to match the switched feed.

DP Chat: The Village cinematographer William Rexer

By Randi Altman

William Rexer is a cinematographer who has worked on documentaries, music videos, commercials and narratives — both comedies and dramas. He’s frequently collaborated with writer/director Ed Burns (Friends With Kids, Newlyweds, Summertime). Recently, he’s directed photography on several series including The Get Down, The Tick, Sneaky Pete and the new NBC drama The Village.

He sat down with us to answer some questions about his love of cinematography, his process and The Village, which follow a diverse group of people living in the same apartment building in Brooklyn.

The set of The Village. Photo: Peter Kramer

How did you become interested in cinematography?
When I was a kid, my mother had a theater company and my father was an agent/producer. I grew up sleeping backstage. When I was a teen, I was running a followspot (light) for Cab Calloway. I guess there was no escaping some job in this crazy business!

My father would check out 16mm movies from the New York City public library — Chaplin, Keaton — and that would be our weekend night entertainment. When I was in 8th grade, an art cinema started in my hometown; it is now called the Cinema Arts Center in Huntington, New York. It showed cinema from all over the world, including Bergman, Fellini, Jasny. I began to see the world through films and fell in love.

What inspires you artistically?
I love going to the movies, the theater and art galleries. Films like Roma and Cold War make me have faith in the world. What mostly inspires me is checking out what my peers are up to. Tim Ives, ASC, and Tod Campbell are two friends that I love to watch. Very impressive guys. David Mullen, ASC, and Eric Moynier are doing great work on Mrs. Maisel. I guess I would say watching my peers and their work inspires me.

NBC’s The Village

How do you stay on top of advancing technology tools for achieving your vision on set or in post?
The cameras and post workflow change every few months. I check in with the rental houses to stay on top of gear. Panavision, Arri Rental, TCS, Keslow and Abel are great resources. I also stay in touch with post houses. My friends at Harbor and Technicolor are always willing to help create LUTs, evaluate cameras and lenses.

Has any recent or new technology changed the way you work?
The introduction of the Red One MX and the ARRI D-20 changed a lot of things. They made shooting high-quality images affordable and cleaner for the environment. It put 35mm size sensors out there and gave a lot of young people a chance to create.

The introduction of large-format cameras, the Red Monstro 8K VV, the ARRI LF and 65, and the Sony Venice have made my life more interesting. All these sensors are fantastic, and the new color spaces we get to work with like Red’s IPP2 are truly astounding. I like having control of depth of field and controlling where the audience looks.

What are some of your best practices or rules you try to follow on each job?
I try my best to shoot tests, create a LUT in the test phase and take the footage through the entire process and see how it holds up. I make sure that all my monitors are calibrated at the post house to match; that gets us all on the same page. Then, I’ll adjust the LUT after a few days of shooting in the field, using the LUT as a film stock and light to it. I watch dailies, give notes and try to get in with colorist/timer and work with them.

Will Rexer (center) with showrunner Mike Daniels and director Minkie Spiro. Photo: Jennifer Rhoades

Tell us about The Village. How would you describe the general look of the show?
The look of The Village is somewhere between romantic realism and magical realism. It is a world that could be. Our approach was to thread that line between real and the potential — warm and inviting and full of potential.

Can you talk about your collaboration with the showrunner when setting the look of a project?
Mike Daniels, Minkie Spiro, Jessica Rhoades and I looked at a ton of photographs and films to find our look. The pilot designer Ola Maslik and the series designer Neil Patel created warm environments for me.

How early did you get involved in the production?
I had three weeks of prep for the pilot, and I worked with Minkie and Ola finding locations and refining the look.

How did you go about choosing the right camera and lenses to achieve the look?
The show required a decent amount of small gimbal work, so we chose the Red Monstro 8K VV using Red’s IPP2 color space. I love the camera, great look, great functionality and my team has customized the accessories to make our work on set effortless.

We used the Sigma Cine PL Primes with 180mm Leica R, Nikon 200 T2, Nikkor Zero Optik 58mm T1.2, Angenieux HR 25-250mm and some other special optics. I looked at other full-frame lenses but really liked the Sigma lenses and their character. These lenses are a nice mix of roundness and warmth and consistency.

What was your involvement with post? Who supported your vision from dailies through final grade? Have you worked with this facility and/or colorists on past projects?
Dailies were through Harbor Picture Company. I love these guys. I have worked with Harbor since they started, and they are total pros. They have helped me create LUTs for many projects, including Public Morals.

The final post for The Village was done in LA at NBC/Universal. Craig Budrick has done a great job coloring the show. I do wish that I could be in the room, but that’s not always possible.

What’s most satisfying to you about this show?
I am very proud of the show and its message. It’s a romantic vision of the world. TV and cinema often go to the dark side. I like going there, but I do think we need to be reminded of our better selves and our potential.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: Mzed.com’s Directing Color With Ollie Kenchington

By Brady Betzel

I am constantly looking to educate myself, no matter what the source — or subject. Whether I am learning how to make a transition in Adobe After Effects from an eSports editor on YouTube to Warren Eagles teaching color correction in Blackmagic’s DaVinci Resolve on FXPHD.com, I’m always beefing up my skills. I even learn from bad tutorials — they teach you what not to do!

But when you come across a truly remarkable learning experience, it is only fair to share with the rest of the world. Last year I saw an ad for an MZed.com course called “Directing Color With Ollie Kenchington,” and was immediately interested. These days you can pretty much find any technical tutorial you can dream of on YouTube, but truly professional, higher education-like, theory-based education series are very hard to come by. Even ones you need to pay for aren’t always worth their price of admission, which is a huge let down.

Ollie sharing his wisdom.

Once I gained access to MZed.com I wanted to watch every educational series they had. From lighting techniques with ASC member Shane Hurlbut to the ARRI Amira Camera Primer, there are over 150 hours of education available from industry leaders. However, I found my way to Directing Color…

I am often asked if I think people should go to college or a film school. My answer? If you have the money and time, you should go to college followed by film school (or do both together, if the college offers it). Not only will you learn a craft, but you will most likely spend hundreds of hours studying and visualizing the theory behind it. For example, when someone asks me about the science behind camera lenses, I can confidently answer them thanks to my physics class based on lenses and optics from California Lutheran University (yes, a shameless plug).

In my opinion, a two-, four- or even 10-year education allows me to live in the grey. I am comfortable arguing for both sides of a debate, as well as the options that are in between —  the grey. I feel like my post-high school education really allowed me to recognize and thrive in the nuances of debate. Leaving me to play devil’s advocate maybe a little too much, but also having civil and proactive discussions with others without being demeaning or nasty — something we are actively missing these days. So if living in the grey is for you, I really think a college education supplemented by online or film school education is valuable (assuming you make the decision that the debt is worth it like I did).

However, I know that is not an option for everyone since it can be very expensive — trust me, I know. I am almost done paying off my undergraduate fees while still paying off my graduate ones, which I am still two or three classes away from finishing. That being said, Directing Color With Ollie Kenchington is the only online education series I have seen so far that is on the same level as some of my higher education classes. Not only is the content beautifully shot and color corrected, but Ollie gives confident and accessible lessons on how color can be used to draw the viewer’s attention to multiple parts of the screen.

Ollie Kenchington is a UK-based filmmaker who runs Korro Films. From the trailer of his Directing Color series, you can immediately see the beauty of Ollie’s work and know that you will be in safe hands. (You can read more about his background here.)

The course raises the online education bar and will elevate the audiences idea of professional insight. The first module “Creating a Palette” covers the thoughts behind creating a color palette for a small catering company. You may even want to start with the last Bonus Module “Ox & Origin” to get a look at what Ollie will be creating throughout the seven modules and about an hour and a half of content.

While Ollie goes over “looks,” the beauty of this course is that he goes through his internal thought processes including deciding on palettes based on color theory. He didn’t just choose teal and orange because it looks good, he chooses his color palette based on complementary colors.

Throughout the course Ollie covers some technical knowledge, including calibrating monitors and cameras, white balancing and shooting color charts to avoid having wrong color balance in post. This is so important because if you don’t do these simple steps, your color correction session while be much harder. And wasting time on fixing incorrect color balance takes time away from the fun of color grading. All of this is done through easily digestible modules that range from two to 20 minutes.

The modules include Creating a Palette; Perceiving Color; Calibrating Color; Color Management; Deconstructing Color 1 – 3 and the Bonus Module Ox & Origin.

Without giving away the entire content in Ollie’s catalog, my favorite modules in this course are the on-set modules. Maybe because I am not on-set that often, but I found the “thinking out loud” about colors helpful. Knowing why reds represent blood, which raise your heart rate a little bit, is fascinating. He even goes through practical examples of color use in films such as in Whiplash.

In the final “Deconstructing Color” modules, Ollie goes into a color bay (complete with practical candle backlighting) and dives in Blackmagic’s DaVinci Resolve. He takes this course full circle to show how since he had to rush through a scene he can now go into Resolve and add some lighting to different sides of someone’s face since he took time to set up proper lighting on set, he can focus on other parts of his commercial.

Summing Up
I want to watch every tutorial MZed.com has to offer. From “Philip Bloom’s Cinematic Masterclass” to Ollie’s other course “Mastering Color.” Unfortunately, as of my review, you would have to pay an additional fee to watch the “Mastering Color” series. It seems like an unfortunate trend in online education to charge a fee and then when an extra special class comes up, charge more, but this class will supposedly be released to the standard subscribers in due time.

MZed.com has two subscription models: MZed Pro, which is $299 for one year of streaming the standard courses, and MZed Pro Premium for $399. This includes the standard courses for one year and the ability to choose one “Premium” course.

“Philip Bloom’s Cinematic Master Class” was the Premium course I was signed up for initially, but you you can decide between this one and the “Mastering Color” course. You will not be disappointed regardless of which one you choose. Even their first course “How to Photograph Everyone” is chock full of lighting and positioning instruction that can be applied in many aspects of videography.

I really was impressed with Directing Color with Ollie Kenchington, and if the other course are this good MZed.com will definitely become a permanent bookmark for me.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

VFX and color for new BT spot via The Mill

UK telco BT wanted to create a television spot that showcased the WiFi capabilities of its broadband hub and underline its promise of “whole home coverage.” Sonny director Fredrik Bond visualized a fun and fast-paced spot for agency AMV BBDO, and a The Mill London was brought onboard to help with VFX and color. It is called Complete WiFi.

In the piece, the hero comes home to find it full of soldiers, angels, dancers, fairies, a giant and a horse — characters from the myriad of games and movies the family are watching simultaneously. Obviously, the look depends upon multiple layers of compositing, which have to be carefully scaled to be convincing.

They also need to be very carefully color matched, with similar lighting applied, so all the layers sit together. In a traditional workflow, this would have meant a lot of loops between VFX and grading to get the best from each layer, and a certain amount of compromise as the colorist imposed changes on virtual elements to make the final grade.

To avoid this, and to speed progress, The Mill recently started using BLG for Flame, a FilmLilght plugin that allows Baselight grades to be rendered identically within Flame — and with no back and forth to the color suite to render out new versions of shots. It means the VFX supervisor is continually seeing the latest grade and the colorist can access the latest Flame elements to match them in.

“Of course it was frustrating to grade a sequence and then drop the VFX on top,” explains VFX supervisor Ben Turner. “To get the results our collaborators expect, we were constantly pushing material to and fro. We could end up with more than a hundred publishes on a single job.”

With the BLG for Flame plugin, the VFX artist sees the latest Baselight grade automatically applied, either from FilmLight’s BLG format files or directly from a Baselight scene, even while the scene is still being graded — although Turner says he prefers to be warned when updates are coming.

This works because all systems have access to the raw footage. Baselight grades non-destructively, by building up layers of metadata that are imposed in realtime. The metadata includes all the grading information, multiple windows and layers, effects and relights, textures and more – the whole process. This information can be imposed on the raw footage by any BLG-equipped device (there are Baselight Editions software plugins for Avid and Nuke, too) for realtime rendering and review.

That is important because it also allows remote viewing. For this BT spot, director Bond was back in Los Angeles by the time of the post. He sat in a calibrated room in The Mill in LA and could see the graded images at every stage. He could react quickly to the first animation tests.

“I can render a comp and immediately show it to a client with the latest grade from The Mill’s colorist, Dave Ludlam,” says Turner. “When the client really wants to push a certain aspect of the image, we can ensure through both comp and grade that this is done sympathetically, maintaining the integrity of the image.”

(L-R) VFX supervisor Ben Turner and colorist Dave Ludlam.

Turner admits that it means more to-ing and fro-ing, but that is a positive benefit. “If I need to talk to Dave then I can pop in and solve a specific challenge in minutes. By creating the CGI to work with the background, I know that Dave will never have to push anything too hard in the final grade.”

Ludlam agrees that this is a complete change, but extremely beneficial. “With this new process, I am setting looks but I am not committing to them,” he says. “Working together I get a lot more creative input while still achieving a much slicker workflow. I can build the grade and only lock it down when everyone is happy.

“It is a massive speed-up, but more importantly it has made our output far superior. It gives everyone more control and — with every job under huge time pressure — it means we can respond quickly.”

The spot was offlined by Patric Ryan from Marshall Street. Audio post was via 750mph with sound designers Sam Ashwell and Mike Bovill.

Behind the Title: Nice Shoes animator Yandong Dino Qiu

This artist/designer has taken to sketching people on the subway to keep his skills fresh and mind relaxed.

NAME: Yandong Dino Qiu

COMPANY: New York’s Nice Shoes

CAN YOU DESCRIBE YOUR COMPANY?
Nice Shoes is a full-service creative studio. We offer design, animation, VFX, editing, color grading, VR/AR, working with agencies, brands and filmmakers to help realize their creative vision.

WHAT’S YOUR JOB TITLE?
Designer/Animator

WHAT DOES THAT ENTAIL?
Helping our clients to explore different looks in the pre-production stage, while aiding them in getting as close as possible to the final look of the spot. There’s a lot of exploration and trial and error as we try to deliver beautiful still frames that inform the look of the moving piece.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Not so much for the title, but for myself, design and animation can be quite broad. People may assume you’re only 2D, but it also involves a lot of other skill sets such as 3D lighting and rendering. It’s pretty close to a generalist role that requires you to know nearly every software as well as to turn things around very quickly.

WHAT TOOLS DO YOU USE?
Photoshop, After Effects,. Illustrator, InDesign — the full Adobe Creative Suite — and Maxon Cinema 4D.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Pitch and exploration. At that stage, all possibilities are open. The job is alive… like a baby. You’re seeing it form and helping to make new life. Before this, you have no idea what it’s going to look like. After this phase, everyone has an idea. It’s very challenging, exciting and rewarding.

WHAT’S YOUR LEAST FAVORITE?
Revisions. Especially toward the end of a project. Everything is set up. One little change will affect everything else.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
2:15pm. Its right after lunch. You know you have the whole afternoon. The sun is bright. The mood is light. It’s not too late for anything.

Sketching on the subway.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I would be a Manga artist.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
La Mer. Frontline. Friskies. I’ve also been drawing during my commute everyday, sketching the people I see on the subway. I’m trying to post every week on Instagram. I think it’s important for artists to keep to a routine. I started up with this at the beginning of 2019, and there’ve been about 50 drawings already. Artists need to keep their pen sharp all the time. By doing these sketches, I’m not only benefiting my drawing skills, but I’m improving my observation about shapes and compositions, which is extremely valuable for work. Being able to break down shapes and components is a key principle of design, and honing that skill helps me in responding to client briefs.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
TED-Ed What Is Time? We had a lot of freedom in figuring out how to animate Einstein’s theories in a fun and engaging way. I worked with our creative director Harry Dorrington to establish the look and then with our CG team to ensure that the feel we established in the style frames was implemented throughout the piece.

TED-Ed What Is Time?

The film was extremely well received. There was a lot of excitement at Nice Shoes when it premiered, and TED-Ed’s audience seemed to respond really warmly as well. It’s rare to see so much positivity in the YouTube comments.

NAME SOME TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My Wacom tablet for drawing and my iPad for reading.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I take time and draw for myself. I love that drawing and creating is such a huge part of my job, but it can get stressful and tiring only creating for others. I’m proud of that work, but when I can draw something that makes me personally happy, any stress or exhaustion from the work day just melts away.

FilmLight offers additions to Baselight toolkit

FilmLight will be at NAB showing updates to its Baselight toolkit, including T-Cam v2. This is FilmLight’s new and improved color appearance model, which allows the user to render an image for all formats and device types with confidence of color.

It combines with the Truelight Scene Looks and ARRI Look Library, now implemented within the Baselight software. “T-CAM color handling with the updated Looks toolset produces a cleaner response compared to creative, camera-specific LUTs or film emulations,” says Andrea Chlebak, senior colorist at Deluxe’s Encore in Hollywood. “I know I can push the images for theatrical release in the creative grade and not worry about how that look will translate across the many deliverables.”

FilmLight had added what they call “a new approach to color grading” with the addition of Texture Blend tools, which allow the colorist to apply any color grading operation dependent on image detail. This gives the colorist fine control over the interaction of color and texture.

Other workflow improvements aimed at speeding the process include enhanced cache management; a new client view that displays a live web-based representation of a scene showing current frame and metadata; and multi-directory conform for a faster and more straightforward conform process.

The latest version of Baselight software also includes per-pixel alpha channels, eliminating the need for additional layer mattes when compositing VFX elements. Tight integration with VFX suppliers, including Foundry Nuke and Autodesk, means that new versions of sequences can be automatically detected, with the colorist able to switch quickly between versions within Baselight.

Shooting, posting New Republic’s Indie film, Sister Aimee

After a successful premiere at the Sundance Film Festival, New Republic Studios’ Sister Aimee screened at this month’s SXSW. The movie tells the story of an infamous American evangelist of the 1920s, Sister Aimee Semple McPherson, who gets caught up in her lover’s dreams of Mexico and finds herself on a road trip toward the border.

Sister Aimee shot at the newly renovated New Republic Studios near Austin, Texas, over two and a half weeks. “Their crew used our 2,400-square-foot Little Bear soundstage, our 3,000-square-foot Lone Wolf soundstage, our bullpen office space and numerous exterior locations in our backlot,” reports New Republic Studios president Mindy Raymond, adding that the Sister Aimee production also had access to two screening rooms with 5.1 surround sound, HDMI hookups to 4K monitors and theater-style leather chairs to watch dailies. The film also hit the road, shooting in the New Mexico desert.

L-R: Directors Samantha Buck, Marie Schlingmann at SXSW. Credit: Harrison Funk

Co-written and co-directed by Samantha Buck and Marie Schlingmann, the movie takes some creative license with the story of Aimee. “We don’t look for factual truth in Aimee’s journey,” they explain. “Instead we look for a more timeless truth that says something about female ambition, the female quest for immortality and, most of all, the struggle for women to control their own narratives. It becomes a story about storytelling itself.”

The film, shot by cinematographer Carlos Valdes-Lora at 3.2K ProRes 4444 XQ on an Arri Alexa Mini, was posted at Dallas and Austin-based Charlieuniformtango.

We reached out to the DP and the post team to find out more.

Carlos, why did you choose the package of the Alexa and Cooke Mini S4 Primes?
Carlos Valdes-Lora: In early conversations with the directors, we all agreed that we didn’t want Sister Aimee to feel like a traditional period movie. We didn’t want to use softening filters or vintage lenses. We aimed instead for clear images, deep focus and a rich color palette that remains grounded in the real world. We felt that this would lend the story a greater sense of immediacy and draw the viewer closer to the characters. Following that same thinking, we worked very extensively with the 25mm and 32mm, especially in closeups and medium closeups, emphasizing accessibility.

The Cooke Mini S4s are a beautiful and affordable set (relative to our other options.) We like the way they give deep dimensionality and warmth to faces, and how they create a slightly lower contrast image compared to the other modern lenses we looked at. They quickly became the right choice for us, striking the right balance between quality, size and value.

The Cookes paired with the Alexa Mini gave us a lightweight camera system with a very contained footprint, and we needed to stay fast and lean due to our compressed shooting schedule and often tight shooting quarters. The Chapman Cobra dolly was a big help in that regard as well.

What was the workflow to post like?
Charlieuniformtango producers Bettina Barrow, Katherine Harper, David Hartstein: Post took place primarily between Charlieuniformtango’s Dallas and Austin offices. Post strategizing started months before the shoot, and active post truly began when production began in July 2018.

Tango’s Evan Linton handled dailies brought in from the shoot, working alongside editor Katie Ennis out of Tango’s Austin studio, to begin assembling a rough cut as shooting continued. Ennis continued to cut at the studio through August with directors Schlingmann and Buck.

Editorial then moved back to the directors’ home state of New York to finish the cut for Sundance. (Editor Ennis, who four-walled out of Tango Austin for the first part of post, went to  New York with the directors, working out of a rented space.)

VFX and audio work started early at Tango, with continuously updated timelines coming from editorial, working to have certain locked shots also finished for the Sundance submission, while saving much of the cleanup and other CG heavy shots for the final picture lock.

Tango audio engineer Nick Patronella also tackled dialogue edit, sound design and mix for the submission out of the Dallas studio.

Can you talk about the VFX?
Barrow, Harper, Hartstein: The cut was locked in late November, and the heavy lifting really began. With delivery looming, Tango’s Flame artists Allen Robbins, Joey Waldrip, David Hannah, David Laird, Artie Peña and Zack Smith divided effects shots, which ranged from environmental cleanup, period-specific cleanup, beauty work such as de-aging, crowd simulation, CG sign creation and more. 3D

(L-R) Tango’s Artie Peña, Connor Adams, Allen Robbins in one of the studio’s Flame suites.

Artist Connor Adams used Houdini, Mixamo and Maya to create CG elements and crowds, with final comps being done in Nuke and sent to Flame for final color. Over 120 VFX shots were handled in total and Flame was the go-to for effects. Color and much of the effects happened simultaneously. It was a nice workflow as the project didn’t have major VFX needs that would have impacted color.

What about the color grade?
Barrow, Harper, Hartstein: Directors Buck and Schlingmann and DP Valdes-Lora worked with Tango colorist Allen Robbins to craft the final look of the film — with the color grade also done in Flame. The trio had prepped shooting for a Kodachrome-style look, especially for the exteriors, but really overall. They found important reference in selections of Robert Capa photographs.

Buck, Schlingmann and Valdes-Lora responded mostly to Kodachrome’s treatment of blues, browns, tans, greens and reds (while staying true to skin tone), but also to their gamma values, not being afraid of deep shadows and contrast wherever appropriate. Valdes-Lora wanted to avoid lighting/exposing to a custom LUT on set that would reflect this kind of Kodachrome look, in case they wanted to change course during the process. With the help of Tango, however, they discovered that by dialing back the Capa look it grounded the film a little more and made the characters “feel” more accessible. The roots of the inspiration remained in the image but a little more naturalism, a little more softness, served the story better.

Because of that they monitored on set with Alexa 709, which he thought exposing for would still provide enough room. Production designer Jonathan Rudak (another regular collaborator with the directors) was on the same page during prep (in terms of reflecting this Capa color style), and the practical team did what they could to make sure the set elements complemented this approach.

What about the audio post?
Barrow, Harper, Hartstein: With the effects and color almost complete, the team headed to Skywalker Ranch for a week of final dialogue edit, mix, sound design and Foley, led by Skywalker’s Danielle Dupre, Kim Foscato and E. Larry Oatfield. The team also was able to simultaneously approve color sections in Skywalker’s Stag Theater allowing for an ultra-efficient schedule. With final mix in hand, the film was mastered just after Christmas so that DCP production could begin.

Since a portion of the film was musical, how complex was the audio mix?
Skywalker sound mixer Dupre: The musical number was definitely one of the most challenging but rewarding scenes to design and mix. It was such a strong creative idea that played so deeply into the main character. The challenge was in striking a balance between tying it into the realism of the film while also leaning into the grandiosity of the musical to really sell the idea.

It was really fun to play with a combination of production dialogue and studio recordings to see how we could make it work. It was also really rewarding to create a soundscape that starts off minimally and simply and transitions to Broadway scale almost undetectably — one of the many exciting parts to working with creative and talented filmmakers.

What was the biggest challenge in post?
Barrow, Harper, Hartstein: Finishing a film in five to six weeks during the holidays was no easy feat. Luckily, we were able to have our directors hands-on for all final color, VFX and mix. Collaborating in the same room is always the best when you have no time to spare. We had a schedule where each day was accounted for — and we stuck to it almost down to the hour.

 

Goldcrest adds 4K theater and colorist Marcy Robinson

Goldcrest Post in New York City has expanded its its picture finishing services, adding veteran colorist Marcy Robinson and unveiling a new, state-of-the-art 4K theater that joins an existing theater and other digital intermediate rooms. The moves are part of a broader strategy to offer film and television productions packaged post services encompassing editorial, picture finishing and sound.

Robinson brings experience working in features, television, documentaries, commercials and music videos. She has recently been working as a freelance colorist, collaborating with directors Noah Baumbach and Ang Lee. Her background also includes 10 years at the creative boutique Box Services, best known for its work in fashion advertising.

Robinson, who was recruited to Goldcrest by Nat Jencks, the facility’s senior colorist, says she was attracted by the opportunity to work on a diversity of high-quality projects. Robinson’s first projects for Goldcrest include the Netflix documentary The Grass is Greener and an advertising campaign for Reebok.

Robinson started out in film photography and operated a custom color photographic print lab for 13 years. She became a digital colorist after joining Box Services in 2008. As a freelance colorist, her credits include the features Billy Lynn’s Long Halftime Walk, DePalma and Frances Ha, the HBO documentary Suited, commercials for Steve Madden, Dior and Prada, and music videos for Keith Urban and Madonna.

Goldcrest’s new 4K theater is set up for the dual purposes of feature film and HDR television mastering. Its technical features include a Blackmagic DaVinci Resolve Linux Advanced color correction and finishing system, a Barco 4K projector, a Screen Research projection screen and Dolby-calibrated 7:1 surround sound.

Posting director Darren Lynn Bousman’s horror film, St. Agatha

Atlanta’s Moonshine Post helped create a total post production pipeline — from dailies to finishing — for the film St. Agatha, directed by Darren Lynn Bousman (Saw II, Saw III, Saw IV, Repo the Genetic Opera). 

The project, from producers Seth and Sara Michaels, was co-edited by Moonshine’s Gerhardt Slawitschka and Patrick Perry and colored by Moonshine’s John Peterson.

St. Agatha is a horror film that shot in the town of Madison, Georgia. “The house we needed for the convent was perfect, as the area was one of the few places that had not burned down during the Civil War,” explains Seth Michaels. “It was our first time shooting in Atlanta, and the number one reason was because of the tax incentive. But we also knew Georgia had an infrastructure that could handle our production.”

What the producers didn’t know during production was that Moonshine Post could handle all aspects of post, and were initially brought in only for dailies. With the opportunity to do a producer’s cut, they returned to Moonshine Post.

Time and budget dictated everything, and Moonshine Post was able to offer two editors working in tandem to edit a final cut. “Why not cut in collaboration?” suggested Drew Sawyer, founder of Moonshine Post and executive producer. “It will cut the time in half, and you can explore different ideas faster.”

“We quite literally split the movie in half,” reports Perry, who, along with Slawitschka, cut on Adobe Premiere “It’s a 90-minute film, and there was a clear break. It’s a little unusual, I will admit, but almost always when we are working on something, we don’t have a lot of time, so splitting it in half works.”

Patrick Perry

Gerhardt Slawitschka

“Since it was a producer’s cut, when it came to us it was in Premiere, and it didn’t make sense to switch over to Avid,” adds Slawitschka. “Patrick and I can use both interchangeably, but prefer Premiere; it offers a lot of flexibility.”

“The editors, Patrick and Gerhardt, were great,” says Sara Michaels. “They watched every single second of footage we had, so when we recut the movie, they knew exactly what we had and how to use it.”

“We have the same sensibilities,” explains Gerhardt. “On long-form projects we take a feature in tandem, maybe split it in half or in reels. Or, on a TV series, each of us take a few episodes, compare notes, and arrive at a ‘group mind,’ which is our language of how a project is working. On St. Agatha, Patrick and I took a bit of a risk and generated a four-page document of proposed thoughts and changes. Some very macro, some very micro.”

Colorist John Peterson, a partner at Moonshine Post, worked closely with the director on final color using Blackmagic’s Resolve. “From day one, the first looks we got from camera raw were beautiful.” Typically, projects shot in Atlanta ship back to a post house in a bigger city, “and maybe you see it and maybe you don’t. This one became a local win, we processed dailies, and it came back to us for a chance to finish it here,” he says.

Peterson liked working directly with the director on this film. “I enjoyed having him in session because he’s an artist. He knew what he was looking for. On the flashbacks, we played with a variety of looks to define which one we liked. We added a certain amount of film grain and stylistically for some scenes, we used heavy vignetting, and heavy keys with isolation windows. Darren is a director, but he also knows the terminology, which gave me the opportunity to take his words and put them on the screen for him. At the end of the week, we had a successful film.”

John Peterson

The recent expansion of Moonshine Post, which included a partnership with the audio company Bare Knuckles Creative and a visual effects company Crafty Apes, “was necessary, so we could take on the kind of movies and series we wanted to work with,” explains Sawyer. “But we were very careful about what we took and how we expanded.”

They recently secured two AMC series, along with projects from Netflix. “We are not trying to do all the post in town, but we want to foster and grow the post production scene here so that we can continue to win people’s trust and solidify the Atlanta market,” he says.

Uncork’d Entertainment’s St. Agatha was in theaters and became available on-demand starting February 8. Look for it on iTunes, Amazon, Google Play, Vudu, Fandango Now, Xbox, Dish Network and local cable providers.

Review: Tangent Wave 2: Color Correction Surface

By Brady Betzel

Have you ever become frustrated while color correcting footage after a long edit due to having to learn a whole new set of shortcuts and keystrokes?

Whether you’re in Adobe Premiere, Avid Media Composer or Blackmagic Resolve, there are hundreds of shortcuts you can learn to become a highly efficient colorist. If you want to become the most efficient colorist you can be, you need an external hardware color panel (clearly we are talking to those who provide color as part of their job but not as their job). You may have seen the professional color correction panels like the Blackmagic DaVinci Panel or the Filmlight Blackboard 2 panel for Baselight. Those are amazing and take a long time of repetitive use to really master (think Malcolm Gladwell’s 10,000 Hour Rule). Not to mention they can cost $30,000 or more… yikes! So if you can’t quite justify the $30,000 for a dedicated color correction panel don’t fret. You still have options.

One of those options is the Tangent Wave, which is at the bottom end of the price range. Before I dig in, I need to note that it only works with Avid if you also use the FilmLight Baselight for Media Composer plugin. So Avid users, keep that in mind.

Tangent has one of the most popular sub-$3,500 set of panels used constantly by editing pros: Tangent Elements. I love the Tangent Elements panel, but at just under $3,500 they aren’t cheap, and I can understand how a lot of people could be put off — plus, it can take up your entire desktop real estate with four panels. Blackmagic sells its Mini panel for just under $3,000, but it only works with Resolve. So if you bounce around between apps that one isn’t for you.

Tangent released the first generation Wave panel around 2010 and it took another eight years to realize that people want color correction panels but don’t want to spend a lot of money. That’s when they released the Tangent Wave 2. The original Tangent Wave was a great color correction panel, but in my opinion was ergonomically inefficient. It was awkward — but at around $1,500 it was one of the only options that was semi-affordable.

In 2016, Tangent released the Tangent Ripple, which has a limited toolset, including three trackballs with dials, reset buttons and shift/alt buttons, costing around $350. You can read my review here. That’s a great price point but it is really limiting. If you are doing very basic color correction, like hue corrections and contrast moves, this is great. But if you want to dive into Power Windows, Hue Qualifiers or maybe even cycling through LUTs you need more. This is where the Tangent Wave 2 comes into play.

Tangent Wave 2
The Tangent Wave 2 works with the Tangent Mapper software, an app to help customize the key and knob mapping if the application you are using let’s you customize the keys. It just so happens that Premiere is customizable but Resolve is not (no matter what panel you are using, not just Tangent).

The Wave 2 is much more comfortable than the original Wave and has enough buttons to get 60% of the shortcuts in these apps. If you are in Premiere you can re-map keys and get where you want much faster than Resolve. However, Resolve’s mapping is set by Blackmagic and has almost everything you need. What it doesn’t have mapped is quickly accessible by keyboard or mouse.

If you’ve ever used the Element panels you will remember its high-grade components (which probably added to the price tag) — including the trackballs and dials. Everything feels very professional on the Elements, very close to the high-end Precision Panels or DaVinci Panels. The Wave 2’s components are on the lower end. They aren’t bad components, just cheaper. The trackballs are a little looser in their sockets, in fact don’t turn the panel over or your balls will fall out (or do it to someone else if you want to play a joke, just ask for the serial # on the bottom of the panel). The accuracy on the trackballs doesn’t feel as tight as the Elements, but is usable. The knobs and buttons feel much closer to the level of the Element panels. The overall plastic casing is much lighter and feels a lot cheaper.

However, for around $900 at the time of my writing this review) the Tangent Wave 2 is arguably the best deal for a color correction panel there is. Between the extremely efficient button layout and beautiful ice-white OLED display you will be hard pressed to find a better product for the money. It is also around 15-inch wide, 11-inch deep, and 2-inches tall, which allows for you to keep your keep your keyboards and mice on your desk, unlike the Elements which can take an entire desktop on their own.

Before you plug in your Wave 2 you should download the latest Tangent Hub and Mapper. Once you open the Mapper app you will understand the button and knob layout and how to customize the keys (unless you are using Resolve). In Premiere, I immediately started pressing buttons and turning knobs and found out that once inside of the Lumetri tabs the up and down arrows on the panel worked in the reverse of how my brain wanted them to work. I jumped into the Mapper app, reassigned the up and down arrows to the way I wanted to cycle through the Lumetri panels and without restarting I was up and running. It was awesome not to have to restart anything.

As you go, you will find that each NLE or color app has their own issues and it might take a few tries to get your panel set up the way you want it. I really liked how a few recent LUTs I had installed in the Premiere LUT directory showed up on the panel’s OLED when cycling through LUTs. It was really helpful and I didn’t have to use my mouse to click the drop-down LUT menu. When you go into the Creative Looks you can cycle through those straight from the Wave 2, which is very helpful. Other than that you can control almost every single thing in the Lumetri interface directly from the panel, including going into full screen for review of your color.

If you use Resolve 15, you will really like the Tangent Wave 2. I did notice that the panel worked much smoother and was way more responsive inside of Resolve than inside of Premiere. There could be a few reasons for that, but I work in and out of these apps almost daily and it definitely felt a little delayed in Premiere Pro.

Once you are getting into the nitty gritty of Resolve you will be a little hamstrung when accessing items like the Hue vs Hue curves. You can’t pinpoint hues on the curve window and adjust them straight from the Wave 2. That is where you will want to look at the Element panels. Another shortcut missing was the lack of Offset — there are only three trackballs so you cannot access the 4th Hue wheel aka Offset. However, you can access the Offset through the knobs, and I actually found controlling the Offset through knobs was oddly satisfying and more accurate than the trackballs. It’s a different way of thinking, and I think I might like it.

Without Resolve’s GUI Matching where I was on the Wave 2 panel, I wasn’t always sure where I was at. On the Resolve GUI I might have been in the Curves tab but on the Wave 2 HUD I may have been on the Power Windows tab. If Tangent could sync the Wave 2 and the Resolve GUI so that they match I think the Wave 2 would be a lot easier to use and less confusing, I guess I wouldn’t even call it an update, it’s a legitimate missing feature.

Summing Up
In the end, you will not find a traditional color correction panel setup that works with multiple applications and satisfies all of the requirements of a professional colorist for around $900.

I love the Tangent Element Panels but at over half the price, the Tangent Wave 2 is a great solution without spending what could be used as a down payment on a car.

Check out the Tangent Wave 2 on Tangent’s website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Colorist Christopher M. Ray talks workflow for Alexa 65-shot Alpha

By Randi Altman

Christopher M. Ray is a veteran colorist with a varied resume that includes many television and feature projects, including Tomorrowland, Warcraft, The Great Wall, The Crossing, Orange Is the New Black, Quantico, Code Black, The Crossing and Alpha. These projects have taken Ray all over the world, including remote places throughout North America, Europe, Asia and Africa.

We recently spoke with Ray, who is on staff at Burbank’s Picture Shop, to learn more about his workflow on the feature film Alpha, which focuses on a young man trying to survive alone in the wilderness after he’s left for dead during his first hunt with his Cro-Magnon tribe.

Ray was dailies colorist on the project, working with supervising DI colorist Maxine Gervais. Gervais of Technicolor won an HPA Award for her work on Alpha in the Outstanding Color Grading — Feature Film category.

Let’s find out more….

Chris Ray and Maxine Gervais at the HPA Awards.

How early did you get involved in Alpha?
I was approached about working on Alpha right before the start of principal photography. From the beginning I knew that it was going to be a groundbreaking workflow. I was told that we would be working with the ARRI Alexa 65 camera, mainly working in an on-set color grading trailer and we would be using FilmLight’s Daylight software.

Once I was on board, our main focus was to design a comprehensive workflow that could accommodate on-set grading and Daylight software while adapting to the ever-changing challenges that the industry brings. Being involved from the start was actually was a huge perk for me. It gave us the time we needed to design and really fine-tune the extensive workflow.

Can you talk about working with the final colorist Maxine Gervais and how everyone communicated?
It was a pleasure working with Maxine. She’s really dialed in to the demands of our industry. She was able to fly to Vancouver for a few days while we were shooting the hair/makeup tests, which is how we were able to form in-person communication. We were able to sit down and discuss creative approaches to the feature right away, which I appreciated as I’m the type of person that likes to dive right in.

At the film’s conception, we set in motion a plan to incorporate a Baselight Linked Grade (BLG) color workflow from FilmLight. This would allow my color grades in Daylight to transition smoothly into Maxine’s Baselight software. We knew from the get-go that there would be several complicated “day for night” scenes that Maxine and I would want to bring to fruition right away. Using the BLG workflow, I was able to send her single “Arriraw” frames that gave that “day for night” look we were searching for. She was able to then send them back to me via a BLG file. Even in remote locations, it was easy for me to access the BLG grade files via the Internet.

[Maxine Gervais weighs in on working with Ray: “Christopher was great to work with. As the workflow on the feature was created from scratch, he implemented great ideas. He was very keen on the whole project and was able to adapt to the ever-changing challenges of the show. It is always important to have on-set color dialed in correctly, as it can be problematic if it is not accurately established in production.”]

How did you work with the DP? What direction were you given?
Being on set, it was very easy for DP Martin Gschlacht to come over to the trailer and view the current grade I was working on. Like Maxine, Martin already had a very clear vision for the project, which made it easy to work with him. Oftentimes, he would call me over on set and explain his intent for the scene. We would brainstorm ways of how I could assist him in making his vision come to life. Audiences rarely see raw camera files, or the how important color can influence the story being told.

It also helps that Martin is a master of aesthetic. The content being captured was extremely striking; he has this natural intuition about what look is needed for each environment that he shoots. We shot in lush rain forests in British Columbia and arid badlands in Alberta, which each inspired very different aesthetics.

Whenever I had a bit of down time, I would walk over to set and just watch them shoot, like a fly on the wall quietly observing and seeing how the story was unfolding. As a colorist, it’s so special to be able to observe the locations on set. Seeing the natural desaturated hues of dead grass in the badlands or the vivid lush greens in the rain forest with your own eyes is an amazing opportunity many of us don’t get.

You were on set throughout? Is that common for you?
We were on set throughout the entire project as a lot of our filming locations were in remote areas of British Columbia and Alberta, Canada. One of our most demanding shooting locations included the Dinosaur Provincial Park in Brooks, Alberta. The park is a UNESCO World Heritage site that no one had been allowed to film at prior to this project. I needed to have easy access to the site in order to easily communicate with the film’s executive team and production crew. They were able to screen footage in their trailer and we had this seamless back-and-forth workflow. This also allowed them to view high-quality files in a comfortable and controlled environment. Also, the ability to flag any potential issues and address them immediately on set was incredibly valuable with a film of such size and complexity.

Alpha was actually the first time I worked in an on-set grading trailer. In the past I usually worked out of the production office. I have heard of other films working with an on-set trailer, but I don’t think I would say that it is overly common. Sometimes, I wish I could be stationed on set more often.

The film was shot mostly with the Alexa 65, but included footage from other formats. Can you talk about that workflow?
The film was mostly shot on the Alexa 65, but there were also several other formats it was shot on. For most of the shoot there was a second unit that was shooting with Alexa XT and Red Weapon cameras, with a splinter unit shooting B-roll footage on Canon 1D, 5D and Sony A7S. In addition to these, there were units in Iceland and South Africa shooting VFX plates on a Red Dragon.

By the end of the shoot, there were several different camera formats and over 10 different resolutions. We used the 6.5K Alexa 65 resolution as the master resolution and mapped all the others into it.

The Alexa 65 camera cards were backed up to 8TB “sled” transfer drives using a Codex Vault S system. The 8TB transfer drives were then sent to the trailer where I had two Codex Vault XL systems — one was used for ingesting all of the footage into my SAN and the second was used to prepare footage for LTO archival. All of the other unit footage was sent to the trailer via shuttle drives or Internet transfer.

After the footage was successfully ingested to the SAN with a checksum verification, it was ready to be colored, processed, and then archived. We had eight LTO6 decks running 24/7, as the main focus was to archive the exorbitant amounts of high-res camera footage that we were receiving. Just the Alexa 65 alone was about 2.8TB per hour for each camera.

Had you worked with Alexa 65 footage previously?
Many times. A few year ago, I was in China for seven months working on The Great Wall, which was one of the first films to shoot with the Alexa 65. I had a month of in-depth pre-production with the camera testing, shooting and honing the camera’s technology. Working very closely with Arri and Codex technicians during this time, I was able to design the most efficient workflow possible. Even as the shoot progressed, I continued to communicate closely with both companies. As new challenges arose, we developed and implemented solutions that kept production running smoothly.

The workflow we designed for The Great Wall was very close to the workflow we ended up using on Alpha, so it was a great advantage that I had previous experience working in-depth with the camera.

What were some of the challenges you faced on this film?
To be honest, I love a challenge. As a colorist, we are thrown into tricky situations every day. I am thankful for these challenges; they improve my craft and enable me to become more efficient at problem solving. One of the largest challenges that I faced in this particular project was working with so many different units, given the number of units shooting, the size of the footage alone and the dozens of format types needed.

We had to be accessible around the clock, most of us working 24 hours a day. Needless to say, I made great friends with the transportation driving team and the generator operators. I think they would agree that my grading trailer was one of their largest challenges on the film since I constantly needed to be on set and my work was being imported/exported in such high resolutions.

In the end, as I was watching this absolutely gorgeous film in the theater it made sense. Working those crazy hours was absolutely worth it — I am thankful to have worked with such a cohesive team and the experience is one I will never forget.

Autodesk cloud-enabled tools now work with BeBop post platform

Autodesk has enabled use of its software in the cloud — including 3DS Max, Arnold, Flame and Maya — and BeBop Technology will deploy the tools on its cloud-based post platform. The BeBop platform enables processing-heavy post projects, such as visual effects and editing, in the cloud on powerful and highly secure virtualized desktops. Creatives can process, render, manage and deliver media files from anywhere on BeBop using any computer and as small as a 20Mbps Internet connection.

The ongoing deployment of Autodesk software on the BeBop platform mirrors the ways BeBop and Adobe work closely together to optimize the experience of Adobe Creative Cloud subscribers. Adobe applications have been available natively on BeBop since April 2018.

Autodesk software users will now also gain access to BeBop Rocket Uploader, which enables ingestion of large media files at incredibly high speeds for a predictable monthly fee with no volume limits. Additionally, BeBop Over the Shoulder (OTS) enables secure and affordable remote collaboration, review and approval sessions in real-time. BeBop runs on all of the major public clouds, including Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure.

Lowepost offering Scratch training for DITs, post pros

Oslo, Norway-based Lowepost, which offers an online learning platform for post production, has launched an Assimilate Scratch Training Channel targeting DITs and post pros. This training includes an extensive series of tutorials that help guide a post pro or DIT through the features of an entire Scratch workflow. Scratch products offer dailies to conform, color grading, visual effects, compositing, finishing, VR and live streaming.

“We’re offering in-depth training of Scratch via comprehensive tutorials developed by Lowepost and Assimilate,” says Stig Olsen, manager of Lowepost. “Our primary goal is to make Scratch training easily accessible to all users and post artists for building their skills in high-end tools that will advance their expertise and careers. It’s also ideal for DaVinci Resolve colorists who want to add another excellent conform, finishing and VR tool to their tool kit.”

Lowepost is offering three-month free access to the Scratch training. The first tutorial, Scratch Essential Training, is also available now. A free 30-day trial offer of Scratch is available via their website.

Lowepost’s Scratch Training Channel is available for an annual fee of $59 (US).

Phil Azenzer returns to Encore as senior colorist

Industry veteran and senior colorist Phil Azenzer, one of Encore’s original employees, has returned to the company, bringing with him a credit list that includes TV and features. He was most recently with The Foundation.

When he first started at Encore he was a color assistant, learning the craft and building his client base. Over his post production career, Azenzer has collaborated with many notable directors including David Lynch, Steven Spielberg and David Nutter, as well as high-profile DPs such as Robert McLachlan and John Bartley.

His credits include The X-Files, Six Feet Under, Entourage, Big Love, Bates Motel, Bloodline and most recently, seasons four and five of Black-ish and seasons one and two of Grown-ish.

“Coming back to Encore is really a full circle journey for me, and it feels like coming home,” shared Azenzer. “I learned my craft and established my career here. I’m excited to be back at Encore, not just because of my personal history here, but because it’s great to be at an established facility with the visibility and reputation that Encore has. I’m looking forward to collaborating with lots of familiar faces.”

Azenzer is adept at helping directors and cinematographers create visual stories. With the flexibility to elevate to a variety of desired looks, he brings a veteran’s knowledge and skillset to projects requiring anything from subtle film noir palettes to hyper-saturated stylized looks. Upon departing Encore in 2001, Azenzer spent time at Technicolor and Post Group/io Film before returning to Encore from 2009-2011. Following his second stint at Encore, he continued work as a senior colorist at Modern Videofilm, NBC Universal and Sony.

While his main tool is Resolve, he has also worked with Baselight and  Lustre.

Color plays big role in the indie thriller Rust Creek

In the edge-of-your-seat thriller Rust Creek, confident college student Sawyer (Hermione Corfield) loses her way while driving through very rural Appalachia and quickly finds herself in a life-or-death struggle with some very dangerous men. The modestly-budgeted feature from Lunacy Productions — a company that encourages female filmmakers in top roles — packs a lot of power with virtually no pyrotechnics using well-thought-out filmmaking techniques, including a carefully planned and executed approach to the use of color throughout the film.

Director Jen McGowan and DP Michelle Lawler

Director Jen McGowan, cinematographer Michelle Lawler and colorist Jill Bogdanowicz of Company 3 collaborated to help express Sawyer’s character arc through the use of color. For McGowan, successful filmmaking requires thorough prep. “That’s where we work out, ‘What are we trying to say and how do we illustrate that visually?’” she explains. “Film is such a visual medium,” she adds, “but it’s very different from something like painting because of the element of time. Change over time is how we communicate story, emotion and theme as filmmakers.”

McGowan and Lawler developed the idea that Sawyer is lost, confused and overwhelmed as her dire situation becomes clear. Lawler shot most of Rust Creek handholding an ARRI Alexa Mini (with Cooke S4s) following Sawyer as she makes her way through the late autumn forest. “We wanted her to become part of the environment,” Lawler says. “We shot in winter and everything is dead, so there was a lot of brown and orange everywhere with zero color separation.”

Production designer Candi Guterres pushed that look further, rather than fighting it, with choices about costumes and some of the interiors.

“They had given a great deal of thought to how color affects the story,” recalls colorist Bogdanowicz, who sat with both women during the grading sessions (using Blackmagic’s DaVinci Resolve) at Company 3 in Santa Monica. “I loved the way color was so much a part of the process, even subtly, of the story arc. We did a lot in the color sessions to develop this concept where Sawyer almost blends into the environment at first and then, as the plot develops and she finds inner strength, we used tonality and color to help make her stand out more in the frame.”

Lawler explains that the majority of the film was shot on private property deep in the Kentucky woods, without the use of any artificial light. “I prefer natural light where possible,” she says. “I’d add some contrast to faces with some negative fill and maybe use little reflectors to grab a rake of sunlight on a rock, but that was it. We had to hike to the locations and we couldn’t carry big lights and generators anyway. And I think any light I might have run off batteries would have felt fake. We only had sun about three days of the 22-day shoot, so generally I made use of the big ‘silk’ in the sky and we positioned actors in ways that made the best use of the natural light.”

In fact, the weather was beyond bad, it was punishing. “It would go from rain to snow to tornado conditions,” McGowan recalls. “It dropped to seven degrees and the camera batteries stopped working.”

“The weather issues can’t be overstated,” Lawler adds, describing conditions on the property they used for much of the exterior location. “Our base camp was in a giant field. The ground would be frozen in the morning and by afternoon there would be four feet of mud. We dug trenches to keep craft services from flooding.”

The budget obviously didn’t provide for waiting around for the elements to change, David Lean-style. “Michelle and I were always mindful when shooting that we would need to be flexible when we got to the color grading in order to tie the look together,” McGowan explains. “I hate the term ‘fix it post.’ It wasn’t about fixing something, it was about using post to execute what was intended.”

Jill Bogdanowicz

“We were able to work with my color grading toolset to fine tune everything shot by shot,” says Bogdanowicz. “It was lovely working with the two of them. They were very collaborative but were very clear on what they wanted.”

Bogdanowicz also adapted a film emulation LUT, which was based on the characteristics of a Fujifilm print stock and added in a subtle hint of digital grain, via a Boris FX Sapphire plug-in, to help add a unifying look and filmic feel to the imagery. At the very start of the process, the colorist recalls, “I showed Jen and Michelle a number of ‘recipes’ for looks and they fell in love with this one. It’s somewhat subtle and elegant and it made ‘electric’ colors not feel so electric but has a film-style curve with strong contrast in the mids and shadows you can still see into.”

McGowan says she was quite pleased with the work that came out of the color theater. “Color is not one of the things audiences usually pick up on, but a lot of people do when they see Rust Creek. It’s not highly stylized, and it certainly isn’t a distracting element, but I’ve found a lot of people have picked up on what we were doing with color and I think it definitely helped make the story that much stronger.”

Rust Creek is currently streaming on Amazon Prime and Google.

SciTech Medallion Recipient: A conversation with Curtis Clark, ASC

By Barry Goch

The Academy of Motion Pictures Arts & Sciences has awarded Curtis Clark, ASC, the John A. Bonner Medallion “in appreciation for outstanding service and dedication in upholding the high standards of the Academy.” The presentation took place in early February and just prior to the event, I spoke to Clark and asked him to reflect on the transition from film to digital cinema and his contributions to the industry.

Clark’s career as a cinematographer includes features, TV and commercials. He is also the chair of the ASC Motion Imaging Technology Council that developed the ASC- CDL.

Can you reflect on the changes you’ve seen over your career and how you see things moving ahead in the future?
Once upon a time, life was an awful lot simpler. I look back on it nostalgically when it was all film-based, and the possibilities of the cinematographer included follow-up on the look of dailies and also follow through with any photographic testing that helped to hone in on the desired look. It had its photochemical limitations; its analog image structure was not as malleable or tonally expansive as the digital canvas we have now.

Do you agree that Kodak’s Cineon helped us to this digital revolution — the hybrid film/digital imaging system where you would shoot on film, scan it and then digitally manipulate it before going back out to film via a film recorder?
That’s where the term digital intermediate came into being, and it was an eye opener. I think at the time not everyone fully understood the ramifications of the sort of impact it was making. Kodak created something very potent and led the way in terms of methodologies, or how to arrive at integration of digital into what was then called a hybrid imaging system —combining digital and film together.

The DCI (Digital Cinema Initiative) was created to establish digital projection standards. Without a standard we’d potentially be creating chaos in terms of how to move forward. For the studios, distributors and exhibitors, it would be a nightmare Can you talk about that?
In 2002, I had been asked to form a technology committee at the ASC to explore these issues: how the new emerging digital technologies were impacting the creative art form of cinematography and of filmmaking, and also to help influence the development of these technologies so they best serve the creative intent of the filmmaker.

DCI proposed that for digital projection to be considered ready for primetime, its image quality needed to be at least as good as, if not better than, a print from the original negative. I thought this was a great commitment that the studios were making. For them to say digital projection was going to be judged against a film print projection from the original camera negative of the exact same content was a fantastic decision. Here was a major promise of a solution that would give digital cinema image projection an advantage since most people saw release prints from a dupe negative.

Digital cinema had just reached the threshold of being able to do 2K digital cinema projection. At that time, 4K digital projection was emerging, but it was a bit premature in terms of settling on that as a standard. So you had digital cinema projection and the emergence of a sophisticated digital intermediate process that could create the image quality you wanted from the original negative, but projected on a digital projection.

In 2004, the Michael Mann film Collateral film was shot with the Grass Valley Viper Film Stream, the Sony F900 and Sony F950 cameras, the latest generation of digital motion picture cameras — basically video cameras that were becoming increasingly sophisticated with better dynamic range and tonal contrast, using 24fps and other multiple frame rates, but 24p was the key.
These cameras were used in the most innovative and interesting manner, because Mann combined film with digital, using the digital for the low-light level night scenes and then using film for the higher-light level day exterior scenes and day interior scenes where there was no problem with exposure.

Because of the challenge of shooting the night scenes, they wanted to shoot at such low light levels that film would potentially be a bit degraded in terms of grain and fog levels. If you had to overrate the negative, you needed to underexpose and overdevelop it, which was not desirable, whereas the digital cameras thrived in lower light levels. Also, you could shoot at a stop that gave you better depth of field. At the time, it was a very bold decision. But looking back on it historically, I think it was the inflection point that brought the digital motion picture camera into the limelight as a possible alternative to shooting on film.

That’s when they decided to do Camera Assessment Series tests, which evaluates all the different digital cinema cameras available at the time?
Yeah, with the idea being that we’d never compare two digital cameras together, we’d always compare the digital camera against a film reference. We did that first Camera Assessment Series, which was the first step in the direction of validating the digital motion picture camera as viable for shooting motion pictures compared with shooting on film. And we got part way there. A couple of the cameras were very impressive: the Sony F35, the Panavision Genesis, the Arri D21 and the Grass Valley Viper were pretty reasonable, but this was all still mainly within a 2K (1920×1080) realm. We had not yet broached that 4K area.

A couple years later, we decided to do this again. It was called the Image Control Assessment Series, ICAS. That was shot at Warner Bros. It was the scenes that we shot in a café — daylight interior and then night time exterior. Both scenes had a dramatically large range of contrast and different colors in the image. It was the big milestone. The new Arri Alexa was used, along with the Sony F65 and the then latest versions of the Red cameras.

So we had 4K projection and 4K cameras and we introduced the use of ACES (Academy Color Encoding System) color management. So we were really at the point where all the key components that we needed were beginning to come together. This was the first instance where these digital workflow components were all used in a single significant project testing. Using film as our common benchmark reference — How are these cameras in relation to film? That was the key thing. In other words, could we consider them to be ready for prime time? The answer was yes. We did that project in conjunction with the PGA and a company called Revelations Entertainment, which is Morgan Freeman’s company. Lori McCreary, his partner, was one of the producers who worked with us on this.

So filmmakers started using digital motion picture cameras instead of film. And with digital cinema having replaced film print as a distribution medium, these new generation digital cameras started to replace film as an image capture medium. Then the question was would we have an end-to-end digital system that would become potentially viable as an alternative to shooting on film.

L to R: Josh Pines, Steve MacMillan, Curtis Clark and Dhanendra Patel.

Part of the reason you are getting this acknowledgement from the Academy is your dedication on the highest quality of image and the respect for the artistry, from capture through delivery. Can you talk about your role in look management from on-set through delivery?
I think we all need to be on the same page; it’s one production team whose objective is maintaining the original creative intent of the filmmakers. That includes director and cinematographer and working with an editor and a production designer. Making a film is a collective team effort, but the overall vision is typically established by the director in collaboration with the cinematographer and a production designer. The cinematographer is tasked with capturing that with lighting, with camera composition, movement, lens choices — all those elements that are part of the process of creative filmmaking. Once you start shooting with these extremely sophisticated cameras, like the Sony F65 or Venice, Panavision Millennium DXL, an Arri or the latest versions of the Red camera, all of which have the ability to reproduce high dynamic range, wide color gamut and high resolution. All that raw image data is inherently there and the creative canvas has certainly been expanded.

So if you’re using these creative tools to tell your story, to advance your narrative, then you’re doing it with imagery defined by the potential of what these technologies are able to do. In the modern era, people aren’t seeing dailies at the same time, not seeing them together under controlled circumstances. The viewing process has become fragmented. When everyone had to come together to view projected dailies, there was a certain camaraderie constructive contributions that made the filmmaking process more effective. So if something wasn’t what it should be, then everyone could see exactly what it was and make a correction if you needed to do that.

But now, we have a more dispersed production team at every stage of the production process, from the initial image capture through to dailies, editorial, visual effects and final color grading. We have so many different people in disparate locations working on the production who don’t seem to be as unified, sometimes, as we were when it was all film-based analog shooting. But now, it’s far easier and simpler to integrate visual effects into your workflow. Like Cineon indicated when it first emerged, you could do digital effects as opposed to optical effects and that was a big deal.

So coming back to the current situation, and particularly now with the most advanced forms of imaging, which include high dynamic range, wider color gamut, wider than even P3, REC 2020, having a color management system like ACES that actually has enough color gamut to be able to contain any color space that you capture and want to be able to manipulate.

Can you talk about the challenges you overcame, and how that fits into the history of cinema as it relates to the Academy recognition you received?
As a cinematographer, working on feature films or commercials, I kept thinking, if I’m fortunate enough to be able to manage the dailies and certainly the final color grading, there are these tools called lift gain gamma, which are common to all the different color correctors. But they’re all implemented differently. They’re not cross-platform-compatible, so the numbers from a lift gain gamma — which is the primary RGB grading — from one color corrector will not translate automatically to another color corrector. So I thought, we should have a cross platform version of that, because that is usually seen as the first step for grading.

That’s about as basic as you can get, and it was designed so that it would be a cross-platform implementation, so that everybody who installs and applies the ASC-CDL in a color grading system compatible with that app, whether you did it on a DaVinci, Baselight, Lustre or whatever you were using, the results would be the same and transferable.

You could transport those numbers from one set-up on set using a dailies creation tool, like ColorFront for example. You could then use the ASC CDL to establish your dailies look during the shoot, not while you’re actually shooting, but with the DIT to establish a chosen look that could then be applied to dailies and used for VFX.

Then when you make your way into the final color grading session with the final cut — or whenever you start doing master color grading going back to the original camera source — you would have these initial grading corrections as a starting point as references. This now gives you the possibility of continuing on that color grading process using all the sophistication of a full color corrector, whether it’s power windows or secondary color correction. Whatever you felt you needed to finalize the look.

I was advocating this in the ASC Technology Committee, as it was called, now subsequently renamed the Motion Imaging Technology Council (MITC). We needed a solution like this and there were a group of us who got together and decided that we would do this. There were plenty of people who were skeptical, “Why would you do something like that when we already have lift gain gamma? Why would any of the manufacturers of the different color grading systems integrate this into their system? Would it not impinge upon their competitive advantage if they had a system that people were used to using, and if their own lift gain gamma would work perfectly well for them, why would they want to use the ASC CDL?

We live in a much more fragmented post world, and I saw that becoming even more so with the advances of digital. The ASC CDL would be a “look unifier” that would establish initial look parameters. You would be able to have control over the look at every stage of the way.

I’m assuming that the cinematographer would work with the director and editor, and they would assess certain changes that probably should be made because we’re now looking at cut sequences and what we had thought would be most appropriate when we were shooting is now in the context of an edit and there may need to be some changes and adjustments.

Were you involved in ACES? Was it a similar impetus for ACES coming about? Or was it just spawned because visual effects movies became so big and important with the advent of digital filmmaking?
It was bit of both, including productions without VFX. So I would say that initially it was driven by the fact that there really should be a standardized color management system. Let me give you an example of what I’m talking about. When we were all photochemical and basically shooting with Kodak stock, we were working with film-based Kodak color science.

It’s a color science that everybody knew and understood, even if they didn’t understand it from an engineering photochemical point of view, they understood the effects of it. It’s what helps enable the look and the images that we wanted to create.

That was a color management system that was built into film. That color science system could have been adapted into the digital world, but Kodak resisted that because of the threat to negatives. If you apply that film color science to digital cameras, then you’re making digital cameras look more like film and that could pose a threat to the sale of color film negative.

So that’s really where the birth of ACES came about — to create a universal, unified color management system that would be appropriate anywhere you shot and with the widest possible color gamut. And it supports any camera or display technology because it would always have a more expanded (future proofing) capability within which the digital camera and display technologies would work effectively and efficiently but accurately, reliably and predictably.

Very early on, my ASC Technology Committee (now called Motion Imaging Technology Council) got involved with ACES development and became very excited about it. It was the missing ingredient needed to be able to make the end-to-end digital workflow the success that we thought that it could become. Because we no longer could rely on film-based color science, we had to either replicate that or emulate it with a color management system that could accommodate everything we wanted to do creatively. So ACES became that color management system.

So, in addition to becoming the first cross-platform primary color grading tool, the ASC CDL became the first official ACES look modification transform. Because ACES is not a color grading tool, it’s a color management system, you have to have color grading tools with color management. So you have the color management with ACES, you have the color grading with ASC CDL and the combination of those together is the look management system because it takes them all to make that work. And it’s not that the ASC CDL is the only tool you use for color grading, but it has the portable cross-platform ability to be able to control the color grading from dailies through visual effects up to the final color grade when you’re then working with a sophisticated color corrector.

What do you see for the future of cinematography and the merging of the worlds of post and on-set work and, what do you see as future challenges for future integrations between maintaining the creative intent and the metadata.
We’re very involved in metadata at the moment. Metadata is a crucial part of making all this work, as you well know. In fact, we worked on the common 3D LUT format, which we worked on with the Academy. So there is a common 3D LUT format that is something that would again have cross-platform consistency and predictability. And it’s functionality and its scope of use would be better understood if everyone were using it. It’s a work in progress. Metadata is critical.

I think as we expand the canvas and the palette of the possibility of image making, you have to understand what these technologies are capable of doing, so that you can incorporate them into your vision. So if you’re saying my creative vision includes doing certain things, then you would have to understand the potential of what they can do to support that vision. A very good example in the current climate is HDR.

That’s very controversial in a lot of ways, because the set manufacturers really would love to have everything just jump off the screen to make it vibrant and exciting. However, from a storytelling point of view, it may not be appropriate to push HDR imagery where it distracts from the story.
Well, it depends on how it’s done and how you are able to use that extended dynamic range when you have your bright highlights. And you can use foreground background relationships with bigger depth of field for tremendous effect. They have a visceral presence, because they have a dimensionality when, for example, you see the bright images outside of a window.

When you have an extended dynamic range of scene tones that could add dimensional depth to the image, you can choreograph and stage the blocking for your narrative storytelling with the kind of images that take advantage of those possibilities.

So HDR needs to be thought of as something that’s integral to your storytelling, not just something that’s there because you can do it. That’s when it can become a distraction. When you’re on set, you need a reference monitor that is able to show and convey, all the different tonal and color elements that you’re working with to create your look, from HDR to wider color gamut, whatever that may be, so that you feel comfortable that you’ve made the correct creative decision.

With virtual production techniques, you can incorporate some of that into your live-action shooting on set with that kind of compositing, just like James Cameron started with Avatar. If you want to do that with HDR, you can. The sky is the limit in terms of what you can do with today’s technology.

So these things are there, but you need to be able to pull them all together into your production workflow to make sure that you can comfortably integrate in the appropriate way at the appropriate time. And that it conforms to what the creative vision for the final result needs to be and then, remarkable things can happen. The aesthetic poetry of the image can visually drive the narrative and you can say things with these images without having to be expositional in your dialogue. You can make it more of an experientially immersive involvement with the story. I think that’s something that we’re headed toward, that’s going to make the narrative storytelling very interesting and much more dynamic.

Certainly, and certainly with the advancements of consumer technology and better panels and the high dynamic range developments, and Dolby Vision coming into the home and Atmos audio coming into the home. It’s really an amazing time to be involved in the industry; it’s so fun and challenging.

It’s a very interesting time, and a learning curve needs to happen. That’s what’s driven me from the very beginning and why I think our ASC Motion Imaging Technology Council has been so successful in its 16 years of continuous operation influencing the development of some of these technologies in very meaningful ways. But always with the intent that these new imaging technologies are there to better serve the creative intent of the filmmaker. The technology serves the art. It’s not about the technology per se, it’s about the technology as the enabling component of the art. It enables the art to happen. And expands it’s scope and possibility to broader canvases with wider color gamuts in ways that have never been experienced or possible before.


Barry Goch is a Finishing Artist at The Foundation and a Post Production Instructor at UCLA Extension. You can follow him on Twitter at @gochya.

Sundance Videos: Watch our editor interviews

postPerspective traveled to Sundance for the first time this year, and it was great. In addition to attending some parties, brunches and panels, we had the opportunity to interview a number of editors who were in Park City to help promote their various projects. (Watch here.)

Billy McMillin

We caught up with the editors on the comedy docu-series Documentary Now!, Michah Gardner and Jordan Kim. We spoke to Courtney Ware about cutting the film Light From Light, as well as Billy McMillin, editor on the documentary Mike Wallace is Here. We also chatted with Phyllis Housen, the editor on director Chinonye Chukwu’s Clemency and Kent Kincannon who cut Hannah Pearl Utt’s comedy, Before you Know It. Finally, we sat down with Bryan Mason, who had the dual roles of cinematographer and editor on Animals.

We hope you enjoy watching these interviews as much as we enjoyed shooting them.

Don’t forget, click here to view!

Oh, and a big shout out to Twain Richardson from Jamaica’s Frame of Reference, who edited and color graded the videos. Thanks Twain!

More Than Just Words: Lucky Post helps bring Jeep’s viral piece to life


Jeep’s More Than Words commercial, out of agency The Richards Group, premiered online just prior to this year’s Super Bowl as part of its Big Game Blitz, which saw numerous projects launched leading up to the Super Bowl.

Quickly earning millions of views, the piece features a version of our national anthem by One Republic, as well as images of the band. The two-minute spot is made up of images of small, everyday moments that add up to something big and evoke a feeling of America.

There is a father and his infant son, people gathered in front of a barn, a football thrown through a hanging tire swing. We see bits of cities and suburbs, football, stock images of Marilyn Monroe and soldiers training for battle — and every once in a while, an image of a Jeep is in view.

The spot ends as it began, with images of One Republic in the studio before the screen goes black and text appears reading: More Than Just Words. Then the Jeep logo appears.

The production Company was Zoom USA with partner Mark Toia directing. Lucky Post in Dallas contributed editorial, color, sounds design and finish to the piece.

Editor Sai Selvarajan used Adobe’s Premiere. Neil Anderson provided the color grade in Blackmagic Resolve, while Scottie Richardson performed the sound design and mix using Avid Pro Tools. Online finishing and effects were via Tim Nagle, who worked in Autodesk Flame.

“The concept is genius in its simplicity; a tribute to faith in our country’s patchwork with our anthem’s words reinforced and represented in image,” says Lucky Post’s Selvarajan. “Behind the scenes, everyone provided collective energy and creativity to bring it to life. It was the product of many, just like the message of the film, and I was so excited to see the groundswell of positive reaction.”

 

 

 

Industry vets open editorial, post studio Made-SF

Made-SF, a creative studio offering editorial and other services, has been launched by executive producer Jon Ettinger, editor/director Doug Walker and editors Brian Lagerhausen and Connor McDonald, all formerly of Beast Editorial. Along with creative editorial (Adobe Premiere), the company will provide motion graphic design (After Effects, Mocha), color correction and editorial finishing (likely Flame and Resolve). Eventually, it plans to add concept development, directing and production to its mix.

“Clients today are looking for creative partners who can help them across the entire production chain,” says Ettinger. “They need to tell stories and they have limited budgets available to tell them. We know how to do both, and we are gathering the resources to do so under one roof.”

Made is currently set up in interim quarters while completing construction of permanent studio space. The latter will be housed in a century-old structure in San Francisco’s North Beach neighborhood and will feature five editorial suites, two motion graphics suites, and two post production finishing suites with room for further expansion.

The four Made partners bring deep experience in traditional advertising and branded content, working both with agencies and directly with clients. Ettinger and Walker have worked together for more than 20 years and originally teamed up to launch FilmCore, San Francisco. Both joined Beast Editorial in 2012. Similarly, Lagerhausen and McDonald have been editing in the Bay Area for more than two decades. Collectively, their credits include work for agencies in San Francisco and nationwide. They’ve also helped to create content directly for Google, Facebook, LinkedIn, Salesforce and other corporate clients.

Made is indicative of a trend where companies engaged in content development are adopting fluid business models to address a diversifying media landscapes and where individual talent is no longer confined to a single job title. Walker, for example, has recently served as director on several projects, including a series of short films for Kelly Services, conceived by agency Erich & Kallman and produced by Caruso Co.

“People used to go to great pains to make a distinction about what they do,” Ettinger observes. “You were a director or an editor or a colorist. Today, those lines have blurred. We are taking advantage of that flattening out to offer clients a better way to create content.”

Main Image Caption: (L-R) Doug Walker, Brian Lagerhausen, Jon Ettinger and Connor McDonald.