NBCUni 7.26

Category Archives: Color Grading

VFX and color for new BT spot via The Mill

UK telco BT wanted to create a television spot that showcased the WiFi capabilities of its broadband hub and underline its promise of “whole home coverage.” Sonny director Fredrik Bond visualized a fun and fast-paced spot for agency AMV BBDO, and a The Mill London was brought onboard to help with VFX and color. It is called Complete WiFi.

In the piece, the hero comes home to find it full of soldiers, angels, dancers, fairies, a giant and a horse — characters from the myriad of games and movies the family are watching simultaneously. Obviously, the look depends upon multiple layers of compositing, which have to be carefully scaled to be convincing.

They also need to be very carefully color matched, with similar lighting applied, so all the layers sit together. In a traditional workflow, this would have meant a lot of loops between VFX and grading to get the best from each layer, and a certain amount of compromise as the colorist imposed changes on virtual elements to make the final grade.

To avoid this, and to speed progress, The Mill recently started using BLG for Flame, a FilmLilght plugin that allows Baselight grades to be rendered identically within Flame — and with no back and forth to the color suite to render out new versions of shots. It means the VFX supervisor is continually seeing the latest grade and the colorist can access the latest Flame elements to match them in.

“Of course it was frustrating to grade a sequence and then drop the VFX on top,” explains VFX supervisor Ben Turner. “To get the results our collaborators expect, we were constantly pushing material to and fro. We could end up with more than a hundred publishes on a single job.”

With the BLG for Flame plugin, the VFX artist sees the latest Baselight grade automatically applied, either from FilmLight’s BLG format files or directly from a Baselight scene, even while the scene is still being graded — although Turner says he prefers to be warned when updates are coming.

This works because all systems have access to the raw footage. Baselight grades non-destructively, by building up layers of metadata that are imposed in realtime. The metadata includes all the grading information, multiple windows and layers, effects and relights, textures and more – the whole process. This information can be imposed on the raw footage by any BLG-equipped device (there are Baselight Editions software plugins for Avid and Nuke, too) for realtime rendering and review.

That is important because it also allows remote viewing. For this BT spot, director Bond was back in Los Angeles by the time of the post. He sat in a calibrated room in The Mill in LA and could see the graded images at every stage. He could react quickly to the first animation tests.

“I can render a comp and immediately show it to a client with the latest grade from The Mill’s colorist, Dave Ludlam,” says Turner. “When the client really wants to push a certain aspect of the image, we can ensure through both comp and grade that this is done sympathetically, maintaining the integrity of the image.”

(L-R) VFX supervisor Ben Turner and colorist Dave Ludlam.

Turner admits that it means more to-ing and fro-ing, but that is a positive benefit. “If I need to talk to Dave then I can pop in and solve a specific challenge in minutes. By creating the CGI to work with the background, I know that Dave will never have to push anything too hard in the final grade.”

Ludlam agrees that this is a complete change, but extremely beneficial. “With this new process, I am setting looks but I am not committing to them,” he says. “Working together I get a lot more creative input while still achieving a much slicker workflow. I can build the grade and only lock it down when everyone is happy.

“It is a massive speed-up, but more importantly it has made our output far superior. It gives everyone more control and — with every job under huge time pressure — it means we can respond quickly.”

The spot was offlined by Patric Ryan from Marshall Street. Audio post was via 750mph with sound designers Sam Ashwell and Mike Bovill.

Behind the Title: Nice Shoes animator Yandong Dino Qiu

This artist/designer has taken to sketching people on the subway to keep his skills fresh and mind relaxed.

NAME: Yandong Dino Qiu

COMPANY: New York’s Nice Shoes

CAN YOU DESCRIBE YOUR COMPANY?
Nice Shoes is a full-service creative studio. We offer design, animation, VFX, editing, color grading, VR/AR, working with agencies, brands and filmmakers to help realize their creative vision.

WHAT’S YOUR JOB TITLE?
Designer/Animator

WHAT DOES THAT ENTAIL?
Helping our clients to explore different looks in the pre-production stage, while aiding them in getting as close as possible to the final look of the spot. There’s a lot of exploration and trial and error as we try to deliver beautiful still frames that inform the look of the moving piece.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Not so much for the title, but for myself, design and animation can be quite broad. People may assume you’re only 2D, but it also involves a lot of other skill sets such as 3D lighting and rendering. It’s pretty close to a generalist role that requires you to know nearly every software as well as to turn things around very quickly.

WHAT TOOLS DO YOU USE?
Photoshop, After Effects,. Illustrator, InDesign — the full Adobe Creative Suite — and Maxon Cinema 4D.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Pitch and exploration. At that stage, all possibilities are open. The job is alive… like a baby. You’re seeing it form and helping to make new life. Before this, you have no idea what it’s going to look like. After this phase, everyone has an idea. It’s very challenging, exciting and rewarding.

WHAT’S YOUR LEAST FAVORITE?
Revisions. Especially toward the end of a project. Everything is set up. One little change will affect everything else.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
2:15pm. Its right after lunch. You know you have the whole afternoon. The sun is bright. The mood is light. It’s not too late for anything.

Sketching on the subway.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I would be a Manga artist.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
La Mer. Frontline. Friskies. I’ve also been drawing during my commute everyday, sketching the people I see on the subway. I’m trying to post every week on Instagram. I think it’s important for artists to keep to a routine. I started up with this at the beginning of 2019, and there’ve been about 50 drawings already. Artists need to keep their pen sharp all the time. By doing these sketches, I’m not only benefiting my drawing skills, but I’m improving my observation about shapes and compositions, which is extremely valuable for work. Being able to break down shapes and components is a key principle of design, and honing that skill helps me in responding to client briefs.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
TED-Ed What Is Time? We had a lot of freedom in figuring out how to animate Einstein’s theories in a fun and engaging way. I worked with our creative director Harry Dorrington to establish the look and then with our CG team to ensure that the feel we established in the style frames was implemented throughout the piece.

TED-Ed What Is Time?

The film was extremely well received. There was a lot of excitement at Nice Shoes when it premiered, and TED-Ed’s audience seemed to respond really warmly as well. It’s rare to see so much positivity in the YouTube comments.

NAME SOME TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My Wacom tablet for drawing and my iPad for reading.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I take time and draw for myself. I love that drawing and creating is such a huge part of my job, but it can get stressful and tiring only creating for others. I’m proud of that work, but when I can draw something that makes me personally happy, any stress or exhaustion from the work day just melts away.

NBCUni 7.26

FilmLight offers additions to Baselight toolkit

FilmLight will be at NAB showing updates to its Baselight toolkit, including T-Cam v2. This is FilmLight’s new and improved color appearance model, which allows the user to render an image for all formats and device types with confidence of color.

It combines with the Truelight Scene Looks and ARRI Look Library, now implemented within the Baselight software. “T-CAM color handling with the updated Looks toolset produces a cleaner response compared to creative, camera-specific LUTs or film emulations,” says Andrea Chlebak, senior colorist at Deluxe’s Encore in Hollywood. “I know I can push the images for theatrical release in the creative grade and not worry about how that look will translate across the many deliverables.”

FilmLight had added what they call “a new approach to color grading” with the addition of Texture Blend tools, which allow the colorist to apply any color grading operation dependent on image detail. This gives the colorist fine control over the interaction of color and texture.

Other workflow improvements aimed at speeding the process include enhanced cache management; a new client view that displays a live web-based representation of a scene showing current frame and metadata; and multi-directory conform for a faster and more straightforward conform process.

The latest version of Baselight software also includes per-pixel alpha channels, eliminating the need for additional layer mattes when compositing VFX elements. Tight integration with VFX suppliers, including Foundry Nuke and Autodesk, means that new versions of sequences can be automatically detected, with the colorist able to switch quickly between versions within Baselight.


Shooting, posting New Republic’s Indie film, Sister Aimee

After a successful premiere at the Sundance Film Festival, New Republic Studios’ Sister Aimee screened at this month’s SXSW. The movie tells the story of an infamous American evangelist of the 1920s, Sister Aimee Semple McPherson, who gets caught up in her lover’s dreams of Mexico and finds herself on a road trip toward the border.

Sister Aimee shot at the newly renovated New Republic Studios near Austin, Texas, over two and a half weeks. “Their crew used our 2,400-square-foot Little Bear soundstage, our 3,000-square-foot Lone Wolf soundstage, our bullpen office space and numerous exterior locations in our backlot,” reports New Republic Studios president Mindy Raymond, adding that the Sister Aimee production also had access to two screening rooms with 5.1 surround sound, HDMI hookups to 4K monitors and theater-style leather chairs to watch dailies. The film also hit the road, shooting in the New Mexico desert.

L-R: Directors Samantha Buck, Marie Schlingmann at SXSW. Credit: Harrison Funk

Co-written and co-directed by Samantha Buck and Marie Schlingmann, the movie takes some creative license with the story of Aimee. “We don’t look for factual truth in Aimee’s journey,” they explain. “Instead we look for a more timeless truth that says something about female ambition, the female quest for immortality and, most of all, the struggle for women to control their own narratives. It becomes a story about storytelling itself.”

The film, shot by cinematographer Carlos Valdes-Lora at 3.2K ProRes 4444 XQ on an Arri Alexa Mini, was posted at Dallas and Austin-based Charlieuniformtango.

We reached out to the DP and the post team to find out more.

Carlos, why did you choose the package of the Alexa and Cooke Mini S4 Primes?
Carlos Valdes-Lora: In early conversations with the directors, we all agreed that we didn’t want Sister Aimee to feel like a traditional period movie. We didn’t want to use softening filters or vintage lenses. We aimed instead for clear images, deep focus and a rich color palette that remains grounded in the real world. We felt that this would lend the story a greater sense of immediacy and draw the viewer closer to the characters. Following that same thinking, we worked very extensively with the 25mm and 32mm, especially in closeups and medium closeups, emphasizing accessibility.

The Cooke Mini S4s are a beautiful and affordable set (relative to our other options.) We like the way they give deep dimensionality and warmth to faces, and how they create a slightly lower contrast image compared to the other modern lenses we looked at. They quickly became the right choice for us, striking the right balance between quality, size and value.

The Cookes paired with the Alexa Mini gave us a lightweight camera system with a very contained footprint, and we needed to stay fast and lean due to our compressed shooting schedule and often tight shooting quarters. The Chapman Cobra dolly was a big help in that regard as well.

What was the workflow to post like?
Charlieuniformtango producers Bettina Barrow, Katherine Harper, David Hartstein: Post took place primarily between Charlieuniformtango’s Dallas and Austin offices. Post strategizing started months before the shoot, and active post truly began when production began in July 2018.

Tango’s Evan Linton handled dailies brought in from the shoot, working alongside editor Katie Ennis out of Tango’s Austin studio, to begin assembling a rough cut as shooting continued. Ennis continued to cut at the studio through August with directors Schlingmann and Buck.

Editorial then moved back to the directors’ home state of New York to finish the cut for Sundance. (Editor Ennis, who four-walled out of Tango Austin for the first part of post, went to  New York with the directors, working out of a rented space.)

VFX and audio work started early at Tango, with continuously updated timelines coming from editorial, working to have certain locked shots also finished for the Sundance submission, while saving much of the cleanup and other CG heavy shots for the final picture lock.

Tango audio engineer Nick Patronella also tackled dialogue edit, sound design and mix for the submission out of the Dallas studio.

Can you talk about the VFX?
Barrow, Harper, Hartstein: The cut was locked in late November, and the heavy lifting really began. With delivery looming, Tango’s Flame artists Allen Robbins, Joey Waldrip, David Hannah, David Laird, Artie Peña and Zack Smith divided effects shots, which ranged from environmental cleanup, period-specific cleanup, beauty work such as de-aging, crowd simulation, CG sign creation and more. 3D

(L-R) Tango’s Artie Peña, Connor Adams, Allen Robbins in one of the studio’s Flame suites.

Artist Connor Adams used Houdini, Mixamo and Maya to create CG elements and crowds, with final comps being done in Nuke and sent to Flame for final color. Over 120 VFX shots were handled in total and Flame was the go-to for effects. Color and much of the effects happened simultaneously. It was a nice workflow as the project didn’t have major VFX needs that would have impacted color.

What about the color grade?
Barrow, Harper, Hartstein: Directors Buck and Schlingmann and DP Valdes-Lora worked with Tango colorist Allen Robbins to craft the final look of the film — with the color grade also done in Flame. The trio had prepped shooting for a Kodachrome-style look, especially for the exteriors, but really overall. They found important reference in selections of Robert Capa photographs.

Buck, Schlingmann and Valdes-Lora responded mostly to Kodachrome’s treatment of blues, browns, tans, greens and reds (while staying true to skin tone), but also to their gamma values, not being afraid of deep shadows and contrast wherever appropriate. Valdes-Lora wanted to avoid lighting/exposing to a custom LUT on set that would reflect this kind of Kodachrome look, in case they wanted to change course during the process. With the help of Tango, however, they discovered that by dialing back the Capa look it grounded the film a little more and made the characters “feel” more accessible. The roots of the inspiration remained in the image but a little more naturalism, a little more softness, served the story better.

Because of that they monitored on set with Alexa 709, which he thought exposing for would still provide enough room. Production designer Jonathan Rudak (another regular collaborator with the directors) was on the same page during prep (in terms of reflecting this Capa color style), and the practical team did what they could to make sure the set elements complemented this approach.

What about the audio post?
Barrow, Harper, Hartstein: With the effects and color almost complete, the team headed to Skywalker Ranch for a week of final dialogue edit, mix, sound design and Foley, led by Skywalker’s Danielle Dupre, Kim Foscato and E. Larry Oatfield. The team also was able to simultaneously approve color sections in Skywalker’s Stag Theater allowing for an ultra-efficient schedule. With final mix in hand, the film was mastered just after Christmas so that DCP production could begin.

Since a portion of the film was musical, how complex was the audio mix?
Skywalker sound mixer Dupre: The musical number was definitely one of the most challenging but rewarding scenes to design and mix. It was such a strong creative idea that played so deeply into the main character. The challenge was in striking a balance between tying it into the realism of the film while also leaning into the grandiosity of the musical to really sell the idea.

It was really fun to play with a combination of production dialogue and studio recordings to see how we could make it work. It was also really rewarding to create a soundscape that starts off minimally and simply and transitions to Broadway scale almost undetectably — one of the many exciting parts to working with creative and talented filmmakers.

What was the biggest challenge in post?
Barrow, Harper, Hartstein: Finishing a film in five to six weeks during the holidays was no easy feat. Luckily, we were able to have our directors hands-on for all final color, VFX and mix. Collaborating in the same room is always the best when you have no time to spare. We had a schedule where each day was accounted for — and we stuck to it almost down to the hour.

 


Goldcrest adds 4K theater and colorist Marcy Robinson

Goldcrest Post in New York City has expanded its its picture finishing services, adding veteran colorist Marcy Robinson and unveiling a new, state-of-the-art 4K theater that joins an existing theater and other digital intermediate rooms. The moves are part of a broader strategy to offer film and television productions packaged post services encompassing editorial, picture finishing and sound.

Robinson brings experience working in features, television, documentaries, commercials and music videos. She has recently been working as a freelance colorist, collaborating with directors Noah Baumbach and Ang Lee. Her background also includes 10 years at the creative boutique Box Services, best known for its work in fashion advertising.

Robinson, who was recruited to Goldcrest by Nat Jencks, the facility’s senior colorist, says she was attracted by the opportunity to work on a diversity of high-quality projects. Robinson’s first projects for Goldcrest include the Netflix documentary The Grass is Greener and an advertising campaign for Reebok.

Robinson started out in film photography and operated a custom color photographic print lab for 13 years. She became a digital colorist after joining Box Services in 2008. As a freelance colorist, her credits include the features Billy Lynn’s Long Halftime Walk, DePalma and Frances Ha, the HBO documentary Suited, commercials for Steve Madden, Dior and Prada, and music videos for Keith Urban and Madonna.

Goldcrest’s new 4K theater is set up for the dual purposes of feature film and HDR television mastering. Its technical features include a Blackmagic DaVinci Resolve Linux Advanced color correction and finishing system, a Barco 4K projector, a Screen Research projection screen and Dolby-calibrated 7:1 surround sound.


Posting director Darren Lynn Bousman’s horror film, St. Agatha

Atlanta’s Moonshine Post helped create a total post production pipeline — from dailies to finishing — for the film St. Agatha, directed by Darren Lynn Bousman (Saw II, Saw III, Saw IV, Repo the Genetic Opera). 

The project, from producers Seth and Sara Michaels, was co-edited by Moonshine’s Gerhardt Slawitschka and Patrick Perry and colored by Moonshine’s John Peterson.

St. Agatha is a horror film that shot in the town of Madison, Georgia. “The house we needed for the convent was perfect, as the area was one of the few places that had not burned down during the Civil War,” explains Seth Michaels. “It was our first time shooting in Atlanta, and the number one reason was because of the tax incentive. But we also knew Georgia had an infrastructure that could handle our production.”

What the producers didn’t know during production was that Moonshine Post could handle all aspects of post, and were initially brought in only for dailies. With the opportunity to do a producer’s cut, they returned to Moonshine Post.

Time and budget dictated everything, and Moonshine Post was able to offer two editors working in tandem to edit a final cut. “Why not cut in collaboration?” suggested Drew Sawyer, founder of Moonshine Post and executive producer. “It will cut the time in half, and you can explore different ideas faster.”

“We quite literally split the movie in half,” reports Perry, who, along with Slawitschka, cut on Adobe Premiere “It’s a 90-minute film, and there was a clear break. It’s a little unusual, I will admit, but almost always when we are working on something, we don’t have a lot of time, so splitting it in half works.”

Patrick Perry

Gerhardt Slawitschka

“Since it was a producer’s cut, when it came to us it was in Premiere, and it didn’t make sense to switch over to Avid,” adds Slawitschka. “Patrick and I can use both interchangeably, but prefer Premiere; it offers a lot of flexibility.”

“The editors, Patrick and Gerhardt, were great,” says Sara Michaels. “They watched every single second of footage we had, so when we recut the movie, they knew exactly what we had and how to use it.”

“We have the same sensibilities,” explains Gerhardt. “On long-form projects we take a feature in tandem, maybe split it in half or in reels. Or, on a TV series, each of us take a few episodes, compare notes, and arrive at a ‘group mind,’ which is our language of how a project is working. On St. Agatha, Patrick and I took a bit of a risk and generated a four-page document of proposed thoughts and changes. Some very macro, some very micro.”

Colorist John Peterson, a partner at Moonshine Post, worked closely with the director on final color using Blackmagic’s Resolve. “From day one, the first looks we got from camera raw were beautiful.” Typically, projects shot in Atlanta ship back to a post house in a bigger city, “and maybe you see it and maybe you don’t. This one became a local win, we processed dailies, and it came back to us for a chance to finish it here,” he says.

Peterson liked working directly with the director on this film. “I enjoyed having him in session because he’s an artist. He knew what he was looking for. On the flashbacks, we played with a variety of looks to define which one we liked. We added a certain amount of film grain and stylistically for some scenes, we used heavy vignetting, and heavy keys with isolation windows. Darren is a director, but he also knows the terminology, which gave me the opportunity to take his words and put them on the screen for him. At the end of the week, we had a successful film.”

John Peterson

The recent expansion of Moonshine Post, which included a partnership with the audio company Bare Knuckles Creative and a visual effects company Crafty Apes, “was necessary, so we could take on the kind of movies and series we wanted to work with,” explains Sawyer. “But we were very careful about what we took and how we expanded.”

They recently secured two AMC series, along with projects from Netflix. “We are not trying to do all the post in town, but we want to foster and grow the post production scene here so that we can continue to win people’s trust and solidify the Atlanta market,” he says.

Uncork’d Entertainment’s St. Agatha was in theaters and became available on-demand starting February 8. Look for it on iTunes, Amazon, Google Play, Vudu, Fandango Now, Xbox, Dish Network and local cable providers.


Review: Tangent Wave 2: Color Correction Surface

By Brady Betzel

Have you ever become frustrated while color correcting footage after a long edit due to having to learn a whole new set of shortcuts and keystrokes?

Whether you’re in Adobe Premiere, Avid Media Composer or Blackmagic Resolve, there are hundreds of shortcuts you can learn to become a highly efficient colorist. If you want to become the most efficient colorist you can be, you need an external hardware color panel (clearly we are talking to those who provide color as part of their job but not as their job). You may have seen the professional color correction panels like the Blackmagic DaVinci Panel or the Filmlight Blackboard 2 panel for Baselight. Those are amazing and take a long time of repetitive use to really master (think Malcolm Gladwell’s 10,000 Hour Rule). Not to mention they can cost $30,000 or more… yikes! So if you can’t quite justify the $30,000 for a dedicated color correction panel don’t fret. You still have options.

One of those options is the Tangent Wave, which is at the bottom end of the price range. Before I dig in, I need to note that it only works with Avid if you also use the FilmLight Baselight for Media Composer plugin. So Avid users, keep that in mind.

Tangent has one of the most popular sub-$3,500 set of panels used constantly by editing pros: Tangent Elements. I love the Tangent Elements panel, but at just under $3,500 they aren’t cheap, and I can understand how a lot of people could be put off — plus, it can take up your entire desktop real estate with four panels. Blackmagic sells its Mini panel for just under $3,000, but it only works with Resolve. So if you bounce around between apps that one isn’t for you.

Tangent released the first generation Wave panel around 2010 and it took another eight years to realize that people want color correction panels but don’t want to spend a lot of money. That’s when they released the Tangent Wave 2. The original Tangent Wave was a great color correction panel, but in my opinion was ergonomically inefficient. It was awkward — but at around $1,500 it was one of the only options that was semi-affordable.

In 2016, Tangent released the Tangent Ripple, which has a limited toolset, including three trackballs with dials, reset buttons and shift/alt buttons, costing around $350. You can read my review here. That’s a great price point but it is really limiting. If you are doing very basic color correction, like hue corrections and contrast moves, this is great. But if you want to dive into Power Windows, Hue Qualifiers or maybe even cycling through LUTs you need more. This is where the Tangent Wave 2 comes into play.

Tangent Wave 2
The Tangent Wave 2 works with the Tangent Mapper software, an app to help customize the key and knob mapping if the application you are using let’s you customize the keys. It just so happens that Premiere is customizable but Resolve is not (no matter what panel you are using, not just Tangent).

The Wave 2 is much more comfortable than the original Wave and has enough buttons to get 60% of the shortcuts in these apps. If you are in Premiere you can re-map keys and get where you want much faster than Resolve. However, Resolve’s mapping is set by Blackmagic and has almost everything you need. What it doesn’t have mapped is quickly accessible by keyboard or mouse.

If you’ve ever used the Element panels you will remember its high-grade components (which probably added to the price tag) — including the trackballs and dials. Everything feels very professional on the Elements, very close to the high-end Precision Panels or DaVinci Panels. The Wave 2’s components are on the lower end. They aren’t bad components, just cheaper. The trackballs are a little looser in their sockets, in fact don’t turn the panel over or your balls will fall out (or do it to someone else if you want to play a joke, just ask for the serial # on the bottom of the panel). The accuracy on the trackballs doesn’t feel as tight as the Elements, but is usable. The knobs and buttons feel much closer to the level of the Element panels. The overall plastic casing is much lighter and feels a lot cheaper.

However, for around $900 at the time of my writing this review) the Tangent Wave 2 is arguably the best deal for a color correction panel there is. Between the extremely efficient button layout and beautiful ice-white OLED display you will be hard pressed to find a better product for the money. It is also around 15-inch wide, 11-inch deep, and 2-inches tall, which allows for you to keep your keep your keyboards and mice on your desk, unlike the Elements which can take an entire desktop on their own.

Before you plug in your Wave 2 you should download the latest Tangent Hub and Mapper. Once you open the Mapper app you will understand the button and knob layout and how to customize the keys (unless you are using Resolve). In Premiere, I immediately started pressing buttons and turning knobs and found out that once inside of the Lumetri tabs the up and down arrows on the panel worked in the reverse of how my brain wanted them to work. I jumped into the Mapper app, reassigned the up and down arrows to the way I wanted to cycle through the Lumetri panels and without restarting I was up and running. It was awesome not to have to restart anything.

As you go, you will find that each NLE or color app has their own issues and it might take a few tries to get your panel set up the way you want it. I really liked how a few recent LUTs I had installed in the Premiere LUT directory showed up on the panel’s OLED when cycling through LUTs. It was really helpful and I didn’t have to use my mouse to click the drop-down LUT menu. When you go into the Creative Looks you can cycle through those straight from the Wave 2, which is very helpful. Other than that you can control almost every single thing in the Lumetri interface directly from the panel, including going into full screen for review of your color.

If you use Resolve 15, you will really like the Tangent Wave 2. I did notice that the panel worked much smoother and was way more responsive inside of Resolve than inside of Premiere. There could be a few reasons for that, but I work in and out of these apps almost daily and it definitely felt a little delayed in Premiere Pro.

Once you are getting into the nitty gritty of Resolve you will be a little hamstrung when accessing items like the Hue vs Hue curves. You can’t pinpoint hues on the curve window and adjust them straight from the Wave 2. That is where you will want to look at the Element panels. Another shortcut missing was the lack of Offset — there are only three trackballs so you cannot access the 4th Hue wheel aka Offset. However, you can access the Offset through the knobs, and I actually found controlling the Offset through knobs was oddly satisfying and more accurate than the trackballs. It’s a different way of thinking, and I think I might like it.

Without Resolve’s GUI Matching where I was on the Wave 2 panel, I wasn’t always sure where I was at. On the Resolve GUI I might have been in the Curves tab but on the Wave 2 HUD I may have been on the Power Windows tab. If Tangent could sync the Wave 2 and the Resolve GUI so that they match I think the Wave 2 would be a lot easier to use and less confusing, I guess I wouldn’t even call it an update, it’s a legitimate missing feature.

Summing Up
In the end, you will not find a traditional color correction panel setup that works with multiple applications and satisfies all of the requirements of a professional colorist for around $900.

I love the Tangent Element Panels but at over half the price, the Tangent Wave 2 is a great solution without spending what could be used as a down payment on a car.

Check out the Tangent Wave 2 on Tangent’s website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Colorist Christopher M. Ray talks workflow for Alexa 65-shot Alpha

By Randi Altman

Christopher M. Ray is a veteran colorist with a varied resume that includes many television and feature projects, including Tomorrowland, Warcraft, The Great Wall, The Crossing, Orange Is the New Black, Quantico, Code Black, The Crossing and Alpha. These projects have taken Ray all over the world, including remote places throughout North America, Europe, Asia and Africa.

We recently spoke with Ray, who is on staff at Burbank’s Picture Shop, to learn more about his workflow on the feature film Alpha, which focuses on a young man trying to survive alone in the wilderness after he’s left for dead during his first hunt with his Cro-Magnon tribe.

Ray was dailies colorist on the project, working with supervising DI colorist Maxine Gervais. Gervais of Technicolor won an HPA Award for her work on Alpha in the Outstanding Color Grading — Feature Film category.

Let’s find out more….

Chris Ray and Maxine Gervais at the HPA Awards.

How early did you get involved in Alpha?
I was approached about working on Alpha right before the start of principal photography. From the beginning I knew that it was going to be a groundbreaking workflow. I was told that we would be working with the ARRI Alexa 65 camera, mainly working in an on-set color grading trailer and we would be using FilmLight’s Daylight software.

Once I was on board, our main focus was to design a comprehensive workflow that could accommodate on-set grading and Daylight software while adapting to the ever-changing challenges that the industry brings. Being involved from the start was actually was a huge perk for me. It gave us the time we needed to design and really fine-tune the extensive workflow.

Can you talk about working with the final colorist Maxine Gervais and how everyone communicated?
It was a pleasure working with Maxine. She’s really dialed in to the demands of our industry. She was able to fly to Vancouver for a few days while we were shooting the hair/makeup tests, which is how we were able to form in-person communication. We were able to sit down and discuss creative approaches to the feature right away, which I appreciated as I’m the type of person that likes to dive right in.

At the film’s conception, we set in motion a plan to incorporate a Baselight Linked Grade (BLG) color workflow from FilmLight. This would allow my color grades in Daylight to transition smoothly into Maxine’s Baselight software. We knew from the get-go that there would be several complicated “day for night” scenes that Maxine and I would want to bring to fruition right away. Using the BLG workflow, I was able to send her single “Arriraw” frames that gave that “day for night” look we were searching for. She was able to then send them back to me via a BLG file. Even in remote locations, it was easy for me to access the BLG grade files via the Internet.

[Maxine Gervais weighs in on working with Ray: “Christopher was great to work with. As the workflow on the feature was created from scratch, he implemented great ideas. He was very keen on the whole project and was able to adapt to the ever-changing challenges of the show. It is always important to have on-set color dialed in correctly, as it can be problematic if it is not accurately established in production.”]

How did you work with the DP? What direction were you given?
Being on set, it was very easy for DP Martin Gschlacht to come over to the trailer and view the current grade I was working on. Like Maxine, Martin already had a very clear vision for the project, which made it easy to work with him. Oftentimes, he would call me over on set and explain his intent for the scene. We would brainstorm ways of how I could assist him in making his vision come to life. Audiences rarely see raw camera files, or the how important color can influence the story being told.

It also helps that Martin is a master of aesthetic. The content being captured was extremely striking; he has this natural intuition about what look is needed for each environment that he shoots. We shot in lush rain forests in British Columbia and arid badlands in Alberta, which each inspired very different aesthetics.

Whenever I had a bit of down time, I would walk over to set and just watch them shoot, like a fly on the wall quietly observing and seeing how the story was unfolding. As a colorist, it’s so special to be able to observe the locations on set. Seeing the natural desaturated hues of dead grass in the badlands or the vivid lush greens in the rain forest with your own eyes is an amazing opportunity many of us don’t get.

You were on set throughout? Is that common for you?
We were on set throughout the entire project as a lot of our filming locations were in remote areas of British Columbia and Alberta, Canada. One of our most demanding shooting locations included the Dinosaur Provincial Park in Brooks, Alberta. The park is a UNESCO World Heritage site that no one had been allowed to film at prior to this project. I needed to have easy access to the site in order to easily communicate with the film’s executive team and production crew. They were able to screen footage in their trailer and we had this seamless back-and-forth workflow. This also allowed them to view high-quality files in a comfortable and controlled environment. Also, the ability to flag any potential issues and address them immediately on set was incredibly valuable with a film of such size and complexity.

Alpha was actually the first time I worked in an on-set grading trailer. In the past I usually worked out of the production office. I have heard of other films working with an on-set trailer, but I don’t think I would say that it is overly common. Sometimes, I wish I could be stationed on set more often.

The film was shot mostly with the Alexa 65, but included footage from other formats. Can you talk about that workflow?
The film was mostly shot on the Alexa 65, but there were also several other formats it was shot on. For most of the shoot there was a second unit that was shooting with Alexa XT and Red Weapon cameras, with a splinter unit shooting B-roll footage on Canon 1D, 5D and Sony A7S. In addition to these, there were units in Iceland and South Africa shooting VFX plates on a Red Dragon.

By the end of the shoot, there were several different camera formats and over 10 different resolutions. We used the 6.5K Alexa 65 resolution as the master resolution and mapped all the others into it.

The Alexa 65 camera cards were backed up to 8TB “sled” transfer drives using a Codex Vault S system. The 8TB transfer drives were then sent to the trailer where I had two Codex Vault XL systems — one was used for ingesting all of the footage into my SAN and the second was used to prepare footage for LTO archival. All of the other unit footage was sent to the trailer via shuttle drives or Internet transfer.

After the footage was successfully ingested to the SAN with a checksum verification, it was ready to be colored, processed, and then archived. We had eight LTO6 decks running 24/7, as the main focus was to archive the exorbitant amounts of high-res camera footage that we were receiving. Just the Alexa 65 alone was about 2.8TB per hour for each camera.

Had you worked with Alexa 65 footage previously?
Many times. A few year ago, I was in China for seven months working on The Great Wall, which was one of the first films to shoot with the Alexa 65. I had a month of in-depth pre-production with the camera testing, shooting and honing the camera’s technology. Working very closely with Arri and Codex technicians during this time, I was able to design the most efficient workflow possible. Even as the shoot progressed, I continued to communicate closely with both companies. As new challenges arose, we developed and implemented solutions that kept production running smoothly.

The workflow we designed for The Great Wall was very close to the workflow we ended up using on Alpha, so it was a great advantage that I had previous experience working in-depth with the camera.

What were some of the challenges you faced on this film?
To be honest, I love a challenge. As a colorist, we are thrown into tricky situations every day. I am thankful for these challenges; they improve my craft and enable me to become more efficient at problem solving. One of the largest challenges that I faced in this particular project was working with so many different units, given the number of units shooting, the size of the footage alone and the dozens of format types needed.

We had to be accessible around the clock, most of us working 24 hours a day. Needless to say, I made great friends with the transportation driving team and the generator operators. I think they would agree that my grading trailer was one of their largest challenges on the film since I constantly needed to be on set and my work was being imported/exported in such high resolutions.

In the end, as I was watching this absolutely gorgeous film in the theater it made sense. Working those crazy hours was absolutely worth it — I am thankful to have worked with such a cohesive team and the experience is one I will never forget.


Autodesk cloud-enabled tools now work with BeBop post platform

Autodesk has enabled use of its software in the cloud — including 3DS Max, Arnold, Flame and Maya — and BeBop Technology will deploy the tools on its cloud-based post platform. The BeBop platform enables processing-heavy post projects, such as visual effects and editing, in the cloud on powerful and highly secure virtualized desktops. Creatives can process, render, manage and deliver media files from anywhere on BeBop using any computer and as small as a 20Mbps Internet connection.

The ongoing deployment of Autodesk software on the BeBop platform mirrors the ways BeBop and Adobe work closely together to optimize the experience of Adobe Creative Cloud subscribers. Adobe applications have been available natively on BeBop since April 2018.

Autodesk software users will now also gain access to BeBop Rocket Uploader, which enables ingestion of large media files at incredibly high speeds for a predictable monthly fee with no volume limits. Additionally, BeBop Over the Shoulder (OTS) enables secure and affordable remote collaboration, review and approval sessions in real-time. BeBop runs on all of the major public clouds, including Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure.

Lowepost offering Scratch training for DITs, post pros

Oslo, Norway-based Lowepost, which offers an online learning platform for post production, has launched an Assimilate Scratch Training Channel targeting DITs and post pros. This training includes an extensive series of tutorials that help guide a post pro or DIT through the features of an entire Scratch workflow. Scratch products offer dailies to conform, color grading, visual effects, compositing, finishing, VR and live streaming.

“We’re offering in-depth training of Scratch via comprehensive tutorials developed by Lowepost and Assimilate,” says Stig Olsen, manager of Lowepost. “Our primary goal is to make Scratch training easily accessible to all users and post artists for building their skills in high-end tools that will advance their expertise and careers. It’s also ideal for DaVinci Resolve colorists who want to add another excellent conform, finishing and VR tool to their tool kit.”

Lowepost is offering three-month free access to the Scratch training. The first tutorial, Scratch Essential Training, is also available now. A free 30-day trial offer of Scratch is available via their website.

Lowepost’s Scratch Training Channel is available for an annual fee of $59 (US).

Phil Azenzer returns to Encore as senior colorist

Industry veteran and senior colorist Phil Azenzer, one of Encore’s original employees, has returned to the company, bringing with him a credit list that includes TV and features. He was most recently with The Foundation.

When he first started at Encore he was a color assistant, learning the craft and building his client base. Over his post production career, Azenzer has collaborated with many notable directors including David Lynch, Steven Spielberg and David Nutter, as well as high-profile DPs such as Robert McLachlan and John Bartley.

His credits include The X-Files, Six Feet Under, Entourage, Big Love, Bates Motel, Bloodline and most recently, seasons four and five of Black-ish and seasons one and two of Grown-ish.

“Coming back to Encore is really a full circle journey for me, and it feels like coming home,” shared Azenzer. “I learned my craft and established my career here. I’m excited to be back at Encore, not just because of my personal history here, but because it’s great to be at an established facility with the visibility and reputation that Encore has. I’m looking forward to collaborating with lots of familiar faces.”

Azenzer is adept at helping directors and cinematographers create visual stories. With the flexibility to elevate to a variety of desired looks, he brings a veteran’s knowledge and skillset to projects requiring anything from subtle film noir palettes to hyper-saturated stylized looks. Upon departing Encore in 2001, Azenzer spent time at Technicolor and Post Group/io Film before returning to Encore from 2009-2011. Following his second stint at Encore, he continued work as a senior colorist at Modern Videofilm, NBC Universal and Sony.

While his main tool is Resolve, he has also worked with Baselight and  Lustre.

Color plays big role in the indie thriller Rust Creek

In the edge-of-your-seat thriller Rust Creek, confident college student Sawyer (Hermione Corfield) loses her way while driving through very rural Appalachia and quickly finds herself in a life-or-death struggle with some very dangerous men. The modestly-budgeted feature from Lunacy Productions — a company that encourages female filmmakers in top roles — packs a lot of power with virtually no pyrotechnics using well-thought-out filmmaking techniques, including a carefully planned and executed approach to the use of color throughout the film.

Director Jen McGowan and DP Michelle Lawler

Director Jen McGowan, cinematographer Michelle Lawler and colorist Jill Bogdanowicz of Company 3 collaborated to help express Sawyer’s character arc through the use of color. For McGowan, successful filmmaking requires thorough prep. “That’s where we work out, ‘What are we trying to say and how do we illustrate that visually?’” she explains. “Film is such a visual medium,” she adds, “but it’s very different from something like painting because of the element of time. Change over time is how we communicate story, emotion and theme as filmmakers.”

McGowan and Lawler developed the idea that Sawyer is lost, confused and overwhelmed as her dire situation becomes clear. Lawler shot most of Rust Creek handholding an ARRI Alexa Mini (with Cooke S4s) following Sawyer as she makes her way through the late autumn forest. “We wanted her to become part of the environment,” Lawler says. “We shot in winter and everything is dead, so there was a lot of brown and orange everywhere with zero color separation.”

Production designer Candi Guterres pushed that look further, rather than fighting it, with choices about costumes and some of the interiors.

“They had given a great deal of thought to how color affects the story,” recalls colorist Bogdanowicz, who sat with both women during the grading sessions (using Blackmagic’s DaVinci Resolve) at Company 3 in Santa Monica. “I loved the way color was so much a part of the process, even subtly, of the story arc. We did a lot in the color sessions to develop this concept where Sawyer almost blends into the environment at first and then, as the plot develops and she finds inner strength, we used tonality and color to help make her stand out more in the frame.”

Lawler explains that the majority of the film was shot on private property deep in the Kentucky woods, without the use of any artificial light. “I prefer natural light where possible,” she says. “I’d add some contrast to faces with some negative fill and maybe use little reflectors to grab a rake of sunlight on a rock, but that was it. We had to hike to the locations and we couldn’t carry big lights and generators anyway. And I think any light I might have run off batteries would have felt fake. We only had sun about three days of the 22-day shoot, so generally I made use of the big ‘silk’ in the sky and we positioned actors in ways that made the best use of the natural light.”

In fact, the weather was beyond bad, it was punishing. “It would go from rain to snow to tornado conditions,” McGowan recalls. “It dropped to seven degrees and the camera batteries stopped working.”

“The weather issues can’t be overstated,” Lawler adds, describing conditions on the property they used for much of the exterior location. “Our base camp was in a giant field. The ground would be frozen in the morning and by afternoon there would be four feet of mud. We dug trenches to keep craft services from flooding.”

The budget obviously didn’t provide for waiting around for the elements to change, David Lean-style. “Michelle and I were always mindful when shooting that we would need to be flexible when we got to the color grading in order to tie the look together,” McGowan explains. “I hate the term ‘fix it post.’ It wasn’t about fixing something, it was about using post to execute what was intended.”

Jill Bogdanowicz

“We were able to work with my color grading toolset to fine tune everything shot by shot,” says Bogdanowicz. “It was lovely working with the two of them. They were very collaborative but were very clear on what they wanted.”

Bogdanowicz also adapted a film emulation LUT, which was based on the characteristics of a Fujifilm print stock and added in a subtle hint of digital grain, via a Boris FX Sapphire plug-in, to help add a unifying look and filmic feel to the imagery. At the very start of the process, the colorist recalls, “I showed Jen and Michelle a number of ‘recipes’ for looks and they fell in love with this one. It’s somewhat subtle and elegant and it made ‘electric’ colors not feel so electric but has a film-style curve with strong contrast in the mids and shadows you can still see into.”

McGowan says she was quite pleased with the work that came out of the color theater. “Color is not one of the things audiences usually pick up on, but a lot of people do when they see Rust Creek. It’s not highly stylized, and it certainly isn’t a distracting element, but I’ve found a lot of people have picked up on what we were doing with color and I think it definitely helped make the story that much stronger.”

Rust Creek is currently streaming on Amazon Prime and Google.

SciTech Medallion Recipient: A conversation with Curtis Clark, ASC

By Barry Goch

The Academy of Motion Pictures Arts & Sciences has awarded Curtis Clark, ASC, the John A. Bonner Medallion “in appreciation for outstanding service and dedication in upholding the high standards of the Academy.” The presentation took place in early February and just prior to the event, I spoke to Clark and asked him to reflect on the transition from film to digital cinema and his contributions to the industry.

Clark’s career as a cinematographer includes features, TV and commercials. He is also the chair of the ASC Motion Imaging Technology Council that developed the ASC- CDL.

Can you reflect on the changes you’ve seen over your career and how you see things moving ahead in the future?
Once upon a time, life was an awful lot simpler. I look back on it nostalgically when it was all film-based, and the possibilities of the cinematographer included follow-up on the look of dailies and also follow through with any photographic testing that helped to hone in on the desired look. It had its photochemical limitations; its analog image structure was not as malleable or tonally expansive as the digital canvas we have now.

Do you agree that Kodak’s Cineon helped us to this digital revolution — the hybrid film/digital imaging system where you would shoot on film, scan it and then digitally manipulate it before going back out to film via a film recorder?
That’s where the term digital intermediate came into being, and it was an eye opener. I think at the time not everyone fully understood the ramifications of the sort of impact it was making. Kodak created something very potent and led the way in terms of methodologies, or how to arrive at integration of digital into what was then called a hybrid imaging system —combining digital and film together.

The DCI (Digital Cinema Initiative) was created to establish digital projection standards. Without a standard we’d potentially be creating chaos in terms of how to move forward. For the studios, distributors and exhibitors, it would be a nightmare Can you talk about that?
In 2002, I had been asked to form a technology committee at the ASC to explore these issues: how the new emerging digital technologies were impacting the creative art form of cinematography and of filmmaking, and also to help influence the development of these technologies so they best serve the creative intent of the filmmaker.

DCI proposed that for digital projection to be considered ready for primetime, its image quality needed to be at least as good as, if not better than, a print from the original negative. I thought this was a great commitment that the studios were making. For them to say digital projection was going to be judged against a film print projection from the original camera negative of the exact same content was a fantastic decision. Here was a major promise of a solution that would give digital cinema image projection an advantage since most people saw release prints from a dupe negative.

Digital cinema had just reached the threshold of being able to do 2K digital cinema projection. At that time, 4K digital projection was emerging, but it was a bit premature in terms of settling on that as a standard. So you had digital cinema projection and the emergence of a sophisticated digital intermediate process that could create the image quality you wanted from the original negative, but projected on a digital projection.

In 2004, the Michael Mann film Collateral film was shot with the Grass Valley Viper Film Stream, the Sony F900 and Sony F950 cameras, the latest generation of digital motion picture cameras — basically video cameras that were becoming increasingly sophisticated with better dynamic range and tonal contrast, using 24fps and other multiple frame rates, but 24p was the key.
These cameras were used in the most innovative and interesting manner, because Mann combined film with digital, using the digital for the low-light level night scenes and then using film for the higher-light level day exterior scenes and day interior scenes where there was no problem with exposure.

Because of the challenge of shooting the night scenes, they wanted to shoot at such low light levels that film would potentially be a bit degraded in terms of grain and fog levels. If you had to overrate the negative, you needed to underexpose and overdevelop it, which was not desirable, whereas the digital cameras thrived in lower light levels. Also, you could shoot at a stop that gave you better depth of field. At the time, it was a very bold decision. But looking back on it historically, I think it was the inflection point that brought the digital motion picture camera into the limelight as a possible alternative to shooting on film.

That’s when they decided to do Camera Assessment Series tests, which evaluates all the different digital cinema cameras available at the time?
Yeah, with the idea being that we’d never compare two digital cameras together, we’d always compare the digital camera against a film reference. We did that first Camera Assessment Series, which was the first step in the direction of validating the digital motion picture camera as viable for shooting motion pictures compared with shooting on film. And we got part way there. A couple of the cameras were very impressive: the Sony F35, the Panavision Genesis, the Arri D21 and the Grass Valley Viper were pretty reasonable, but this was all still mainly within a 2K (1920×1080) realm. We had not yet broached that 4K area.

A couple years later, we decided to do this again. It was called the Image Control Assessment Series, ICAS. That was shot at Warner Bros. It was the scenes that we shot in a café — daylight interior and then night time exterior. Both scenes had a dramatically large range of contrast and different colors in the image. It was the big milestone. The new Arri Alexa was used, along with the Sony F65 and the then latest versions of the Red cameras.

So we had 4K projection and 4K cameras and we introduced the use of ACES (Academy Color Encoding System) color management. So we were really at the point where all the key components that we needed were beginning to come together. This was the first instance where these digital workflow components were all used in a single significant project testing. Using film as our common benchmark reference — How are these cameras in relation to film? That was the key thing. In other words, could we consider them to be ready for prime time? The answer was yes. We did that project in conjunction with the PGA and a company called Revelations Entertainment, which is Morgan Freeman’s company. Lori McCreary, his partner, was one of the producers who worked with us on this.

So filmmakers started using digital motion picture cameras instead of film. And with digital cinema having replaced film print as a distribution medium, these new generation digital cameras started to replace film as an image capture medium. Then the question was would we have an end-to-end digital system that would become potentially viable as an alternative to shooting on film.

L to R: Josh Pines, Steve MacMillan, Curtis Clark and Dhanendra Patel.

Part of the reason you are getting this acknowledgement from the Academy is your dedication on the highest quality of image and the respect for the artistry, from capture through delivery. Can you talk about your role in look management from on-set through delivery?
I think we all need to be on the same page; it’s one production team whose objective is maintaining the original creative intent of the filmmakers. That includes director and cinematographer and working with an editor and a production designer. Making a film is a collective team effort, but the overall vision is typically established by the director in collaboration with the cinematographer and a production designer. The cinematographer is tasked with capturing that with lighting, with camera composition, movement, lens choices — all those elements that are part of the process of creative filmmaking. Once you start shooting with these extremely sophisticated cameras, like the Sony F65 or Venice, Panavision Millennium DXL, an Arri or the latest versions of the Red camera, all of which have the ability to reproduce high dynamic range, wide color gamut and high resolution. All that raw image data is inherently there and the creative canvas has certainly been expanded.

So if you’re using these creative tools to tell your story, to advance your narrative, then you’re doing it with imagery defined by the potential of what these technologies are able to do. In the modern era, people aren’t seeing dailies at the same time, not seeing them together under controlled circumstances. The viewing process has become fragmented. When everyone had to come together to view projected dailies, there was a certain camaraderie constructive contributions that made the filmmaking process more effective. So if something wasn’t what it should be, then everyone could see exactly what it was and make a correction if you needed to do that.

But now, we have a more dispersed production team at every stage of the production process, from the initial image capture through to dailies, editorial, visual effects and final color grading. We have so many different people in disparate locations working on the production who don’t seem to be as unified, sometimes, as we were when it was all film-based analog shooting. But now, it’s far easier and simpler to integrate visual effects into your workflow. Like Cineon indicated when it first emerged, you could do digital effects as opposed to optical effects and that was a big deal.

So coming back to the current situation, and particularly now with the most advanced forms of imaging, which include high dynamic range, wider color gamut, wider than even P3, REC 2020, having a color management system like ACES that actually has enough color gamut to be able to contain any color space that you capture and want to be able to manipulate.

Can you talk about the challenges you overcame, and how that fits into the history of cinema as it relates to the Academy recognition you received?
As a cinematographer, working on feature films or commercials, I kept thinking, if I’m fortunate enough to be able to manage the dailies and certainly the final color grading, there are these tools called lift gain gamma, which are common to all the different color correctors. But they’re all implemented differently. They’re not cross-platform-compatible, so the numbers from a lift gain gamma — which is the primary RGB grading — from one color corrector will not translate automatically to another color corrector. So I thought, we should have a cross platform version of that, because that is usually seen as the first step for grading.

That’s about as basic as you can get, and it was designed so that it would be a cross-platform implementation, so that everybody who installs and applies the ASC-CDL in a color grading system compatible with that app, whether you did it on a DaVinci, Baselight, Lustre or whatever you were using, the results would be the same and transferable.

You could transport those numbers from one set-up on set using a dailies creation tool, like ColorFront for example. You could then use the ASC CDL to establish your dailies look during the shoot, not while you’re actually shooting, but with the DIT to establish a chosen look that could then be applied to dailies and used for VFX.

Then when you make your way into the final color grading session with the final cut — or whenever you start doing master color grading going back to the original camera source — you would have these initial grading corrections as a starting point as references. This now gives you the possibility of continuing on that color grading process using all the sophistication of a full color corrector, whether it’s power windows or secondary color correction. Whatever you felt you needed to finalize the look.

I was advocating this in the ASC Technology Committee, as it was called, now subsequently renamed the Motion Imaging Technology Council (MITC). We needed a solution like this and there were a group of us who got together and decided that we would do this. There were plenty of people who were skeptical, “Why would you do something like that when we already have lift gain gamma? Why would any of the manufacturers of the different color grading systems integrate this into their system? Would it not impinge upon their competitive advantage if they had a system that people were used to using, and if their own lift gain gamma would work perfectly well for them, why would they want to use the ASC CDL?

We live in a much more fragmented post world, and I saw that becoming even more so with the advances of digital. The ASC CDL would be a “look unifier” that would establish initial look parameters. You would be able to have control over the look at every stage of the way.

I’m assuming that the cinematographer would work with the director and editor, and they would assess certain changes that probably should be made because we’re now looking at cut sequences and what we had thought would be most appropriate when we were shooting is now in the context of an edit and there may need to be some changes and adjustments.

Were you involved in ACES? Was it a similar impetus for ACES coming about? Or was it just spawned because visual effects movies became so big and important with the advent of digital filmmaking?
It was bit of both, including productions without VFX. So I would say that initially it was driven by the fact that there really should be a standardized color management system. Let me give you an example of what I’m talking about. When we were all photochemical and basically shooting with Kodak stock, we were working with film-based Kodak color science.

It’s a color science that everybody knew and understood, even if they didn’t understand it from an engineering photochemical point of view, they understood the effects of it. It’s what helps enable the look and the images that we wanted to create.

That was a color management system that was built into film. That color science system could have been adapted into the digital world, but Kodak resisted that because of the threat to negatives. If you apply that film color science to digital cameras, then you’re making digital cameras look more like film and that could pose a threat to the sale of color film negative.

So that’s really where the birth of ACES came about — to create a universal, unified color management system that would be appropriate anywhere you shot and with the widest possible color gamut. And it supports any camera or display technology because it would always have a more expanded (future proofing) capability within which the digital camera and display technologies would work effectively and efficiently but accurately, reliably and predictably.

Very early on, my ASC Technology Committee (now called Motion Imaging Technology Council) got involved with ACES development and became very excited about it. It was the missing ingredient needed to be able to make the end-to-end digital workflow the success that we thought that it could become. Because we no longer could rely on film-based color science, we had to either replicate that or emulate it with a color management system that could accommodate everything we wanted to do creatively. So ACES became that color management system.

So, in addition to becoming the first cross-platform primary color grading tool, the ASC CDL became the first official ACES look modification transform. Because ACES is not a color grading tool, it’s a color management system, you have to have color grading tools with color management. So you have the color management with ACES, you have the color grading with ASC CDL and the combination of those together is the look management system because it takes them all to make that work. And it’s not that the ASC CDL is the only tool you use for color grading, but it has the portable cross-platform ability to be able to control the color grading from dailies through visual effects up to the final color grade when you’re then working with a sophisticated color corrector.

What do you see for the future of cinematography and the merging of the worlds of post and on-set work and, what do you see as future challenges for future integrations between maintaining the creative intent and the metadata.
We’re very involved in metadata at the moment. Metadata is a crucial part of making all this work, as you well know. In fact, we worked on the common 3D LUT format, which we worked on with the Academy. So there is a common 3D LUT format that is something that would again have cross-platform consistency and predictability. And it’s functionality and its scope of use would be better understood if everyone were using it. It’s a work in progress. Metadata is critical.

I think as we expand the canvas and the palette of the possibility of image making, you have to understand what these technologies are capable of doing, so that you can incorporate them into your vision. So if you’re saying my creative vision includes doing certain things, then you would have to understand the potential of what they can do to support that vision. A very good example in the current climate is HDR.

That’s very controversial in a lot of ways, because the set manufacturers really would love to have everything just jump off the screen to make it vibrant and exciting. However, from a storytelling point of view, it may not be appropriate to push HDR imagery where it distracts from the story.
Well, it depends on how it’s done and how you are able to use that extended dynamic range when you have your bright highlights. And you can use foreground background relationships with bigger depth of field for tremendous effect. They have a visceral presence, because they have a dimensionality when, for example, you see the bright images outside of a window.

When you have an extended dynamic range of scene tones that could add dimensional depth to the image, you can choreograph and stage the blocking for your narrative storytelling with the kind of images that take advantage of those possibilities.

So HDR needs to be thought of as something that’s integral to your storytelling, not just something that’s there because you can do it. That’s when it can become a distraction. When you’re on set, you need a reference monitor that is able to show and convey, all the different tonal and color elements that you’re working with to create your look, from HDR to wider color gamut, whatever that may be, so that you feel comfortable that you’ve made the correct creative decision.

With virtual production techniques, you can incorporate some of that into your live-action shooting on set with that kind of compositing, just like James Cameron started with Avatar. If you want to do that with HDR, you can. The sky is the limit in terms of what you can do with today’s technology.

So these things are there, but you need to be able to pull them all together into your production workflow to make sure that you can comfortably integrate in the appropriate way at the appropriate time. And that it conforms to what the creative vision for the final result needs to be and then, remarkable things can happen. The aesthetic poetry of the image can visually drive the narrative and you can say things with these images without having to be expositional in your dialogue. You can make it more of an experientially immersive involvement with the story. I think that’s something that we’re headed toward, that’s going to make the narrative storytelling very interesting and much more dynamic.

Certainly, and certainly with the advancements of consumer technology and better panels and the high dynamic range developments, and Dolby Vision coming into the home and Atmos audio coming into the home. It’s really an amazing time to be involved in the industry; it’s so fun and challenging.

It’s a very interesting time, and a learning curve needs to happen. That’s what’s driven me from the very beginning and why I think our ASC Motion Imaging Technology Council has been so successful in its 16 years of continuous operation influencing the development of some of these technologies in very meaningful ways. But always with the intent that these new imaging technologies are there to better serve the creative intent of the filmmaker. The technology serves the art. It’s not about the technology per se, it’s about the technology as the enabling component of the art. It enables the art to happen. And expands it’s scope and possibility to broader canvases with wider color gamuts in ways that have never been experienced or possible before.


Barry Goch is a Finishing Artist at The Foundation and a Post Production Instructor at UCLA Extension. You can follow him on Twitter at @gochya.

Sundance Videos: Watch our editor interviews

postPerspective traveled to Sundance for the first time this year, and it was great. In addition to attending some parties, brunches and panels, we had the opportunity to interview a number of editors who were in Park City to help promote their various projects. (Watch here.)

Billy McMillin

We caught up with the editors on the comedy docu-series Documentary Now!, Michah Gardner and Jordan Kim. We spoke to Courtney Ware about cutting the film Light From Light, as well as Billy McMillin, editor on the documentary Mike Wallace is Here. We also chatted with Phyllis Housen, the editor on director Chinonye Chukwu’s Clemency and Kent Kincannon who cut Hannah Pearl Utt’s comedy, Before you Know It. Finally, we sat down with Bryan Mason, who had the dual roles of cinematographer and editor on Animals.

We hope you enjoy watching these interviews as much as we enjoyed shooting them.

Don’t forget, click here to view!

Oh, and a big shout out to Twain Richardson from Jamaica’s Frame of Reference, who edited and color graded the videos. Thanks Twain!

More Than Just Words: Lucky Post helps bring Jeep’s viral piece to life


Jeep’s More Than Words commercial, out of agency The Richards Group, premiered online just prior to this year’s Super Bowl as part of its Big Game Blitz, which saw numerous projects launched leading up to the Super Bowl.

Quickly earning millions of views, the piece features a version of our national anthem by One Republic, as well as images of the band. The two-minute spot is made up of images of small, everyday moments that add up to something big and evoke a feeling of America.

There is a father and his infant son, people gathered in front of a barn, a football thrown through a hanging tire swing. We see bits of cities and suburbs, football, stock images of Marilyn Monroe and soldiers training for battle — and every once in a while, an image of a Jeep is in view.

The spot ends as it began, with images of One Republic in the studio before the screen goes black and text appears reading: More Than Just Words. Then the Jeep logo appears.

The production Company was Zoom USA with partner Mark Toia directing. Lucky Post in Dallas contributed editorial, color, sounds design and finish to the piece.

Editor Sai Selvarajan used Adobe’s Premiere. Neil Anderson provided the color grade in Blackmagic Resolve, while Scottie Richardson performed the sound design and mix using Avid Pro Tools. Online finishing and effects were via Tim Nagle, who worked in Autodesk Flame.

“The concept is genius in its simplicity; a tribute to faith in our country’s patchwork with our anthem’s words reinforced and represented in image,” says Lucky Post’s Selvarajan. “Behind the scenes, everyone provided collective energy and creativity to bring it to life. It was the product of many, just like the message of the film, and I was so excited to see the groundswell of positive reaction.”

 

 

 

Industry vets open editorial, post studio Made-SF

Made-SF, a creative studio offering editorial and other services, has been launched by executive producer Jon Ettinger, editor/director Doug Walker and editors Brian Lagerhausen and Connor McDonald, all formerly of Beast Editorial. Along with creative editorial (Adobe Premiere), the company will provide motion graphic design (After Effects, Mocha), color correction and editorial finishing (likely Flame and Resolve). Eventually, it plans to add concept development, directing and production to its mix.

“Clients today are looking for creative partners who can help them across the entire production chain,” says Ettinger. “They need to tell stories and they have limited budgets available to tell them. We know how to do both, and we are gathering the resources to do so under one roof.”

Made is currently set up in interim quarters while completing construction of permanent studio space. The latter will be housed in a century-old structure in San Francisco’s North Beach neighborhood and will feature five editorial suites, two motion graphics suites, and two post production finishing suites with room for further expansion.

The four Made partners bring deep experience in traditional advertising and branded content, working both with agencies and directly with clients. Ettinger and Walker have worked together for more than 20 years and originally teamed up to launch FilmCore, San Francisco. Both joined Beast Editorial in 2012. Similarly, Lagerhausen and McDonald have been editing in the Bay Area for more than two decades. Collectively, their credits include work for agencies in San Francisco and nationwide. They’ve also helped to create content directly for Google, Facebook, LinkedIn, Salesforce and other corporate clients.

Made is indicative of a trend where companies engaged in content development are adopting fluid business models to address a diversifying media landscapes and where individual talent is no longer confined to a single job title. Walker, for example, has recently served as director on several projects, including a series of short films for Kelly Services, conceived by agency Erich & Kallman and produced by Caruso Co.

“People used to go to great pains to make a distinction about what they do,” Ettinger observes. “You were a director or an editor or a colorist. Today, those lines have blurred. We are taking advantage of that flattening out to offer clients a better way to create content.”

Main Image Caption: (L-R) Doug Walker, Brian Lagerhausen, Jon Ettinger and Connor McDonald.

Quick Chat: Crew Cuts’ Nancy Jacobsen and Stephanie Norris

By Randi Altman

Crew Cuts, a full-service production and post house, has been a New York fixture since 1986. Originally established as an editorial house, over the years as the industry evolved they added services that target all aspects of the workflow.

This independently-owned facility is run by executive producer/partner Nancy Jacobsen, senior editor/partner Sherri Margulies Keenan and senior editor/partner Jake Jacobsen. While commercial spots might be in their wheelhouse, their projects vary and include social media, music videos and indie films.

We decided to reach out to Nancy Jacobsen, as well as EP of finishing Stephanie Norris, to find out about trends, recent work and succeeding in an industry and city that isn’t always so welcoming.

Can you talk about what Crew Cuts provides and how you guys have evolved over the years?
Jacobsen: We pretty much do it all. We have 10 offline editors as well as artists working in VFX, 2D/3D animation, motion graphics/design, audio mix and sound design, VO record, color grading, title treatment, advanced compositing and conform. Two of our editors double as directors.

In the beginning, Crew Cuts primarily offered only editorial. As the years went by and the industry climate changed we began to cater to the needs of clients and slowly built out our entire finishing department. We started with some minimal graphics work and one staff artist in 2008.

In 2009, we expanded the team to include graphics, conform and audio mix. From there we just continued to grow and expand our department to the full finishing team we have today.

As a woman owner of a post house, what challenges have you had to overcome?
Jacobsen: When I started in this business, the industry was very different. I made less money than my male counterparts and it took me twice as long to be promoted because I am a woman. I have since seen great change where women are leading post houses and production houses and are finally getting the recognition for the hard work they deserve. Unfortunately, I had to “wait it out” and silently work harder than the men around me. This has paid off for me, and now I can help women get the credit they rightly deserve

Do you see the industry changing and becoming less male-dominated?
Jacobsen: Yes, the industry is definitely becoming less male-dominated. In the current climate, with the birth of the #metoo movement and specifically in our industry with the birth of Diet Madison Avenue (@dietmadisonave), we are seeing a lot more women step up and take on leading roles.

Are you mostly a commercial house? What other segments of the industry do you work in?
Jacobsen: We are primarily a commercial house. However, we are not limited to just broadcast and digital commercial advertising. We have delivered specs for everything from the Godzilla screen in Times Square to :06 spots on Instagram. We have done a handful of music videos and also handle a ton of B2B videos for in-house client meetings, etc., as well as banner ads for conferences and trade shows. We’ve even worked on display ads for airports. Most recently, one of our editors finished a feature film called Public Figure that is being submitted around the film festival circuit.

What types of projects are you working on most often these days?
Jacobsen: The industry is all over the place. The current climate is very messy right now. Our projects are extremely varied. It’s hard to say what we work on most because it seems like there is no more norm. We are working on everything from sizzle pitch videos to spots for the Super Bowl.

What trends have you seen over the last year, and where do you expect to be in a year?
Jacobsen: Over the last year, we have noticed that the work comes from every angle. Our typical client is no longer just the marketing agency. It is also the production company, network, brand, etc. In a year we expect to be doing more production work. Seeing as how budgets are much smaller than they used to be and everyone wants a one-stop shop, we are hoping to stick with our gut and continue expanding our production arm.

Crew Cuts has beefed up its finishing services. Can you talk about that?
Stephanie Norris: We offer a variety of finishing services — from sound design to VO record and mix, compositing to VFX, 2D and 3D motion graphics and color grading. Our fully staffed in-house team loves the visual effects puzzle and enjoys working with clients to help interpret their vision.

Can you name some recent projects and the services you provided?
Norris: We just worked on a new campaign for New Jersey Lottery in collaboration with Yonder Content and PureRed. Brian Neaman directed and edited the spots. In addition to editorial, Crew Cuts also handled all of the finishing, including color, conform, visual effects, graphics, sound design and mix. This was one of those all-hands-on-deck projects. Keeping everything under one roof really helped us to streamline the process.

New Jersey Lottery

Working with Brian to carefully plan the shooting strategy, we filmed a series of plate shots as elements that could later be combined in post to build each scene. We added falling stacks of cash to the reindeer as he walks through the loading dock and incorporated CG inflatable decorations into a warehouse holiday lawn scene. We also dramatically altered the opening and closing exterior warehouse scenes, allowing one shot to work for multiple seasons. Keeping lighting and camera positions consistent was mission-critical, and having our VFX supervisor, Dulany Foster, on set saved us hours of work down the line.

For the New Jersey Lottery Holiday spots, the Crew Cuts CG team, led by our creative director Ben McNamara created a 3D Inflatable display of lottery tickets. This was something that proved too costly and time consuming to manufacture and shoot practically. After the initial R&D, our team created a few different CG inflatable simulations prior to the shoot, and Dulany was able to mock them up live while on set. Creating the simulations was crucial for giving the art department reference while building the set, and also helped when shooting the plates needed to composite the scene together.

Ben and his team focused on the physics of the inflation, while also making sure the fabric simulations, textures and lighting blended seamlessly into the scene — it was important that everything felt realistic. In addition to the inflatables, our VFX team turned the opening and closing sunny, summer shots of the warehouse into a December winter wonderland thanks to heavy compositing, 3D set extension and snow simulations.

New Jersey Lottery

Any other projects you’d like to talk about?
Jacobsen: We are currently working on a project here that we are handling soup to nuts from production through finishing. It was a fun challenge to take on. The spot contains a hand model on a greenscreen showing the audience how to use a new product. The shoot itself took place here at Crew Cuts. We turned our common area into a stage for the day and were able to do so without interrupting any of the other employees and projects going on.

We are now working on editorial and finishing. The edit is coming along nicely. What really drives the piece here is the graphic icons. Our team is having a lot of fun designing these elements and implementing them into the spot. We are so proud because we budgeted wisely to make sure to accommodate all of the needs of the project so that we could handle everything and still turn a profit. It was so much fun to work in a different setting for the day and has been a very successful project so far. Clients are happy and so are we.

Main Image: (L-R) Stephanie Norris and Nancy Jacobsen

Company 3 to open Hollywood studio, adds Roma colorist Steve Scott

Company 3 has added Steve Scott as EVP/senior finishing artist. His long list of credits includes Alfonso Cuarón’s Oscar-nominated Roma and Gravity; 19 Marvel features, including The Avengers, Iron Man and Guardians of the Galaxy franchises; and many Academy-Award-winning films, including The Jungle Book, Birdman or The Unexpected Virtue of Ignorance and The Revenant (both took Oscars for director Alejandro Iñárritu and cinematographer Emmanuel Lubezki).

Roma

The addition of Scott comes at a time when Company 3 is completing work on a new location at 950 Lillian Way in Hollywood. This new space represents the first phase of a planned much larger footprint in that area of Los Angeles. This new space will enable the company to significantly expand its capacity while providing the level of artistry and personalized service the industry expects from Company 3. It will also enable them to service more East Side and Valley-based clients.

“Steve is someone I’ve always wanted to work with and I am beyond thrilled that he has agreed to work with us at Company 3,” says CEO Stefan Sonnenfeld. “As we continue the process of re-imagining the entire concept of what ‘post production’ means creatively and technically, it makes perfect sense to welcome a leading innovator and brilliant artist to our team.”

Sonnenfeld and Scott will oversee every facet of this new boutique-style space to ensure it offers the same flexible experience clients have come to expect when working at Company 3. Scott, a devoted student of art and architecture, with extensive professional experience as a painter and architectural illustrator, says, “The opportunity to help design a new cutting-edge facility in my Hollywood hometown was too great to pass up.”

Scott oversees a team of additional artists to offer filmmakers the significantly increased ability to augment and refine imagery as part of the finishing process.

“The industry is experiencing a renaissance of content,” says Sonnenfeld. “The old models of feature film vs. television, long- vs. short-form are changing rapidly. Workflows and delivery methods are undergoing revolutionary changes with more content, and innovative content, coming from a whole array of new sources. It’s a very exciting and challenging time and I think these major additions to our roster and infrastructure will go a long way towards our goal of continuing Company 3’s role as a major force in the industry.”

Main Image Credit: 2018 HPA Awards Ceremony/Ryan Miller/Capture Imaging

Efilm’s Natasha Leonnet: Grading Spider-Man: Into the Spider-Verse

By Randi Altman

Sony Pictures’ Spider-Man: Into the Spider-Verse is not your typical Spider-Man film… in so many ways. The most obvious is the movie’s look, which was designed to make the viewer feel they are walking inside a comic book. This tale, which blends CGI with 2D hand-drawn animation and comic book textures, focuses on a Brooklyn teen who is bitten by a radioactive spider on the subway and soon develops special powers.

Natasha Leonnet

When he meets Peter Parker, he realizes he’s not alone in the Spider-Verse. It was co-directed by Peter Ramsey, Robert Persichetti Jr. and Rodney Rothman and produced by Phil Lord and Chris Miller, the pair behind 21 Jump Street and The Lego Movie.

Efilm senior colorist Natasha Leonnet provided the color finish for the film, which was nominated for an Oscar in the Best Animated Feature category. We reached out to find out more.

How early were you brought on the film?
I had worked on Angry Birds with visual effects supervisor Danny Dimian, which is how I was brought onto the film. It was a few months before we started color correction. Also, there was no LUT for the film. They used the ACES workflow, developed by The Academy and Efilm’s VP of technology, Joachim “JZ” Zell.

Can you talk about the kind of look they were after and what it took to achieve that look?
They wanted to achieve a comic book look. You look at the edges of characters or objects in comic books and you actually see aspects of the color printing from the beginning of comic book printing — the CMYK dyes wouldn’t all be the same line — it creates a layered look along with the comic book dots and expression lines on faces, as if you’re drawing a comic book.

For example, if someone gets hurt you put actual slashes on their face. For me it was a huge education about the comic book art form. Justin Thompson, the art director, in particular is so knowledgeable about the history of comic books. I was so inspired I just bought my first comic book. Also, with the overall look, the light is painting color everywhere the way it does in life.

You worked closely Justin, VFX supervisor Danny Dimian and art director Dean Gordon What was that process like?
They were incredible. It was usually a group of us working together during the color sessions — a real exercise in collaboration. They were all so open to each other’s opinions and constantly discussing every change in order to make certain that the change best served the film. There was no idea that was more important than another idea. Everyone listened to each other’s ideas.

Had you worked on an animated film previously? What are the challenges and benefits of working with animation?
I’ve been lucky enough to do all of Blue Sky Studios’ color finishes so far, except for the first Ice Age. One of the special aspects of working on animated films is that you’re often working with people who are fine-art painters. As a result, they bring in a different background and way of analyzing the images. That’s really special. They often focus on the interplay of different hues.

In the case of Spider-Man: Into the Spider-Verse, they also wanted to bring a certain naturalism to the color experience. With this particular film, they made very bold choices with their use of color finishing. They used an aspect of color correctors that are used to shift all of the hues and colors; that’s usually reserved for music videos. They completely embraced it. They were basically using color finishing to augment the story and refine their hues, especially time of day and progression of the day or night. They used it as their extra lighting step.

Can you talk about your typical process? Did that differ because of the animated content?
My process actually does not differ when I’m color finishing animated content. Continuity is always at the forefront, even in animation. I use the color corrector as a creative tool on every project.

How would you describe the look of the film?
The film embodies the vivid and magical colors that I always observed in childhood but never saw reflected on the screen. The film is very color intense. It’s as if you’re stepping inside a comic book illustrator’s mind. It’s a mind-meld with how they’re imagining things.

What system did you use for color and why?
I used Resolve on this project, as it was the system that the clients were most familiar with.

Any favorite parts of the process?
My favorite part is from start to finish. It was all magical on this film.

What was your path to being a colorist?
My parents loved going to the cinema. They didn’t believe in babysitters, so they took me to everything. They were big fans of the French new wave movement and films that offered unconventional ways of depicting the human experience. As a result, I got to see some pretty unusual films. I got to see how passionate my parents were about these films and their stories and unusual way of telling them, and it sparked something in me. I think I can give my parents full credit for my career.

I studied non-narrative experimental filmmaking in college even though ultimately my real passion was narrative film. I started as a runner in the Czech Republic, which is where I’d made my thesis film for my BA degree. From there I worked my way up and met a colorist (Biggi Klier) who really inspired me. I was hooked and lucky enough to study with her and another mentor of mine in Munich, Germany.

How do you prefer a director and DP describe a look?
Every single person I’ve worked with works differently, and that’s what makes it so fun and exciting, but also challenging. Every person communicates about color differently and our vocabulary for color is so limited, therein lies the challenge.

Where do you find inspiration?
From both the natural world and the world of films. I live in a place that faces east, and I get up every morning to watch the sunrise and the color palette is always different. It’s beautiful and inspiring. The winter palettes in particular are gorgeous, with reds and oranges that don’t exist in summer sunrises.

VFX studio Electric Theatre Collective adds three to London team

London visual effects studio Electric Theatre Collective has added three to its production team: Elle Lockhart, Polly Durrance and Antonia Vlasto.

Lockhart brings with her extensive CG experience, joining from Touch Surgery where she ran the Johnson & Johnson account. Prior to that she worked at Analog as a VFX producer where she delivered three global campaigns for Nike. At Electric, she will serve as producer on Martini and Toyota.

Vlasto joins Electric working on clients such Mercedes, Tourism Ireland and Tui. She joins from 750MPH where, over a four-year period, she served as producer on Nike, Great Western Railway, VW and Amazon to name but a few.

At Electric, Polly Durrance will serve as producer on H&M, TK Maxx and Carphone Warehouse. She joins from Unit where she helped launched their in-house Design Collective, worked with clients such as Lush, Pepsi and Thatchers Cider. Prior to Unit Polly was at Big Buoy where she produced work for Jaguar Land Rover, giffgaff and Redbull.

Recent projects at the studio, which also has an office in Santa Monica, California, include Tourism Ireland Capture Your Heart and Honda Palindrome.

Main Image: (L-R) Elle Lockhart, Antonia Vlasto and Polly Durrance.

Color grading The Favourite

Yorgos Lanthimos’ historical comedy, The Favourite, has become an awards show darling. In addition to winning 10 British Independent Film Awards, it also dominated the BAFTA nominations with 12 nods, including Best Film, Best Director, Best Editing, and Best Cinematography for Robbie Ryan, BSC, ISC, who scored an ASC Award nom as well.

Final picture post on the black comedy was completed by Goldcrest Post in London using DaVinci Resolve Studio. The Century Fox film’s DI was overseen by Goldcrest producer Jonathan Collard, with senior colorist Rob Pizzey providing the grade. He was assisted by Maria Chamberlain, while Russell White completed the online edit.

The film stars Olivia Colman (who one a Golden Globe for her role), Emma Stone and Rachel Weisz.

Lensed by Ryan, The Favourite was shot on a mixture of Kodak 500T 5219 and 200T 5207 film stocks with Timothy Jones of Digital Film Bureau scanning the 35mm film negative for the grade at Goldcrest. To capture the full dynamic range of modern film stock, the 2K ARRI scanner was set to 2.5 density range with drama scanning beginning once the edit was locked.

According to colorist Pizzey, once scanned almost everything seen on-screen exposure-wise is what came straight out of the camera. “Robbie did such an amazing job; there were only a handful of shots where I had to tweak the film grain back a little bit.

“In some respects, grading on film can be harder,” he continues. “It does take a lot more balancing because of variations in the scanning process and film stocks. Conversely, with digital capture you have a pretty good balance to begin with, if you start with the CDL values from the digital rushes process.”

Rob Pizzey

He says the way the director worked was very interesting. “Basically, we kept the images very natural and didn’t rely on too many secondaries. Instead, we focused on manipulating the palette using primary color correction to achieve an organic, naturalistic look. It sounds easy, but in truth, it is quite difficult. We started early testing on some of the dailies, a mix of interior and exterior shots, both day and night, to get an idea of where the director and DP wanted to go. We then pushed on with that into the DI.”

DP Ryan wasn’t able to attend the grade, so it was just Pizzey and the director.

“There was a lot of colorization going on in the bottom end of the picture, whether it’s in the shadows and deep blacks or playing with the highlights to create something that looked interesting,” says Pizzey. “We were ultimately still creating a look, it is just a lot more subtle, which is where the challenge lies.”

Most of the film was shot relying on available light only. “There was hardly any artificial lighting used at all during principal photography,” he reports. “The candlelit scenes at night relied solely on the candles themselves and, as you can imagine, there were a lot of candles. The blacks in those scenes are really inky.”

The night scenes were especially tough to complete, with Pizzey relying on Resolve’s primary grading toolset. “Those scenes are very rich and very warm, so we automatically backed off the warmth and tried to dial it down by adding some desaturation. However, it just didn’t look right,” he explains. “We then stripped the grade back and tried to stay as close to what had come out of the camera as we could, with only a few subtle tweaks here and there.”

Looking to embrace the contrast of the film stock, everything about the grade was all very natural and subtle. “For the first couple of weeks everything was about the primaries, and it was only toward the end of the DI that we began to use window shapes and keys on shots that we couldn’t otherwise get to work using primaries alone.

“There was one scene in particular where Yorgos and Robbie had to go back and shoot it five weeks later. Coming into the grade, there were a number of notable differences between the trees, moving from winter into spring, which meant the trees were beginning to bud.”

The Favourite is in theaters now.

Roy H. Wagner, ASC, to speak at first Blackmagic Collective event

By Randi Altman

The newly formed Blackmagic Collective, a group founded by filmmakers for filmmakers, is holding the first of its free monthly meetings on Saturday, January 12 at the Gnomon School of Visual Effects in Hollywood.

The group, headed up by executive director Brett Harrison, says they are dedicated to sharing info on the art of filmmaking as well as education. “With Blackmagic Design’s support, the group will feature ‘TED Talk’-like presentations from media experts, panels covering post and production topics and film festivals, as well as networking opportunities.”

In addition, Blackmagic Design is offering free Resolve training attached to the meetings. While Blackmagic is a sponsor, this is not a Blackmagic-run group. According to Harrison, “The Blackmagic Collective is an independent group created to support the art of filmmaking as a whole. We are also a 501(c)(3) charity, with plans to find ways to give back to the community.” Membership is free, with no application process. Members can simply sign-up on the site. Despite the name, Harrison insists that the group, while inspired by Blackmagic’s filmmaking tools, is focused on filmmaking as a whole. “You do not need to use BMD tools to be a member,” adds Harrison.

On creating the Collective, Harrison says, “After producing the Blackmagic Design Conference + Expo in LA early in 2018, I realized that a monthly group in Hollywood for filmmakers to learn from other professionals and share with and inspire each other would be well-received and vital, particularly for Blackmagic users in the industry. BMD allows for an end-to-end workflow that encompasses the spectrum of production and post, with endless topics for our group to focus on, though we will be speaking on a range of topics and not strictly BMD gear and software.”

At their first meeting, esteemed film and television cinematographer Roy H. Wagner, ASC, will be interviewed by Christian Sebaldt, ASC, with a focus on Roy’s new feature film Stand!. There will be a panel discussing the art and experiences of young colorists from Efilm, Apache and Company 3. Also, the Blackmagic Collective will be announcing a film festival that will start in April and end in November with a final competition. Filmmakers can submit films each month. Selected films will be streamed on the group website, with a select few shown at the monthly meetings starting in April. Members will have the opportunity to vote for the best each month, with a final competition for the top five films at the November event.

In case you were wondering, and we know you are, the current plan for the film festival is this:
“Our film festival submissions must use BMD technology to be eligible to enter the contest. That may include cameras, software or both, depending on the category,” explains Harrison.

The Collective will also be hosting job fairs at every other meeting.

“We are thrilled to be supporting the Blackmagic Collective,” says Blackmagic president Dan May. “Our company shares a passion with filmmakers by creating hardware and software that make their craft easier and more cost effective. We feel the Collective will provide the added resource of bringing a focus to the art form of filmmaking, as well as helping share new ideas and technology among creatives at all skill levels, from student to professional.”

You can sign up for the Resolve editing class or the event (or both) at the website.

Behind the Title: DigitalFilm Tree colorist Dan Judy

This color vet finds inspiration for his work in everyday sights, such as sunsets, views of the city and even music.

NAME: Colorist Dan Judy

COMPANY: DigitalFilm Tree (DFT)

CAN YOU DESCRIBE YOUR COMPANY?
DFT provides cloud post services and software that evolve file-based workflows, simplify the creative process, and dramatically reduce production cost.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
How creative the process is — it’s an amazing collaborative effort between the production team and color. Our attention to detail, both broad and minute, are almost surgical. It’s micro and macro. Oh, and having the right snacks available are absolutely critical!

Dan Judy

WHAT SYSTEM DO YOU WORK ON?
Blackmagic’s Resolve.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Nearly every project will have requests that are specific and non-color related. I was once asked to dry off an actress who was perspiring too much. At that time I didn’t have the towel function on my color corrector.

We are asked to help out with beauty fixes, add lens flares, light matches, remove footprints in sand . . . you get the idea.

WHAT’S YOUR FAVORITE PART OF THE JOB?
It is the satisfaction of the finished project, knowing that I got to contribute to the end result. It’s the confidence at the end of that process and putting the piece out there for people to enjoy.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
My first love was athletics, especially football. Would I have been a player? I had my shot and, well, I’m here. I’m sure my path would have continued in that direction.

WHY DID YOU CHOOSE THIS PROFESSION?
I had no clue this position was even a thing. I got an internship at a post facility through my masters program in Florida. They offered me a position at the end of the internship and my career began. A lot of bumps and bruises later and, well, I feel blessed to be where this path has led me.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The 100, Last Man on Earth, the Roseanne relaunch, Falling Skies and a few years ago, The Walking Dead.

The 100

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I would say with a wink, the next one. I know it’s a cliché, but it’s like saying which of your children do you like better? I have been extraordinarily lucky that all my shows have given me a great deal of freedom to be really creative.

WHERE DO YOU FIND INSPIRATION?
Honestly, from life. Watching amazing sunsets, experiencing great expanses of nature. I also like having uplifting music on while I work.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
I would say electricity is a big one, big smile here. Professionally? A bitchin’ hero monitor, a great calibrated scope and Resolve.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Hanging with my family! They ground me every day and keep me honest. Their love is what keeps me wanting tomorrow to happen.

Catching up with Aquaman director James Wan

By Iain Blair

Director James Wan has become one of the biggest names in Hollywood thanks to the $1.5 billion-grossing Fast & Furious 7, as well as the Saw, Conjuring and Insidious films — three of the most successful horror franchises of the last decade.

Now the Malaysian-born, Australian-raised Wan, who also writes and produces, has taken on the challenge of bringing Aquaman and Atlantis to life. The origin story of half-surface dweller, half-Atlantean Arthur Curry stars Jason Momoa in the title role. Amber Heard plays Mera, a fierce warrior and Aquaman’s ally throughout his journey.

James Wan and Iain Blair

Additional cast includes Willem Dafoe as Vulko, council to the Atlantean throne; Patrick Wilson as Orm, the present King of Atlantis; Dolph Lundgren as Nereus, King of the Atlantean tribe Xebel; Yahya Abdul-Mateen II as the revenge-seeking Manta; and Nicole Kidman as Arthur’s mom, Atlanna.

Wan’s team behind the scenes included such collaborators as Oscar-nominated director of photography Don Burgess (Forrest Gump), his five-time editor Kirk Morri (The Conjuring), production designer Bill Brzeski (Iron Man 3), visual effects supervisor Kelvin McIlwain (Furious 7) and composer Rupert Gregson-Williams (Wonder Woman).

I spoke with the director about making the film, dealing with all the effects, and his workflow.

Aquaman is definitely not your usual superhero. What was the appeal of doing it? 
I didn’t grow up with Aquaman, but I grew up with other comic books, and I always was well aware of him as he’s iconic. A big part of the appeal for me was he’d never really been done before — not on the big screen and not really on TV. He’s never had the spotlight before. The other big clincher was this gave me the opportunity to do a world-creation film, to build a unique world we’ve never seen before. I loved the idea of creating this big fantasy world underwater.

What sort of film did you set out to make?
Something that was really faithful and respectful to the source material, as I loved the world of the comic book once I dove in. I realized how amazing this world is and how interesting Aquaman is. He’s bi-racial, half-Atlantean, half-human, and he feels he doesn’t really fit in anywhere at the start of the film. But by the end, he realizes he’s the best of both worlds and he embraces that. I loved that. I also loved the fact it takes place in the ocean so I could bring in issues like the environment and how we treat the sea, so I felt it had a lot of very cool things going for it — quite apart from all the great visuals I could picture.

Obviously, you never got the Jim Cameron post-Titanic memo — never, ever shoot in water.
(Laughs) I know, but to do this we unfortunately had to get really wet as over 2/3rds of the film is set underwater. The crazy irony of all this is when people are underwater they don’t look wet. It’s only when you come out of the sea or pool that you’re glossy and dripping.

We did a lot of R&D early on, and decided that shooting underwater looking wet wasn’t the right look anyway, plus they’re superhuman and are able to move in water really fast, like fish, so we adopted the dry-for-wet technique. We used a lot of special rigs for the actors, along with bluescreen, and then combined all that with a ton of VFX for the hair and costumes. Hair is always a big problem underwater, as like clothing it behaves very differently, so we had to do a huge amount of work in post in those areas.

How early on did you start integrating post and all the VFX?
It’s that kind of movie where you have to start post and all the VFX almost before you start production. We did so much prep, just designing all the worlds and figuring out how they’d look, and how the actors would interact with them. We hired an army of very talented concept artists, and I worked very closely with my production designer Bill Brzeski, my DP Don Burgess and my visual effects supervisor Kelvin McIlwain. We went to work on creating the whole look and trying to figure out what we could shoot practically with the actors and stunt guys and what had to be done with VFX. And the VFX were crucial in dealing with the actors, too. If a body didn’t quite look right, they’d just replace them completely, and the only thing we’d keep was the face.

It almost sounds like making an animated film.
You’re right, as over 90% of it was VFX. I joke about it being an animated movie, but it’s not really a joke. It’s no different from, say, a Pixar movie.

Did you do a lot of previs?
A lot, with people like Third Floor, Day For Nite, Halon, Proof and others. We did a lot of storyboards too, as they are quicker if you want to change a camera angle, or whatever, on the fly. Then I’d hand them off to the previs guys and they’d build on those.

What were the main technical challenges in pulling it all together on the shoot?
We shot most of it Down Under, near Brisbane. We used all nine of Village Roadshow Studios’ soundstages, including the new Stage 9, as we had over 50 sets, including the Atlantis Throne Room and Coliseum. The hardest thing in terms of shooting it was just putting all the actors in the rigs for the dry-for-wet sequences; they’re very cumbersome and awkward, and the actors are also in these really outrageous costumes, and it can be quite painful at times for them. So you can’t have them up there too long. That was hard. Then we used a lot of newish technology, like virtual production, for scenes where the actors are, say, riding creatures underwater.

We’d have it hooked up to the cameras so you could frame a shot and actually see the whole environment and the creature the actor is supposed to be on — even though it’s just the actors and bluescreen and the creature is not there. And I could show the actors — look, you’re actually riding a giant shark — and also tell the camera operator to pan left or right. So it was invaluable in letting me adjust performance and camera setups as we shot, and all the actors got an idea of what they were doing and how the VFX would be added later in post. Designing the film was so much fun, but executing it was a pain.

The film was edited by Kirk Morri, who cut Furious 7, and worked with you on the Insidious and The Conjuring films. How did that work?
He wasn’t on set but he’d visit now and again, especially when we were shooting something crazy and it would be cool to actually see it. Then we’d send dailies and he’d start assembling, as we had so much bluescreen and VFX stuff to deal with. I’d hop in for an hour or so at the end of each day’s shoot to go over things as I’m very hands on — so much so that I can drive editors crazy, but Kirk puts up with all that.

I like to get a pretty solid cut from the start. I don’t do rough assemblies. I like to jump straight into the real cut, and that was so important on this because every shot is a VFX shot. So the sooner you can lock the shot, the better, and then the VFX teams can start their work. If you keep changing the cut, then you’ll never get your VFX shots done in time. So we’d put the scene together, then pass it to previs, so you don’t just have actors floating in a bluescreen, but they’re in Atlantis or wherever.

Where did you do the post?
We did most of it back in LA on the Warner lot.

Do you like the post process?
I absolutely love it, and it’s very important to my filmmaking style. For a start, I can never give up editing and tweaking all the VFX shots. They have to pull it away from me, and I’d say that my love of all the elements of the post process — editing, sound design, VFX, music — comes from my career in suspense movies. Getting all the pieces of post right is so crucial to the end result and success of any film. This post was creatively so much fun, but it was long and hard and exhausting.

James Wan

All the VFX must have been a huge challenge.
(Laughs) Yes, as there’s over 2,500 VFX shots and we had everyone working on it — ILM, Scanline, Base, Method, MPC, Weta, Rodeo, Digital Domain, Luma — anyone who had a computer! Every shot had some VFX, even the bar scene where Arthur’s with his dad. That was a set, but the environment outside the window was all VFX.

What was the hardest VFX sequence to do?
The answer is, the whole movie. The trench sequence was hard, but Scanline did a great job. Anything underwater was tough, and then the big final battle was super-difficult, and ILM did all that.

Did the film turn out the way you hoped?
For the most part, but like most directors, I’m never fully satisfied.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Filmic adds Log V2 to FilmicPro

Filmic has added Log V2 within FilmicPro, its mobile filmmaking tool. Log V2 for FilmicPro offers to 2.5 stops of additional dynamic range for mobile devices, enabling the newest iPhone XR, XS and XS Max models to exceed 12 stops of total dynamic range at base ISO. Filmic says these results rival those of the Blackmagic Pocket Cinema Camera and the Panasonic Lumix GH5s.

Additionally, to further complement the new Log V2 curve, Filmic has increased the maximum target bit rate for 4K recording to 130Mbps on the latest-generation smartphones, delivering a higher-quality recording experience for mobile filmmakers.

Filmic has also released a new professional LUT pack for use with its Cinematographer Kit, that gives filmmakers the ability to color grade data rich content for cinematic results. Filmic is also offering deFlat and deLogLUT packs are also offered, free of charge, on the Filmic site.

The new FilmicPro LUT Pack uses the .cube format which ensures its compatibility with Adobe Premiere, Apple Final Cut Pro X, DaVinci Resolve and other standard industry editing solutions for the desktop. The Filmic deFlat and deLog LUTs are also pre-bundled with LumaFusion and VideoLUT apps for iOS. By partnering with leading iOS editing apps like LumaFusion and VideoLUT, Filmic will simplify advanced color grading on mobile devices for filmmakers and editors. One click will conform their Log to the Rec.709 color space while still giving them additional dynamic range.

With the release of FilmicPro Log V2 and its new Pro LUT Pack, Filmic is offering a series of new tutorials and test shot clips. All resources for mobile filmmakers are available here.

Filmic Pro Log V2 capabilities are available immediately as an in-app purchase, for optional devices, with Cinematographer Kit and is priced at $14.99. The new Pro LUT Pack is available as a free download from their website. The Filmic Pro app is available as a download from the Apple App store (for iOS devices) and on Google Play (for Android devices) for $14.99.

Review: Picture Instruments’ plugin and app, Color Cone 2

By Brady Betzel

There are a lot of different ways to color correct an image. Typically, colorists will start by adjusting contrast and saturation followed by adjusting the lift, gamma and gain (a.k.a. shadows, midtones and highlights). For video, waveforms and vectorscopes are great ways of measuring color values and are about the only way to get the most accurate scientific facts on the colors you are manipulating.

Whether you are in Blackmagic Resolve, Avid Media Composer, Adobe Premiere Pro, Apple FCP X or any other nonlinear editor or color correction app, you usually have similar color correction tools across apps — whether you color based on curves, wheels, sliders or even interactively on screen. So when I heard about the way that Picture Instruments Color Cone 2 color corrects — via a Cone (or really a bicone) — I was immediately intrigued.

Color Cone 2 is a standalone app but also, more importantly, a plugin for Adobe After Effects, Adobe Premiere Pro and FCP X. In this review I am focusing on the Premiere Pro plugin, but keep in mind that the standalone version works on still images and allows you to export a 3dl or cube LUTs — a great way for a client to see what type of result you can get quickly from just a still image.

Color Cone 2 is literally a color corrector when used as a plugin for Adobe Premiere. There are no contrast and saturation adjustments, just the ability to select a color and transform it. For instance, you can select a blue sky and adjust the hue, chromanance (saturation) and/or luminance of the resulting color inside of the Color Cone plugin.

To get started you apply the Color Cone 2 plugin to your clip — the plugin is located under Picture Instruments in the Effects tab. Then you click the little square icon in the effect editor panel to open up the Color Cone 2 interface. The interface contains the bicone image representation of the color correction, presets to set up a split-tone color map or a three-point color correct, and the radius slider to adjust the effect your correction has on surrounding color.

Once you are set on a look you can jump out of the Color Cone interface and back into the effect editor inside of Premiere. There you can keyframe all of the parameters you adjusted in the Color Cone interface. This allows for a nice and easy way to transition from no color correction to color correction.

The Cone
The Cone itself is the most interesting part of this plugin. Think of the bicone as the 3D side view of a vectorscope. In other words, if the vectorscope view from a traditional scope is the top view — the bicone in Color Cone would be a side view. Moving your target color from the top cone to the bottom cone will adjust your lightness to darkness (or luminance). At the intersection of the cones is the saturation (or chromanance) and when moving from the center outwards saturation is increased. When a color is selected using the eye dropper you will see a square, which represents the source color selection, a circle representing the target color and an “x” with a line for reference on the middle section.

Additionally, there is a black circle on the saturation section in the middle that shows the boundaries of how far you can push your chromanance. There is a light circle that represents the radius of how surrounding colors are affected. Each video clip can have effects layered on them and one instance of the plugin can handle five colors. If you need more than five, you can add another instance of the plugin to the same clip.

If you are looking to export 3dl and Cube LUTs of your work you will need to use the standalone Color Cone 2 app. The one caveat to using the standalone app is that you can only apply color to still images. Once you do that you can export the LUT to be used in any modern NLE/color correction app.

Summing Up
To be honest, working in Color Cone 2 was a little weird for me. It’s not your usual color correction workflow, so I would need to sit with the plugin for a while to get used to its setup. That being said, it has some interesting components that I wish other color correction apps would use, such as the Cone view. The bicone is a phenomenal way to visualize color correction in realtime.

In my opinion, if Picture Instruments would sell just the Cone as a color measurement tool to work in conjunction with Lumetri, they would have another solid tool. Color Cone 2 has a very unique and interesting way to color correct in Premiere that acts as an advanced secondary color correct tool to the Lumetri color correction tools.

The Color Cone 2 standalone app and plugin costs $139 when purchased together, or $88 individually. In my opinion, video people should probably just stick to the plugin version. Check out Picture Instrument’s website for more info on Color Cone 2 as well as their other products. And check them out on Twitter @Pic_instruments.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Full-service creative agency Carousel opens in NYC

Carousel, a new creative agency helmed by Pete Kasko and Bernadette Quinn, has opened its doors in New York City. Billing itself as “a collaborative collective of creative talent,” Carousel is positioned to handle projects from television series to ad campaigns for brands, media companies and advertising agencies.

Clients such as PepsiCo’s Pepsi, Quaker and Lays brands; Victoria’s Secret; Interscope Records; A&E Network and The Skimm have all worked with the company.

Designed to provide full 360 capabilities, Carousel allows its brand partners to partake of all its services or pick and choose specific offerings including strategy, creative development, brand development, production, editorial, VFX/GFX, color, music and mix. Along with its client relationships, Carousel has also been the post production partner for agencies such as McGarryBowen, McCann, Publicis and Virtue.

“The industry is shifting in how the work is getting done. Everyone has to be faster and more adaptable to change without sacrificing the things that matter,” says Quinn. “Our goal is to combine brilliant, high-caliber people, seasoned in all aspects of the business, under one roof together with a shared vision of how to create better content in a more efficient way.”

According to managing director Dee Tagert comments, “The name Carousel describes having a full set of capabilities from ideation to delivery so that agencies or brands can jump on at any point in their process. By having a small but complete agency team that can manage and execute everything from strategy, creative development and brand development to production and post, we can prove more effective and efficient than a traditional agency model.”

Danielle Russo, Dee Tagert, AnaLiza Alba Leen

AnaLiza Alba Leen comes on board Carousel as creative director with 15 years of global agency experience, and executive producer Danielle Russo brings 12 years of agency experience.
Tagert adds, “The industry has been drastically changing over the last few years. As clients’ hunger for content is driving everything at a much faster pace, it was completely logical to us to create a fully integrative company to be able to respond to our clients in a highly productive, successful manner.”

Carousel is currently working on several upcoming projects for clients including Victoria’s Secret, DNTL, Subway, US Army, Tazo Tea and Range Rover.

Main Image: Bernadette Quinn and Pete Kasko

Veteran colorist Walter Volpatto joins Efilm

Walter Volpatto, a colorist with 15 years under his belt, has joined LA’s Efilm. His long list of credits includes Dunkirk, Star Wars: The Last Jedi and, most recently, Amazon Studios’ series Homecoming.

As a colorist, Volpatto’s style gravitates toward an aesthetic of realism, though his projects span genres from drama and action to comedy and documentary, such as just-released Green Book, directed by Peter Farrelly; Quentin Tarantino’s The Hateful Eight; Independence Day: Resurgence, directed by Roland Emmerich; and Bad Moms, directed by Jon Lucas and Scott Moore.

He joins Efilm from Fotokem, where he started in digital intermediate before progressively shifting toward fully digital workflows while navigating emerging technologies such as HDR. (Watch our interview with him about his work on The Last Jedi.)

Volpatto found his way into color finishing by way of visual effects, a career he initially pursued as an outlet for his passion for photography. He began working as a digital intermediate artist at Cinecitta in Rome in 2002, before relocating to Los Angeles the following year. Since then, he’s continued honing his skillset for both film and digital, while also expanding his knowledge of color science.

While known for his feature film work, Volpatto periodically works in episodic television. Based at Efilm’s Hollywood facility, will be working at many of Deluxe’s color grading suites, including the newly opened Stage One. He will be working on Blackmagic’s DaVinci Resolve.

Technicolor welcomes colorists Trent Johnson and Andrew Francis

Technicolor in Los Angeles will be beefing up its color department in January with the addition of colorists Andrew Francis and Trent Johnson.

Francis joins Technicolor after spending the last three years building the digital intermediate department of Sixteen19 in New York. With recent credits that include Second Act, Night School, Hereditary and Girls Trip. Francis is a trained fine artist who has established a strong reputation of integrating the bleeding edge of technology in support of the craft of color.

Johnson, a Technicolor alumnus, returns after stints as a digital colorist at MTI, Deluxe and Sony Colorworks. His recent credits include horror hits Slender Man and The Possession of Hannah Grace, as well as comedies Overboard and Ted 2.

Johnson will be using FilmLight and Resolve for his work, while Francis will toggle between Resolve, BaseLight and Lustre, depending on the project.

Francis and Johnson join Technicolor LA’s roster, which includes Pankaj Bajpai, Tony Dustin, Doug Delaney, Jason Fabbro, recent HPA award-winner Maxine Gervais, Michael Hatzer, Roy Vasich, Tim Vincent, Sparkle and others.

Main Image: Trent Johnson and Andrew Francis

Post house Cinematic Media opens in Mexico City, targets film, TV

Mexico City is now home to Cinematic Media, a full-service post production finishing facility focused on television and cinema content   Located on the lot at Estudios GGM, the facility offers dailies, look development, editorial finishing, color grading and other services, and aims to capitalize on entertainment media production in Mexico and throughout Central and South America.

Scot Evans

In its first project, Cinematic Media provided finishing services for the second season of the Netflix series Ingobernable.

CEO Scot Evans brings more than 25 years of post experience and has managed large-scale post production operations in the United States, Mexico and Canada. His recent posts include executive VP at Technicolor PostWorks in New York, managing director of Technicolor in Vancouver and managing director of Moving Picture Company (MPC) in Mexico City.

“We’re excited about the future for entertainment production in Mexico,” says Evans. “Netflix opened the door and now Amazon is in Mexico. We expect film production to also grow. Through its geographic location, strong infrastructure and cinematic history, Mexico is well-positioned to become a strong producer of content for the world market.”

Cinematic Media has been built from the ground up with a workflow modeled after top-tier facilities in Hollywood and geared toward television and cinema finishing. Engineering design was supervised by John Stevens, whose four decades of post experience includes stints at Cinesite, Efilm, The Post Group, Encore Hollywood, MTI Film and, currently, the Foundation.

Resources include a DI theater with DaVinci Resolve, 4K projection and 7.1 surround sound, four color suites supporting 2K, 4K and HDR, multiple editorial finishing suites, and a Colorfront On-Set Dailies system. The facility also offers look development services to assist productions in creating end-to-end color pipelines, as well as quality control and deliverable services for streaming, broadcast and cinema. Plans to add visual effects services are in the works.

“We can handle six or seven series simultaneously,” says Evans. “There is a lot of redundancy built into our pipeline, making it incredibly efficient and virtually eliminating downtime. A lot of facilities in Hollywood would be envious of what we have here.”

Cinematic Media features high-speed connectivity via the private network Sohonet. It will be employed to share media with studios, producers and distributors around the globe securely and efficiently. It will also be used to facilitate remote collaboration with directors, cinematographers, editors, colorists and other production partners.

Evans cites as a further plus Cinematic Media’s location within Estudios GGM, which has six sound stages, production and editorial office space, grip and lighting resources and more. Producers can take projects from concept to the screen from within the confines of the site. “We can literally walk down a flight of stairs to support a project shooting on one of the stages,” he says. “Proximity is important. We expect many productions to locate their offices and editorial teams here.”

Managing director Arturo Sedano will oversee day-to-day operations. He has supervised post for thousands of hours of television and cinema content on behalf of studios and producers from around the globe, including Netflix, Telemundo, Sony Pictures, Viacom, Lionsgate, HBO, TV Azteca, Grupo Imagen and Fox.

Other key staff includes senior colorist Ana Montaño whose experience as a digital colorist spans facilities in Mexico City, Barcelona, London, Dublin and Rome; producer and post supervisor Cyntia Navarro, previously with Lejana Films and Instituto Mexicano de Cinematografía (IMCINE). Her credits span episodic television, feature film and documentaries, and include projects for IFC Films, Canal Once, UPI, Discovery Channel, Netflix and Amazon.

Additional staff includes chief technology officer Oliver De Gante, previously with Ollin VFX, where his credits included the hit films Chappie, Her, Tron: Legacy and The Social Network, as well as the Netflix series House of Cards; technical director Gabriel Kerlegand, a workflow specialist and digital imaging technologist with 18 years of experience in cinema and television; and coordinator and senior conform editor Humberto Flores, formerly senior editor at Zenith Adventure Media.

Behind the Title: Picture Shop workflow specialist Alex Martin

NAME: Alex Martin

COMPANY: Picture Shop Post

TITLE: Workflow Specialist

CAN YOU DESCRIBE YOUR COMPANY?
While Picture Shop is two years old, our team has decades of experience. A majority of our employees here know each other through some previous career venture. We are a hand-picked team that meshes really well together.

We’re led by four individuals who live and breathe post production and have for decades: president Bill Romeo, EVP of sales and marketing Robert Glass, EVP of VFX Tom Kendall and EVP and CTO Jay Bodnar.

Our projects — from superhero shows to Netflix and Hulu’s top HDR projects (oh yeah, and zombies) — we’re constantly expanding.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT A WORKFLOW SPECIALIST DOES?
Technology is always advancing, and so are our shows and their workflows. Keeping up to speed with the new gear and new specs is a large majority of what makes up my day-to-day.

You have to be quick on your feet and one step ahead of the industry at all times in order to grasp success. The biggest challenge for me is always having to think outside the box; looking for new and improved ways to make what already works even better. We are often stumbling upon new advancements, constantly producing and testing new ideas into fruition.

WHAT SYSTEMS DOES PICTURE SHOP HAVE FOR COLOR?
We’re fortunate enough to have three major color correctors: Digital Vision’s Nucoda, Filmlight’s Baselight and Blackmagic’s DaVinci Resolve. We also use Colorfront’s software, Express Dailies and Transkoder.

For our online systems we use Avid Media Composer, Autodesk Flame and, recently, Resolve.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST SET UP PROJECTS/ DESIGN WORKFLOWS?
All the time. One moment, I’m figuring out why the text over picture is more transparent than it should be, and the next, I’m creating LUTs for a new show on-set. My day-to-day job is always about workflow, but my minute-to-minute lies in the fine details. The main focus is to help get the show out the door on time and ensure that our clients keep coming back for more.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Probably that no two days are the same. I have a great mentor, senior systems workflow engineer Todd Korody, who we consider the brains of the building. Working alongside him for the past two years has been the most rewarding. Most of the conversations that we have are about a show’s color pipeline, and how we can get to the final delivery stage seamlessly while keeping in mind that each new show brings a different element to the table.

Whether we’re designing the workflow on a regular HD finish for a network show or evolving the HDR processes for Netflix and Hulu, figuring out the pass off from one platform to the next (dailies to online, online to color, or VFX to color) makes each day unique.

WHAT’S YOUR LEAST FAVORITE?
My least favorite is probably my own tendency to be a perfectionist. I always want to make sure that everything goes according to plan — as most of us do. I’m then reminded of the brilliant team that I am surrounded by, and though seeking a more collaborative effort, the “best way” to fix any issue makes itself known.

It’s amazing to know that I’m surrounded by people that care about our company to the same degree, and we all work together to ensure the best possible success.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I would be a sound engineer for live concerts. What’s better than being behind the controls, mixing for a great band?

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I kind of fell into it. I wanted to go to recording school for music. Not that I couldn’t have, but a four-year university was needed. I ended up finding film schools had classes in mixing for movies. This turned into an editing and VFX emphasis so I could take mixing classes.

One of the classes offered in the area I was studying was color correction. I loved that class, which opened a very wide door for me to pursue in post. I knew I would end up in the entertainment business in some way around 17. My friends and I would cut together videos on Windows Movie Maker. Always enjoyed the art and still do today.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON OR PLAN TO WORK ON?
On the technology side, we’ve been working a lot with Resolve and Baselight in terms of HDR. Also making sure we are familiar with the Dolby Vision toolsets, color management workflows and making sure our pipeline is smooth for everyone.

We have a few projects coming out which I’m exciting to be a part of, The Chilling Adventures of Sabrina, Unbelievable and Huge in France, all for Netflix. There is also Future Man for Hulu. There’s a lot more HDR work on the horizon, but these are a few currently underway.

WHAT ARE YOU MOST PROUD OF WORK WISE?
Our HDR pipeline. We’ve developed some great tools and strategies along the way to handle very large camera files, ways we bring media in and out of the color correctors, and tools to help us with final delivery.

WHERE DO YOU FIND INSPIRATION? ART? PHOTOGRAPHY?
Camera tests through to final picture. Before each show starts filming, the DP usually directs a camera test. When they do the camera and lens-package comparisons, I love seeing the subtle differences. Once the show’s colorist has a chance to collaborate with the DP’s vision, the best part is seeing the final colored image through their eyes. In my opinion, this finishing touch is what brings the picture to life.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT?
My phone, my laptop and Resolve… I also have to mention my car.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I use LinkedIn – I follow all the studios, production companies, software companies, different operators and artists; really anything that keeps me up to speed with the post production world.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I guess I always go back to what brought me to this business in the first place, and that’s music. I play the drums, and that helps me decompress and have a good time.

DP Chat: Polly Morgan, ASC, BSC

Cinematographer Polly Morgan, who became an active member of the ASC in July, had always been fascinated with films, but she got the bug for filmmaking as a teenager growing up in Great Britain. A film crew shot at her family’s farmhouse.

“I was fixated by the camera and cranes that were being used, and my journey toward becoming a cinematographer began.”

We reached out to Morgan recently to talk about her process and about working on the FX show Legion.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
I am inspired by the world around me. As a cinematographer you learn to look at life in a unique way, noticing elements that you might not have been aware of before. Reflections, bouncing light, colors, atmosphere and so many more. When I have time off, I love to travel and experience different cultures and environments.

I spend my free time reading various periodicals to stay of top of the latest developments in technology. Various publications, such as the ASC’s magazine, help to not only highlight new tools but also people’s experiences with them. The filmmaking community is united by this exploration, and there are many events where we are able to get together and share our thoughts on a new piece of equipment. I also try to visit different vendors to see demos of new advances in technology.

Has any recent or new technology changed the way you work?
Live on-set grading has given me more control over the final image when I am not available for the final DI. Over the last two years, I have worked more on episodic television, and I am often unable to go and sit with the colorist to do the final grade, as I am working on another project. Live grading enables me to get specific with adjustments on the set, and I feel confident that with good communication, these adjustments will be part of the final look of the project.

How do you go about choosing the right camera and lenses to achieve the right look for a story?
I like to vary my choice of camera and lenses depending on what story I am telling.
When it comes to cameras, resolution is an important factor depending on how the project is going to be broadcast and if there are specific requirements to be met from the distributor, or if we are planning to do any unique framing that might require a crop into the sensor.

Also, ergonomics play a part. Am I doing a handheld show, or mainly one in studio mode? Or are there any specifications that make the camera unique that will be useful for that particular project? For example, I used the Panasonic VariCam when I needed an extremely sensitive sensor for night driving around downtown Los Angeles. Lenses are chosen for contrast and resolution and speed. Also, sometimes size and weight play a part, especially if we are working in tight locations or doing lots of handheld.

What are some best practices, or rules, you try to follow on each job?
Every job is different, but I always try to root my work in naturalism to keep it grounded. I feel like a relatable story can have the most impact on its viewer, so I want to make images that the audience can connect with and be drawn into emotionally. As a cinematographer, we want our work to be invisible but yet always support and enhance the narrative.

On set, I always ensure a calm and pleasant working environment. We work long and bizarre hours, and the work is demanding so I always strive to make it an enjoyable and safe experience for everyone,

Explain your ideal collaboration with the director when setting the look of a project.
It is always my aim to get a clear idea of what the director is imagining when they describe a certain approach. As we are all so different, it is really about establishing a language that can be a shorthand on set and help me to deliver exactly what they want. It is invaluable to look at references together, whether that is art, movies, photography or whatever.

As well as the “look,” I feel it is important to talk about pace and rhythm and how we will choose to represent that visually. The ebb and flow of the narrative needs to be photographed, and sometimes directors want to do that in the edit, or sometimes we express it through camera movement and length of shots. Ideally, I will always aim to have a strong collaboration with a director during prep and build a solid relationship before production begins.

How do you typically work with a colorist?
This really varies from project to project, depending if I am available to sit in during the final DI. Ideally, I would work with the colorist from pre-production to establish and build the look of the show. I would take my camera tests to the post house and work on building a LUT together that would be the base look that we work off while shooting.

I like to have an open dialogue with them during the production stage so they are aware and involved in the evolution of the images.

During post, this dialogue continues as VFX work starts to come in and we start to bounce the work between the colorist and the VFX house. Then in the final grade, I would ideally be in the room with both the colorist and the director so we can implement and adjust the look we have established from the start of the show.

Tell us about FX’s Legion. How would you describe the general look of the show?
Legion is a love letter to art. It is inspired by anything from modernist pop art to old Renaissance masters. The material is very cerebral, and there are many mental planes or periods of time to express visually, so it is a very imaginative show. It is a true exploration of color and light and is a very exciting show to be a part of.

How early did you get involved in the production?
I got involved with Legion starting in Season 2. I work alongside Dana Gonzales, ASC, who established the look of the show in Season one with creator Noah Hawley. My work begins during the production stage when I worked with various directors both prepping and shooting their individual episodes.

Any challenging scenes that you are particularly proud of how it turned out?
Most of the scenes in Legion take a lot of thought to figure out… contextually as well as practically. In Season 2, Episode 2, a lot of the action takes place out in the desert. After a full day, we still had a night shoot to complete with very little time. Instead of taking time to try to light the whole desert, I used one big soft overhead and then lit the scene with flashlights on the character’s guns and headlights of the trucks. I added blue streak filters to create multiple horizontal blue flares from each on-camera source (headlights and flashlights) that provided a very striking lighting approach.

FX’s Legion, Season 2, Episode 2

With the limited hours available, we didn’t have enough time to complete all the coverage we had planned so, instead, we created one very dynamic camera move that started overhead looking down at the trucks and then swooped down as the characters ran out to approach the mysterious object in the scene. We followed the characters in the one move, ending in a wide group shot. With this one master, we only ended up needing a quick reverse POV to complete the scene. The finished product was an inventive and exciting scene that was a product of limitations.

What’s your go-to gear (camera, lens, mount/accessories you can’t live without)?
I don’t really have any go-to gear except a light meter. I vary the equipment I use depending on what story I am telling. LED lights are becoming more and more useful, especially when they are color- and intensity-controllable and battery-operated. When you need just a little more light, these lights are quick to throw in and often save the day!

Industry vets launch hybrid studio, Olio Creative

Colorist Marshall Plante, producer Natalie Westerfield and director/creative director Justin Purser founded hybrid studio Olio Creative, which has opened its doors in Venice, California.

Olio features vintage-style décor and an open floor plan and the space is adaptable for freelancers, mobile artists and traveling talent, with two color suites and a suite set up to toggle between editorial and Flame work.

Marshall Plante is a well-known colorist who has built his career at shops such as Digital Magic, Riot, Syndicate and, most recently, at Ntropic where he headed up the color department. His commercial credits include Samsung, Audi, Olay, Nike, Honda, Budweiser, and direct-to-brand projects for Apple and Riot Games. Recently, the Nick Jr. Girls in Charge: Girl Power campaign he graded won an Emmy for Outstanding Daytime Promo Announcement Brand Image Campaign, and the Uber campaign he graded, Rolling With the Champion with Lebron James, won a bronze Cannes Lion.

Marshall’s long-time producer, Natalie Westerfield, has over 10 years of experience producing at companies including The Mill and Ntropic. As executive producer, Westerfield will provide oversight to guide all projects that come through Olio’s pipeline.

The third member of the team is director/creative director Justin Purser. As a director, Purser has worked at production companies A Band Apart and Anonymous Content. He was one of the original creators and directors behind Maker Studios (acquired by Walt Disney Corp.) that pioneered the multi-channel YouTube-centric companies of today.

The three partners will bring an element of experimentation and collaboration to the post production field. “The ability to be chameleons within the industry keeps us open to fresh ideas,” says Pursur. “Our motto is, ‘Try it. If it doesn’t work, pivot.’ And if we thrive in a new way of working, we’re going to share that with everyone. We want to not only make noise for ourselves, but for others in the same business.”

Senior colorist Nicholas Hasson joins Light Iron’s LA team

Post house Light Iron has added senior colorist Nicholas Hasson to its roster. He will be based in the company’s Los Angeles studio.

Hasson colored the upcoming Tiffany Haddish feature Nobody’s Fool and Season 2 of HBO’s Room 104. Additional past credits include Boo 2! A Madea Halloween, Masterminds, All About Nina and commercial campaigns for Apple, Samsung and Google. He worked most recently at Technicolor, but his long career has included time at ILM, Company 3 and Modern VideoFilm.

“Nicholas has a wealth of experience that makes him a great fit with our team,” says Light Iron GM Peter Cioni. “His background in color, online and VFX ensures success in meeting clients’ creative objectives and enables flexibility in working across both episodic and feature projects.”

Like Lightiron’s other LA-based colorists, led by Ian Vertovec, Hasson is able to support cinematographers working in other regions through virtual DI sessions in Panavision’s network of connected facilities. (Light Iron is a Panavision company.)

Hasson joins Light Iron during a time of high-profile streaming releases including Netflix’s Maniac and Facebook’s Sorry For Your Loss, as well as feature releases garnering awards buzz, such as Can You Ever Forgive Me? and What They Had.

“This is a significant time of growth for Panavision’s post production creative services,” concludes Cioni. “We are thrilled to have Nicholas with us as we enter this next chapter of expansion.”

Digital Domain Shanghai’s Simon Astbury talks color, projects

England-born Simon Astbury’s path to color grading wasn’t a straight one. He earned a degree in music and had vague ambitions about working in A&R. “I started working in this industry briefly in the early ‘90s and pretty much hated it,” he shares.

One day, Astbury went to sound sync and dialogue edit in a small facility in Twickenham Film Studios where they had two MkIII Rank Cintel telecines. “It was love at first sight,” he says. “The ‘Heath Robinson’ craziness of these systems, with their very limited color tools in those days, PEC master control (operated with a tweaker) and primaries.

Simon Astbury

“There was no machine control or editing, so no stopping once you’d started. It was a great way to learn the craft, to hone an instinctive reaction to an image that still serves me well today. The green radioactive glow from the tube, the smell of film all went to make grading a much more visceral experience! The early ‘90s was a period of huge change in post. Avid was this new thing that the editors mistrusted, most of them were using Steenbecks at that time.”

Astbury’s path was officially changed and he went on to work on many films, including Shakespeare in Love, Sense and Sensibility and Notting Hill. “I also worked with a bunch of film legends including Roger Pratt, Jack Cardiff, Richard Attenborough, Alan Parker, Franco Zeferelli… and The Spice Girls!”

Astbury has worked in a wide range of genres, from Oscar-winning films to iconic ad campaigns and pop promos. He has collaborated with people like Jack Cardiff, Roger Pratt, Tony Kaye, Paul WS Anderson and many more. Today, he is the head of color at Digital Domain in Shanghai.

Let’s find out more…

You’ve recently moved to Shanghai. Why the move, and are your clients’ requests or expectations there different than in London?
I felt it was time to leave my Soho comfort zone. I’d always intended to travel with the job, but the right opportunity never came up. Then when the offer to relocate came somewhat out of the blue, I consulted my family and we decided to go for it.

Digital Domain has an incredible body of work and a global presence. It was also an opportunity to develop and grow a grading department worldwide in a company that is primarily focused on VFX.

Managing client expectations is always very important, but in China the client really is king or queen. Making sure that the work remains good and not diluted by overthinking and over-tweaking is sometimes a very delicate negotiation.

How have you gone about building or enhancing the grading department at Digital Domain China?
So far I’ve introduced some enhanced workflows and defined training for the juniors and assistants. I’m also attempting to make remote grading available to any of our other offices around the world. Additionally, I’m promoting increased co-operation between our Shanghai and Beijing offices.

You’ve worked on all sorts of projects, from documentaries to features to commercials. Is there a genre you enjoy grading the most?
If it doesn’t sound trite, I would say that good, well-executed work is the most enjoyable to grade. I love commercials because they afford the opportunity to go into detail and occasionally push things creatively.

I love documentaries because the grade can enhance the story in so many different ways. I love dramas because the story arc and mood can be helped immensely by a good grade. I love movies because in my heart I’m a film nut and the opportunity to have your work in a cinema is an incredible buzz that will never ever get old.

What work are you most proud of?
There are a few things that stand out for me, most recently a grade for the wonderful director Nieto at Stink for Wu Fang Zhai. It was great fun to throw away the rulebook and do some crazy stuff.

There are a bunch of things that I’ve done over the years that I remember fondly, a travelogue for BBC4 called Travels With a Tangerine, which was amazingly well shot on SD DVCAM. Also some beautiful films for Volvo directed by Martin Swift, and some epic stuff for Audi directed by Paul WS Anderson. There is also the amazing multi-screen art installation “Mother’s Day” for artist Smadar Dreyfuss about dispossessed stateless children in Israel.

Working with younger directors like Stella Scott has been a great experience for me. Passing on knowledge and at the same time learning new visual languages helps to keep everything fresh. At the other end of the spectrum is The Human Centipede trilogy — it’s not often that you get to be involved in a cultural phenomenon.

Wu Fang Zha

Can you describe a recent project and what tools were particularly beneficial?
The Wu Fang Zhai project was shot on greenscreen with matte-painted backgrounds and sometimes with complicated comps. It was really easy to assemble rough comps in my FilmLight Baselight to ensure the grade looked correct. Layer mode composite settings were particularly useful.

Baselight Editions is also a brilliant tool for VFX-heavy jobs. We have a top-secret project on at the moment and the ability to have a renderless workflow between Baselight and Nuke is invaluable.

As a colorist what are your biggest strengths?
I’ve been doing it for a long time and can come up with creative solutions for most eventualities. Sometimes the client wants you to drive the session and come up with all the ideas, sometimes they want you to do as you are told, and sometimes they want it to be a collaboration. I’m comfortable with any of these scenarios, but the client is paying for my eyes and my interpretation, so sometimes you have to be the guide, even when the client has very definite ideas.

You also have to be the arbiter of taste. So on occasion you have to be firm, particularly when bad decisions are being made. I try to separate my ego from the work and create a calm-but-creative atmosphere in the grade suite. Music is hugely important, as well as a fully stocked drink trolley!

The wonderful colorist Bob Festa has said that he asks people what they want their films to say, rather than how they want them to look, and that’s pretty much my approach. I’ve been compared to an airline pilot or cruise ship captain more than once….I’ll take that (he smiles).

You’ve been a colorist for over 20 years and witnessed the time when color correction was processed in film labs. What are your thoughts today about film versus digital?
I worked exclusively in film for about half my career and I love it. It is tactile, it smells great, it feels good in your hands and, of course, many of the most memorable images in cinema were shot on it. The soft detail, intensity and richness of color, the roll off into the whites and blacks is something that digital still finds hard to replicate.

Gucci commercial

Also, the recent resurgence in Europe and the USA of film in shorts, commercials and promos is great to see. However, I find myself thinking about all those things I don’t miss about film, such as weave, cell scratches, grain, wet gate TK and that buttock-clenching moment when the lab manager tells you the reel had broken in the bath and 300 feet of neg had been destroyed. X-ray fogging! Oh my goodness, I have so many film horror stories.

Modern cameras produce amazingly clear images with great color and response to light with far less in the way of insurmountable problems, and I don’t see either as particularly better. Actually, I think decent glass and proper lighting are just as important as what camera or format you shoot on.

What are the biggest challenges you face today as a colorist?
There are a few, but I don’t think they’re specific to colorists. Content is becoming continually more disposable. It’s more important than ever that respect for the craft — not only of color grading but the whole production and post process — becomes central to every production. The proliferation of display devices is also a big subject, making sure that the grade looks good on phones, tablets, laptops and TVs is an issue that will only get more challenging.

Do you have a routine when grading?
Yes, definitely. Although color is incredibly subjective, I personally think that your process shouldn’t be. I strongly believe there’s a right and wrong way of going about a grade. Every colorist has a different process but there are definitely ways that work and ways that don’t.

The longer I do the job, the more important the psychological aspect of it becomes — how your choices in the grade affect the thoughts and emotions of the viewer… what really matters and what doesn’t. I’m always on a quest to distil the essence of a grade. A lot of the content I see now, in my opinion, is over-graded. We have such comprehensive tools now, so you don’t have to throw the kitchen sink at every shot. “Keep it simple” is a mantra I try to impress upon my juniors.

Baselight is your main tool?
I’ve been working on Baselight for just shy of a decade. My favorite thing about Baselight is what I call “redundancy of process,” by which I mean there are multiple ways of doing most grading operations — hue angle not working? Then try Dkey. Dkey no good? Then try RGB key or curves, etc, etc.

What advice would you give to a junior colorist starting out today?
Be patient, there are no shortcuts, although I think it takes less time nowadays than it did due to the absence of telecines. Be a geek about your industry, cameras, lighting and lenses. Watch movies, ads and everything that’s good. Study art and artists, if only to have common points of reference. Remember that the grading part is only a portion of what makes a good colorist. You’re the host, therapist, barman and ringmaster.

You have to be someone people don’t mind spending 12 hours in a dark room with or they’ll never use you again. With difficult client requests try to say yes and then work out how you’re going to do it; if you can’t do it, suggest an alternative rather than saying no. Social media, especially Instagram is a brilliant medium for colorists, but be careful not to post things just for the sake of it.

Main Image Caption: Wu Fang Zhai 

Deluxe opens 4,000-square-foot color grading theater

Deluxe has opened a new color grading theater called Stage One. It is equipped with a large Stewart and RealDs screens and Barco projectors, as well as advanced color grading, audio and editorial systems. Located in Deluxe Audio Seward at Hollywood’s Glen Glenn Sound building, the 4,000-square-foot space features plush seating and “perfect” black levels.

“I’ve been dreaming of a space like Stage One since I started color finishing,” says senior colorist Skip Kimball, whose recent credits include Deadpool 2 at Efilm. “We’re set up to handle any format and have a fleet of projectors so I can grade on a screen that’s comparable to exhibition; it’s much easier to evaluate the picture and address any issues when you can see it on a 60-foot screen. And the size of Stage One is incredible; it can comfortably accommodate 120 people, so we can handle conform, color and VFX all in one space, with the director and cinematographer for a more streamlined process.”

Deluxe’s Stage One features a RealD Ultimate Screen with a 45’x21’7” maximum image and a Stewart Filmscreen SnoMatte 100 screen with 41’3”x22’4” maximum image, and can accommodate the latest display monitors, allowing production to view content in whatever format is needed throughout production.

The space is also equipped with two Christie Dolby Vision Eclipse laser projectors, capable of providing 108-nit brightness, as well as high frame rate projection and 4K resolution; a Barco DP4K-P reference projector for theatrical grading at 48 nits in 4K resolution; and a Barco DP4K-32B projector for RealD stereo theatrical grading at up to 48 nits in 4K resolution. Available color grading and editorial systems include Blackmagic Resolve, Autodesk Lustre and Flame, and Filmlight Baselight.


Company 3 adds television colorist Jeremy Sawyer 

Company 3 in Santa Monica has beefed up its team of colorists with Jeremy Sawyer (Hulu’s The First, Showtime’s I’m Dying Up Here). He will be working on the studio’s expanding slate of TV projects — they currently have more than 20 series in the facility, including Lost in Space (Netflix), Insecure (HBO) and Jack Ryan (Amazon).

For Sawyer, who has also worked on The Walking Dead (AMC), this move brings him back to Company 3, where he had worked as an assistant and then colorist and where he learned a great deal about his craft from CEO/founder Stefan Sonnenfeld.

He returns to the company following a tenure at Light Iron, and was at MTI before that. Prior to his initial stint at Company 3, Sawyer worked at the now-defunct Syndicate. He started his career at Finish Post in his native Boston.

“We’re very excited to welcome Jeremy back,” Sonnenfeld says. “He is an excellent artist and he has a keen understanding of the unique challenges involved in coloring episodic programming. He’s a perfect addition to our team, especially as demand for top-notch TV post continues to explode.”

Sawyer will continue his work on the third season of Netflix series Easy, for which he’s colored every episode to date.

Encore adds colorist Andrea Chlebak, ups Genevieve Fontaine to director of production

Encore has added colorist Andrea Chlebak to its roster and promoted veteran post producer Genevieve Fontaine to director of production. Chlebak brings a multidisciplinary background in feature films, docu-series and commercials across a range of aesthetics. Fontaine has been a post producer since joining the Encore team in early 2010.

Chlebak’s credits include award-winning indies Mandy and Prospect, Neill Blomkamp features Elysium and Chappie and animated adaptation Kahlil Gibran’s “The Prophet.” Having worked primarily in the digital landscape, her experience as an artist, still photographer, film technician, editor and compositor are evident in both her work and how she’s able to streamline communication with directors and cinematographers in delivering their vision.

In her new role, Fontaine’s responsibilities shift toward ensuring organized, efficient and future-proof workflows. Fontaine began her career as a telecine and dailies producer at Riot before moving to Encore, where she managed post for up to 11 shows at a time, including Marvel’s The Defenders series for Netflix. She understands all the building blocks necessary to keep a facility running smoothly and has been instrumental in establishing Encore, a Deluxe company, as a leader in advanced formats, helping coordinate 4K, HDR and IMF-based workflows.

Main Image: (L-R) Genevieve Fontaine and Andrea Chlebak.

A Conversation: 3P Studio founder Haley Stibbard

Australia’s 3P Studio is a post house founded and led by artisan Haley Stibbard. The company’s portfolio of work includes commercials for brands such as Subway, Allianz and Isuzu Motor Company as well as iconic shows like Sesame Street. Stibbard’s path to opening her own post house was based on necessity.

After going on maternity to have her first child in 2013, she returned to her job at a content studio to find that her role had been made redundant. She was subsequently let go. Needing and wanting to work, she began freelancing as an editor — working seven days a week and never turning down a job. Eventually she realized that she couldn’t keep up with that type of schedule and took her fate into her own hands. She launched 3P Studio, one of Brisbane’s few women-led post facilities.

We reached out to Stibbard to ask about her love of post and her path to 3P Studio.

What made you want to get into post production? School?
I had a strong love of film, which I got from my late dad, Ray. He was a big film buff and would always come home from work when I was a kid with a shopping bag full of $2 movies from the video store and he would watch them. He particularly liked the crime stories and thrillers! So I definitely got my love of film and television from him.

We did not have any film courses at high school in the ‘90s, so the closest I could get was photography. Without a show reel it was hard to get a place at university in the college of art; a portfolio was a requirement and I didn’t have one. I remember I had to talk my way into the film program, and in the end I think they just got sick of me and let me into the course through the back door without a show reel — I can be very persistent when I want to be. I always had enjoyed editing and I was good at it, so in group tasks I was always chosen as the editor and then my love of post came from there.

What was your first job?
My very first job was quite funny, actually. I was working in both a shoe store and a supermarket at the time, and two post positions became available one day, an in-house editor for a big furniture chain and a job as a production assistant for a large VFX company at Movie World on the Gold Coast. Anyone who knows me knows that I would be the worst PA in the world. So, luckily for that company director, I didn’t get the PA job and became the in-house editor for the furniture chain.

I’m glad that I took that job, as it taught me so much — how to work under pressure, how to use an Avid, how to work with deadlines, what a key number was, how to dispatch TVCS to the stations, be quick, be accurate, how to take constructive feedback.

I made every mistake known to man, including one weekend when I forgot to remove the 4×3 safe bars from a TVC and my boss saw it on TV. I ended up having to drive to the office, climb the fence that was locked to get into the office and pull it off air. So I’ve learned a lot of things the hard way, but my boss was a very patient and forgiving man, and 18 years later is now a client of mine!

What job did you hold when you went out on maternity leave?
Before I left on maternity leave to have my son Dashiell, I was an editor for a small content company. I have always been a jack-of-all-trades and I took care of everything from offline to online, grading in Resolve, motion graphics in After Effects and general design. I loved my job and I loved the variety that it brought. Doing something different every day was very enjoyable.

After leaving that job, you started freelancing as an editor. What systems did you edit on at the time and what types of projects? How difficult a time was that for you? New baby, working all the time, etc.
I started freelancing when my son was just past seven months old. I had a mortgage and had just come off six months of unpaid maternity leave, so I needed to make a living and I needed to make it quickly. I also had the added pressure of looking after a young child under the age of one who still needed his mother.

So I started contacting advertising agencies and production companies that I thought may be interested in my skill set. I just took every job that I could get my hands on, as I was always worried that every job that I took could potentially be my last for a while. I was lucky that I had an incredibly well-behaved baby! I never said “no” to a job.

As my client base started to grow, my clients would always book me since they knew that I would never say “no” (they know I still don’t say no!). It got to the point where I was working seven days a week. I worked all day when my son was in childcare and all night after he would go to bed. I would take the baby monitor downstairs where I worked out of my husband’s ‘man den.’

As my freelance business grew, I was so lucky that I had the most supportive husband in the world who was doing everything for me, the washing, the cleaning, the cooking, bath time, as well has holding down his own full-time job as an engineer. I wouldn’t have been able to do what I did for that period of time without his support and encouragement. This time really proved to be a huge stepping stone for 3P Studio.

Do you remember the moment you decided you would start your own business?
There wasn’t really a specific moment where I decided to start my own business. It was something that seemed to just naturally come together. The busier I became, the more opportunities came about, like having enough work through the door to build a space and hire staff. I have always been very strategic in regard to the people that I have brought on at 3P, and the timing in which they have come on board.

Can you walk us through that bear of a process?
At the start of 2016, I made the decision to get out of the house. My work life was starting to blend in with my home life and I needed to have that separation. I worked out of a small office for 12 months, and about six months into that it came to a point where I was able to purchase an office space that would become our studio today.

I went to work planning the fit out for the next six months. The studio was an investment in the business and I needed a place that my clients could also bring their clients for approvals, screenings and collaboration on jobs, as well as just generally enjoying the space.

The office space was an empty white shell, but the beauty of coming into a blank canvas was that I was able to create a studio that was specifically built for post production. I was lucky in that I had worked in some of the best post houses in the country as an editor, and this being a custom build I was able to take all the best bits out of all the places I had previously worked and put them into my studio without the restriction of existing walls.

I built up the walls, ripped down the ceilings and was able to design the edit suites and infrastructure all the way down to designing and laying the cable runs myself that I knew would work for us down the line. Then, we saved money and added more equipment to the studio bit by bit. It wasn’t 0 to 100 overnight, I had to work at the business development side of the company a lot, and I spent a lot of long days sitting by myself in those edit suites doing everything. Soon, word of mouth started to circulate and the business started to grow on the back of some nice jobs from my existing loyal clients.

What type of work do you do, and what gear do you call on?
3P Studio is a boutique post production studio that specializes in full-service post production, we also shoot content when required.

Our clients range anywhere from small content videos for the web all the way up to large commercial campaigns and everything in between.

There are currently six of us working full time in the studio, and we handle everything in-house from offline editing to VFX to videography and sound design. We work primarily in the Adobe Creative suite for offline editing in Premiere, mixed with Maxon Cinema 4D/Autodesk Maya for 3D work, Autodesk Flame and Side Effects Houdini for online compositing and VFX, Blackmagic Resolve for color grading and Pro Tools HD for sound mixing. We use EditShare EFS shared storage nodes for collaborative working and sharing of content between the mix of creative platforms we use.

This year we have invested in a Red Digital Cinema camera as well as an EditShare XStream 200 EFS scale-out single-node server so we can become that one-stop shop for our clients. We have been able to create an amazing creative space for our clients to come and work with us, be it from the bespoke design of our editorial suites or the high level of client service we offer.

How did you build 3P Studios to be different from other studios you’ve worked at?
From a personal perspective, the culture that we have been able to build in the studio is unlike anywhere else I have worked in that we genuinely work as a team and support each other. On the business side, we cater to clients of all sizes and budgets while offering uncompromising services and experience whether they be large or small. Making sure they walk away feeling that they have had great value and exemplary service for their budget means that they will end up being a customer of ours for life. This is the mantra that I have been able to grow the business on.

What is your hiring process like, and how do you protect employees who need to go out on maternity or family leave?
When I interview people to join 3P, attitude and willingness to learn is everything to me — hands down. You can be the most amazing operator on the planet, but if your attitude stinks then I’m really not interested. I’ve been incredibly lucky with the team that I have, and I have met them along the journey at exactly the right times. We have an amazing team culture and as the company grows our success is shared.

I always make it clear that it’s swings and roundabouts and that family is always number one. I am there to support my team if they need me to be, not just inside of work but outside as well and I receive the same support in return. We have flexible working hours, I have team members with young families who, at times, are able to work both in the studio and from home so that they can be there for their kids when they need to be. This flexibility works fine for us. Happy team members make for a happy, productive workplace, and I like to think that 3P is forward thinking in that respect.

Any tips for young women either breaking into the industry or in it that want to start a family but are scared it could cost them their job?
Well, for starters, we have laws in Australia that make it illegal for any woman in this country to be discriminated against for starting a family. 3P also supports the 18 weeks paid maternity leave available to women heading out to start a family. I would love to see more female workers in post production, especially in operator roles. We aren’t just going to be the coffee and tea girls, we are directors, VFX artists, sound designers, editors and cinematographers — the future is female!

Any tips for anyone starting a new business?
Work hard, be nice to people and stay humble because you’re only as good as your last job.

Main Image: Haley Stibbard (second from left) with her team.

Adobe updates Creative Cloud

By Brady Betzel

You know it’s almost fall when when pumpkin spice lattes are  back and Adobe announces its annual updates. At this year’s IBC, Adobe had a variety of updates to its Creative Cloud line of apps. From more info on their new editing platform Project Rush to the addition of Characterizer to Character Animator — there are a lot of updates so I’m going to focus on a select few that I think really stand out.

Project Rush

I use Adobe Premiere quite a lot these days; it’s quick and relatively easy to use and will work with pretty much every codec in the universe. In addition, the Dynamic Link between Adobe Premiere Pro and Adobe After Effects is an indispensible feature in my world.

With the 2018 fall updates, Adobe Premiere will be closer to a color tool like Blackmagic’s Resolve with the addition of new hue saturation curves in the Lumetri Color toolset. In Resolve these are some of the most important aspects of the color corrector, and I think that will be the same for Premiere. From Hue vs. Sat, which can help isolate a specific color and desaturate it to Hue vs. Luma, which can help add or subtract brightness values from specific hues and hue ranges — these new color correcting tools further Premiere’s venture into true professional color correction. These new curves will also be available inside of After Effects.

After Effects features many updates, but my favorites are the ability to access depth matte data of 3D elements and the addition of the new JavaScript engine for building expressions.

There is one update that runs across both Premiere and After Effects that seems to be a sleeper update. The improvements to motion graphics templates, if implemented correctly, could be a time and creativity saver for both artists and editors.

AI
Adobe, like many other companies, seem to be diving heavily into the “AI” pool, which is amazing, but… with great power comes great responsibility. While I feel this way and realize others might not, sometimes I don’t want all the work done for me. With new features like Auto Lip Sync and Color Match, editors and creators of all kinds should not lose the forest for the trees. I’m not telling people to ignore these features, but asking that they put a few minutes into discovering how the color of a shot was matched, so you can fix something if it goes wrong. You don’t want to be the editor who says, “Premiere did it” and not have a great solution to fix something when it goes wrong.

What Else?
I would love to see Adobe take a stab at digging up the bones of SpeedGrade and integrating that into the Premiere Pro world as a new tab. Call it Lumetri Grade, or whatever? A page with a more traditional colorist layout and clip organization would go a long way.

In the end, there are plenty of other updates to Adobe’s 2018 Creative Cloud apps, and you can read their blog to find out about other updates.

Presenting at IBC vs. NAB

By Mike Nuget

I have been lucky enough to attend NAB a few times over the years, both as an onlooker and as a presenter. In 2004, I went to NAB for the first time as an assistant online editor, mainly just tagging along with my boss. It was awesome! It was very overwhelming and, for the most part, completely over my head.  I loved seeing things demonstrated live by industry leaders. I felt I was finally a part of this crazy industry that I was new to. It was sort of a rite of passage.

Twelve years later, Avid asked me to present on the main stage. Knowing that I would be one of the demo artists that other people would sit down and watch — as I had done just 12 years earlier — was beyond anything I thought I would do back when I first started. The demo showed the Avid and FilmLight collaboration between the Media Composer and the Baselight color system. Two of my favorite systems to work on. (Watch Mike’s presentation here.)

Thanks to my friend and now former co-worker Matt Schneider, who also presented alongside of me, I had developed a very good relationship with the Avid developers and some of the people who run the Avid booth at NAB. And at the same time, the Filmlight team was quickly being put on my speed dial and that relationship strengthened as well.

This past NAB, Avid once again asked me to come back and present on the main stage about Avid Symphony Color and FilmLight’s Baselight Editions plug-in for Avid, but this time I would get to represent myself and my new freelance career change — I had just left my job at Technicolor-Postworks in New York a few weeks prior. I thought that since I was now a full-time freelancer this might be the last time I would ever do this kind of thing. That was until this past July, when I got an email from the FilmLight team asking me to present at IBC in Amsterdam. I was ecstatic.

Preparing for IBC was similar enough as far as my demo, but I was definitely more nervous than I was at NAB. I think it was two reasons: First, presenting in front of many different people in an international setting. Even though I am from the melting pot of NYC, it is a different and interesting feeling being surrounded by so many different nationalities all day long, and pretty much being the minority. On a personal note, I loved it. My wife and I love traveling, and to us this was an exciting chance to be around people from other cultures. On a business level, I guess I was a little afraid that my fast-talking New Yorker side would lose some people, and I didn’t want that to happen.

The second thing was that this was the first time that I was presenting strictly for FilmLight and not Avid. I have been an Avid guy for over 15 years. It’s my home, it’s my most comfortable system, and I feel like I know it inside and out. I discovered Baselight in 2012, so to be presenting in front of FilmLight people, who might have been using their systems for much longer, was a little intimidating.

When I walked into the room, they had setup a full-on production, along with spotlights, three cameras, a projector… the nerves rushed once again. The demo was standing room only. Sometimes when you are doing presentations, time seems to fly by, so I am not sure I remember every minute of the 50-minute presentation, but I do remember at one point within the first few minutes my voice actually trembled, which internally I thought was funny, because I do not tend to get nervous. So instead of fighting it, I actually just said out loud “Sorry guys, I’m a little nervous here,” then took a deep breath, gathered myself, and fell right into my routine.

I spent the rest of the day watching the other FilmLight demos and running around the convention again saying hello to some new vendors and goodbye to those I had already seen, as Sunday was my last day at the show.

That night I got to hang out with the entire Filmlight staff for dinner and some drinks. These guys are hilarious, what a great tight-knit family vibe they have. At one point they even started to label each other, the uncle, the crazy brother, the funny cousin. I can’t thank them enough for being so kind and welcoming. I kind of felt like a part of the family for a few days, and it was tremendously enjoyable and appreciated.

Overall, IBC felt similar enough to NAB, but with a nice international twist. I definitely got lost more since the layout is much more confusing than NAB’s. There are 14 halls!

I will say that the “relaxing areas” at IBC are much better than NAB’s! There is a sandy beach to sit on, a beautiful canal to sit by while having a Heineken (of course) and the food trucks were much, much better.

I do hope I get to come back one day!


Mike Nuget (known to most as just “Nuget”) is a NYC-based colorist and finishing editor. He recently decided to branch out on his own and become a freelancer after 13 years with Technicolor-Postworks. He has honed a skill set across multiple platforms, including FilmLight’s Baselight, Blackmagic’s Resolve, Avid and more. 

Our Virtual Production Roundtable

By Randi Altman

Evolve or die. That old adage, while very dramatic, fits well with the state of our current production workflows. While most productions are now shot digitally, the warmth of film is still in the back of pros’ minds. Camera makers and directors of photography often look for ways to retain that warmth in digital. Whether it’s through lighting, vintage lenses, color grading, newer technology or all of the above.

There is also the question of setting looks on-set and how 8K and HDR are affecting the picture and workflows. And let’s not forget shooting for OTT series. There is a lot to cover!

In an effort to get a variety of perspectives, we reached out to a few cinematographers and some camera manufacturers to talk trends and technology. Enjoy!

Claudio Miranda, ASC

Claudio Miranda is a Chilean cinematographer who won an Oscar for his work on Life of Pi. He also worked on The Curious Case of Benjamin Button, the first movie nominated for a cinematography Oscar that was shot entirely on digital. Other films include Oblivion, Tomorrowland and the upcoming Top Gun: Maverick.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Seems like everyone is shooting large format. Chris Nolan and Quentin Tarantino shot 65mm film for their last projects. New digital cameras such as the Alexa LF and Sony Venice cater to this demand. People seem to like the shallow depth of field of these larger format lenses.

How is HDR affecting the way things are being shot these days? Are productions shooting/monitoring HDR on-set?
For me, too much grain in HDR can be distracting. This must be moderated in the camera acquisition format choice and DI. Panning in a high-contrast environment can cause painful strobing. This can be helped in the DI and set design. HDR done well is more important than 8K or even 3D.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K can be important for VFX plates. For me, creatively it is not important, 4K is enough. The positive of 8K is just more K. The downside is that I would rather the camera companies focus on dynamic range, color latitude, sensitivity and the look and feel of the captured image instead of trying to hit a high K number. Also, there are storage and processing issues.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
I have not shot for a streaming service. I do think we need to pay attention to all deliverables and make adjustments accordingly. In the DI, I am there for the standard cinema pass, HDR pass, IMAX pass, home video pass and other formats that arise.

Is the availability of all those camera resolutions a help or a hindrance?
I choose the camera that will fit the job. It is my job in prep to test and pick the camera that best serves the movie.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
On set, I am able to view HDR or 709. I test the pipeline and make sure the LUT is correct and make modifications if needed. I do not play with many LUTs on set, I normally just have one. I treat the camera like a film stock. I know I will be there in the DI to finalize the look. On set is not the place for futzing with LUTs on the camera. My plate is full enough as it is.

If not already covered, how has production changed in the last two years?
I am not sure production has changed, but there are many new tools to use to help make work more efficient and economical. I feel that I have always had to be mindful of the budget, no matter how large the show is. I am always looking for new solutions.

Daryn Okada, ASC
Daryn Okada is known for his work on films such as Mean GirlsAnna Karenina and Just Like Heaven. He has also worked on many TV series, such as Scandal, Grey’s Anatomy and Castle. He served as president of the ASC from 2006 to 2009.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses? 

Modern digital cinema cameras can achieve a level of quality with the proper workflows and techniques to evolve a story’s visual identity parallel explorations shooting on film. Larger image sensors, state-of-the-art lenses and mining historic optics enable cinematographers to use their experience and knowledge of the past to paint rich visual experiences for today’s audience.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
HDR is a creative and technical medium just as shooting and projecting 65mm film would be. It’s up to the director and the cinematographer to decide how to orchestrate the use of HDR for their particular story.

Can you address 8K? What are the positives, and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K will is working its way into production like 65mm and 35mm VistaVision did by providing more technical resolution for use in VFX or special-venue exhibition. The enormous amount of data and cost to handle it must be justified by its financial return and does it benefit a particular story. Latitude and color depth are paramount to creating a motion picture’s pallet and texture. Trying to use a format just because it’s technically possible may be distracting to an audience’s acceptance of a story or creative concept.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?

I think the delivery specifications of OTT have generally raised the bar, making 4K and wide color gamut the norm. For cinematographers that have spent years photographing features, we are accustomed to creating images with detail for a big screen and a wide color pallet. It’s a natural creative process to shoot for 4K and HDR in that respect. 

Are the availability of all those camera resolutions a help or a hindrance? 
Having the best imaging available is always welcomed. Even if a camera is not technically exploited, the creation of subtle images is richer and possible through the smoother transition and blending of color, contrast and detail from originating with higher resolutions and color range.

Can you talk about color management from the sensor/film to the screen? How do you ensure correct color management from the set into dailies and post, the DI and final delivery?
As a cinematographer we are still involved in workflows for dailies and post production to ensure everyone’s creative efforts to the final production are maintained for the immediate viewer and preserved for the audiences in the future.

How has production changed over the last two years?
There are more opportunities to produce content with creative high-quality cinematography thanks to advancements in cameras and cost-effective computing speed combined with demands of high quality displays and projection.

Vanja Černjul, ASC
This New York-based DP recently worked on the huge hit Crazy Rich Asians. In addition to feature film work, Černjul has shot TV shows (Deuce’s season 1 finale and two seasons of Marco Polo, as well as commercials for Panasonic and others.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
One interesting trend I noticed is the comeback of image texture. In the past, cinematographers used to expose film stock differently according to the grain texture they desired. Different exposure zones within the same frame had different grain character, which produced additional depth of the image. We lost that once we switched to digital. Crude simulations of film grain, such as overall filters, couldn’t produce the dimensionality we had with film.

Today, I am noticing new ways of bringing the texture back as a means of creative expression. The first one comes in the form of new, sophisticated post production tools designed to replicate the three-dimensional texturing that occurs naturally when shooting film, such as the realtime texturing tool LiveGrain. Monitoring the image on the set with a LiveGrain texture applied can impact lighting, filtration or lens choices. There are also new ways to manipulate texture in-camera. With the rise of super-sensitive, dual-native ISO sensors we can now shoot at very low-light levels and incorporate so-called photon shot noise into the image. Shot noise has organic character, very much like film grain.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?

The creative potential of HDR technology is far greater than that of added resolution. Unfortunately, it is hard for cinematographers to take full advantage of HDR because it is still far from being the standard way the audience sees our images. We can’t have two completely different looks for a single project, and we have to make sure the images are working on SDR screens. In addition, it is still impractical to monitor in HDR on the set, which makes it difficult to adjust lighting and lens choices to expanded dynamic range. Once HDR screens become a standard, we will be able to really start creatively exploring this new territory.

Crazy Rich Asians

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
Additional resolution adds more available choices regarding relationship of optical systems and aspect ratios. I am now able to choose lenses for their artifacts and character regardless of the desired aspect ratio. I can decide to shoot one part of the film in spherical and the other part in anamorphic and crop the image to the project’s predetermined aspect ratio without fear of throwing away too much information. I love that freedom.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices and workflows, if at all?
For me, the only practical difference between shooting high-quality content for cable or streaming is the fact that Netflix demands their projects to be capt
ured in true 4K RAW. I like the commitment to higher technical standards, even though this may be an unwelcome restriction for some projects.

Is the availability of all those camera resolutions a help or a hindrance?
I like choices. As large format lenses become more available, shooting across formats and resolutions will become easier and simpler.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The key for correct color management from the set to final color grading is in preproduction. It is important to take the time to do proper tests and establish the communication between DIT, the colorist and all other people involved as early as possible. This ensures that original ideas aren’t lost in the process.

Adjusting and fine-tuning the LUT to the lenses, lighting gels and set design and then testing it with the colorist is very important. Once I have a bulletproof LUT, I light and expose all the material for it specifically. If this part of the process is done correctly, the time in final color grading can be spent on creative work rather than on fixing inconsistencies.

I am very grateful for ACES workflow, which offers long-overdue standardization. It is definitely a move in the right direction.

How has production changed over the last two years?
With all the amazing post tools that are becoming more available and affordable, I am seeing negative trends of further cutting of preproduction time, and lack of creative discipline on the set. I sincerely hope this is just a temporary confusion due to recalibration of the process.

Kate Reid, DP
Kate Reid is a UK-based DP working in TV and film. Her recent work includes the TV series Hanna (Amazon) Marcella 2 (Netflix) and additional photography on the final season Game of Thrones for HBO. She is currently working on Press for BBC.

Can you talk about some camera trends you’ve been seeing? Such as Large Format? The use of old/vintage lenses?
Large format cameras are being used increasingly on drama productions to satisfy the requirement for additional resolution by certain distribution platforms. And, of course, the choice to use large format cameras in drama brings with it another aesthetic that DPs now have as another tool: Choosing if increased depth-of-field fall off, clarity in the image etc., enhances the particular story they wish to portray on screen.

Like many other DPs, I have always enjoyed using older lenses to help make the digital image softer, more organic and less predictable, but the larger format cameras now mean that much of this older glass designed for 35mm size sensor may not cover the increased sensor size, so newer lenses designed for the larger format cameras may become popular by necessity, alongside older larger format glass that is enjoying a renaissance.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
I have yet to shoot a show that requires HDR delivery. It hasn’t yet become the default in drama production in the UK.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and frame rate more important currently?
I don’t inherently find an ultra sharp image attractive. Through older glass and diffusion filters on the lens, I am usually looking to soften and break down my image, so I personally am not all about the extra Ks. How the camera’s sensor reproduces color and handles highlights and shadows is of more interest to me, and I believe has more impact on the picture.

Of primary importance is how practical a camera is to work with — size and how comfortable the camera is to handle would supersede excessive resolution — as the first requirement of any camera has got to be whether it allows you to achieve the shots you have in mind, because a story isn’t told through its resolution.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices, and workflows, if at all?
The major change is the requirement by Netflix for true 4K resolution, determining which cameras cinematographers are allowed to shoot on. For many cinematographers the Arri Alexa was their digital camera of choice, which was excluded by this rule, and therefore we have had to look to other cameras for such productions. Learning a new camera, its sensor, how it handles highlights, produces color, etc., and ensuring the workflow through to the post facility is something that requires time and testing, which has certainly added to a DP’s workload.

From a creative perspective, however, I found shooting for OTTs (I shot two episodes of the TV series Hanna made by Working Title TV and NBC Universal for Amazon) has been more liberating than making a series for broadcast television as there is a different idea and expectation around what the audience wants to watch and enjoy in terms of storytelling. This allowed for a more creative way of filming.

Is the availability of all those camera resolutions a help or a hindrance?
Where work is seen now can vary from a mobile phone screen to a digital billboard in Times Square, so it is good for DPs to have a choice of cameras and their respective resolutions so we can use the best tool of each job. It only becomes a hindrance if you let the technology lead your creative process rather than assist it.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Ideally, I will have had the time and opportunity to shoot tests during prep and then spend half a day with the show’s colorist to create a basic LUT I can work with on set. In practice, I have always found that I tweak this LUT during the first days of production with the DIT, and this is what serves me throughout the rest of the show.

I usually work with just one LUT that will be some version of a modified Rec. 709 (unless the look of the show drastically requires something else). It should then be straight forward in that the DIT can attach a LUT to the dailies, and this is the same LUT applied by editorial so that exactly what you see on set is what is being viewed in the edit.

However, where this fails is that the dailies uploaded to FTP sites — for viewing by the execs, producers and other people who have access to the work — are usually very compressed with low resolution, so it bears little resemblance to how the work looked on set or looks in the edit. This is really unsatisfying as for months, key members of production are not seeing an accurate reflection of the picture. Of course, when you get into the grade this can be restored, but it’s dangerous if those viewing the dailies in this way have grown accustomed to something that is a pale comparison of what was shot on set.

How has production changed over the last two years?
There is less differentiation between film and television in how productions are being made and, critically, where they are being seen by audiences, especially with online platforms now making award-winning feature films. The high production values we’ve seen with Netflix and Amazon’s biggest shows has seen UK television dramas pushing to up their game, which does put pressure on productions, shooting schedules and HODs, as the budgets to help achieve this aren’t there yet.

So, from a ground-level perspective, for DPs working in drama this looks like more pressure to produce work of the highest standard in less time. However, it’s also a more exciting place to be working as the ideas about how you film something for television versus cinema no longer need apply. The perceived ideas of what an audience is interested in, or expect, are being blown out the water by the success of new original online content, which flies in the face of more traditional storytelling. Broadcasters are noticing this and, hopefully, this will lead to more exciting and cinematic mainstream television in the future.

Blackmagic’s Bob Caniglia
In addition to its post and broadcast tools, Blackmagic offers many different cameras, including the Pocket Cinema Camera, Pocket Cinema Camera 4K, Micro Studio Camera 4K, Micro Cinema Camera, Studio Camera, Studio Camera 4K, Ursa Mini Pro, Ursa Mini 4.6K, Ursa Broadcast.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Lens freedom is on everyone’s mind right now… having the freedom to shoot in any style. This is bringing about things like seeing projects shot on 50-year-old glass because the DP liked the feel of a commercial back in the ‘60s.

We actually just had a customer test out actual lenses that were used on The Godfather, The Shining and Casablanca, and it was amazing to see the mixing of those with a new digital cinema camera. And so many people are asking for a camera to work with anamorphic lenses. The trend is really that people expect their camera to be able to handle whatever look they want.

For large format use, I would say that both Hollywood and indie filmmakers are using them more often. Or, at least they trying to get the general large format look by using anamorphic lenses to get a shallow depth of field.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Right now, HDR is definitely more of a concern for DPs in Hollywood, but also with indie filmmakers and streaming service content creators. Netflix and Hulu have some amazing HDR shows right now. And there is plenty of choice when it comes to the different HDR formats and shooting and monitoring on set. All of that is happening everyday, while 8K still needs the industry to catch up with the various production tools.

As for impacting shooting, HDR is about more immersive colors, and a DP needs to plan for it. It gives viewers a whole new level of image detail in what they shoot. They have to be much more aware of every surface or lighting impact so that the viewer doesn’t get distracted. Attention to detail gets even higher in HDR, and DPs and colorists will need to keep a close eye on every shot, including when an image in a sideview mirror’s reflection is just a little too sharp and needs a tweak.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
You can never have enough Ks! Seriously. It is not just about getting a beautiful 8K TV, it is about giving the production and post pros on a project as much data as possible. More data means more room to be creative, and is great for things like keying.

Latitude and framerate are important as well, and I don’t think any one is more important than another. For the viewers, the beauty will be in large displays, you’re already seeing 8K displays in Times Square, and though you may not need 8K on your phone, 8K on the side of a building or highway will be very impactful.

I do think one of the ways 8K is changing production practices is that people are going to be much more storage conscious. Camera manufacturers will need to continue to improve workflows as the images get larger in an effort to maximize storage efficiencies.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
For streaming content providers, shoots have definitely been impacted and are forcing productions to plan for shooting in a wider number of formats. Luckily, companies like Netflix have been very good about specifying up front the cameras they approve and which formats are needed.

Is the availability of all those camera resolutions a help or a hindrance?
While it can be a bit overwhelming, it does give creatives some options, especially if they have a smaller delivery size than the acquisition format. For instance, if you’re shooting in 4K but delivering in HD, you can do dynamic zooms from the 4K image that look like an optical zoom, or you can get a tight shot and wide shot from the same camera. That’s a real help on a limited budget of time and/or money.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Have the production and the post people planning together from the start and create the look everyone should be working on right up front.

Set the LUTs you want before a single shot is done and manage the workflow from camera to final post. Also, choose post software that can bring color correction on-set, near-set and off-set. That lets you collaborate remotely. Definitely choose a camera that works directly with any post software, and avoid transcoding.

How has production changed in the last two years?
Beyond the rise of HDR, one of the other big changes is that more productions are thinking live and streaming more than ever before. CNN’s Anderson Cooper now does a daily Facebook Live show. AMC has the live Talking Dead-type formats for many of their shows. That trend is going to keep happening, so cinematographers and camera people need to be thinking about being able to jump from scripted to live shooting.

Red Digital Cinema’s Graeme Nattress
Red Digital Cinema manufactures professional digital cameras and accessories. Red’s DSMC2 camera offers three sensor options — Gemini 5K S35, Helium 8K S35 and Monstro 8K VV.

Can you talk about some camera trends you’ve been seeing?
Industry camera trends continue to push image quality in all directions. Sensors are getting bigger, with higher resolutions and more dynamic range. Filmmakers continue to innovate, making new and amazing images all the time, which drives our fascination for advancing technology in service to the creative.

How is HDR affecting the way things are being shot these days?
One of the benefits of a primary workflow based on RAW recording is that HDR is not an added extra, but a core part of the system. Filmmakers do consider HDR important, but there’s some concern that HDR doesn’t always look appealing, and that it’s not always an image quality improvement. Cinematography has always been about light and shade and how they are controlled to shape the image’s emotional or storytelling intent. HDR can be a very important tool in that it greatly expands the display canvas to work on, but a larger canvas doesn’t mean a better picture. The increased display contrast of HDR can make details more visible, and it can also make motion judder more apparent. Thus, more isn’t always better; it’s about how you use what you have.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
Without resolution, we don’t have an image. Resolution is always going to be an important image parameter. What we must keep in mind is that camera resolution is based on input resolution to the system, and that can — and often will — be different to the output resolution on the display. Traditionally, in video the input and output resolutions were one and the same, but when film was used — which had a much higher resolution than a TV could display — we were taking a high-resolution input and downsampling it to the display, the TV screen.

As with any sampled system, in a digital cinema camera there are some properties we seek to protect and others to diminish. We want a high level of detail, but we don’t want sharpening artifacts and we don’t want aliasing. The only way to achieve that is through a high-resolution sensor, properly filtered (optical low-pass) that can see a large amount of real, un-enhanced detail. So yes, 8K can give you lots of fine detail should you want it, but the imaging benefits extend beyond downsampling to 4K or 2K. 8K makes for an incredibly robust image, but noise is reduced, and what noise remains takes on more of a texture, which is much more aesthetically pleasing.

One challenge of 8K is an increase in the amount of sensor data to be recorded, but that can be addressed through quality compression systems like RedCode.

Addressing dynamic range is very important because dynamic range and resolution work together to produce the image. It’s easy to think that high resolutions have a negative impact upon dynamic range, but improved pixel design means you can have dynamic range and resolution.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Color management is vitally important and so much more than just keeping color control from on-set through to delivery. Now with the move to HDR and an increasing amount of mobile viewing, we have a wide variety of displays, all with their own characteristics and color gamuts. Color management allows content creators to display their work at maximum quality without compromise. Red cameras help in multiple ways. On camera, one can monitor in both SDR and HDR simultaneously with the new IPP2 image processing pipeline’s output independence, which also allows you to color via CDL and creative 3D LUT in such a way as to have those decisions represented correctly on different monitor types.

In post and grading, the benefits of output independence continue, but now it’s critical that scene colors, which can so easily go out of gamut, are dealt with tastefully. Through the metadata support in the RedCode format, all the creative decisions taken on set follow through to dailies and post, but never get in the way of producing the correct image output, be it for VFX, editorial or grading.

Panavision’s Michael Cioni 
Panavision designs and manufactures high-precision camera systems, including both film and digital cameras, as well as lenses and accessories for the motion picture and television industries.

Can you talk about some camera trends you’ve been seeing?
With the evolution of digital capture, one of the most interesting things I’ve noticed in the market are new trends emerging from the optics side of cinematography. At a glance, it can appear as if there is a desire for older or vintage lenses based on the increasing resolution of large format digital cameras. While resolution is certainly a factor, I’ve noticed the larger contributor to vintage glass is driven by the quality of sensors, not the resolution itself. As sensors increase in resolution, they simultaneously show improvements in clarity, low-light capability, color science and signal-to-noise ratio.

The compounding effect of all these elements are improving images far beyond what was capable with analog film technology, which explains why the same lens behaves differently on film, S35 digital capture and large-format digital capture. As these looks continue to become popular, Panavision is responding through our investments in both restoration of classic lenses as well as designing new lenses with classic characteristics and textures that are optimized for large format photography on super sensors.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look?
Creating images is not always about what component is better, but rather how they elevate images by working in concert. HDR images are a tool that increases creative control alongside high resolution and 16-bit color. These components work really well together because a compelling image can make use of more dynamic range, more color and more clarity. Its importance is only amplified by the amalgamation of high-fidelity characteristics working together to increase overall image flexibility.

Today, the studios are still settling into an HDR world because only a few groups, led by OTT, are able to distribute in HDR to wide audiences. On-set tools capable of HDR, 4K and 16-bit color are still in their infancy and currently cost-prohibitive. 4K/HDR on the set is going to become a standard practice by 2021. 4K wireless transmitters are the first step — they are going to start coming online in 2019. Smaller OLED displays capable of 750 nits+ will follow in 2020, creating an excellent way to monitor higher quality images right on set. In 2021, editorial will start to explore HDR and 4K during the offline process. By 2024, all productions will be HDR from set to editorial to post to mobile devices. Early adopters that work out the details today will find themselves ahead of the competition and having more control as these trends evolve. I recommend cinematographers embrace the fundamentals of HDR, because understanding the tools and trends will help prevent images from appearing artificial or overdone.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
One of the reasons we partnered with Red is because the Monstro 8K VV sensor makes no sacrifice in dynamic range while still maintaining ultra high smoothness at 16 bits. The beauty of technology like this is that we can finally start to have the best from all worlds — dynamic range, resolution, bit depth, magnification, speed and workflow — without having to make quality sacrifices. When cinematographers have all these elements together, they can create images previously never seen before, and 8K is as much part of that story as any other element.

One important way to view 8K is not solely as a thermometer for high-resolution sharpness. A sensor with 35 million pixels is necessary in order to increase the image size, similar to trends in professional photography. 8K large format creates a larger, more magnified image with a wider field of view and less distortion, like the difference in images captured by 70mm film. The biggest positive I’ve noticed is that DXL2’s 8K large-format Red Monstro sensor is so good in terms of quality that it isn’t impacting images themselves. Lower quality sensors can add a “fingerprint” to the image, which can distort the original intention or texture of a particular lens.

With sensors like Monstro capable of such high precision, the lenses behave exactly as the lens maker intended. The same Panavision lenses on a lower grade sensor, or even 35mm film, are exhibiting characteristics that we weren’t able to see before. This is literally breathing new life into lenses that previously didn’t perform the same way until Monstro and large format.

Is the availability of so many camera formats a help or a hindrance?
You don’t have to look far to identify individuals who are easily fatigued by having too many choices. Some of these individuals cope with choices by finding ways to regulate them, and they feel fewer choices means more stability and perhaps more control (creative and economic). As an entrepreneur, I find the opposite to be true: I believe regulating our world, especially with regards to the arts and sciences, is a recipe for protecting the status quo. I fully admit there are situations in which people are fatigued by too many complex choices.

I find that failure is not of the technology itself, rather it’s the fault of the manufactures who have not provided the options in easy-to-consume ways. Having options is exactly what creatives need in order to explore something new and improved. But it’s also up to manufacturers to deliver the message in ways everyone can understand. We’re still learning how to do that, and with each generation the process changes a bit. And while I am not always certain which are the best ways to help people understand all the options, I am certain that the pursuit of new art will motivate us to go out of our comfort zones and try something previously thought not possible.

Have you encountered any examples of productions that have shot streaming content (i.e. for Netflix/Amazon) and had to change production practices and workflows for this format/deliverable?
Netflix and Amazon are exceptional examples of calculated risk takers. While most headlines discuss their investment in the quantity of content, I find the most interesting investment they make is in relationships. Netflix and Amazon are heavily invested in standards groups, committees, outreach, panels and constant communication. The model of the past and present (incumbent studios) are content creators with technology divisions. The model of the future (Netflix, Amazon, Hulu, Apple, Google and YouTube) are all the technology companies with the ability to create content. And technology companies approach problems from a completely different angle by not only embracing the technology, they help invent it. In this new technological age, those who lead and those who follow will likely be determined by the tools and techniques used to deliver. What I call “The Netflix Effect” is the impact Netflix has on traditional groups and how they have all had to strategically pivot based on Netflix’s impact.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The DXL2 has an advanced color workflow. In collaboration with LiveGrade by Pomfort, DXL2 can capture looks wirelessly from DITs in the form of CDLs and LUTs, which are not only saved into the metadata of the camera, but also baked into in-camera proxy files in the form of Apple ProRes or Avid DNx. These files now contain visual references of the exact looks viewed on monitors and can be delivered directly to post houses, or even editors. This improves creative control because it eliminates the guess work in the application of external color decisions and streamlines it back to the camera where the core database is kept with all the other camera information. This metadata can be traced throughout the post pipeline, which also streamlines the process for all entities that come in contact with camera footage.

How has production changed over the last two years?
Sheesh. A lot!

ARRI‘s Stephan Ukas-Bradley
The ARRI Group manufactures and distributes motion picture cameras, digital intermediate systems and lighting equipment. Their camera offerings include the Alexa LF, Alexa Mini, Alexa 65, Alexa SXT W and the Amira.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Large format opens some new creative possibilities, using a shallow depth of field to guide the audience’s view and provide a wonderful bokeh. It also conveys a perspective truer to the human eye, resulting in a seemingly increased dimensional depth. The additional resolution combined with our specially designed large format Signature Primes result in beautiful and emotional images.

Old and vintage lenses can enhance a story. For instance, when Gabriel Beristain, ASC, used Bausch & Lomb Super Baltar on the Starz show Magic City, and Bradford Young used detuned DNA lenses in conjunction with Alexa 65 on Solo: A Star Wars Story, certain characteristics like flares, reflections, distortions and focus fall-off are very difficult to recreate in post organically, so vintage lenses provide an easy way to create a unique look for a specific story and a way for the director of photography to maintain creative control.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Currently, things are not done much differently on set when shooting HDR versus SDR. While it would be very helpful to monitor in both modes on-set, HDR reference monitors are still very expensive and very few productions have the luxury to do that. One has to be aware of certain challenges when shooting for an HDR finish. High contrast edges can result in a more pronounced stutter/strobing effect when panning the camera, windows that are blown out in SDR might retain detail in the HDR pass and now all of a sudden, a ladder or grip stand are visible.

In my opinion, HDR is more important than higher resolution. HDR is resolution-independent in regard to viewing devices like phone/tablets and gives the viewer a perceived increased sharpness, and it is more immersive than increased resolution. Also, let’s not forget that we are working in the motion picture industry and that we are either capturing moving objects or moving the camera, and with that introducing motion blur. Higher resolution only makes sense to me in combination with higher frame rates, and that in return will start a discussion about aesthetics, as it may look hyper-real compared to the traditional 24fps capture. Resolution is one aspect of the overall image quality, but in my opinion extended dynamic range, signal/noise performance, sensitivity, color separation and color reproduction are more important.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices and workflows, if at all?
Shooting streaming content has really not changed production practices or workflows. At ARRI, we offer very flexible and efficient workflows and we are very transparent documenting our ARRIRAW file formats in SMPTE RDD 30 (format) and 31 (processing) and working with many industry partners to provide native file support in their products.

Is the availability of all those camera resolutions a help or a hindrance?
I would look at all those different camera types and resolutions as different film stocks and recommend to creatives to shoot their own test and select the camera systems based on what suits their project best.

We offer the ARRI Look Library for Amira, Alexa Mini and Alexa SXT (SUP 3.0), which is a collection of 87 looks, each of them available in three different intensities provided in Rec. 709 color space. Those looks can either be recorded or only used for monitoring. These looks travel with the picture, embedded in the metadata of the ARRIRAW file, QuickTime Atom or HD/SDI stream in form of the actual LUT and ASC CDL. One can also create a look dynamically on set, feeding the look back to the camera and having the ASC CDL values embedded in the same way.

More commonly, one would record in either ARRIRAW or ProRes LogC, while applying a standard Rec. 709 look for monitoring. The “C” in LogC stands for Cineon, which is a film-like response very much the like of a scanned film image. Colorists and post pros are very familiar with film and color grading LogC images is easy and quick.

How has production changed over the last two years?
I don’t have the feeling that production has changed a lot in the past two years, but with the growing demand from OTTs and increased production volume, it is even more important to have a reliable and proven system with flexible workflow options.

Main Image: DP Kate Reid.

Behind the Title: Carbon senior colorist Julien Biard

NAME: Julien Biard

COMPANY: Carbon in Chicago

CAN YOU DESCRIBE YOUR COMPANY?
Carbon is a full-service creative studio specializing in design, color, visual effects and motion graphics, with offices in Chicago, Los Angeles and New York.

WHAT’S YOUR JOB TITLE?
Senior Colorist

WHAT DOES THAT ENTAIL?
I’m responsible for grading the work to get the most out of the material. Color has a lot of potential to assist the storytelling in conveying the emotion of a film. I also oversee the running of the Chicago color department.

National Trust

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Most of the time people are surprised this job actually exists, or they think I’m a hair colorist. After many years this still makes me smile every time!

WHAT’S YOUR FAVORITE PART OF THE JOB?
There are many aspects of the job I enjoy. The main part of the job is the creative side, giving my input and taste to a piece makes the job personally and emotionally involving. I get a lot of satisfaction from this process, working with the team and using color to set the mood and tone of the spot or film.

Finally, by far the best part of the job is to educate and train the next generation of colorists. Having been part of the same process at the beginning of my career, I feel very proud to be able to pass on my knowledge, what I have learned from peers and worked out for myself, and to help as many youngsters to get into color grading as possible.

WHAT’S YOUR LEAST FAVORITE?
I miss 35mm…

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I’m a morning type of guy, so getting on my bike nice and early, taking photographs or getting straight to work. Mornings are always productive for me.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d be an art buyer! Realistically, I’d probably be a mountain guide back home in the French Alps where I grew up.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
In all honesty, this was very unexpected as I originally trained to become a professional football player until quite an advanced age — which I’m now glad wasn’t meant to be my path. It was only when I moved to London after graduating that I fell into the post world where I started as a tea boy. I met the colorist there, and within the first day I knew this would be something I’d enjoy doing and could be good at. I trained hard and worked alongside some of the best colorists in the industry, learning from them while finding my own tune and it worked out pretty well.

Ted Baker

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
National Trust
Run the Jewels
Royal Blood
Rapha
Ted Baker

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
There are many projects I’m proud of, and picking only one is probably not possible. I think what I’m the most proud of is the relationship I have built with some of the industry’s most creative talents — people like Crowns and Owls, David Wilson, Thomas Bryant, Andrew Telling and Ninian Doff, to name a few. Also, being able to bring my contribution to the edifice in this stimulating world is what I’m the proudest of.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My sound system, my camera, a corkscrew and my bike, of course!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Mainly Instagram; it’s all about the visuals.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Is there such thing as grading without music?! I need my music when I work. It helps me get in the zone and also helps me with timings. An album is around the hour mark, so I know where I am.

Taste wise? Oh dear, the list could be long. If the beat is good and there are instruments, I’m in. I do struggle with pop music a lot. But I’m open to anything else.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I ride my bike, anywhere I can. I climb. I enjoy photography very much too. Since I’m in a dark room most of the time at work, I spend as much of my spare time outside as possible

Roundtable Post tackles HFR, UHD and HDR image processing

If you’re involved in post production, especially episodic TV, documentaries and feature films, then it’s highly probable that High Frame Rate (HFR), Ultra High Definition (UHD) and High Dynamic Range (HDR) have come your way.

“On any single project, the combination of HFR, UHD and HDR image-processing can be a pretty demanding, cutting-edge technical challenge, but it’s even more exacting when particular specs and tight turnarounds are involved,” says Jack Jones, digital colorist and CTO of full-service boutique facility Roundtable Post Production.

Among the central London facility’s credits are online virals for brands including Kellogg’s, Lurpak, Rolex and Ford, music films for Above & Beyond and John Mellencamp, plus broadcast TV series and feature documentaries for ITV, BBC, Sky, Netflix, Amazon, Discovery, BFI, Channel 4, Showtime and film festivals worldwide. These include Sean McAllister’s A Northern Soul, Germaine Bloody Greer (BBC) and White Right: Meeting The Enemy (ITV Exposure/Netflix).

“Yes, you can render-out HFR/UHD/HDR deliverables from a variety of editing and grading systems, but there are not many that can handle the simultaneous combination of these formats, never mind the detailed delivery stipulations and crunching deadlines that often accompany such projects,” says Jones.

Rewinding to the start of 2017, Jones says that, “Looking forward, to the future landscape of post, the proliferation of formats, resolutions, frame rates and color spaces involved in modern screened entertainment seemed an inevitability for our business. We realized that we were going to need to tackle the impending scenario head-on. Having assessed the alternatives, we took the plunge and gambled on Colorfront Transkoder.”

Transkoder is a standalone, automated system for fast digital file conversion. Roundtable Post’s initial use of Colorfront Transkoder turned out to be the creation of encrypted DCP masters and worldwide deliverables of a variety of long-form projects, such as Nick Broomfield’s Whitney: Can I Be Me, Noah Media Group’s Bobby Robson: More Than a Manager, Peter Medak’s upcoming feature The Ghost of Peter Sellers, and the Colombian feature-documentary To End A War, directed by Marc Silver.

“We discovered from these experiences that, along with incredible quality in terms of image science, color transforms and codecs, Transkoder is fast,” says Jones. “For example, the deliverables for To End A War, involved 10 different language versions, plus subtitles. It would have taken several days to complete these out straight of out of an Avid, but rendering in Transkoder took just four hours.”

More recently, Roundtable Post was faced with the task of delivering country-specific graphics packages, designed and created by production agency Noah Media Group, for use by FIFA rights holders and broadcasters during the 2018 World Cup.

The project involved delivering a mix of HFR, UHD, HDR and HD SDR formats, resulting in 240 bespoke animations, and the production of a mammoth 1,422 different deliverables. These included: 59.94p UHD HDR, 50p UHD HDR, 59.94p HD SDR, 50p HD SDR, 59.94i HD SDR and 50i HD SDR with a variety of clock, timecode, pre-roll, soundtrack, burn-in and metadata requirements as part of the overall specification. Furthermore, the job encompassed the final QC of all deliverables, and it had to be completed within a five-day work week.

“For a facility of our size, this was a significant job in terms of its scale and deadline,” says Jones. “Traditionally, projects like these would involve throwing a lot of people and time at them, and there’s always the chance of human error creeping in. Thankfully, we already had positive experiences with Transkoder, and were eager to see how we could harness its power.”

Using technical data from FIFA, Jones built an XML file containing timelines all of the relevant timecode, clock, image metadata, Wav audio and file-naming information of the required deliverables. He also liaised with Colorfront’s R&D team, and was quickly provided with an initial set of Python script templates that would help to automate the various requirements of the job in Transkoder.

Roundtable Post was able to complete the FIFA 2018 World Cup job, including the client-attend QC of the 1,422 different UHD HDR and HD SDR assets, in under three days.

Sony Pictures Post adds three theater-style studios

Sony Pictures Post Production Services has added three theater-style studios inside the Stage 6 facility on the Sony Pictures Studios lot in Culver City. All studios feature mid-size theater environments and include digital projectors and projection screens.

Theater 1 is setup for sound design and mixing with two Avid S6 consoles and immersive Dolby Atmos capabilities, while Theater 3 is geared toward sound design with a single S6. Theater 2 is designed for remote visual effects and color grading review, allowing filmmakers to monitor ongoing post work at other sites without leaving the lot. Additionally, centralized reception and client services facilities have been established to better serve studio sound clients.

Mix Stage 6 and Mix Stage 7 within the sound facility have been upgraded, each featuring two S6 mixing consoles, six Pro Tools digital audio workstations, Christie digital cinema projectors, 24 X 13 projection screens and a variety of support gear. The stages will be used to mix features and high-end television projects. The new resources add capacity and versatility to the studio’s sound operations.

Sony Pictures Post Production Services now has 11 traditional mix stages, the largest being the Cary Grant Theater, which seats 344. It also has mix stages dedicated to IMAX and home entertainment formats. The department features four sound design suites, 60 sound editorial rooms, three ADR recording studios and three Foley stages. Its Barbra Streisand Scoring Stage is among the largest in the world and can accommodate a full orchestra and choir.

Digging into the dailies workflow for HBO’s Sharp Objects

By Randi Altman

If you have been watching HBO’s new series Sharp Objects, you might have some theories about who is murdering teenage girls in a small Missouri town, but at this point they are only theories.

Sharp Objects revolves around Amy Adams’ character, Camille, a journalist living in St. Louis, who returns to her dysfunctional hometown armed with a deadline from her editor, a drinking problem and some really horrific childhood memories.

Drew Dale

The show is shot in Atlanta and Los Angeles, with dailies out of Santa Monica’s Local Hero and post out of its sister company, Montreal’s Real by Fake. Real by Fake did all the post on the HBO series Big Little Lies.

Local Hero’s VP of workflows, Drew Dale, managed the dailies workflow on Sharp Objects, coming up against the challenges of building a duplicate dailies set up in Atlanta as well as dealing with HBO’s strict delivery requirements — not just for transcoding, but for labeling files and more. Local Hero co-owner Steve Bannerman calls it “the most detailed and specific dailies workflow we’ve ever designed.”

To help cope with such a high level of complexity, Dale turned to Assimilate’s Scratch as the technical heart of his workflow. Since Scratch is a very open system, it was able to integrate seamlessly with all the software and hardware tools that were needed to meet the requirements.

Local Hero’s DI workflow is something that Dale and the studio have been developing for about five or six years and adjusting for each show or film they work on. We recently reached out to Dale to talk about that workflow and their process on Sharp Objects, which was created by Marti Noxon and directed by Jean-Marc Vallée.

Can you describe your workflow with the footage?
Basically, the DIT hands a shuttle RAID (we use either OWC or Areca RAIDs) to a PA, and they’ll take it to our operator. Our operators tend to start as soon as wrap hits, or as soon as lunch breaks, depending on whether or not you’re doing one or two breaks a day.

We’ll ingest into Scratch and apply the show LUT. The LUT is typically designed by our lead colorist and is based on a node stack in Blackmagic Resolve that we can use on the back end as the first pass of the DI process. Once the LUT is loaded, we’ll do our grades using the CDL protocol, though we didn’t do the grade on Sharp Objects. Then we’ll go through, sync all the audio, QC the footage and make our LTO back-ups.

What are you looking for in the QC?
Things like crew in the shot, hot pixels, corrupt footage, lens flares, just weird stuff that’s going to cost money on the backend. Since we’re working in conjunction with production a lot of the time, we can catch those things reasonably early; a lot earlier than if you were waiting until editorial. We flag those and say, “This scene that you shot yesterday is out of focus. You should probably re-shoot.” This allows them adjust more quickly to that sort of thing.

After the QC we do a metadata pass, where we take the embedded information from the WAV files provided by the sound mixer, as well as custom metadata entered by our operator and apply that throughout the footage. Then we’ll render out editorial media — typically Avid but sometimes Premiere or Final Cut — which will then get transferred to the editors either via online connection or shipped shuttle drives. Or, if we’re right next to them, we’ll just push it to their system from our computer using a fiber or Ethernet intranet.

We’ll also create web dailies. Web dailies are typically H.264s, and those will either get loaded onto an iPad for the director, uploaded to pix or Frame.io for web review, or both.

You didn’t grade the dailies on Sharp Objects?
No, they wanted a specific LUT applied; one that was used on the first season of Big Little Lies, and is being used on the second season as well. So they have a more generic look applied, but they do have very specific needs for metadata, which is really important. For example, a lot of the things they require are the input of shoot date and shoot day information, so you can track things.

We also ingest track information from WAV files, so when the editor is cutting the footage you can see the individual audio channel names in the edit, which makes cutting audio a lot easier. It also helps sync things up on the backend with the audio mix. As per HBO’s requests, a lot of extra information in the footage goes to the editor.

The show started in LA and then moved to Atlanta, so you had to build your workflow for a second time? Can you talk about that?
The tricky part of working on location is making sure the Internet is set up properly and getting a mobile version of our rig to wherever it needs to go. Then it’s dealing with the hassle of being on location. I came up in the production world in the camera department, so it reminds me of being back on set and being in the middle of nowhere with a lot less infrastructure than you’re used to when sitting at a post house in Los Angeles. Most of the challenge of being on location is finding creative ways to implement the same workflow in the face of these hurdles.

Let’s get back to working with HBO’s specific specs. Can you talk about different tools you had to call on to make sure it was all labeled and structured correctly?
A typical scene identifier for us is something like “35B-01” 35 signifies the scene, “B” signifies the shot and “01” signifies the take.

The way that HBO structured things on Sharp Objects was more by setup, so it was a much more fluid way of shooting. It would be like “Episode 1, setup 32, take one, two, three, four, five.” But each of those takes individually was more like a setup and less like a take itself. A lot of the takes were 20 minutes long, 15 minutes long, where they would come in, reset the actors, reset the shot, that kind of thing.

In addition to that, there was a specific naming convention and a lot of specific metadata requirements required by the editors. For example, the aforementioned WAV track names. There are a lot of ways to process dailies, but most software doesn’t provide the same kind of flexibility with metadata as Scratch.

For this show it was these sorts of things, as well as very specific LTO naming conventions and structure, which took a lot of effort on our part to get used to. Typically, with a smaller production or smaller movie, the LTO backups they require are basically just to make sure that the footage is placed somewhere other than our hard drives, so we can store it for a long period of time. But with HBO, very specific manifests are required with naming conventions on each tape as well as episode numbers, scene and take info, which is designed to make it easier for un-archiving footage later for restoration, or for use in later seasons of a show. Without that metadata, it becomes a much more labor-intensive job to track down specific shots and scenes.

HBO also requires us to use multiple LTO brands in case one brand suddenly ceases to support the medium, or if a company goes under, they can un-archive the footage 30 years from now. I think a lot of the companies are starting to move toward future-proofing their footage in case you need to go back and remaster it.

Does that make your job harder? Easier?
It makes it harder in some ways, and easier in others. Harder because there is a lot of material being generated. I think the total count for the show was something like 120TB of footage, which is not an excessive amount for a show this big, but it’s definitely a lot of data to manage over the course of a show.

Could name some of the tools that you used?
As I mentioned, the heartbeat of all our dailies workflows is Scratch. I really love Scratch for three reasons. First, I can use it to do fully color graded, fully animated dailies with power windows, ramping curves — everything. Second, it handles metadata very well. This was crucial for Sharp Objects. And finally, it’s pretty affordable.

Beyond Scratch, the software that we tend to use most for copying footage is Silverstack. We use that for transferring files to and from the RAID to make sure everything’s verified. We use Scratch for processing the footage; that’s sort of the big nexus of everything. We use YoYottaID for LTO creations; that’s what HBO suggests we use to handle their specific LTO requirements. One of the things I love is the ability to export ALEs directly out of Scratch and into YoYattID. This saves us time and errors. We use Aspera for transferring files back and forth between HBO and ourselves. We use Pix for web daily distributions. Pix access was specifically provided to us by HBO.

Hardware wise, we’re mostly working on either Mac Pros or Silverdraft Demon PCs for dailies. We used to use mostly Mac Pros, but we find that they aren’t quite robust enough for larger projects, though they can be useful for mid-range or smaller jobs.

We typically use Flanders monitors for our on-set grading, but we’ve also used Sony’s and JVC’s, depending on the budget level and what’s available on hand. We tend to use the G-Speed Shuttle XLs for the main on-set RAIDs, and we like to use OWC Thunderbays or Areca thunderbolt RAIDS for our transfer drives.

What haven’t I asked that is important?
For me it’s important to have tools, operators and infrastructure that are reliable so we can generate trust with our clients. Trust is the biggest thing for me, and the reason we vetted all the software… we know what works. We know it does what we need it to do to be flexible for everybody’s needs. It’s really about just showing the clients that we’ve got their back.

Colorist Asa Shoul joins Warner Bros. De Lane Lea

Warner Bros. De Lane Lea (WBDLL) has added colorist Asa Shoul to its new picture services department, which will launch this September. Shoul’s recent credits include Mission Impossible: Fallout, Baby Driver, Amazon’s Tin Star and he recently received a BAFTA craft award for his work on the multi award-winning Netflix series The Crown.

Shoul will be joined by his assistant Katie McCulloch, senior post producer Louise Stewart and online editor Gareth Parry, as well as additional industry-leading creative, technical and operations staff, yet to be announced.

The expansion will include the launch of two new 4K HDR grading theaters, in addition to online suites, mastering, content handling services and dark fibre connectivity for both Dolby UK and Leavesden Studios. A full-service production dailies offering based at Warner Bros. Studios Leavesden will also launched to help service the needs of productions.

Recent features at WBDLL include Wonder Woman; Three Billboards Outside Ebbing, Missouri; Fantastic Beasts and Early Man. It also has an impressive roster of high-end television clients including Netflix, Amazon, Starz, BBC and ITV.

Last year the company announced it will cement its future in Soho by moving to a new purpose-built post production facility located in the centre of Soho, which is currently under construction. WBDLL will be the anchor tenant within the Ilona Rose building which is slated to open in 2021.

Colorist Bob Festa on Yellowstone’s modern Western look

Paramount Network’s Yellowstone, from creator, writer and director Taylor Sheridan (Sicario, Hell or High Water), is a 10-episode modern-day Western starring Kevin Costner as the patriarch of the Duttons, owners of the largest ranch in the contiguous United States.

The Dutton family is in constant conflict with owners of the property surrounding their land, including developers, an Indian reservation and a national park. The series follows Costner’s character and his dysfunctional children as they navigate their bumpy road.

Cinematographer Ben Richardson and Efilm senior colorist Mitch Paulson already had a color lock on the pilot for Yellowstone, but brought on Encore senior colorist Bob Festa to work on the episodes. “As a Deluxe sister company, it was only natural to employ Encore Hollywood’s television resources,” explains Festa. “I was keen to collaborate with both Ben and Mitch. Mitch then served as supervising colorist.”

Let’s find out more from the veteran colorist.

How did you work with the director and DP?
Honestly, my first discussions with Ben were quite involved and fully articulated. For instance, while Ben’s work with Beasts of the Southern Wild and Wind River are wildly different looking projects —and shot on different formats — the fundamentals that he shared with me were fully in place in both of those projects, as well as with Yellowstone.

There is always a great deal of talking that goes on beforehand, but nothing replaces collaboration in the studio. I guess I auditioned for the job by spending a full day with Ben and Mitch at Encore. Talk is a cheap abstraction, and there is nothing like the feeling you get when you dim the lights, sit in the chair and communicate with pictures.

The only way I can describe it is it’s like improvising with another musician when you have never played together before. There’s this buildup of ideas and concepts that happens over a few shots, grades get thrown out or refined, layers are added, apprehension gives way to creativity, and a theme takes place. If you do this over 50 shots, you develop a language that is unique to a given project and a “look” is born.

What was your workflow for this project? What did you use tool-wise on Yellowstone?
ARRI RAW and Resolve were the foundation, but the major lifting came from using a Log Offset workflow, better known as the printer lights workflow. Although printer lights has its roots in a photochemical laboratory setting, it has tremendous real-world digital applications. Many feel this relationship to printer lights is very elementary, but the results can be scaled up very quickly to build an amazingly natural and beautiful grade.

The Resolve advanced panel can be remapped to use an additional fourth trackball as a fuel-injected printer light tool that is not only very fast and intuitive, but also exceptionally high quality. The quality angle comes from the fact that Log Offset grading works in a fashion that keeps all of the color channels moving together during a grade. All curves work in complete synchronicity, resulting in a very natural transition between the toe and the knee, and the shoulder and head of the grade.

This is all enhanced using pivot and contrast controls to establish the transfer characteristic of a scene. There is always a place for cross process, bleach bypass and other twisted aggressive grades, but this show demanded honest emotion and beauty from the outset. The Log Offset workflow delivered that.

What inspired the look of Yellowstone? Are there any specific film looks it is modeled after?
As a contemporary western, you can draw many correlations to cinematic looks from the past, from Sergio Leone to Deadwood, but the reality is the look is decidedly modern western.

In the classic film world, the look is very akin to a release print, or in the DI world it emulates a show print (generationally closer to the original negative). The look demands that the curves and saturation are very high quality. Ben has refined an ARRI LUT that really enhances the skies and flesh tones to create a very printy film laboratory look. We also use Livegrain for the most part using a 35mm 5219 emulation for night shots and a 5207 look for day exteriors to create texture. That is the Yellowstone recipe.

How did you approach the sweeping landscape shots?
Broad, cinematic and we let the corners bleed. Vignettes were never used on the wide vistas. The elements are simple: you have Kevin Costner on a horse in Montana. The best thing I can think of is to follow the medical credo of “do no harm.”

What was the most challenging aspect of coloring Yellowstone?
Really just the time constraints. Coordinating with the DP, the VFX teams and the post crew on a weekly basis for color review sessions is hard for everyone. The show is finished week by week, generally delivering just days before air. VFX shots are dropped in daily. Throw in the 150 promos, teasers and trailers, and scheduling that is a full-time job.

Other than color, did you perform any VFX shots?
Every VFX vendor supplied external mattes with their composites. We always color composite plates using a foreground and a background grade to serve the story. This is where the Resolves external matte node structure can be a lifesaver.

What is your favorite scene or scenes?
I have to go with episode one of the pilot. That opening shot sets the tone for the entire series. The first time I saw that opening shot, my jaw dropped both from a cinematography and story background. If you have seen the show, you know what I’m talking about.

Review: Blackmagic’s Resolve 15

By David Cox

DaVinci Resolve 15 from Blackmagic Design has now been released. The big news is that Blackmagic’s compositing software Fusion has been incorporated into Resolve, joining the editing and audio mixing capabilities added to color grading in recent years. However, to focus just on this would hide a wide array of updates to Resolve, large and small, across the entire platform. I’ve picked out some of my favorite updates in each area.

For Colorists
Each time Blackmagic adds a new discipline to Resolve, colorists fear that the color features take a back seat. After all, Resolve was a color grading system long before anything else. But I’m happy to say there’s nothing to fear in Version 15, as there are several very nice color tweaks and new features to keep everyone happy.

I particularly like the new “stills store” functionality, which allows the colorist to find and apply a grade from any shot in any timeline in any project. Rather than just having access to manually saved grades in the gallery area, thumbnails of any graded shot can be viewed and copied, no matter which timeline or project they are in, even those not explicitly saved as stills. This is great for multi-version work, which is every project these days.

Grades saved as stills (and LUTS) can also be previewed on the current shot using the “Live Preview” feature. Hovering the mouse cursor over a still and scrubbing left and right will show the current shot with the selected grade temporarily applied. It makes quick work of finding the most appropriate look from an existing library.

Another new feature I like is called “Shared Nodes.” A color grading node can be set as “shared,” which creates a common grading node that can be inserted into multiple shots. Changing one instance, changes all instances of that shared node. This approach is more flexible and visible than using Groups, as the node can be seen in each node layout and can sit at any point in the process flow.

As well as the addition of multiple play-heads, a popular feature in other grading systems, there is a plethora of minor improvements. For example, you can now drag the qualifier graphics to adjust settings, as opposed to just the numeric values below them. There are new features to finesse the mattes generated from the keying functions, as well as improvements to the denoise and face refinement features. Nodes can be selected with a single click instead of a double click. In fact, there are 34 color improvements or new features listed in the release notes.

For Editors
As with color, there are a wide range of minor tweaks all aimed at improving feel and ergonomics, particularly around dynamic trim modes, numeric timecode entry and the like. I really like one of the major new features, which is the ability to open multiple timelines on the screen at the same time. This is perfect for grabbing shots, sequences and settings from other timelines.

As someone who works a lot with VFX projects, I also like the new “Replace Edit” function, which is aimed at those of us that start our timelines with early drafts of VFX and then update them as improved versions come along. The new function allows updated shots to be dragged over their predecessors, replacing them but inheriting all modifications made, such as the color grade.

An additional feature to the existing markers and notes functions is called “Drawn Annotations.” An editor can point out issues in a shot with lines and arrows, then detail them with notes and highlight them with timeline markers. This is great as a “note to self” to fix later, or in collaborative workflows where notes can be left for other editors, colorists or compositors.

Previous versions of Resolve had very basic text titling. Thanks to the incorporation of Fusion, the edit page of Resolve now has a feature called Text+, a significant upgrade on the incumbent offering. It allows more detailed text control, animation, gradient fills, dotted outlines, circular typing and so on. Within Fusion there is a modifier called “Follower,” which enables letter-by-letter animation, allowing Text+ to compete with After Effects for type animation. On my beta test version of Resolve 15, this wasn’t available in the Edit page, which could be down to the beta status or an intent to keep the Text+ controls in the Edit page more streamlined.

For Audio
I’m not an audio guy, so my usefulness in reviewing these parts is distinctly limited. There are 25 listed improvements or new features, according to the release notes. One is the incorporation of Fairlight’s Automated Dialog Replacement processes, which creates a workflow for the replacement of unsalvageable originally recorded dialog.

There are also 13 new built-in audio effects plugins, such as Chorus, Echo and Flanger, as well as de-esser and de-hummer clean-up tools.
Another useful addition both for audio mixers and editors is the ability to import entire audio effects libraries, which can then be searched and star-rated from within the Edit and Fairlight pages.

Now With Added Fusion
So to the headline act — the incorporation of Fusion into Resolve. Fusion is a highly regarded node-based 2D and 3D compositing software package. I reviewed Version 9 in postPerspective last year [https://postperspective.com/review-blackmagics-fusion-9/]. Bringing it into Resolve links it directly to editing, color grading and audio mixing to create arguably the most agile post production suite available.

Combining Resolve and Fusion will create some interesting challenges for Blackmagic, who say that the integration of the two will be ongoing for some time. Their challenge isn’t just linking two software packages, each with their own long heritage, but in making a coherent system that makes sense to all users.

The issue is this: editors and colorists need to work at a fast pace, and want the minimum number of controls clearly presented. A compositor needs infinite flexibility and wants a button and value for every function, with a graph and ideally the ability to drive it with a mathematical expression or script. Creating an interface that suits both is near impossible. Dumbing down a compositing environment limits its ability, whereas complicating an editing or color environment destroys its flow.

Fusion occupies its own “page” within Resolve, alongside pages for “Color,” “Fairlight” (audio) and “Edit.” This is a good solution in so far that each interface can be tuned for its dedicated purpose. The ability to join Fusion also works very well. A user can seamlessly move from Edit to Fusion to Color and back again, without delays, rendering or importing. If a user is familiar with Resolve and Fusion, it works very well indeed. If the user is not accustomed to high-end node-based compositing, then the Fusion page can be daunting.

I think the challenge going forward will be how to make the creative possibilities of Fusion more accessible to colorists and editors without compromising the flexibility a compositor needs. Certainly, there are areas in Fusion that can be made more obvious. As with many mature software packages, Fusion has the occasional hidden right click or alt-click function that is hard for new users to discover. But beyond that, the answer is probably to let a subset of Fusion’s ability creep into the Edit and Color pages, where more common tasks can be accommodated with simplified control sets and interfaces. This is actually already the case with Text+; a Fusion “effect” that is directly accessible within the Edit section.

Another possible area to help is Fusion Macros. This is an inbuilt feature within Fusion that allows a designer to create an effect and then condense it down to a single node, including just the specific controls needed for that combined effect. Currently, Macros that integrate the Text+ effect can be loaded directly in the Edit page’s “Title Templates” section.

I would encourage Blackmagic to open this up further to allow any sort of Macro to be added for video transitions, graphics generators and the like. This could encourage a vibrant exchange of user-created effects, which would arm editors and colorists with a vast array of immediate and community sourced creative options.

Overall, the incorporation of Fusion is a definite success in my view, whether used to empower multi-skilled post creatives or to provide a common environment for specialized creatives to collaborate. The volume of updates and the speed at which the Resolve software developers address the issues exposed during public beta trials, remains nothing short of impressive.


David Cox is a VFX compositor and colorist with 20-plus years of experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox recently created interactive projects with full body motion sensors and 4D/AR experiences.

Quick Chat: Freefolk colorist Paul Harrison

By Randi Altman

Freefolk, which opened in New York City in October 2017, was founded in London in 2003 by Flame artist Jason Watts and VFX artist Justine White. Originally called Finish, they rebranded to Freefolk with the opening of their NYC operation. Freefolk is an independent post house that offers high-end visual effects, color grading and CG for commercials, film and TV.

We reached out to global head of color grading Paul Harrison to find out his path to color and the way he likes to work.

What are your favorite types of jobs to work on and why?
I like to work on a mix of projects and not be pigeonholed as a particular type of colorist. Commercials are my main work, but I also work on music videos and the odd feature or longform piece. Each form has its own creative challenges, and I enjoy all disciplines.

What is your tool of choice, and why?
I use the FilmLight Baselight color system because it’s extremely versatile and will cope with any file format one cares to mention. On so many levels it allows a colorist to get on with the job at hand and not be bogged down by the kit’s limitations. The toolset is extensive and it doesn’t put boundaries in the way of creativity, like other systems I’ve used.

Are you often asked to do more than just color?
These days, because of the power of the systems we use, the lines are blurring between color and VFX. On most jobs I do things that used to be the realm of the VFX room. Things like softening skin tones, putting in skies or restoring elements of the image that need to be treated differently from the rest of the image.

Traditionally, this was done in the VFX room, now we do it as part of the grade. When there’s more difficult or time-consuming fixes required, the VFX artists will do that work.

How did you become a colorist? What led you down this path?
I started as a runner at the Mill in London. I had always had a keen interest in photography/art and film so this was the natural place for me to go. I was captivated by the mystery of the telecine suite; they looked hideously complex to operate. It was a mix of mechanical machinery, computers, film and various mixers and oscilloscopes, and it spoke to my technical, “How does this work” side of my brain, and the creative, photography/art side too.

Making all the various bits of equipment that comprised a suite then work together and talk to each other was a feat in itself.

Do you have a background in photography or fine art?
I’ve been a keen photographer for years, both on land and underwater. I’ve not done it professionally; it’s just grown through the influence of my work and interests.

In addition to your photography, where do you find inspiration? Museums? Films? A long walk?
I find inspiration from lots of different places — from hiking up mountains to diving in the oceans observing and photographing the creatures that live there. Or going for a walk in all weathers, and at all times of the year.

Art and photography are passions of mine, and seeing the world through the eyes of a talented photographer or artist, absorbing those influences, makes me constantly reassess my own work and what I’m doing in the color room. Colorists sometimes talk about learning to “see.” I think we take notice of things that others pass by. We notice what the “light” is doing and how it changes our environment.

If you had three things to share with a client before a project begins, what would that be?
Before a project begins? That’s a tough question. All I could share would be my vision of the look of the film, any reference that I had to show to illustrate my ideas. Maybe talking about any new or interesting cameras or lenses I’ve seen lately.

How do you prefer getting direction? Photos? Examples from films/TV?
Photos are always good at getting the message across. They describe a scene in a way words can’t. I’m a visual person, so that’s the preferred way for me. Also, a conversation imparts a feeling for the film, obviously that is more open to interpretation.

Do you often work directly with the DP?
DPs seem to be a rarer sight these days. It’s great when one has a good relationship with a DP and there’s that mutual trust in each other.

Is there a part of your job that people might not realize you do? Something extra and special that is sort of below the line?
Yes. Fixing things that no one knows are broken, whether it’s sorting out dodgy exposures/camera faults or fixing technical problems with the material. Colorists and their assistants make the job run smoothly and quietly in the background, outside of the color room.

What project are you most proud of?
Certain jobs stand out to me for different reasons. I still love the look of 35mm, and those jobs will always be favorites. But I guess it’s the jobs that I’ve had the complete creative freedom on like the Stella, Levi’s and Guinness commercials, or some of the music videos like Miike Snow. To be honest I don’t really have a top project.

Can you name some projects that you’ve worked on recently?
Since moving over to NYC recently, I’ve worked on some projects that I knew of before, and some I had no idea existed. Like a Swiffer — I had no idea what that was before working in NYC. But I’ve also graded projects for Cadillac, Bud Light, New York Yankees, Lays, State Farm and Macy’s, to name a few.