Audionamix – 7.1.20

Category Archives: Color Grading

This is Us: Talking with showrunner Dan Fogelman

By Iain Blair

In a time when issues of diversity and social change are at the forefront of society’s collective conversation, the Emmy Award-winning series This Is Us has proved to be very timely. Created by Dan Fogelman, produced by 20th Century Fox Television and airing on NBC and Hulu, the show chronicles the Pearson family across the decades: from Jack (Milo Ventimiglia) and Rebecca (Mandy Moore) as young parents in the 1980s to their kids Kevin, Kate and Randall searching for love and fulfillment in the present day.

Dan Fogelman

Fogelman’s TV credits include The Neighbors, Pitch and Like Family. He’s written film screenplays for Pixar’s Cars and Disney’s Bolt and Tangled. His live-action film credits include the screenplays for Last Vegas; Crazy, Stupid, Love; and the semi-autobiographical The Guilt Trip. He also directed and wrote the features Danny Collins and Life Itself.

I recently talked with Fogelman about making the show, the challenges and why he loves post. In fact, post supervisor Nick Pavonetti also joined in the conversation.

You finished Season 4 just before the COVID crisis. Have you started Season 5?
Yes, and we have a pretty unusual process. We’ve had early pickups for the show, which allows us to jump right into the next season at the end of the last one in terms of storytelling. So we’ve already mapped out a lot of it and written quite a lot, and we’re way ahead, which helps us with both production and post.

Where do you shoot, and what cameras do you shoot on?
We shoot on stages at Paramount using ARRI Alexa cameras. It’s a two-camera setup — A and B — and our shooting style is pretty voyeuristic. This was established right back in the pilot. We like to put you right inside the room with the family. It’s not that super-hand-held, shaky, on-the-ground action look.

We try to really get inside with the characters and cross-shoot where possible, as it allows for the natural moments to play out with multiple angles, as opposed to trying to manufacture them for a second position. We have an amazing DP, Yasu Tanida, who works with the directors to find the frames that allow us to use this setup. But for specialized episodes — like the Vietnam battle sequence the concert or the episode that was set entirely in a waiting room — we’ll use three or four cameras, but that’s very rare.

Do you like being a showrunner?
I love it. It’s the best job ever, but it’s difficult, challenging and relentless in terms of the schedule. When I’m exhausted, I often fantasize about jobs that allow you to clock off at 5pm. That’s not this gig. But I started down this path because I wanted to be the final word on the page and the final edit of this thing you love.

You have a giant crew and giant cast. Are those the big challenges of running this show?
Yeah, it’s a huge army of super-talented people. The big challenge is storytelling because on this show it’s really complicated since you’re not just telling one linear story a week, but often five or six, all in just 42 minutes. And we have seasons that are interconnected in time periods and multiple time periods — up to six. So keeping track of all that when we should be focused on one character, one storyline, one time period, is the real challenge.

Where do you post?
We do the editing on the lot at Paramount and have three editors and their Avid bays, which are conveniently close to our writers’ room and Nick Pavonetti’s post team. We do our mixing at Technicolor on the lot and the color timing at Technicolor at Sunset Gower; our sound editorial is done at Smart Post Sound with supervising sound editor Randy Thomas.

Do you like post?
Honestly, it’s my favorite part of the whole process. I’m a writer by trade, and post is all about rewriting. I spend very little time on set because when I go, I find very little I can add, as everyone knows what they’re doing. I spend a lot of time writing the scripts and working with writers on theirs, and then with the editors, as you’re essentially writing in the edit bay sometimes. I have a hard time letting anyone else take control in the edit bay.

Besides dealing with all the characters, storylines and time periods, what are the big editing challenges?
Timing and pacing, since after a first cut, a typical episode tends to come in about 10 minutes longer than NBC’s very strict run time of 42 minutes and 30 seconds, which is what we have to hit. So we have to reshape the story and maybe cut down my overly long monologues — but they still have to feel part of a whole with the piece.

This show has a great score and great sound design. Talk about the importance of sound and music, and working supervising sound editor Randy Thomas.
That’s another part of post I love — playing with the score by composer Siddhartha Khosla, which is such a vital part of the show’s emotion and power. Even without picture, it stirs real emotion. Then I drive Nick crazy talking about the mix since we have a lot of music — a lot of needle drops, a lot of score — but all the dialogue is crucial too, so finding that balance in the mix takes a lot of time and effort to make it all sing together. Randy is so good at all that.

What about all the VFX? What’s involved?
Nick Pavonetti: It’s quite complicated. We’re this little family drama, but there’s a huge amount of VFX that are quite delicate and subtle — ageing and de-ageing characters. We have an in-house VFX coordinator, Jim Owens, and an in-house artist, Josh Bryson, who’ve really helped us get the VFX to the high level we want. That team will probably grow next year. So they’re right with us in the edit room and going through cuts in progress. We use a bunch of VFX companies — Ingenuity, Technicolor, CBS Digital, Big Little Panda, Inviseffects and Parker Mountain.

Nick, what are the big challenges in post for you?
It’s a big show and just getting all the pieces together on time in post is very demanding. As Dan said, we’re always trying to cut stuff down and we may be doing reshoots at the last minute and then having to drop that in. It’s not like a Netflix show where it’s all done six months in advance. We’ve mixed Saturday and Sunday for Tuesday air. That’s a very tight schedule.

Dan, where do you do the DI, and how closely do you work with colorist Tom Forletta?
It’s not really in my wheelhouse, so I trust Nick, Tom and our DP and their judgment on all that. But if we’re doing an episode set in Vietnam, for instance, where we’re doing a lot of really heavy VFX, I want to make sure it all looks real and realistic in the final color, so I’ll be more involved.

There’s a lot of talk about the lack of diversity in the entertainment business, but you recruited behind-the-scenes diverse talent, including black directors like George Tillman Jr. and Regina King, and black female writers like Kay Oyegun and Jas Waters. Why did that matter to you?
Well, this show is meant to be about the collective human experience in this country, so you’d like the people working on it to reflect that — and you’d like it to be like that on any show, and I feel we all still have a ways to go.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

MisterWives’ Rock Bottom music video: light, dark and neon

American indie pop band MisterWives’ Rock Bottom video was made to promote the band’s first single off its upcoming album. In the video, the band’s lead singer, Mandy Lee, is seen walking on the sands and hills of a beach before walking through a mirror to find the rest of her band on a dance floor. The video combines neon colors and different textures with dark and gray backgrounds at the beginning as Lee goes from dark times to eventually breaking through the mirror and shining with her band on a swirling dance floor.

To capture the look the band wanted, production was done in different locations at different times of day. This included shooting on a remote beach and in the California desert, into which director and colorist Jade Ehlers and his small crew had to hand-carry all of their camera gear and lighting, including a 100-pound mirror. Ehlers color graded the piece on Resolve and edited on Adobe Premiere.

“We wanted to go for a darker tone, with the neon colors in the darkness that showed that light can shine through even the dark times. The song is about showing it is more about the journey to get to the end of the tunnel than just sitting in the dark times, and the video had to capture that perfectly,” Ehlers says.

The video was shot with a Blackmagic Pocket Cinema Camera 6K, which was chosen because of its small size, high dynamic range and ability to shoot in low light, all essential requirements that allowed Ehlers to shoot at the locations that were best for the song.

“Honestly, because of how different all our scenes were, I knew we needed a camera that had great low light that would allow us to be sparing with light since this shoot had a lot of hiking involved,” he says. “The beach location was quite crazy, and we hiked all of the gear in, so having a small camera bag to carry everything in was great.”

Throughout the video, Ehlers had to adjust for different textures and unexpected lighting problems — including lighting the lead singer’s bright-green puffy dress against a gray background in the desert. Another challenge came from shooting the dance floor scenes, wherein the black floor was not putting out as much light as expected. To compensate and get the shots, Ehlers used the camera’s 13 stops of dynamic range and dual native ISO up to 25,600 along with the Blackmagic Raw codec for high-quality, lifelike color images and skin tones.

“Because of the bit range of the camera’s sensor, I was able to qualify the dress to make it pop a bit more, which was amazing and saved me a lot of extra work. And the dance floor scenes were great but were also harder than we imagined, so we had to push the camera higher on the ISO so get the exposure we needed,” concludes Ehlers.

Audionamix – 7.1.20

Getting creative to shoot Eckrich sausage spot during COVID

By Randi Altman

During this pandemic, many brands have found creative ways to keep producing commercials. Some with recurring characters, like Progressive, have had actors shoot from their homes while taking part in a Zoom-like call. The creatives behind sausage brand Eckrich went in a different direction — San Francisco-based production and post company Kaboom used some staff and their families to create the live-action spot, Eckrich Zoom Family Dinner.

While agency Trofia came up with the concepts of the families doing the Zoom dinner Kaboom figured out how to design a workable approach to the idea, developing it to suit the restrictions while still giving a warm and robust feeling. Kaboom’s Joe Stevens and Doug Werby directed and acted as DPs along with Chris Saul. Werby also served as editor. Lauren Schwartz was EP for Kaboom.

The spot features a handful of family members ordering food, including Eckrich sausages, online. When the deliveries come, the viewer watches these families prepare a variety of sausage-based meals on the grill, on the stove and in the oven. When complete, they all turn on their electronics and share their meals with their extended family while social distancing. You can watch the final piece here:

In terms of capture, Werby shot on a Sony Venice in San Francisco, Saul (who is a freelance DP) on an ARRI Alexa Mini in Los Angeles, and Stevens on a Red Dragon in Portland.

Doug Werby (center) and family

Werby cut on Adobe Premiere and the color grade by Swells Studio’s Sean Wells was done remotely using Blackmagic Resolve.

We reached out to Werby to get some details from the unconventional shoot.

This was a great solution for creating new content during the shutdown. How did it start?
We were in the beginning of COVID while quarantined at home and jumped at the chance to do creative work. Our home has been used as a location for several still and film shoots, so we were used to the process, although wearing three hats was a little challenging.

Generally, on set I only direct, but this day I was shooting, directing and acting (if you can even call it that). My plan was to keep the day as simple as possible and only capture what I knew I would need in the edit.

How did you make it work?
We used this great company called PowerRemote to trouble shoot and deliver an interactive video shoot to New England, where our head of production Steven Sills was located. After a half-hour Zoom tutorial a few days before the shoot, a sanitized Sony Venice was delivered to our house with a webcam and an iPad so I could be in constant contact with my remote DP/technical director Petr Stepanek, who was monitoring the camera, the signal and me. In the end, we captured everything we needed at a beautiful time of day.

How did your family get involved and were they prepared for their “15 minutes”?
I live with professional photographer Sylvie Gil. She is an avid cook who has an Instagram cooking presence called Voila bon_appetit, so asking her to be in the spot was a no-brainer.

As for Ulysse, Sylvie’s son, he’s done other video work with me before, and he had just graduated college during the first weeks of COVID, so he was game.

To be honest it was one of the better days during the shelter in place.


DP James Whitaker on Amazon’s Troop Zero

The Amazon Studios film Troop Zero follows a bright and quirky young girl named Christmas and her eccentric friends on their quest to become a Birdie Scout troop and travel to Jamboree to take part in a science competition. Christmas’ mother nurtured her into believing that meteors and shooting stars were messages from the heavens above, so when NASA announces the Golden Record program at Jamboree, she knows she needs to infiltrate the high-and-mighty Birdie Scout youth group in order to enter the talent show and get the chance to win and to have her voice heard throughout the stars.

James Whitaker

This comedy-drama, which stars Viola Davis, Allison Janney, Jim Gaffigan and Mckenna Grace, is helmed by the female directing team Bert and Bertie from a screenplay written by the Oscar-winning Beasts of the Southern Wild co-writer Lucy Alibar. It was inspired by Alibar’s 2010 play Christmas and Jubilee Behold the Meteor Shower.

The small-budget Troop Zero was captured over 28 days across multiple locations around New Orleans in settings made to look and feel like the sweltering summer experience common in rural Georgia during the mid-‘70s.

DP James Whitaker, ASC, (The Cooler, Captain America: Civil War, Thank You for Smoking, Patriot), knowing they had limited budget and time, meticulously scouted out the locations ahead of time, blocking scenes and planning the lens choices to best address the style and action the directors wanted to convey during the shoot. Working closely with a camera team consisting of veterans first AC Bryan DeLorenzo, key grip Charles Lenz and gaffer Allen Parks, they were able to light the way and set the mood for the production. Troop Zero’s main camera was an ARRI Alexa SXT.

“Using a lookup table that had been gifted to me by Sean Coleman at CO3 as a starting point, I worked closely with the digital imaging technician, Adrian Jebef, to shape this into our show LUT,” explains Whitaker. “Adrian then applied the LUT across a Sony 24-inch calibrated monitor and then routed this signal to the director’s monitors, including to the video village and the video assist.

“The signal was sent to the entire set so that the established look was presented to everyone — from hair and makeup to costume and wardrobe — to make sure there were no questions on what the picture would look like,” he continues. “With limited time and multiple locations, Adrian would adjust the looks from scene to scene with CDLs or Printer Light adjustments, and these looks were given to dailies colorist Alex Garcia from Light Iron, working near set on location on Resolve. Alex would balance these looks across the multiple cameras and keep things consistent. These looks were then delivered to editorial and posted to PIX for review.”

Whitaker enjoyed working with Bert and Bertie — sometimes Bert would be directing the talent while Bertie would be able to discuss the camera moves for the next setup, and the next day they might switch roles. “The Berts were really into the idea of formal framing, but they also wanted to mix it up,” he explains.

“We looked at a bunch of different films as references but didn’t really find what we liked, so we created a visual language of our own. I used the Vantage MiniHawk lenses. They have an anamorphic look and come with all the good things I wanted — they are fast, and they are light. They actually have two apertures that allow you to have anamorphic-like distortion in the bokeh, but they are actually spherical lenses. This allowed me to use a short focal length lens for a wide shot and have the actors run into closeup. The close focus is basically the front element of the lens, which is amazing.”

There’s a particularly great food fight scene between the members of the titular Troop Zero and the rival group of Birdie Scouts, wherein the use of slow motion perfectly captures how a group of precocious misfits would envision the experience. It’s like an epic battle in the World War of Girl Scouts, with flour raining down around everyone as someone runs by wielding a soaked eggbeater, spraying everyone in range with rapid-fire batter bullets, while another scout takes a bowl of rainbow sprinkles to the face. The slow-motion intensity was captured at high frame rate with the ARRI Alexa SXT camera system using the Codex SXR capture media. Using a combination of dolly and hand-held shots that move the viewer through the action, the motion feels smooth and the images are in focus throughout.

“When I first sat down with the Berts and Corrine Bogdanowicz at Light Iron to grade Troop Zero, we had so much range in the image. This is why ARRI cameras are my first choice,” he says. “You have this large 3.4K filmic image in raw that we could push wherever we wanted. We started warming it up, making it less saturated and windowing various parts of the skies and faces. After a bit of this, we sat back and said, ‘This doesn’t feel like it is servicing the story we wanted to tell.’ Sometimes you need to simply go back to basics.

“We started from the beginning using the same LUT that we had on set, and then Corinne did a basic Printer Light grade (in Resolve) to start, and it looked pretty much like what we had viewed on the monitors during the shoot. We skewed a bit from the original CDL values, but the overall feel of the look was very close in the end.”

“Working with a Codex raw workflow is an easy sell for me. The earlier concerns from a producer about the cost of the capture drives and the time it takes a DIT to back up the data have seemingly gone away. Codex is so fast and robust that I never get a pushback in shooting raw on a production. The last two TV shows I shot — Season 2 of Patriot and Perpetual Grace, LTD — were both captured on Codex in ARRIRAW. I just bought an ARRI Alexa Mini LF with the new compact drives, and I am looking forward to using this when we get back to work.”


How CBS’ All Rise went remote for season finale

By Daniel Restuccio

When the coronavirus forced just about everything to shut down back in mid-March, many broadcast television series had no choice but to make their last-shot episodes their season finales. Others got creative.

Producer Dantonio Alvarez

While NBC’s The Blacklist opted for a CG/live-action hybrid to end its season, CBS’ courtroom drama, All Rise, chose to address the shutdown head-on with a show that was shot remotely. When CBS/Warner Bros. shut down production on All Rise, EPs Michael M. Robin and Len Goldstein — along with EP/co-showrunners Greg Spottiswood and Dee Harris-Lawrence — began brainstorming the idea of creating an episode that reflected the current pandemic crisis applied to the justice system.

Co-producer Dantonio Alvarez was deep into remote post on the already-shot episodes 19 and 20 when Robin called him. He and consultant Gil Garcetti had looked into how the court system was handling the pandemic and decided to pitch an idea to Warner Bros.: a remote episode of All Rise done via a Zoom-like setup. Alvarez was relieved; it meant a lot of the crew — 50 from the usual 90-person team — could keep working.

In a week’s time, Spottiswood and co-executive producer Greg Nelson wrote the 64-page script that focused on the complications around a virtual bench trial and the virus-jammed court system.

The Logistics
Producer Ronnie Chong reached out to Jargon Entertainment’s Lucas Solomon to see how he could help. Jargon, which provides on-set playback and computer graphics, had been working with network solutions company Straight Up Technologies (SUT) on other projects. Solomon brought SUT into the mix. “We figured out a way to do everything online and to get it to a point where Mike Robin could be at home directing everybody,” he explains.

Straight Up Technologies offers a secure and proprietary broadband network with a broadcast-quality ISP backbone that can accommodate up to 200 simultaneous video feeds at 1920×1080 at 30fps and do 4K (3840×2160 or 4096×2160). For All Rise to record at 1920×1080, each actor needed a network upload speed of 5Mb/s for no lag or packet loss. If the producers had decided to go 4K, it would have needed to be triple that.

Prep started the week of April 10, with Solomon, Alvarez, DP David Harp, Robin and the SUT IT team doing Zoom or WebEx scouts of the actors’ homes for suitable locations. They also evaluated each home’s bandwidth, making a list of what computers and mobile devices everyone had.

“You’re only as good as the connection out of your house and the traffic around your house,” explains SUT’s John Grindley. They used what was in the actors’ houses and enhanced the connection to their network when necessary. This included upgrading the basic download/upload data plan, going from 4G to 5G, putting in signal boosters, adding hard lines to computers and installing “cradle points” — high-end Wi-Fi hotspots — if needed.

The cast got small battery-powered ring lights for their devices.

Cinematographer Harp set out to find what area of the casts’ houses helped tell the story. He asked things like, “What was the architecture? What kind of lights did they have in the room? Were they on dimmers? Where were the windows, and what are the window treatments like?” The answers to those questions determined Harp’s lighting package. He sent small battery-powered ring lights to the cast along with tripods for their iPhones, but mostly they worked with what they had. “We decided that we’re not going to get cameras out to anybody,” explains Alvarez. “We were going to use people’s phones and their home computers for capture.”

As a result, all 22 cast members became camera operators, grips and essentially one-person guerrilla film crews. Their gear was MacBook Pros, MacBook Airs, iPhones, and Cisco DX70s. Harp controlled exposure on the computers by moving lights around and positioning the actors.

Solomon set up his video assist system, QTake, at his shop in Valencia. It was equipped with a bandwidth of 400Mb/s download and 20Mb/s upload to record all the feeds. “We set up two other recording locations — one in Hollywood and one in Chatsworth — as redundancy.”

Production Begins
On Friday, April 17, day one of the six-day shoot, a five-person engineering crew at the COVID-safe SUT offices in San Francisco, Seattle and El Segundo fired up the network, checked the call sheet and connected to the crew.

Actors, Jessica Camacho (Emily Lopez) and Lindsay Mendez (Sara Castillo) logged into the join.sutvideo.com on their MacBook Pro laptop and iPhone, respectively. Their signal strength was good, so they shot their scene.

According to Straight Up Technologies CTO Reinier Nissen, the engineers set up virtual spaces, or “talent rooms,” for each actor and a “main stage” room where “talent rooms” were nested and scenes were played out. Every actor’s camera and mic feeds were married and recorded as individual signals. The “main stage” could be configured into a split-screen “Zoom-like” grid with inputs from any of the actors’ feeds. Some of the virtual spaces were control rooms, like a video village, where crew and IT could see all the actors, give technical and creative direction, monitor the signals, manage network traffic and control whose video and audio were on or muted.

The Cisco DX70s natively output 1920×1080 at 30fps. The MacBook Pro and Air 1280×720 camera feeds were upscaled in the sutvideo.com system to 1920×1080 30fps. The iPhones, 4K capable, were set to 1920×1080 30fps. Solomon recorded both the split-screen main stage and individual actor talent room streams to his QTake system in QuickTime ProRes 1920×1080, recalibrated the frame rate to 23.97 and added timecode.

DP David Harp

Each take was slated just like a normal shoot. From his LA home, director Robin could see everyone in the scene on the main stage and decide how to arrange them in the grid, set their eyelines and even pop into the grid during rehearsal and between takes to give notes.

Staging the scene, you would think that the actor should look straight at the camera so you could see their eyes. However, they noticed that there was “less of a connection when looking at the lens,” says Harp. “When they’re looking around the screen, you can feel a connection because they’re looking at each other.”

In addition to the virtual first unit footage, Harp shot eight days of second unit footage of Los Angeles streets during COVID. With four suction cups, he attached his Sony A7 to the roof of his son’s car and drove around for four or five hours a day shooting essentially a stock library of Los Angeles during a pandemic.

Post Production
Alvarez used the remote post infrastructure he set up for Episodes 19 and 20 for the new show. All of the editors, assistant editors, visual effects artists and audio team were working from home on their own systems or ones provided by Warner Bros. Since there was no Avid Unity shared storage, they did old-school shuttling of drives from location to location.

“We had three teams tackling this thing because our schedule was ridiculously short,” says Alvarez. “Every single day, feeding everybody material, we were able to get everyone cutting. We’d send live feeds or links to producers to get their eyes on editorial approvals on scenes in real time. We just moved.”

MTI Film EP Barbara Marshall reports that all the footage was ingested into the post house’s Signiant server system. From those masters, they made DNxHD 36 dailies using the MTI Cortex v5 software and sent them to the editors and assistant editors.

The edit team included Craig Bench, Leah Breuer and Chetin Chabuk, who worked with three assistants: Bradford Obie, Diana Santana and Douglas Staffield. They edited from home on six Avid Media Composers. They worked 13-hour days for 14 days in a row, says Bench.

Everyone on the editorial team got the same pool of dailies and started editing Saturday morning, April 18. Once they reviewed the footage, the team decided to rebuild the split-screen grids from scratch to get the pace of the show right. They wanted to retain, as much as possible, both the cadence of the dialog and the syncopated cutting style that Spottiswood and Bench had set in the pilot.

Rebuilding the grids, explains Bench, “gave us the freedom to treat everyone’s coverage separately. Even though the grid appears to be one take, it’s really not. We were creating our own world.” Rough cuts were sent every night to Robin.

During the first couple of production days, all three teams would jump on cutting the dailies as well as working through the previous day’s notes. As the show came together, Bench worked on the teaser and Act 1, Chabuk did Acts 2 and 3, and Breuer did Act 4 and the party scene at the end.

“There was a lot of experimenting,” explains Bench. “In the grid, should the actors be side by side or one on top of the other? There was also a lot of back and forth about grid background colors and textures.”

The assistants had their bins full setting up grid templates. This would allow them to drop an iso shot on a track so it would go to that spot on the grid and keep it consistent. They also built all the sound effects of the frames animating on and off.

Editorial gave MTI online editor Andrew Miller a “soft lock” of the episode early on April 30. Miller got the Avid project file that was “a big stack of split screens” and a reference video from Bench.

MTI colorist Greg Strait

Miller worked over the weekend with post supervisor Cat Crimins putting the episode together remotely. They replaced all the proxies with the high-res masters in the timeline and made necessary last-minute adjustments.

MTI colorist Greg Strait got a baked, uncompressed 10-bit MXF mixdown of the Avid timeline from Miller. Strait, who graded virtually the entire season of All Rise in Digital Vision’s Nucoda, had a good idea where the look was going. “I tried to keep it as familiar as possible to the other 20 episodes,” he says. “Sharpening some things, adding contrast and putting a lot of power windows around things had the best result.”

After laying in the audio stems, post was wrapped Sunday night at 11pm. Alvarez did a quality-control review of the episode. On Monday, May 4, they output XDCAM as the network deliverable.

Despite the tight time crunch, things went pretty smoothly, which MTI Film’s Marshall attributes to the trust and longtime relationship MTI has with Robin and the show. “That’s the cool thing about Mike. He definitely likes to push the envelope,” she says.

All Rise has been renewed for Season 2, and the team promises the innovations will continue.


Collaborating on color for HBO’s I Know This Much is True

In HBO’s emotionally devastating limited series I Know This Much is True, Mark Ruffalo portrays identical twin brothers Dominick and Thomas Birdsey — a pair of men who might look alike and share the same difficult childhood experiences but who are actually quite different. Thomas has schizophrenia that has caused him to frequently act out in frightening and erratic ways, usually leaving Dominick to pick up the pieces.

Sam Daley

The series, directed by Derek Cianfrance (Blue Valentine, The Place Beyond the Pines) and shot by cinematographer Jody Lee Lipes (A Beautiful Day in the Neighborhood, Manchester by the Sea) presents this fraught and touching relationship so seamlessly that it’s very easy to forget both characters are the result of a single actor’s performances knitted together by VFX supervisor Eric Pascarelli and his team of compositing and effects experts.

Colorist Sam Daley of Company 3, who has worked with Lipes regularly for over a decade, has the utmost respect for Lipes’ somewhat old-school approach. “I came up through telecine and film laboratories,” he says, “and I strive to always respect what’s on the ‘neg’ — digital neg or celluloid. I’m not interested in ‘breaking apart’ an image to make it something it isn’t. My job is to help the director and cinematographer tell the story. I want everything I do to be in harmony with that.”

Lipes and Cianfrance definitely wanted to shoot the period piece (portions take place in various eras from 1913 to 1992) on film, both for the actual texture and feel it can bring to imagery and its characteristic look that evokes the past. Lipes shot tests using several formats through to a final grade and the filmmakers decided that 2-perf 35mm (with a 2:1 extraction from the full 2.66:1 image area) presented the perfect compromise between the too-clean look of 4-perf 35 and the rougher feel of 16. By shooting 2-perf they could also shoot multiple takes without cutting lasting 22.5 minutes (the director’s preferred way of building performance).

Lipes shot with ARRICAM LT cameras, Optimo 24-290 zooms and Cooke S4s and Canon K35s. He rated Eastman Kodak’s fastest stock, 5219 (500T) “under-exposed” by one stop, increasing the apparent grain, letting shadow information fall to almost nothing and letting some highlights blow out, all in the service of enhancing some of the attributes that make film feel like film.

Daley, who graded in Blackmagic DaVinci Resolve 16, leaned into these same qualities. Lipes, he reports, “likes what we call a ‘low-contrast’ or ‘faded film’ look even when he shoots digitally. We did something similar for A Beautiful Day in the Neighborhood. When he shoots for this kind of look, he exposes in a way that compresses the values in shadows, whether shooting on film or a digital sensor so that when we lift everything in post to brighten it, the process results in something of a photographic print quality, especially in the blacks.”

Daley also created a film print emulation LUT for Lipes, which the two tweaked before principal photography commenced so that Lipes and the director were in sync on the intended look from the start.

Jody Lee Lipes

“I always want Sam to be there for the whole process,” Lipes notes. “If you call a colorist up the day before you’re supposed to start grading and say, ‘Here’s the footage. Take a look,’ you’re not going to get the same quality. The whole time I’m working, I count on Sam’s feedback. During testing and production, watching dailies, the whole time — and his feedback affects how we shoot.

“Sam has a holistic approach to storytelling and that goes beyond the technical job. It’s about storytelling, and he’s always a creative force in the process. He also catches the tiniest details. He once was a QC guy, and he’ll catch the smallest little thing that I might not see until the 50th time watching it.”

The VFX shots that including both Dominic and Thomas were shot with Ruffalo as Dominick first and then, after all the character’s scenes were complete, the actor returned about 30 pounds heavier to shoot Thomas’ scenes. Lipes shot these on the same 5219 in 3-perf format, exposing the same way and shooting using the exact same negative area as he did for the 2-perf portions. The picture information captured outside that surface area of the negative was there only as a safety measure in case Pascarelli’s team needed a bit more image beyond the frame to create a seamless composite.

“They removed the grain before they did the work and then put it back on top of the finished shots,” Lipes explains. “They did an excellent job of making sure the effects shots matched everything else, and then Sam was very meticulous about making sure that anything that didn’t sit exactly right in the scene was melded in with his color grading tools.”

While colorists sometimes are expected to make images that pop and look snappy, Lipes says of Daley, “He’s the one who sometimes scales me back. I might propose something extreme and he’ll say, ‘Yes, we could do that but…’ and then show me something that isn’t visually distracting from the story, which is ultimately what we both care most about. Sam is subtle and quiet in his work, and I like that. Look at what he’s done with Christian Springer or Andrij Parekh or Ed Lachman. The work is understated but it’s very good and highly-respected, and that’s exactly where I want my work to be.”

While I Know this Much is True has concluded its first run on HBO, episodes will continue to be available to stream on HBO Go and HBO Max.


Pixelworks hires post vets to lead TrueCut motion grading initiatives

Pixelworks, which provides video and display processing solutions, has beefed up its team with the hire of industry veterans Sarah Priestnall and Bruce Harris. The duo will lead TrueCut motion grading technology initiatives in North America, specifically targeting Hollywood.

Introduced in 2019, Pixelworks TrueCut motion grading provides filmmakers with the ability to deliver at a cinematically tuned high frame rate while filming at any frame rate. This allows for a broader set of motion appearances. The platform takes advantage of current cinema and home-entertainment displays while also ensuring a consistent motion appearance across different devices and screens that is faithful to the original artistic intent.

Priestnall joins as the director of product marketing, heading up the Burbank office and advancing the TrueCut commercial rollout. Harris joins as creative engineer and artist, responsible for usability design, customer training and both technical and artistic support. Priestnall and Harris both report directly to executive vice president of technology Richard Miller, who will continue to lead strategic development of the TrueCut platform.

“We’re excited to bring Sarah and Bruce on board as we ramp our efforts in Hollywood and continue the success of our TrueCut technology around the globe,” Miller says. “We are working with studios, streaming services, post production facilities and creatives to ensure that original intent of artists and filmmakers is displayed on screens across all sizes, from the cinema to the home and beyond.”

Priestnall has been evangelizing new technologies for movie, television production and post throughout her career. She was deeply involved in the development of the digital intermediate process at Cinesite, working closely with Roger Deakins, ASC, BSC, and the Coen Brothers on O Brother, Where Art Thou? She also led Codex’s worldwide marketing efforts. Priestnall is a board member of the Colorist Society International and an associate member of the American Society of Cinematographers. She wrote the chapter on digital post production in the latest ASC manual.

Harris began his career working on movie sets as a propmaker, including on major motion pictures such as Pulp Fiction and A River Runs Through It. He then transitioned into visual effects, becoming a longstanding member of the Visual Effects Society. Working all over the world, he has used his artistic talents as a compositor on productions such as The Aviator and Guardians of the Galaxy.


Tales From the Loop DP talks large-format and natural light

By Adrian Pennington

“Not everything in life makes sense,” a woman tells a little girl in the first episode of Amazon’s series Tales From the Loop. Sage advice from any adult to a child, but in this case the pair are both versions of the same character caught in a time-travelling paradox.

Jeff Cronenweth

“This is an adventure with a lot of sci-fi nuances, but the story itself is about humanity,” says Jeff Cronenweth, ASC, who shot the pilot episode for director/producer Mark Romanek. “We are representing the idea that life is little different from the norm. There are time changes that our characters are unaware of, and we wanted the audience’s attention to detail. We didn’t want the visuals to be a distraction.”

Inspired by the retro-futurist paintings of Swedish artist Simon Stålenhag, Tales from the Loop gravitates around characters in a rural North American community and the emotional connection some of them feel toward artefacts from a clandestine government facility that litter the landscape.

Rather than going full Stranger Things and having a narrative that inexorably unlocks the dark mysteries of the experimental lab, writer Nathaniel Halpern (Legion) and producer Matt Reeves (director of Dawn of the Planet of the Apes and The Batman), construct Tales From the Loop as a series of individual loosely connected short stories.

The tone and pace are different too, as Cronenweth explains. “Simon’s artwork is the foundation for the story, and it elicits a certain emotion, but some of his pieces we felt were overly strong in color or saturated in a way that would overwhelm a live-action piece. Our jumping-off points were his use of light and staging of action, which often depicts rusting, broken-down bipedal robots or buildings located in the background. What is striking is that the people in the paintings — and the characters in our show — treat these objects as a matter of fact of daily life.”

Near the beginning of Episode 1, a young girl runs through woods across snowy ground. Filmed as a continuous shot and edited into two separate shots in the final piece, the child has lost her mother and spends the rest of the story trying to find her. “We can all relate to being 9 years old and finding yourself alone,” Cronenweth explains. “We begin by establishing the scale of the environment. This is flat rural Ohio in the middle of winter.”

Photography took place during early 2019 in southwest Winnipeg in Canada (standing in for Ohio) and in sub-zero temperatures. “Our dilemma was shooting in winter with short daylight hours and at night where it reaches minus 32. Child actors are in 80 percent of scenes and the time you can legally shoot with them is limited to eight hours per day, plus you need weather breaks, or your fingers will break off. The idea of shooting over 10 consecutive nights became problematic. During location scouting, I noticed that the twilight seemed longer than normal and was really very beautiful, so we made the decision to switch our night scenes to magic hour to prolong our shoot time and take advantage of this light.”

He continues, “We had a condor [cherry picker] and lights on standby in case we couldn’t make it. We rehearsed two-camera setups, and once the light was perfect, we shot. It surprised everybody how much we could accomplish in that amount of time.”

Working in low, natural light; maximizing time with child actors and establishing figures isolated in a landscape were among the factors that led to the decision to shoot large-format digital.

Cronenweth drew on his vast experience shooting Red cameras on films for David Fincher, including Gone Girl, The Social Network and The Girl With the Dragon Tattoo. Cronenweth was Oscar nominated for the latter of those two films. His experience with Red and his preference for lenses led him to the Panavision’s Millennium DXL2 with the Red Monstro 8K VV full-frame sensor, which offers a 46.31 mm (diagonal) canvas and 16 bits of color.

“It was important for us to use a format with 70mm glass and a large-format camera to give scale to the drama on the small screen,” he says.

Another vital consideration was to have great control over depth of field. A set of Primo 70s were mainly for second unit and plate work while Panaspeeds (typically 65mm, 125mm and 200mm) allowed him to shoot at T1.4 (aided by 1st AC Jeff Hammerback).

“The Monstro sensor combined with shooting wide open made depth very shallow in order to make our character more isolated as she tries to find what was taken away from her,” explains Cronenweth. “We also want to be with the characters all the time, so the camera movement is considerable. In telling this story, the camera is fluid, allowing viewers to be more present with the character.”

There is very little Steadicam, but he deployed a variety of technocranes, tracks and vehicles to keep the camera moving. “The camera movement is always very deliberate and tied to the actor.”

Shooting against blinding white snow might have been an issue for older generations of digital sensors, but the Monstro “has so much latitude it can handle high-contrast situations,” says Cronenweth. “We’d shoot exteriors at the beginning or end of the day to mitigate extreme daylight brightness. The quality of light we captured at those times was soft and diffused. That, plus a combination of lens choice, filtration and some manipulation in the DI process, gave us our look.”

Cronenweth was able to draw on his experience working camera on eight pictures for fabled Swedish cinematographer Sven Nykvist, ASC, FSF, (Sleepless in Seattle, What’s Eating Gilbert Grape). Other tonal references were the films of Russian filmmaker Andrei Tarkovsky (Stalker) and Polish genius Krzysztof Kieslowski (notably his 10-hour TV series Dekalog).

“I was motivated by Sven’s style of lighting on this,” he says. “We were trying to get the long shadows, to create drama photographically as much as we could to add weight to the story.”

Cronenweth’s year spent shooting Dragon Tattoo in Sweden also came into play. “The way exteriors should look and how to embrace the natural soft light all came flooding back. From Bergman, Tarkovsky and Kieslowski, we leaned into the ‘Scandinavian’ approach of tempered and methodological filmmaking.”

The color palette is suitably muted: cold blues and greys melding with warm yellows and browns. Cronenweth tuned the footage using the DXL2’s built-in color film LUT, which is tuned to the latest Red IPP2 color processing incorporated in the Monstro sensor.

Cronenweth recalls, “In talking with [Light Iron supervising colorist] Ian Vertovec about the DI for Tales From the Loop, he explained that Light Iron had manufactured that LUT from a combination of work we’d done together on The Social Network and Dragon Tattoo. That was why this particular LUT was so appealing to me in tonality and color for this show — I was already familiar with it!”

“I’ve had the good fortune of working with Jeff Cronenweth on several feature films. This would be the first project that’ve we’ve done together that would be delivering for HDR,” reports Vertovec. “I started building the show LUT using the camera LUT for the DXL2 that I made, but I needed to rebuild it for HDR. I knew we would want to control skin tones from going too ruddy and also keep the green grass from getting to bright and electric. When Jeff came into grade, he asked to increase the contrast a bit and keep the blacks nice and rich.”

The pilot of Tales From the Loop is helmed by Romanek, for whom Cronenweth has worked for over two decades on music videos as well as Romanek’s first feature, One Hour Photo. The remaining episodes of Tales From the Loop were shot by Ole Bratt Birkeland; Luc Montpellier, CSC; and Craig Wrobleski, CSC, for directors So Yong Kim, Andrew Stanton and Jodie Foster, among others.

Tales From the Loop is streaming now on Amazon Prime.


Adrian Pennington is a UK-based journalist, editor and commentator in the film and TV production space. He has co-written a book on stereoscopic 3D and edited several publications.


Senior colorist Paul Ensby rejoins Technicolor London

Paul Ensby, a third-generation Technicolor employee, will be returning to the studio’s London facility this month. Following in his grandfather and father’s footsteps, he worked in Technicolor’s London film laboratory in the early ‘90s. Ensby then landed a role as a 35mm trainee feature grader before becoming a full-fledged color grader, working with directors such as Ridley Scott, Jane Campion, Ken Russell, Neil Jordan, Bill Forsyth and Stephen Frears. He comes to Technicolor from Company 3.

Recently Ensby reunited with his longtime friend director Asif Kapadia on the HBO documentary Diego Maradona. He had previously worked on Asif’s Oscar-winning documentary Amy and the BAFTA-winning documentary Senna. Ensby also recently finished grading miniseries Quiz for ITV and Left Bank Pictures, directed by Stephen Frears.

Ensby’s previous credits include The Lady in the Van with cinematographer Andrew Dunn, BSC; Allegiant with cinematographer Florian Ballhaus, ASC; Johnny English Strikes Again with cinematographer Florian Hoffmeister, BSC; and Mary Queen of Scots with cinematographer John Mathieson, BSC.

“Technicolor has always been home to me,” says Ensby.  “They have and will continue to be a standard of excellence and I’m happy to be starting the next chapter in my career with Sherri [Potter, worldwide president of Technicolor Post Production].

 

Colorist Chat: Light Iron’s Nick Hasson

Photography plays a big role in this colorist’s world; he often finds inspiration through others’ images.

Name: Nick Hasson

Company: Light Iron

Can you describe what Light Iron does?
Light Iron is a full-service post company providing end-to-end solutions — including dailies and finishing in both HDR and SDR.

The L Word

As a colorist, what would surprise people the most about what falls under that title?
Colorists are one of the select few creatives that touch every frame of a project. Working with the cinematographer and director, we help shape the tone of a project. It’s very collaborative.

Are you often asked to do more than just color on projects?
Almost every project I do has a visual effects component to it. I have a background in visual effects and online editing, so I am comfortable in those disciplines. I also tend to do a lot of sky replacements, beauty and cleanup work.

What’s your favorite part of the job?
Being creative on a daily basis. Problem solving is another fun aspect of the job. I love finding solutions and making the client smile.

What’s your least favorite?
I like being outside. The long days in a dark room can be a challenge.

Queen of the South

If you weren’t a colorist, what would you be doing instead?
Electrical engineering or network infrastructure. I’m a big geek and love to build computers and push technology.

How did you choose color grading as a profession?
I was originally heading to a career in music. After a year of touring, I decided it was not for me and got a job at a post house. I was lucky enough to work in both VFX and telecine at the time. Photography was always my first love, so color grading just felt right and fell into place for me.

What are some recent projects you’ve worked on?
I’m lucky to work in both episodic and feature film. Recent movies include Corporate Animals, Sweetheart, Boss Level, Wander Darkly and Like a Boss. On the episodic side, I have been working on The L Word, Room 104, Queen of the South, Greenleaf, Exhibit A and The Confession Tapes.

Room 104

What is the project you are most proud of?
Room 104 is a big challenge. Not many projects get to Season 4. Coming up with looks that aid the storytelling and are different every episode has been exciting and creative challenging. We do a lot of the look design in pre-production, and I love seeing what the cinematographers come back with.

Where do you find inspiration?
I love photography! I like to seek out interesting photographers and see how they are pushing the limits of what can be done digitally. I shoot black-and-white film every week. It is a great way to study composition and lighting.

Name three pieces of technology you can’t live without.
My phone, air-conditioned car seats and Amazon.

What social media channels do you follow?
I only use Instagram, and I tend to follow hashtags rather than specific outlets. It gives my feed a broader reach and keeps things fresh.

How do you de-stress from it all?
Spending time with my family. Working on my old cars and playing guitar. I also ride mountain bikes and love to cook in a wood-fired oven.

Posting Michael Jordan’s The Last Dance — before and during lockdown

By Craig Ellenport

One thing viewers learned from watching The Last Dance — ESPN’s 10-part documentary series about Michael Jordan and the Chicago Bulls — is that Jordan might be the most competitive person on the planet. Even the slightest challenge led him to raise his game to new heights.

Photo by Andrew D. Bernstein/NBAE via Getty Images

Jordan’s competitive nature may have rubbed off on Sim NY, the post facility that worked on the docuseries. Since they were only able to post the first three of the 10 episodes at Sim before the COVID-19 shutdown, the post house had to manage a work-from-home plan in addition to dealing with an accelerated timeline that pushed up the deadline a full two months.

The Last Dance, which chronicles Jordan’s rise to superstardom and the Bulls’ six NBA title runs in the 1990s, was originally set to air on ESPN after this year’s NBA Finals ended in June. With the sports world starved for content during the pandemic, ESPN made the decision to begin the show on April 19 — airing two episodes a night on five consecutive Sunday nights.

Sim’s New York facility offers edit rooms, edit systems and finishing services. Projects that rent these rooms will then rely on Sim’s artists for color correction and sound editing, ADR and mixing. Sim was involved with The Last Dance for two years, with ESPN’s editors working on Avid Media Composer systems at Sim.

When it became known that the 1997-98 season was going to be Jordan’s last, the NBA gave a film crew unprecedented access to the team. They compiled 500 hours of 16mm film from the ‘97-’98 season, which was scanned at 2K for mastering. The Last Dance used a combination of the rescanned 16mm footage, other archival footage and interviews shot with Red and Sony cameras.

Photo by Andrew D. Bernstein/NBAE via Getty Images

“The primary challenge posed in working with different video formats is conforming the older standard definition picture to the high definition 16:9 frame,” says editor Chad Beck. “The mixing of formats required us to resize and reposition the older footage so that it fit the frame in the ideal composition.”

One of the issues with positioning the archival game footage was making sure that viewers could focus when shifting their attention between the ball and the score graphics.

“While cutting the scenes, we would carefully play through each piece of standard definition game action to find the ideal frame composition. We would find the best position to crop broadcast game graphics, recreate our own game graphics in creative ways, and occasionally create motion effects within the frame to make sure the audience was catching all the details and flow of the play,” says Beck. “We discovered that tracking the position of the backboard and keeping it as consistent as possible became important to ensuring the audience was able to quickly orient themselves with all the fast-moving game footage.”

From a color standpoint, the trick was taking all that footage, which was shot over a span of decades, and creating a cohesive look.

Rob Sciarratta

“One of main goals was to create a filmic, dramatic natural look that would blend well with all the various sources,” says Sim colorist Rob Sciarratta, who worked with Blackmagic DaVinci Resolve 15. “We went with a rich, slightly warm feeling. One of the more challenging events in color correction was blending the archival work into the interview and film scans. The older video footage tended to have various quality resolutions and would often have very little black detail existing from all the transcoding throughout the years. We would add a filmic texture and soften the blacks so it would blend into the 16mm film scans and interviews seamlessly. … We wanted everything to feel cohesive and flow so the viewer could immerse themselves in the story and characters.”

On the sound side, senior re-recording mixer/supervising sound editor Keith Hodne used Avid Pro Tools. “The challenge was to create a seamless woven sonic landscape from 100-plus interviews and locations, 500 hours of unseen raw behind-the-scenes footage, classic hip-hop tracks, beautifully scored instrumentation and crowd effects, along with the prerecorded live broadcasts,” he says. “Director Jason Hehir and I wanted to create a cinematic blanket of a basketball game wrapped around those broadcasts. What it sounds like to be at the basketball game, feel the game, feel the crowd — the suspense. To feel the weight of the action — not just what it sounds like to watch the game on TV. We tried to capture nostalgia.”

When ESPN made the call to air the first two episodes on April 19, Sim’s crew still had the final seven episodes to finish while dealing with a work-from-home environment. Expectations were only heightened after the first two episodes of The Last Dance averaged more than 6 million viewers. Sim was now charged with finishing what would become the most watched sports documentary in ESPN’s history — and they had to do this during a pandemic.

Stacy Chaet

When the shutdown began in mid-March, Sim’s staff needed to figure out the best way to finish the project remotely.

“I feel like we started the discussions of possible work from home before we knew it was pushed up,” says Stacy Chaet, Sim’s supervising workflow producer. “That’s when our engineering team and I started testing different hardware and software and figuring out what we thought would be the best for the colorist, what’s the best for the online team, what’s the best for the audio team.”

Sim ended up using Teradici to get Sciarratta connected to a machine at the facility. “Teradici has become a widely used solution for remote at home work,” says Chaet. “We were easily able to acquire and install it.”

A Sony X300 monitor was hand-delivered to Sciarratta’s apartment in lower Manhattan, which was also connected to Sciarratta’s machine at Sim through an Evercast stream. Sim shipped him other computer monitors, a Mac mini and Resolve panels. Sciarratta’s living room became a makeshift color bay.

“It was during work on the promos that Jason and Rob started working together, and they locked in pretty quickly,” says David Feldman, Sim’s senior VP, film and television, East Coast. “Jason knows what he wants, and Rob was able to quickly show him a few color looks to give him options.

David Feldman

“So when Sim transitioned to a remote workflow, Sciarratta was already in sync with what the director, Jason Hehir, was looking for. Rob graded each of the remaining seven episodes from his apartment on his X300 unsupervised. Sim then created watermarked QTs with final color and audio. Rob reviewed each QT to make sure his grade translated perfectly when reviewed on Jason’s retina display MacBook. At that point, Sim provided the director and editorial team access for final review.”

The biggest remote challenge, according to producer Matt Maxson, was that the rest of the team couldn’t see Sciarratta’s work on the X300 monitor.

“You moved from a facility with incredible 4K grading monitors and scopes to the more casual consumer-style monitors we all worked with at home,” says Maxson. “In a way, it provided a benefit because you were watching it the way millions of people were going to experience it. The challenge was matching everyone’s experience — Jason’s, Rob’s and our editors’ — to make sure they were all seeing the same thing.”

Keth Hodne

For his part, Hodne had enough gear in his house in Bay Ridge, Brooklyn. Using Pro Tools with Mac Pro computers at Sim, he had to work with a pared-down version of that in his home studio. It was a challenge, but he got the job done.

Hodne says he actually had more back-and-forth with Hehir on the final episode than any of the previous nine. They wanted to capture Jordan’s moments of reflection.

“This episode contains wildly loud, intense crowd and music moments, but we counterbalance those with haunting quiet,” says Hodne. “We were trying to achieve what it feels like to be a global superstar with all eyes on Jordan, all expectations on Jordan. Just moments on the clock to write history. The buildup of that final play. What does that feel and sound like? Throughout the episode, we stress that one of his main strengths is the ability to be present. Jason and I made a conscious decision to strip all sound out to create the feeling of being present and in the moment. As someone whose main job it is to add sound, sometimes there is more power in having the restraint to pull back on sound.”

ESPN Films_Netflix_Mandalay Sports Media_NBA Entertainment

Even when they were working remotely, the creatives were able to communicate in real time via phone, text or Zoom sessions. Still, as Chaet points out, “you’re not getting the body language from that newly official feedback.”

From a remote post production technology standpoint, Chaet and Feldman both say one of the biggest challenges the industry faces is sufficient and consistent Internet bandwidth. Residential ISPs often do not guarantee speeds needed for flawless functionality. “We were able to get ahead of the situation and put systems in place that made things just as smooth as they could be,” says Chaet. “Some things may have taken a bit longer due to the remote situation, but it all got done.”

One thing they didn’t have to worry about was their team’s dedication to the project.

“Whatever challenges we faced after the shutdown, we benefitted from having lived together at the facility for so long,” says Feldman. “There was this trust that, somehow, we were going to figure out a way to get it done.”


Craig Ellenport is a veteran sports writer who also covers the world of post production. 

Color grading Togo with an Autochrome-type look

Before principal photography began on the Disney+ period drama Togo, the film’s director and cinematographer, Ericson Core, asked Company 3 senior colorist Siggy Ferstl to help design a visual approach for the color grade that would give the 1920s-era drama a unique look. Based on a true story, Togo is named for the lead sled dog on Leonhard Seppala’s (Willem Dafoe) team and tells the story of their life-and-death relay through Alaska’s tundra to deliver diphtheria antitoxin to the desperate citizens of Nome.

Siggy Ferstl

Core wanted a look that was reminiscent of the early color photography process called Autochrome, as well as an approach that evoked an aged, distressed feel. Ferstl, who recently colored Lost in Space (Netflix) and The Boys (Amazon), spent months — while not working on other projects — developing new ways of building this look using Blackmagic’s Resolve 16.

Many of Ferstl’s ideas were realized using the new Fusion VFX tab in Resolve 16. It allowed him to manipulate images in ways that took his work beyond the normal realm of color grading and into the arena of visual effects.

By the time he got to work grading Togo, Ferstl had already created looks that had some of the visual qualities of Autochrome melded with a sense of age, almost as if the images were shot in that antiquated format. Togo “reflects the kind of style that I like,” explains Ferstl. “Ericson, as both director and cinematographer, was able to provide very clear input about what he wanted the movie to look like.”

In order for this process to succeed, it needed to go beyond the appearance of a color effect seemingly just placed “on top” of the images. It had to feel organic and interact with the photography, to seem embedded in the picture.

A Layered Approach
Ferstl started this large task by dividing the process into a series of layers that would work together to affect the color, of course, but also to create lens distortion, aging artifacts and all the other effects. A number of these operations would traditionally be sent to Company 3’s VFX department or to an outside vendor to be created by their artists and returned as finished elements. But that kind of workflow would have added an enormous amount of time to the post process. And, just as importantly, all these effects and color corrections needed to work interactively during grading sessions at Company 3 so Ferstl and Core could continuously see and refine the overall look. Even a slight tweak to a single layer could affect how other layers performed, so Ferstl needed complete, realtime control of every layer for every fine adjustment.

Likewise the work of Company 3 conform artist Paul Carlin could not be done in the way conform has typically been done. It couldn’t be sent out of Resolve and into a different conform/compositing tool, republished to the company network and then returned to Ferstl’s Resolve timeline. This would have taken too long and wouldn’t have allowed for the interactivity required in grading sessions.

Carlin needed to be able to handle the small effects that are part of the conform process — split screens, wire removals, etc. — quickly, and that meant working from the same media Ferstl was accessing. Carlin worked entirely in Resolve using Fusion for any cleanup and compositing effects — a practice becoming more and more common among conform artists at Company 3. “He could do his work and return it to our shared timeline,” Ferstl says. “We both had access to all the original material.”


Most of the layers actually consisted of multiple sublayers. Here is some detail:
Texture: This group of sublayers was based on overlaid textures that Ferstl created to have a kind of “paper” feel to the images. There were sublayers based on photographs of fabrics and surfaces that all play together to form a texture over the imagery.
Border: This was an additional texture that darkened portions of the edges of the frame. It inserts a sense of a subtle vignette or age artifact that framed the image. It isn’t consistent throughout; it continually changes. Sublayers bring to the images a bit of edge distortion that resembles the look of diffraction that can happen to lenses, particularly lenses from the early 20th century, under various circumstances.
Lens effects: DP Core shot with modern lenses built with very evolved coatings, but Ferstl was interested in achieving the look of uncoated and less-refined optics of the day. This involved the creation of sublayers of subtle distortion and defocus effects.
Stain: Ferstl applied a somewhat sepia-colored stain to parts of the image to help with the aging effect. He added a hint of additional texture and brought some sepia to some of the very bluish exterior shots, introducing hints of warmth into the images.
Grain-like effect: “We didn’t go for something that exactly mimicked the effect of film grain,” Ferstl notes. “That just didn’t suit this film. But we wanted something that has that feel, so using Resolve’s Grain OFX, I generated a grain pattern, rendered it out and then brought it back into Resolve and experimented with running the pattern at various speeds. We decided it looked best slowed to 6fps, but then it had a steppiness to it that we didn’t like. So I went back and used the tool’s Optical Flow in the process of slowing it down. That blends the frames together, and the result provided just a hint of old-world filmmaking. It’s very subtle and more part of the overall texture.”

Combining Elements
“It wasn’t just a matter of stacking one layer on top of the other and applying a regular blend. I felt it needed to be more integrated and react subtly with the footage in an organic-looking way,” Ferstl recalls.

One toolset he used for this was a series of customized lens flare using Resolve’s OFX, not for their actual purpose but as the basis of a matte. “The effect is generated based on highlight detail in the shot,” explains Ferstl. “So I created a matte shape from the lens flare effect and used that shape as the basis to integrate some of the texture layers into the shots. It’s the textures that become more or less pronounced based on the highlight details in the photography and that lets the textures breathe more.”

Ferstl also made use of the Tilt-Shift effect in Fusion that alters the image in the way movements within a tilt/shift lens would. He could have used a standard Power Window to qualify the portion of the image to apply blur to, but that method applied the effect more evenly and gave a diffused look, which Ferstl felt wasn’t like a natural lens effect. Again, the idea was to avoid having any of these effects look like some blanket change merely sitting on top of the image.

“You can adjust a window’s softness,” he notes, “but it just didn’t look like something that was optical… it looked too digital. I was desperate to have a more optical feel, so I started playing around with the Tilt-Shift OFX and applying that just to the defocus effect.

“But that only affected the top and bottom of the frame, and I wanted more control than that,” he continues. “I wanted to draw shapes to determine where and how much the tilt/shift effect would be applied. So I added the Tilt-Shift in Fusion and fed a poly mask into it as an external matte. I had the ability to use the mask like a depth map to add dimensionality to the effect.”

As Ferstl moved forward with the look development, the issue that continually came up was that while he and Core were happy with the way these processes affected any static image in the show, “as soon as the camera moves,” Ferstl explains, “you’d feel like the work went from being part of the image to just a veil stuck on top.”

He once again made use of Fusion’s compositing capabilities: The delivery spec was UHD, and he graded the actual photography in that resolution. But he built all the effects layers at the much larger 7K. “With the larger layers,” he says, “if the camera moved, I was able to use Fusion to track and blend the texture with it. It didn’t have to just seem tacked on. That really made an enormous difference.”

Firepower
Fortunately for Ferstl, Company 3’s infrastructure provided the enormous throughput, storage and graphics/rendering capabilities to work with all these elements (some of which were extremely GPU-intensive) playing back in concert in a color grading bay. “I had all these textured elements and external mattes all playing live off the [studio’s custom-built] SAN and being blended in Resolve. We had OpenFx plugins for border and texture and flares generated in real time with the swing/tilt effect running on every shot. That’s a lot of GPU power!”

Ferstl found this entire experience artistically rewarding, and looks forward to similar challenges. “It’s always great when a project involves exploring the tools I have to work with and being able to create new looks that push the boundaries of what my job of colorist entails.”

AMD’s new Radeon Pro VII graphics card for 8K workflows

AMD has introduced the AMD Radeon Pro VII workstation graphics card designed for those working in broadcast and media in addition to CAE and HPC applications. According to AMD, the Radeon Pro VII graphics card offers 16GB of extreme speed HBM2 (high bandwidth memory) and support for six synchronized displays and high-bandwidth PCIe 4.0 interconnect technology.

AMD says the new card considerably speeds up 8K image processing performance in Blackmagic’s DaVinci Resolve in addition to performance speed updates in Adobe’s After Effects and Photoshop and Foundry’s Nuke.

The AMD Radeon Pro VII introduces AMD Infinity Fabric Link technology to the workstation market, which speeds application data throughput by enabling high-speed GPU-to-GPU communications in multi-GPU system configurations. The new workstation graphics card provides the high performance and advanced features that enable post teams and broadcasters to visualize, review and interact with 8K content.

The AMD Radeon Pro VII graphics card is expected to be available beginning mid-June for $1,899. AMD Radeon Pro VII-equipped workstations are expected to be available in the second half of 2020 from OEM partners.

Key features include:
– 16GB of HBM2 with 1TB/s memory bandwidth and full ECC capability to handle large and complex models and datasets smoothly with low latency.
– A high-bandwidth, low-latency connection that allows memory sharing between two AMD Radeon Pro VII GPUs, enabling users to increase project workload size and scale, develop more complex designs and run larger simulations to drive scientific discovery. AMD Infinity Fabric Link delivers up to 5.25x PCIe 3.0 x16 bandwidth with a communication speed of up to 168GB/s peer-to-peer between GPUs.
– Users can access their physical workstation from virtually anywhere with the remote workstation IP built into AMD Radeon Pro Software for Enterprise driver.
– PCIe 4.0 delivers double the bandwidth of PCIe 3.0 to enable smooth performance for 8K, multichannel image interaction.
– Enables precise synchronized output for display walls, digital signage and other visual displays (AMD FirePro S400 synchronization module required).
– Supports up to six synchronized display panels, full HDR and 8K screen resolution (single display) combined with ultra-fast encode and decode support for enhanced multi-stream workflows.
– Optimized and certified with pro applications for stability and reliability. The list of Radeon Pro Software-certified ISV applications can be found here.
– ROCm open ecosystem, an open software platform for accelerated compute, provides an easy GPU programming model with support for OpenMP, HIP and OpenCL and for ML and HPC frameworks.

AMD Radeon Pro workstation graphics cards are supported by the Radeon Pro Software for Enterprise driver, offering enterprise-grade stability, performance, security, image quality and other features, including high-resolution screen capture, recording and video streaming. The company says the latest release offers up to a 14 percent year-over-year performance improvement for current-generation AMD Radeon Pro graphics cards. The new software driver is now available for download from AMD.com.

AMD also released updates for AMD Radeon ProRender, a physically-based rendering engine built on industry standards that enables accelerated rendering on any GPU, any CPU and any OS. The updates include new plugins for Side Effects Houdini and Unreal Engine and updated plugins for Autodesk Maya and Blender.

For developers, an updated AMD Radeon ProRender SDK is now available on the redesigned GPUOopen.com site and is now easier to implement with an Apache License 2.0. AMD also released a beta SDK of the next-generation Radeon ProRender 2.0 rendering engine with enhanced CPU and GPU rendering support with open-source versions of the plugins.

Production begins again on New Zealand’s Shortland Street series

By Katie Hinsen

The current global pandemic has shut down production all over the world. Those who can have moved to working from home, and there’s speculation about how and when we’ll get back to work again.

New Zealand, a country with a significant production economy, has announced that it will soon reopen for shoots. The most popular local television show, Shortland Street, was the first to resume production after an almost six-week break. It’s produced by Auckland’s South Pacific Pictures.

Dylan Reeve

I am a native New Zealander who has worked in post there on and off over the years. Currently I live in Los Angeles, where I am an EP for dailies and DI at Nice Shoes, so taking a look at how New Zealand is rolling things out interests me. With that in mind, I reached out to Dylan Reeve, head of post production at Shortland Street, to find out how it looked the week they went back to work under Level 3 social distancing restrictions.

Shortland Street is a half-hour soap that runs five nights a week on prime-time television. It has been on air for around 28 years and has been consistently among the highest-rated shows in the nation. It’s a cultural phenomenon. While the cast and crew take a single three-week annual break from production during the Christmas holiday season, the show has never really stopped production … until the pandemic hit.

Shortland Street’s production crew is typically made up of about 100 people; the post department consists of two editors, two assistants, a composer and Reeve, who is also the online editor. Sound mixes and complex VFX are done elsewhere, but everything else for the production is done at the studio.

New Zealand responded to COVID-19 early, instituting one of the harshest lockdowns in the world. Reeve told me that they went from alert Level 1 — basic social distancing, more frequent handwashing — to Level 3 as soon as the first signs of community transmission were detected. They stayed at this level for just two days before going to Level 4: complete lockdown. New Zealanders had 48 hours to get home to their families, shop for supplies and make sure they were ready.

“On a Monday afternoon at about 1:30pm, the studio emptied out,” explains Reeve. “We were shut down, but we were still on air, and we had about five or six weeks’ worth of episodes in various stages of production and post. I then had two days to figure out and prepare for how we were going to finish all of those and make sure they got delivered so that the show could continue to be on air.”

Shortland Street’s main production building dressed as the exterior of the hospital where the show is set, with COVID workplace safety materials on the doors.

The nature of the show’s existing workflow meant that Reeve had to copy all the media to drives and send Avids and drives home with the editors. The assistant editors logged in remotely for any work they needed to do, and Reeve took what he needed home as well to finish onlining, prepping and delivering those already-shot episodes to the broadcaster. They used Frame.io for review and approval with the audio team and with the directors, producers and network.

“Once we knew we were coming back into Level 3, and the government put out more refined guidelines about what that required, we had a number of HoD meetings — figuring out how we could produce the show while maintaining the restrictions necessary.”

I asked Reeve whether he and his crew felt safe going back to work. He reminded me that New Zealand only went back down to Level 3 once there had been a period with no remaining evidence of community transmission. Infection rates in New Zealand had spent two weeks in single digits, including two days when no new cases had been reported.

Starting Up With Restrictions
My conversation with Reeve took place on May 4, right after his first few days back at work. I asked him to explain some of the conditions under which the production was working while the rest of the country was still in isolation. Level 3 in New Zealand is almost identical to the lockdown restrictions put in place in US cities like New York and Los Angeles.

“One of the key things that has changed in terms of how we’re producing the show is that we physically have way less crew in the building. We’re working slower, and everyone’s having to do a bit more, maybe, than they would normally.

Shortland Street director Ian Hughes and camera operator Connagh Heath discussing blocking with a one-metre guide.

“When crew are in a controlled workspace where we know who everyone is,” he continues, “that allows us to keep track of them properly — they’re allowed to work within a meter of one another physically (three feet). Our policy is that we want staff to stay two meters (six feet) apart from one another as much as possible. But when we’re shooting, when it’s necessary, they can be a meter from one another.”

Reeve says the virus has certainly changed the nature of what can be shot. There are no love scenes, no kissing and no hugs. “We’re shooting to compensate for that; staging people to make them seem closer than they are.

Additionally, everything stays within the production environment. Parts of our office have been dressed; parts of our building have been dressed. We’ll do a very low-profile exterior shoot for scenes that take place outside, but we’re not leaving the lot.”

Under Level 3, everyone is still under isolation at home. This is why, explains Reeve, social distancing has to continue at work. That way any infection that comes into the team can be easily traced and contained and affect as few others as possible. Every department maintains what they call a “bubble,” and very few individuals are allowed to cross between them.

Actors are doing their own hair and makeup, and there are no kitchen or craft services available. The production is using and reusing a small number of regular extras, with crew stepping in occasionally as well. Reeve noted that Australia was also resuming production on Neighbours, with crew members acting as extras.

“Right now in our studio, our full technical complement consists of three camera operators at the moment, just one boom operator and one multi-skilled person who can be the camera assist, the lighting assist and the second boom op if necessary. I don’t know how a US production would get away with that. There’s no chance that someone who touches lights on a union production can also touch a boom.”

Post Production
Shortland Street’s post department is still working from home. Now that they are back in production, they are starting to look at more efficient ways to work remotely. While there are a lot of great tools out there for remote post workflows, Reeve notes that for them it’s not that easy, especially when hardware and support are halfway across the world, borders are closed and supply chains are disrupted.

There are collaboration tools that exist, but they haven’t been used “simply because the pace and volume of our production means it’s often hard to adapt for those kinds of products,” he says. “Every time we roll camera, we’re rolling four streams of DNxHD 185, so nearly 800Mb/s each time we roll. We record that media directly into the server to be edited within hours, so putting that in the cloud or doing anything like that was never the best workflow solution. When we wanted feedback, we just grabbed people from the building and dragged them into the edit suite when we wanted them to look at something.”

Ideally, he says, they would have tested and invested in these tools six months ago. “We are in what I call a duct tape stage. We’re taking things that exist, that look useful, and we’re trying to tape them together to make a solution that works for us. Coming out of this, I’m going to have to look at the things we’ve learned and the opportunities that exist and decide whether or not there might be some ways we can change our future production. But at the moment, we’re just trying to make it through.”

Because Shortland Street has only just resumed shooting, they haven’t reached the point yet where they need to do what Reeve calls “the first collaborative director/editor thing” from start to finish. “But there are two plans that we’re working toward. The easy, we-know-it-works plan is that we do an output, we stick it on Frame.io, the director watches it, puts notes on it, sends it back to us. We know that works, and we do that sometimes with directors anyway.

“The more exciting idea is that we have the directors join us on a remote link and watch the episodes as they would if they were in the room. We’ve experimented with a few things and haven’t found a solution that makes us super-happy. It’s tricky because we don’t have an existing hardware solution in place that’s designed specifically for streaming a broadcast output signal over an internet connection. We can do a screen-share, and we’ve experimented with Zoom and AnyDesk, but in both those cases, I’ve found that sometimes the picture will break up unacceptably, or sync will drift — especially using desktop-sharing software that’s not really designed to share full-screen video.”

Reeve and crew are just about to experiment with a tool used for gaming called Parsec. It’s designed to share low-latency, in-sync, high-frame-rate video. “This would allow us to share an entire desktop at, theoretically, 60fps with half-second latency or less. Very brief tests looked good. Plan A is to get the directors to join us on Parsec and screen-share a full-screen output off Avid. They can watch it down and discuss with the editor in real time or just make their own notes and work through it interactively. If that experience isn’t great, or if the directors aren’t enjoying it, or if it’s just not working for some reason, we’ll fall back to outputting a video, uploading it to Frame.io and waiting for notes.

What’s Next?
What are the next steps for other productions returning to work? Shortland Street is the only production that chose to resume under Level 3. The New Zealand Film Commission has said that filming will resume eventually under Level 2, which is being rolled out in several stages beginning this week. Shortland Street’s production company has several other shows, but none have plans to resume yet.

“I think it’s a lot harder for them to stay contained because they can’t shoot everything in the studio,” explains Reeve. “Our production has an added advantage because it is constantly shooting and the core cast and crew are mostly the same every day. I think these types of productions will find it easiest to come back.”

Reeve says that anyone coming into their building has to sign in and deliver a health declaration — recent travel, contact with any sick person, other work they’ve been engaged in. “I think if you can do some of that reasonable contact tracing with the people in your production, it will be easier to start again. The more contained you can keep it, the better. It’s going to be hard for productions that are on location, have high turnover or a large number of extras — anything where they can’t keep within a bubble.

“From a post point of view, I think we’re going to get a lot more comfortable working remotely,” he continues. “And there are lots of editors who already do that, especially in New Zealand. If that can become the norm, and if there are tools and workflows that are well established to support that, it could be really good for post production. It offers a lot of great opportunities for people to essentially broaden their client essentially or the geographic regions in which they can work.

Productions are going to have to make their own sort of health and safety liability decisions, according to Reeve. “All of the things we are doing are effectively responding to New Zealand government regulation, but that won’t be the case for everyone else.”

He sees some types of production finding an equilibrium. “Love Island might be the sort of reality show you can make. You can quarantine everyone going into that show for 14 days, make sure they’re all healthy, and then shoot the show because you’re basically isolated from the world. Survivor as well, things like that. But a reality show where people are running around the streets isn’t happening anymore. There’s no Amazing Race, that’s for sure.”


After a 20-year career talent-side, Katie Hinsen turned her attention to building, developing and running post facilities with a focus on talent, unique business structures and innovative use of technology. She has worked on over 90 major feature and episodic productions, founded the Blue Collar Post Collective, and currently leads the dailies & DI department at Nice Shoes.

Posting John Krasinski’s Some Good News

By Randi Altman

Need an escape from a world filled with coronavirus and murder hornets? You should try John Krasinski’s weekly YouTube show, Some Good News. It focuses on the good things that are happening during the COVID-19 crisis, giving people a reason to smile with things such as a virtual prom, Krasinski’s chat with astronauts on the ISS and bringing the original Broadway cast of Hamilton together for a Zoom singalong.

L-R: Remy, Olivier, Josh and Lila Senior

Josh Senior, owner of Leroi and Senior Post in Dumbo, New York, is providing editing and post to SGN. His involvement began when he got a call from a mutual friend of Krasinski’s, asking if he could help put something together. They sent him clips via Dropbox, and a workflow was born.

While the show is shot at Krasinski’s house in New York at different times during the week, Senior’s Fridays, Saturdays and Sundays are spent editing and posting SGN.

In addition to his post duties, Senior is an EP on the show, along with his producing partner Evan Wolf Buxbaum at their production company, Leroi. The two work in concert with Allyson Seeger and Alexa Ginsburg, who executive produced for Krasinski’s company, Sunday Night Productions. Production meetings are held on Tuesday, and then shooting begins. After footage is captured, it’s still shared via Dropbox or good old iMessage.

Let’s find out more…

What does John use for the shoot?
John films on two iPhones. A good portion of the show is screen-recorded on Zoom, and then there’s the found footage user-generated content component.

What’s your process once you get the footage? And, I’m assuming, it’s probably a little challenging getting footage from different kinds of cameras?
Yes. In the alternate reality where there’s no coronavirus, we run a pretty big post house in Dumbo, Brooklyn. And none of the tools of the trade that we have there are really at play here, outside of our server, which exists as the ever-present backend for all of our remote work.

The assets are pulled down from wherever they originate. The masters are then housed behind an encrypted firewall, like we do for all of our TV shows at the post house. Our online editor is the gatekeeper. All the editors, assistant editors, producers, animators, sound folks — they all get a mirrored drive that they download, locally, and we all get to work.

Do you have a style guide?
We have a bible, which is a living document that we’ve made week over week. It has music cues, editing style, technique, structure, recurring themes, a living archive of all the notes that we’ve received and how we’ve addressed them. Also, any style that’s specific to segments, post processing, any phasing or audio adjustments that we make all live within a document, that we give to whoever we onboard to the show.

Evan Wolf Buxbaum

Our post producers made this really elegant workflow that’s a combination of Vimeo and Slack where we post project files and review links and share notes. There’s nothing formal about this show, and that’s really cool. I mean, at the same time, as we’re doing this, we’re rapidly finishing and delivering the second season of Ramy on Hulu. It comes out on May 29.

I bet that workflow is a bit different than SGN’s.
It’s like bouncing between two poles. That show has a hierarchy, it’s formalized, there’s a production company, there’s a network, there’s a lot of infrastructure. This show is created in a group text with a bunch of friends.

What are you using to edit and color Some Good News?
We edit in Adobe Premiere, and that helps mitigate some of the challenges of the mixed media that comes in. We typically color inside of Adobe, and we use Pro Tools for our sound mix. We online and deliver out of Resolve, which is pretty much how we work on most of our things. Some of our shows edit in Avid Media Composer, but on our own productions we almost always post in Premiere — so when we can control the full pipeline, we tend to prefer Adobe software.

Are review and approvals with John and the producers done through iMessage in Dropbox too?
Yes, and we post links on Vimeo. Thankfully we actually produce Some Good News as well as post it, so that intersection is really fluid. With Ramy it’s a bit more formalized. We do notes together and, usually internally, we get a cut that we like. Then it goes to John, and he gives us his thoughts and we retool the edit; it’s like a rapid prototyping rather than a gated milestone. There are no network cuts or anything like that.

Joanna Naugle

For me, what’s super-interesting is that everyone’s ideas are merited and validated. I feel like there’s nothing that you shouldn’t say because this show has no agenda outside of making people happy, and everybody’s uniquely qualified to speak to that. With other projects, there are people who have an experience advantage, a technical advantage or some established thought leadership. Everybody knows what makes people happy. So you can make the show, I can make the show, my mom can make the show, and because of that, everything’s almost implicitly right or wrong.

Let’s talk about specific episodes, like the ones featuring the prom and Hamilton? What were some of the challenges of working with all of that footage. Maybe start with Hamilton?
That one was a really fun puzzle. My partner at Senior Post, Joanna Naugle, edited that. She drew on a lot of her experience editing music videos, performance content, comedy specials, multicam live tapings. It was a lot like a multicam live pre-taped event being put together.

We all love Hamilton, so that helps. This was a combination of performers pre-taping the entire song and a live performance. The editing technique really dissolves into the background, but it’s clear that there’s an abundance of skill that’s been brought to that. For me, that piece is a great showcase of the aesthetic of the show, which is that it should feel homemade and lo-fi, but there’s this undercurrent of a feat to the way that it’s put together.

Getting all of those people into the Zoom, getting everyone to sound right, having the ability to emphasize or de-emphasize different faces. To restructure the grid of the Zoom, if we needed to, to make sure that there’s more than one screen worth of people there and to make sure that everybody was visible and audible. It took a few days, but the whole show is made from Thursday to Sunday, so that’s a limiting factor, and it’s also this great challenge. It’s like a 48-hour film festival at a really high level.

What about the prom episode?
The prom episode was fantastic. We made the music performances the day before and preloaded them into the live player so that we could cut to them during the prom. Then we got to watch the prom. To be able to participate as an audience member in the content that you’re still creating is such a unique feeling and experience. The only agenda is happiness, and people need a prom, so there’s a service aspect of it, which feels really good.

John Krasinski setting up his shot.

Any challenges?
It’s hard to put things together that are flat, and I think one of the challenges that we found at the onset was that we weren’t getting multiple takes of anything, so we weren’t getting a lot of angles to play with. Things are coming in pretty baked from a production standpoint, so we’ve had to find unique and novel ways to be nonlinear when we want to emphasize and de-emphasize certain things. We want to present things in an expositional way, which is not that common. I couldn’t even tell you another thing that we’ve worked on that didn’t have any subjectivity to it.

Let’s talk sound. Is he just picking up audio from the iPhones or is he wearing a mic?
Nope. No, mic. Audio from the iPhones that we just run through a few filters on Pro Tools. Nobody mics themselves. We do spend a lot of time balancing out the sound, but there’s not a lot of effect work.

Other than SGN and Ramy, what are some other shows you guys have worked on?
John Mulaney & the Sack Lunch Bunch, 2 Dope Queens, Random Acts of Flyness, Julio Torres: My Favorite Shapes by Julio Torres and others.

Anything that I haven’t asked that you think is important?
It’s really important for me to acknowledge that this is something that is enabling a New York-based production company and post house to work fully remotely. In doing this week over week, we’re really honing what we think are tangible practices that we can then turn around and evangelize out to the people that we want to work with in the future.

I don’t know when we’re going to get back to the post house, so being able to work on a show like this is providing this wonderful learning opportunity for my whole team to figure out what we can modulate from our workflow in the office to be a viable partner from home.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Posting Everest VR: Journey to the Top of the World

While preparing to climb both Mount Everest and Mount Lhotse without the use of bottled oxygen, renowned climber Ueli Steck fell to his death in late April of 2017. VR director and alpine photographer Jonathan Griffith and mountain guide Tenji Sherpa, both friends of Steck, picked up the climber’s torch, and the result was the 8K 3D documentary Everest VR: Journey to the Top of the World, produced by Facebook’s Oculus.

Over the course of three years, Griffith shot footage following Tenji and some of the world’s most accomplished climbers in some of the world’s most extreme locations. The series also includes footage that lets viewers witness what it is like to be engulfed in a Himalayan avalanche, cross a crevasse and staring deep in its depths, take a huge rock-climbing fall, camp under the stars and soak in the view from the top of the world.

For the post part of the doc, Griffith called on veteran VR post pro Matthew DeJohn for editing and color correction, VR stitching expert Keith Kolod and Brendan Hogan for sound design.

“It really was amazing how a small crew was able to get all of this done,” says Griffith. “The collaboration between myself as the cameraman and Matt and Keith was a huge part of being able to get this series done — and done at such as a high quality.

“Matt and Keith would give suggestions on how to capture for VR, how camera wobbling impacted stitching, how to be aware of the nadir and zenith in each frame and to think about proximity issues. The efficient post process helped in letting us focus on what was needed, and I am incredibly happy with the end result.”

DeJohn was tasked with bringing together a huge amount of footage from a number of different high-end camera systems, including the Yi Halo and Z Cam V1 Pro.

DeJohn called on Blackmagic Resolve for this part of the project, saying that using one tool for all helped speed up the process.“A VR project usually has different teams of multiple people for editing, grading and stitching, but with Resolve, Keith and I handled everything,” he explains.

Within Resolve, DeJohncut the series at 2Kx2K, relinked to 8Kx8K source and then change the timeline resolution to 8Kx8K for final color and rendering. He used the Fairlight audio editing tab to make fine adjustments, manage different narration takes with audio layers, and manage varied source files such as mono-narration, stereo music and four-channel ambisonic spatial audio.

In terms of color grading, DeJohn says, “I colored the project from the very first edit so when it came to finalize the color it was just a process of touching things up.”

Fusion Studio was used for stereoscopic alignment fixes, motion graphics, rig removal, nadir patches, stabilization, stereo correction of the initial stitch, re-orienting 360 imagery, viewing the 360 scenes in a VR headset and controlling focal areas. More intense stitching work was done by Kolod using Fusion Studio.

Footage of such an extreme environment, as well as the closeness of climbers to the cameras, provided unique challenges for Kolod who had to rebuild portions of images from individual cameras. He also had to manually ramp down the stereo on the images north and south poles to ensure easy viewing, fix stereo misalignment and distance issues between the foreground and background and calm excessive movement in images.

“A regular fix I had to make was adjusting incorrect vertical alignments, which create huge problems for viewing. Even if a camera is a little bit off, the viewer can tell,” says Kolod. “The project used a lot of locked-off tripod cameras, and you would think that the images coming from them would be completely steady. But a little bit of wind or slight movement in what is usually a calm frame makes a scene unwatchable in VR. So I used Fusion for stabilization on a lot of shots.”

“High-quality VR work should always be done with manual stitching with an artist making sure there are no rough areas. The reason why this series looks so amazing is that there was an artist involved in every part of the process — shooting, editing, grading and stitching,” concludes Kolod.

Light Illusion intros color management system for beta testing

Color management specialist Light Illusion has launched a new color management product called ColourSpace CMS for post facilities, studios and broadcasters. It is available for pre-order, which enables participation in the open beta program to provide feedback and input for the final release.

Running on a PC, ColourSpace CMS software connects to a wide variety of display calibration probes in order to accurately assess the color rendition of any display. It then creates color management data that can be used to perfect color accuracy by uploading it into a display that has color management features or, alternatively, applied as a correction to the signal feeding the display using associated hardware.

Applications include display calibration, color management and color workflows within the professional film, post production and broadcast industries and for display manufacturers and home cinema enthusiasts. ColourSpace CMS data is compatible with most color-related post platforms such as Resolve, Flame and Baselight.

ColourSpace CMS was designed to improve how color accuracy is measured and reported. Light Illusion CEO Steve Shaw says, “The most visually impressive aspect of ColourSpace CMS is the way it communicates color accuracy to the user. Full volumetric accuracy of any given display can be quickly and easily assessed using the new three-dimensional, fully interactive, resizable and color-coded graphs. Complex color data can be clearly analyzed with 3D CIE and normalized RGB Color Space graphs, including Error Tangent lines and color coded measure points.

“Color coding within the display graphs helps to quickly identify the accuracy of any given measured point. For example, green signifies that a measured color value is below 1dE, while orange shows a color as being between 1dE and 2.3dE, and red indicates a value above 2.3dE. Additional Tangent Lines are visual plots of any given point’s error, showing the recorded location of the measured color, compared to where the color should actually be located for any given color space.”

Display manufacturer Flanders Scientific has been involved in testing and feedback during the development of ColourSpace CMS. The open beta program will be available to anyone who has ordered ColourSpace CMS in advance.

Colorist Chat: I Am Not Okay With This’ Toby Tomkins

Colorist Toby Tomkins, co-founder of London color grading and finishing boutique Cheat, collaborated once again with The End of the F***ing World director Jonathan Entwistle on another Charles Forsman novel, I Am Not Okay With This. The now-streaming Netflix show is produced by Entwistle alongside Stranger Things EPs Shawn Levy and Dan Cohen, as well as Josh Barry. The director also once again called DP Justin Brown.

Toby Tomkins

Adapted from Forsman’s graphic novel of the same name, the series follows a teenage girl named Sydney as she navigates school, crushes, her sexuality and sudden on-set superpowers. You know, the typical teenage experience.

Here, Tomkins talks about collaborating with the director and DP as well as his workflow.

How early did you get involved on I Am Not Okay With This?
Jon Entwistle had reached out to DP Justin Brown about his interest in adapting this graphic novel after working on The End of the F***ing World. When the series then got commissioned and Justin was on board, he and Jon convinced production company 21 Laps that they could do the grade in London with Cheat. There were some discussions about grading in LA, but we managed to convince them that it could be a quick and easy process back here, and that’s how I got involved.

I was on board quite early on in the production, getting involved with camera tests and reviewing all the material with Justin. We worked together to evaluate the material, and after Justin chose the camera and lenses, we built a color pipeline that informed how the material was shot and how the show would be captured and pass through the color pipeline. From then, we started building off the work we did on The End of the F***ing World. (Check out our coverage of The End of the F***ing World, which includes an interview with Tomkins.)

What kind of look did Jon and Justin want, and how did they express that look to you? Film or show references? Pictures?
There were quite a few visual references, which I already knew from previously working with Jon and Justin. They both gravitate toward a timeless American cinema look — something photochemical but also natural. I knew it would be similar to The End of the F***ing World, but we were obviously using different locations and a slightly different light, so there was a little bit of playing around at the beginning.

We’re all fans of American cinema, especially the look of old film stock. We wanted the look of the show to feel a little bit rough around the edges — like when things used to be shot on film and you had limited control on how to make any changes. Films weren’t corrected to a perfect level and we wanted to keep those imperfections for this show, making it feel authentic and not overly polished. Although it was produced by the same people that did Stranger Things, we wanted to stray away from that style slightly, making it feel a bit different.

We were really aiming for a timeless American look, with a vintage aesthetic that played into a world that was slightly out of place and not really part of reality. During the grade, Justin liked to put a line through it, keeping it all very much in the same space, with perhaps a little pop on the reds and key “American” colors.

Personally, I wanted to evoke the style of some teen film from the late 20th century — slightly-independent looking and minimally processed. Films like 10 Things I Hate About You and She’s All That certainly influenced me.

You have all worked together in the past. How did that help on this show? Was there a kind of shorthand?
We learned a lot doing The End of the F***ing World together and Justin and I definitely developed a shorthand. It’s like having a head start because we are all on the same page from the get-go. Especially as I was grading remotely with Justin and Jon just trusted us to know exactly what he wanted.

Tomkins works on Resolve

At the end of the first day, we shared our work with Jon in LA and he’d watch and add his notes. There were only three notes of feedback from him, which is always nice! They were notes on richness in some scenes and a question on matching between two shots. As we’d already tested the cameras and had conversations about it before, we were always on the same page with feedback and I never disagreed with a single note. And Jon only had to watch the work through once, which meant he was always looking at it with clean, fresh eyes.

What was the show shot on, and what did you use for color grading?
It was shot ARRI Alexa, and I used DaVinci Resolve Studio.

Any particular challenges on this one for you?
It was actually quite smooth for me! Because Justin and I have worked together for so long, and because we did the initial testing around cameras and LUTs, we were very prepared. Justin had a couple of challenges due to unpredictable weather in Pittsburgh, but he likes to do as much as possible in-camera. So once it got to me, we were already aligned and prepared.

How did you find your way to being a colorist?
I started off in the art department on big studio features but wanted to learn more about filmmaking in general, so I went to film school in Bournemouth, back when it was called the Arts University College Bournemouth. I quickly realized my passion was post and gleamed what I could from an exceptional VFX tutor there called Jon Turner. I started specializing in editing and then VFX.

I loved the wizardry and limitless availability of VFX but missed the more direct relationship with storytelling, so when I found out about color grading — which seemed like the perfect balance of both — I fell in love. Once I started grading, I didn’t stop. I even bribed the cleaners to get access to the university grading suite at night.

My first paid gig was for N-Dubz, and after I graduated and they became famous, they kept me on. And that gave me the opportunity to work on bigger music videos with other artists. I set up a suite at home (way before anyone else was really doing this) and convinced clients to come 30 minutes out of London to my parents’ house in a little village called Kings Langley.

I then got asked to set up a color department for a sound studio called Tate Post, where I completed lots of commercials, a few feature films — notably Ill Manors — and some shorts. These included one for Jon called Human Beings, which is where our relationship began! After that, I went it alone again and eventually set up Cheat. The rest is history.

What, in addition to the color, do you provide on projects? Small VFX, etc.?
For I Am Not Okay With This, we did some minor work, online and delivery in house at Cheat. I just do color, however. I think it’s best to leave each department to do its own work and trust the knowledge and experience of experts in the field. We worked with LA-based VFX company Crafty Apes for the show; they were really fantastic.

Where do you get inspiration? Photographs, museums?
Mostly from films — both old and new — and definitely photography and the work of other colorists.

Finally, any advice you’d give your younger self about working as a colorist?
Keep at it! Experience is everything.

Autodesk’s Flame 2021 adds Dolby Vision, expands AI and ML offerings

Autodesk has released Flame 2021 with new features aimed at innovating and accelerating creative workflows for VFX, color grading, look development and editorial finishing. Flame 2021 increases workflow flexibility for artists, expands AI capabilities with new machine learning-powered human face segmentation and simplifies finishing for streaming services with new functionality for Dolby Vision HDR authoring and display. In response to user requests, the release also adds a new GPU-accelerated Physical Defocus effect and finishing enhancements that make it easier to adjust looks across many shots, share updates with clients and work faster.

Useful for compositing, color grading and cosmetic beauty work, the AI-based face segmentation tool automates all tracking and identifies and isolates facial features — including nose, eyes, mouth, laugh lines and cheekbones — for further manipulation. Face matching algorithms are also capable of specialized tasks, including specific mole or scar isolation, through custom layout workflows. Built-in machine learning analysis algorithms isolate and modify common objects in moving footage, dramatically accelerating VFX and compositing workflows.

To meet increasing demand for HDR content mastering driven by OTT streaming services, Flame 2021 introduces a new Dolby Vision HDR authoring and display workflow. This enables Flame to import, author, display and export Dolby Vision HDR shot-by-shot animatable metadata, streamlining creation and delivery of high dynamic range imagery required by leading OTT streaming services. The update also expands collaboration with Autodesk Lustre and other Dolby-certified color grading tools through enabling XML metadata import/export.

Other new features in the Flame 2021 family include:
● Save and recall color grading and VFX: Quickly save and recall color grading and VFX work in the new Explorer, a dedicated “grade bin” and reference comparison area to support artist workflows.
● Viewing area: A new video preview mode shares artist viewports — including storyboard, manager and schematic — to SDI or HDMI preview monitors. In broadcast mode, Gmasks can now be observed in the view area during editing along with any other tools that get directly manipulated.
● Gmask premade shapes: New Gmask premade shapes with softness are available for colorists, compositors, and finishing VFX artists in the image and action nodes.

Flame, Flare and Flame Assist 2021 are available at no additional cost to Flame Family 2020 subscribers.

Colorist Vicki Matich joins UK’s Molinare

UK-based post company Molinare has beefed up its grading team with the addition of colorist Vicki Matich. Over the path of her career Matich has worked with such directors as Danny Boyle, Paul W.S. Anderson and Sir David Attenborough.

Her resume includes grading the Emmy award-winning documentary Leaving Neverland: Michael Jackson as well as Jade: The Reality Star Who Changed Britain, The Last Survivors, 8 Days To The Moon and Back, Undercover: Inside China’s Digital Gulag, War in the Blood and Stath Lets Flats.

Matich started her career as a colorist at ABC in Sydney, before joining The Finishing School in Leeds, UK, where she predominantly worked on dramas. Since then, she has worked her way up the colorist department at Envy, where she was head of grading, a position she held for over six years. She moved over to freelance last year, where she has been working at Molinare as a freelance colorist.

Matich will be working on FilmLight Baselight in Molinare’s theatrical and broadcast grading theaters.

 

Quick Chat: Director/DP Ruben Latre creates Candleosophy spot at home

Unable to travel the globe to shoot his spots and documentary projects, New York’s Hostage Films director/DP Ruben Latre is still working. With social distancing rules in effect, Ruben is still filming, designing and editing… but from his home. His latest spot for mediation candle company Candleosophy showcases some of his in-house capabilities to create new content without cast or crew.

The spot, created for digital and social networks, features macro shots of organic imagery, layered with subtle text, stylized design treatment and a peaceful music track.

Latre on set before social distancing.

We reached out to Latre to find out more about the spot and his workflow:

Was this an existing project you setup and then came up with an alternative on how to do it?
Yes, we had planned to shoot a spot for Candleosophy, showing a candle meditation with a cast and in a nature setting. Once it became clear that involving anyone would be risking the health of cast and crew, production made the decision to shut down the live-action shoot. At that point, we regrouped on how to convey the meditative moment without a full studio shoot, without cast and location — and to have only the tabletop portion — without leaving the house. It was always supposed to be more focused on serenity, and less product-oriented, so I tried to have that play out in this smaller way.

How did you describe to the client what it was going to look like?
I’ve worked with the client before on a really unique wonderful project, The Pioneer, so I think part of it was a level of trust. It was a bit of luck to have a client who believed in me.

What did you shoot on, light with, edit and color on?
I shot on a Red camera that I have at home, and even though I have lighting, the spot felt like it was calling for something more raw. So it’s being lit by natural light, shaped to suit each frame. I edited on Adobe Premiere and color corrected in Blackmagic DaVinci Resolve. Sound design was via Adobe Audition.

Latre’s at-home set up

Did the client provide some of the footage?
They did not provide footage, only candles. All the rest was done in my house.

What were some lessons you learned from the project?
Since the sets were very little, about two feet, I tried to make it feel wider and worked a lot with magnification.

For the whole project, I used diopters and extension tubes to create a shallow depth of field, which was a little bit of testing and playing around to get the look I was after.

What were some of the best and worst parts of working this way?
While I was shooting the spot, I was easily able to change directions. I think when you are able to work alone and see something you as the director are happy with it, it feels easier to stand by the result. However, it’s a lot more work in all of the aspects of making a frame, and there are technical limitations in terms of what you are able to execute.

Behind the Title: Unit9 director Matthew Puccini

This young director has already helmed two short films, Dirty and Lavender, that got into Sundance. And he still finds time to edit when his schedule allows. 

Name: Director Matthew Puccini

Can you describe Unit9?
Unit9, which has headquarters in London and Los Angeles, is a global production company that represents a team of producers and film directors, creative and art directors, designers, architects, product designers, software engineers and gaming experts. I’m based in Brooklyn.

Puccini on set of Dirty

What would surprise people the most about what falls under the title of director?
These days, there’s a certain element of self-promotion that’s required to be a young director. We have to figure out how to brand ourselves in a way that people might not have had to do 10 to 15 years ago when the Internet wasn’t as prevalent in how people discovered new artists. I constantly have to be tip-toeing back and forth between the creative side of the work and the more strategic side — getting the work seen and amplified as much as possible.

What’s your favorite part of the job?
My favorite part of directing is the collaborative aspect of it. I love that it offers this unique ability to dip into so many other disciplines and to involve so many other incredible, wildly different people.

What’s your least favorite?
The salesperson aspect of it can be frustrating. In a perfect world it would be nice to just make things and not have to worry about the back end of finding an audience. But at the same time, sometimes being forced to articulate your vision in a way that’s palatable to a financier or a production company can be helpful in figuring out what the core of the idea is. It’s a necessary evil.

Why did you choose this profession? How early on did you know this would be your path?
I fell in love with directing in high school. We had an amazing theater program at my school. I started off mainly acting, and then there was one show where I ended up being the assistant director instead of acting. That experience was so wonderful and fulfilling and I realized that I preferred being on that side of things. That happened parallel to getting my first video camera, which I enjoyed as a hobby but began to take more seriously during my junior and senior years of high school.

What was it about directing that attracted you?
I fell in love with working with actors to craft performances. The whole process requires so much collaboration and trust and vulnerability. Over time, I’ve also grown to appreciate filmmaking as a means of filling in gaps in representation. I get to highlight human experiences that I feel like I haven’t seen properly portrayed before. It’s wish fulfillment, in a sense; you get to make the work that you wish you were seeing as an audience member.

Puccini on set of Lavender

How do you pick the people you work with on a particular project?
I began making work while I was in school in New York, so there’s a wonderful community of people that I met in college and with whom I still work. I also continue to meet new collaborators at film festivals, or will occasionally just reach out to someone after having seen a film of theirs that I responded to. I continue to be amazed by how willing people are to make time for something if they believe in it, even if it seems like it’s far beneath their pay grade.

How do you work with your DP?
It always just starts with me sending them the script and having a meeting to talk about the story. I might have some preconceived ideas going into that meeting about how I’m thinking of shooting it — what my visual references were while writing the script — but I try to stay open to what they imagined when they were reading it. From there, it’s a very organic process of us pulling references and gradually building a look book together of colors, lighting styles, compositions and textures.

It could be as specific as a frame that we want to be completely copy or as general as a feeling that an image evokes, but the idea is that we’re figuring out what our shared vocabulary is going to be before we get to set. My number one need is knowing that the person is just as passionate about the story as I am and is able to tailor their shooting style to what’s right for that particular project.

Do you get involved with the post at all?
Definitely. I’m very involved with every stage of post, working closely with the department heads who are running the show on a more granular level. I love the post process and enjoy being involved as much as possible.

I also work as a video editor myself, which has given me so much awareness and respect for the importance of a good edit and a good editor. I think sometimes it’s easy to waste time and resources on shooting coverage you’re never going to use. So as a director, it’s important even before starting a project for me to think ahead and visualize what the film really needs so that I can be as efficient and decisive as possible on set.

Dirty

Can you talk about Dirty? What was it like getting it ready for Sundance?
We found out that Dirty got into Sundance last November. Obviously, it’s the call of anyone’s dreams and such a wonderful feeling and boost of validation. We had finished the film back in April, so it had been a long time of waiting.

From November to the festival, it was a rush to get the film ready. We got it recolored and remixed, trying to make it as good as possible before it premiered there. It was a bit of a whirlwind. The festival itself was a really special experience. It was incredibly powerful to have a film that, in my mind, is somewhat doing things that are really pushing the boundaries of what we’re seeing on screen and getting to share it with a lot of people. There’s a gay sex scene in the middle of the film, and to have that celebrated and accepted by an important part of the film community was really special.

Can you describe the film?
Dirty is a tender coming-of-age film. It follows two queer teenagers over an afternoon as they navigate intimacy for the first time.

What about Lavender? Do you have a distributor for that?
The film was acquired by Searchlight Pictures out of Sundance last year. They released the film on their Vimeo and YouTube channels last spring. They put the film in theaters for a week in NYC and LA in front of a feature film they were showing, which actually qualified it for the Oscars last year.

Can you describe that film?
The film is about a young gay man who is growing increasingly entangled in the marriage of an older couple. It is the portrait of an unconventional relationship as it blossoms and ultimately unravels.

What is the project that you are most proud of?
To me Dirty and Lavender are both equally important. I don’t have an answer. I’m grateful for both films for different reasons and they are all part of one period of my life — exploring these ideas of intimacy and loneliness and queer people seeking connection. In some ways they’re almost two attempts to answer the same question.

Name three pieces of technology you can’t live without.
My laptop for all of the writing and editing I do. I try to watch a lot of movies, so I enjoy my TV. And even though I’m trying to wean myself off my phone as much as possible, I still rely on that throughout the day. Obvious answers I know, but it’s true!

What do you do to de-stress from it all?
I find that watching movies and seeing a lot of theater are often the best ways to get inspired and excited about making new work. I’m trying to meditate more. Starting the day with something like that and building out some introspection into my routine has been really helpful. And therapy, of course. Gotta have therapy.

Colorist Chat: FotoKem’s Phil Beckner talks My Spy film, more

Phil Beckner comes to the world of color grading through his digital intermediate editing background, which includes his work on Star Wars: The Last Jedi. Over his nearly 10-year tenure at FotoKem, he has worked as the additional colorist on many studio titles, including The Nun, Aquaman and Shazam!

Beckner’s early titles include work as 3D colorist on 2016’s The Great Wall and as additional colorist on Kong: Skull Island. He moved up to lead colorist on The Director and the Jedi (a full-length BTS documentary for home release). His work can be seen on the upcoming releases My Spy, Lovebirds and Holler.

My Spy

Burbank’s FotoKem is a full-service finishing house specializing in many aspects of post and production support. Services include 2D/3D digital intermediate color grading, 4K/UHD SDR and HDR file-based mastering, digital on-set dailies, and media management and distribution.

Let’s dig a little deeper…

Can you describe the general look for My Spy and how you worked with DP Larry Blanford and director Peter Segal to achieve what they wanted?
My Spy is a great mash-up of “family buddy comedy” and “action movie,” so one of our main goals was to have the film flow seamlessly between those two worlds. We wanted to enhance the drama of the action sequences by really embracing the world that Larry created on set — paying extra attention to the shadows and contrast, for instance — all while making sure not to step on any of the lighter moments in the dialogue.

In contrast, there are interactions between Dave Bautista and the young co-star, Chloe Coleman, that feel very natural and charming. During those scenes we took the opportunity to make it feel like a more traditional comedy — poppy, warm and inviting. The goal being to blend the two looks so that they work in conjunction with each other throughout the film.

I was in contact with Larry before we started the DI and he sent still images along that he liked to get the process started. Once we got rolling and could get everyone in the theater together, it was a very collaborative effort between Pete, Larry and myself. A fun one to work on, for sure!

What format was My Spy shot on? How did that influence your approach to the film?
My Spy was primarily shot with an ARRI Alexa SXT and ARRI Alexa Mini, with some Phantom and GoPro mixed in. Having worked on many ARRI shows over the years, I was familiar with the color science and I knew right away that there would be no compromise in image quality, especially with DP Larry Blanford at the helm. Knowing all of this up front was great because we could immediately dive in and start creating the world that you see on screen.

MY SPY

What color system did you use? Why did it serve the needs of this project?
This show was finished on Blackmagic’s DaVinci Resolve, which has been rapidly improving over the past few years and it was perfect for this show. The conform, color correction and even a handful of VFX were done in Resolve in the DI theater, which was great because there is virtually no delay in showing the filmmakers the end result.

Now on to more general questions:

Throughout your career, you have worked alongside a variety of Hollywood’s top colorists. What’s the most valuable lesson you’ve learned?
One of the most valuable lessons I’ve learned is that there is no “right” answer! In many aspects of filmmaking, you know immediately if you’ve succeeded or missed the mark, which is not really the case with color. Color grading is an ongoing collaboration between the colorist and the filmmakers with the end goal being to bring out the best aspects in every image. The communication and rapport you have with the filmmakers in the theater is just as important as knowing which knob to turn.

As a colorist, what would surprise people the most about what falls under that title?
One thing that may surprise people is how technical the job can be. As a colorist in 2020, it’s important to have a real understanding of the entire image pipeline. The colorist is often looked to for their technical expertise throughout the entire process — many times before shooting even begins — so it’s vital to have a solid technical understanding to help usher the image from set, through VFX, ultimately to the big screen.

What’s your favorite part of the job?
I really enjoy being a part of the team that ultimately creates the visual aesthetic you see on screen. Being able to give the filmmakers exactly the image that they know they captured on set or to show them something new that they hadn’t considered before is very satisfying.

What’s your least favorite?
The dark! I try and go for a lap around the block every so often to get some sunshine.

Can you name some recent projects you have worked on?
I finished a comedy for Paramount/MRC called The Lovebirds, starring Kumail Nanjiani and Issa Rae and directed by Michael Showalter. I also wrapped up Holler, a very cool film shot on 16mm, written and directed by Nicole Riegel.

Where do you find inspiration?
Obviously, I find inspiration in the work of my peers. There are a lot of great-looking movies and TV shows that come out every year, and I love seeing what other artists are doing. But maybe, most importantly, I try to be present in the moment. There are a lot of things vying for your attention on a daily basis but when you slow down and start looking with a critical eye at every different environment you find yourself in, there is a lot of information to gain. Whether I’m on the beach at sunset or in a school auditorium, I’m paying attention to how the light is reacting or what skin tones look like and putting that in my mental toolbox for use in the future.

Emma DP Christopher Blauvelt talks looks and workflow

Focus Features Emma, the latest reincarnation of the Jane Austen novel, was directed by Autumn de Wilde and shot by cinematographer Christopher Blauvelt. For de Wilde, a photographer and music video director, Emma is her feature film directorial debut. In addition to her still work on CD covers for The White Stripes, Fiona Apple and Beck she has directed music videos for indie bands, including Spoon, The Decemberists and Rilo Kiley.

DP Christopher Blauvelt (right)

Blauvelt and de Wilde met in 2000 when she directed Elliott Smith’s Son of Sam music video. He then shot some 16mm footage that was projected behind Elliott on his tour, and the collaborators became friends. When it came time to start on her directorial debut, de Wilde reached out, knowing that he could help her bring her vision to the screen.

Emma was shot with the ARRI Alexa LF using ARRI Signature Prime lenses. Blauvelt had done some tests with it a year or so ago for a film he shot with Gus Van Sant, called Don’t Worry, He Won’t Get Far on Foot, so he was familiar with the camera. “We looked at many different cameras to find our aesthetic,” Blauvelt explains. “I remember making a note about the softness and way it treated skin. It was also something I would think about for scope and scale, for which Emma provided the environments in the form of castles and the English countryside. Of course, we didn’t just test cameras — Autumn had given me many references to use as our guide in finding the unique look of Emma, so we tested many different lenses, cameras, filters, lights and lookup tables.”

Principal photography began in March 2019. Blauvelt hadn’t worked on a feature film in the UK before but was fortunate enough to team up with Mission, a UK-based DIT and digital services company, who assisted in setting up a workflow and a color pipeline that ensured that the director and DP’s vision was communicated to everyone. Mission’s Jacob Robinson was the DIT.

DITs have become a more and more important part of the camera crew and often build close working relationships with DPs. Designing the look of a production is a collaborative process that often begins in preproduction. “I really enjoy my working relationships with DITs; they are the people I rely on to inform me on the rules we’ve put in place on any particular shoot,” says Blauvelt. “Usually during prep, we will do an enormous amount of testing and come up with the recipe that we decide on for the shoot.”

The final DI colorist is often part of the mix too. For Emma, Goldcrest’s Rob Pizzey was the DI colorist, so he was also involved in creating the color pipeline and LUTs with Blauvelt and DIT Robinson. As Blauvelt explains, “It’s also really great having the chance to create custom LUTs with our final grade colorist. We work hard to have a formula that works while on location and all the way to the final grade.”

There are several different ways for a DP to work with LUTs. In Blauvelt’s case, during testing the team created several different base LUTs, including day/exterior, day/interior, night/exterior, night/interior, day/exterior in clouds and sun and other variations they might encounter during the shoot. “These LUTs are all adjustable as well,” he continues, “and will be manipulated live on set to achieve the desired look. We also have images to serve as our spirit throughout the shoot to remind us of the original intent as well.”

The digital lab process on Emma was straightforward. Every day, DIT Robinson would send the capture drives from set along with a Blackmagic DaVinci Resolve project with CDLs applied. Mission’s Niall Todd and Neil Gray were tasked with creating synced H264s and DNxHD 115 files for Avid. The data was then backed up to dual LTOs and a G-Tech 10TB G-Drive.

Mission’s on-set to near-set process made it simple for Blauvelt’s vision to be conveyed to everyone. “Mark Purvis has created a collective and creative environment with Mission that I had not experienced before.”

The digital intermediate process was straightforward, with Goldcrest’s Pizzey using DaVinci Resolve to complete the final grading. Emma, which was edited by Nick Emerson, is now streaming.

Sebastian Robertson, Mark Johnson on making Playing For Change’s The Weight

By Randi Altman

If you have any sort of social media presence, it’s likely that you have seen Playing For Change’s The Weight video featuring The Band’s Robbie Robertson, Ringo Starr, Lukas Nelson and musicians from all over the world. It’s amazing, and if you haven’t seen it, please click here now. Right now. Then come back and read how it was made.

L-R: Mark Johnson, Robbie Robertson, Sebastian Robertson, Raan Williams and Robin Moxey

The Weight was produced by Mark Johnson and Sebastian Robertson, Robbie’s son. It was a celebration of the 50th anniversary of The Band’s first studio album, Music From Big Pink, where the song “The Weight” first appeared. Raan Williams and Robin Moxey were also producers on the project.

Playing For Change (PFC) was co-founded by Johnson and Whitney Kroenke in 2002 with the goal to share the music of street musicians worldwide. And it seems the seed of the idea involved the younger Robertson and Johnson. “Mark Johnson is an old friend of mine,” explains Robertson. “I was sitting around in his apartment when he initially conceived the idea of Playing For Change. At first, it was a vehicle that brought street musicians into the spotlight, then it became world musicians, and then it evolved into a big musical celebration.”

Johnson explains further: “Playing For Change was born out of the idea that no matter how many things in life divide us, they will never be as strong as the power of music to bring us all together. We record and film songs around the world to reconnect all of us to our shared humanity and to show the world through the lens of music and art.” Pretty profound words considering current events.

Mermans Mosengo – Kinshasa Congo

Each went on with their busy lives, Robertson as a musician and composer, and Johnson traveling the world capturing all types of music. They reconnected a couple of years ago, and the timing was ideal. “I wanted to do something to commemorate the 50th anniversary of The Band’s Music From Big Pink — this beautiful album and this beautiful song that my dad wrote — so I brought it to Mark. I wanted to team up with some friends and we all came together to do something really special for him. That was the driving force behind the production of this video.”

To date, Playing For Change has created over 50 “Songs Around the World” videos — including The Grateful Dead’s Ripple and Jimi Hendrix’s All Along the Watchtower — and recorded and filmed over 1,000 musicians across more than 60 countries.

The Weight is beautifully shot and edited, featuring amazingly talented musicians, interesting locales and one of my favorite songs to sing along to. I reached out to Robertson and Johnson to talk through the production, post and audio post.

This was a big undertaking. All those musicians and locales… how did you choose the musicians that were going to take part in it?
Robertson: First, some friends and I went into the studio to record the very basic tracks of the song — the bass, drums, guitar, a piano and a scratch vocal. The first instrument that was added was my dad on rhythm and lead guitar. He heard this very kind of rough demo version of what we had done and played along with it. Then, slowly along the way, we started to replace all those rough instruments with other musicians around them. That’s basically how the process worked.

Larkin Poe – Venice, California

Was there an audition process, or people you knew, like Lukas Nelson and Marcus King? Or did Playing For Change suggest them?
Robertson: Playing For Change was responsible for the world musicians, and I brought in artists like Lukas, my dad, Ringo and Larkin Poe. They have this incredible syndicate of world musicians, so there is no auditioning. So we knew they were going to be amazing. We brought what we had, they added this flavor, and then the song started to take on a new identity because of all these incredible cultures that are added to it. And it just so happened that Lukas was in Los Angeles because he had been recording up at Shangri-La in Malibu. My friend Eric (Lynn) runs that studio, so we got in touch. Then we filmed Lukas.

Is Shangri-La where you initially went to record the very basic parts of the song?
Robertson: It is. The funny and kind of amazing coincidence is that Shangri-La was The Band’s clubhouse in the ’70s. Since then, producer Rick Rubin has taken over. That’s where the band recorded the studio songs of The Last Waltz (film). That’s where they recorded their album, Northern Lights – Southern Cross. Now, here we are 50 years later, recording The Weight.

Mark, how did you choose the locations for the musicians? They were all so colorful and visually stunning.
Johnson: We generally try to work with each musician to find an outdoor location that inspires them and a place that can give the audience a window into their world. Not every location is always so planned out, so we do a lot of improvising to find a suitable location to record and film music live outside.

Shooting Marcus King in Greenville, South Carolina

What did you shoot on? Did you have one DP/crew or use some from all over the world? Were you on set?
Johnson: Most of the PFC videos are recorded and filmed by one crew (Guigo Foggiatto and Joe Miller), including myself, an additional audio person and two camera operators. We work with a local guide to help us find both musicians and locations. We filmed The Weight around the world on 4K with Sony A7 cameras — one side angle, one zoom and a Ronin for more motion.

How did you capture the performances from an audio aspect, and who did the audio post?
Johnson: We record all the musicians around the world live and outside using the same mobile recording studio we’ve used since the beginning of our “Song Around the World” videos over 10 years ago. The only thing that has changed is the way we power everything. In the beginning it was golf cart batteries and then car batteries with big heavy equipment, but fortunately it evolved into lightweight battery packs.

We primarily use Grace mic preamps and Schoeps microphones, and our recording mantra comes from a good friend and musician named Keb’ Mo’. He once told us, “Sound is a feeling first, so if it feels good it will always sound good…” This inspires us to help the musicians to feel comfortable and aware that they are performing along with other musicians from around the world to create something bigger than themselves.

One interesting thing that often comes from this project that differs from life in the studio is that the musicians playing on our songs around the world tend to listen more and play less. They know they are only a part of the performance and so they try to find the best way to fit in and support the song without any ego. This reality makes the editing and mixing process much easier to handle in post.

Lukas Nelson – Austin, Texas

The Weight was recorded by the Playing For Change crew and mixed by Greg Morgenstein, Robin Moxey, Sebastian and me.

What about the editing? All that footage and lining up the song must have been very challenging. I’m assuming cutting your previous videos has given you a lot of experience with this.
Johnson: That is a great question, and one of the most challenging and rewarding parts of the process. It can get really complicated sometimes to edit because we have three cameras per shoot/musician and sometimes many takes of each performance. And sometimes we comp the audio. For example, the first section came from Take 1, the second from Take 6, etc. … and we need to match the video to correspond to each different audio take/performance. We always rough-mix the music first in Avid Pro Tools and then find the corresponding video takes in Adobe Premiere. Whenever we return from a trip, we add the new layer to the Pro Tools session, then the video edit and build the song as we go.

The Weight was a really big audio session in Pro Tools with so many tracks and options to choose from as to who would play what fill or riff and who would sing each verse, and the video session was also huge. with about 20 performances around the world combined with all the takes that go along with them. One of the best parts of the process for me is soloing all the various instruments from around the world and seeing how amazing they all fit together.

You edited this yourself? And who did the color grade?
Johnson: The video was colored by Jon Walls and Yasuhiro Takeuchi on Blackmagic DaVinci Resolve and edited by me, along with everyone’s help, using Premiere. The entire song and video took over a year to make, so we had time throughout the process to work together on the rough mixes and rough edits from each location and build it brick by brick as we went along the journey.

Sherieta Lewis and Roselyn Williams – Trenchtown, Jamaica

When your dad is on the bench playing and wearing headphones — and the other artists as well — what are they listening to? Are they listening to the initial sort of music that you recorded in studio, or was it as it evolved, adding the different instruments and stuff? Is that what he was listening to and playing along to?
Robertson: Yeah. My dad would listen to what we recorded, except in his case we muted the guitar, so he was now playing the guitar part. Then, as elements from my dad and Ringo are added, those [scratch] elements were removed from what we would call the demo. So then as it’s traveling around the world, people are hearing more and more of what the actual production is going to be. It was not long before all those scratch tracks were gone and people were listening to Ringo and my dad. Then we just started filling in with the singers and so on and so forth.

I’m assuming that each artist played the song from start to finish in the video, or at least for the video, and then the editor went in and cut different lines together?
Robertson: Yes and no. For example, we asked Lukas to do a very specific part as far as singing. He would sing his verse, and then he would sing a couple choruses and play guitar over his section. It varied like that. Sometimes when necessary, if somebody is playing percussion throughout the whole song, then they would listen to it from start to finish. But if somebody was just being asked to sing a specific section, they would just sing that section.

Rajeev Shrestha – Nepal

How was your dad’s reaction to all of it? From recording his own bit to watching it and listening to the final?
Robertson: He obviously came on board very early because we needed to get his guitar, and we wanted to get him filmed at the beginning of the process. He was kind of like, “I don’t know what the hell you guys are doing, but it seems cool.” And then by the time the end result came, he was like, “Oh my God.” Also, the response that his friends and colleagues had to it… I think they had the similar response to what you had, which is A, how the hell did you do this? And, B, this is one of the most beautiful things I’ve ever seen.

It really is amazing. One of my favorite parts of the video is the very end, when your dad’s done playing, looks up and has that huge smile on his face.
Robertson: Yeah. It’s a pulling-at-the-heart-strings moment for me, because that was really a perfect picture of the feeling that I had when it all came together.

You’re a musician as well. What are you up to these days?
Robertson: I have a label under the Universal Production Music umbrella, called Sonic Beat Records. The focus of the label is on contemporary, up-to-the-minute super-slick productions. My collaboration with Universal has been a great one so far; we just started in the fall of 2019, so it’s really new. But I’m finding my way in that family, and they’ve welcomed me with open arms.

Another really fun collaboration was working with my dad on the score for Martin Scorsese’s The Irishman. That was a wonderful experience for me. I’m happy with how the music that we did turned out. Over the course of my life, my dad and I haven’t collaborated that much. We’ve just been father and son, and good friends, but as of late, we’ve started to put our forces together, and that has been a lot of fun.

L-R: Mark Johnson and Ahmed Al Harmi – Bahrain

Any other scores on the horizon?
Robertson: Yeah. I just did another score for a documentary film called Let There Be Drums!, which is a look into the mindset of rock and roll drummers. My friend, Justin Kreutzmann, directed it. He’s the son of Bill Kreutzmann, the drummer of the Grateful Dead. He gave me some original drum tracks of his dad’s and Mickey Hart’s, so I would have all these rhythmic elements to play with, and I got to compose a score on top of Mickey Hart and Bill Kreutzmann’s percussive and drumming works. That was a thrill of a lifetime.

Any final thoughts? And what’s next for you, Mark?
Johnson: One of the many amazing things that came out making this video was our partnership with Sheik Abdulla bin Hamad bin Isa Al Khalifa from Bahrain, who works with us to help end the stereotype of terrorism through music by including musicians from the Middle East in our videos. In The Weight watch an oud master in Bahrain cut to a sitar master in Nepal followed by Robbie Robertson and Ringo Starr, and they all work so well together.

One of the best things about Playing For Change is that it never ends. There are always more songs to make, more musicians to record and more people to inspire through the power of music. One heart and one song at a time…


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years.

Finishing artist Tim Nagle discuses work on indie film Miss Juneteenth

Lucky Post Flame artist Tim Nagle has a long list of projects under his belt, including collaborations with David Lowery — providing Flame work on the short film Pioneer as well as finishing and VFX work to Lowery’s motion picture A Ghost Story. He is equally at home working on spots, such as campaigns for AT&T, Hershey’s, The Home Depot, Jeep, McDonald’s and Ram..

Nagle began his formal career on the audio side of the business, working as engineer for Solid State Logic, where he collaborated with clients including Fox, Warner Bros., Skywalker, EA Games and ABC.

Tim Nagle

We reached out to Nagle about his and Lucky Post’s work on the feature film Miss Juneteenth, which premiered at Sundance and was recently honored by SXSW 2020 as the winner of the Louis Black Lone Star award.

Miss Juneteenth was directed (and written) by Channing Godfrey Peoples — her first feature-length film. It focuses on a woman from the south — a bona fide beauty queen once crowned Miss Juneteenth, a title commemorating the day slavery was abolished in Texas. The film follows her journey as she tries to hold onto her elegance while striving to survive. She looks for ways to thrive despite her own shortcomings as she marches, step by step, toward self-realization.

How did the film come to you?
We have an ongoing relationship with Sailor Bear, the film’s producing team of David Lowery, Toby Halbrooks and James Johnston. We’ve collaborated with them on multiple projects, including The Old Man & The Gun, directed by Lowery.

What were you tasked to do?
We were asked to provide dailies transcoding, additional editorial, VFX, color and finishing and ultimately delivery to distribution.

How often did you talk to director Channing Godfrey Peoples?
Channing was in the studio, working side by side with our creatives, including colorist Neil Anderson and me, to get the project completed for the Sundance deadline. It was a massive team effort, and we felt privileged to help Channing with her debut feature.

Without spoilers, what most inspires you about the film?
There’s so much to appreciate in the film — it’s a love letter to Texas, for one. It’s directed by a woman, has a single mother at its center and is a celebration of black culture. The LA Times called it one of the best films to come out of Sundance 2020.

Once you knew the film was premiering at Sundance, what was left to complete and in what amount of time?
This was by far the tightest turnaround we have ever experienced. Everything came down to the wire, sound being the last element. It’s one of the advantages of having a variety of talent and services under one roof — the creative collaboration was immediate, intense and really made possible by our shorthand and proximity.

How important do you think it is for post houses to be diversified in terms of the work they do?
I think diversification is important not only for business purposes but also to keep the artists creatively inspired. Lucky Post’s ongoing commitment to support independent film, both financially and creatively, is an integrated part of our business along with brand-supported work and advertising. Increasingly, as you see greater crossover of these worlds, it just seems like a natural evolution for the business to have fewer silos.

What does it mean to you as a company to have work at Sundance? What kinds of impact do you see — business, morale and otherwise?
Having a project that we put our hands on accepted into Sundance was such an honor. It is unclear what the immediate and direct business impacts might be, but for morale, this is often where the immediate value is clear. The excitement and inspiration we all get from projects like this just naturally makes how we do business better.

What software and hardware did you use?
On this project we started with Assimilate Scratch for dailies creation. Editorial was done in Adobe Premiere. Color was Blackmagic DaVinci Resolve, and finishing was done in Autodesk Flame.

What is a piece of advice that you’d give to filmmakers when considering the post phase of their films?
We love being involved as early as possible — certainly not to get in anyone’s way,  but to be in the background supporting the director’s creative vision. I’d say get with a post company that can assist in setting looks and establishing a workflow. With a little bit of foresight, this will create the efficiency you need to deliver in what always ends up being a tight deadline with the utmost quality.

Workstations and Color Grading

By Karen Moltenbrey

A workstation is a major investment for any studio. Today, selecting the right type of machine for the job can be a difficult process. There are many brands and flavors on the market, and some facilities even opt to build their own. Colorists have several tools available to them when it comes to color grading, ranging from software-based systems (which typically use a multiprocessor workstation with a high-end GPU) to those that are hardware-based.

Here, we examine the color workflow of two different facilities: Technicolor Vancouver and NBCUniversal StudioPost in Los Angeles.

[Editor’s note: These interviews were conducted before the coronavirus work limits were put in place.]

Anne Boyle

Technicolor Vancouver
Technicolor is a stalwart in the post industry, with its creative family — including VFX studios MPC, The Mill, Mr. X and Mikros — and wide breadth of post production services offered in many locations around the world. Although Technicolor Vancouver has been established for some time now, it was only within the past two years that the decision was made to offer finishing services again there, with an eye toward becoming more of a boutique operation, albeit one offering top-level effects.

With this in mind, Anne Boyle joined as a senior colorist, and immediately Technicolor Vancouver began a co-production with Technicolor Los Angeles. The plan was for the work to be done in Vancouver, with review and supervision handled in LA. “So we hit the ground running and built out new rooms and bought a lot of new equipment,” says Boyle. “This included investing in FilmLight Baselight, and we quickly built a little boutique post finishing house here.”

This shared-location work setup enabled Technicolor to take advantage of the lucrative tax credits offered in Vancouver. The supervising colorist in LA reviews sessions with the client, after which she and Boyle discuss them, and then Boyle picks up the scene and performs the work based on those conversations or notes in the timeline. A similar process occurs for the Dolby SDR deliverables. “There isn’t much guesswork. It is very seamless,” she says.

“I’ve always used Baselight,” says Boyle, “and was hoping to go that route when I got here, and then this shared project happened, and it was on a Baselight [in LA]. Happily for me, the supervising colorist, Maxine Gervais, insisted that we mirror the exact setup that they had.”

Gervais was using a Baselight X system, so that is what was installed in Vancouver. “It’s multi-GPU (six Nvidia Titan XPs) with a huge amount of storage,” she says. “So we put in the same thing and mimicked the infrastructure in LA. They also put in a Baselight Assist station and plan to upgrade it in the coming months to make it color-capable as well.”

The Baselight X turnkey system ships with bespoke storage and processing hardware, although Technicolor Vancouver loaded it with additional storage. For the grading panels, the company went with the top-of-the-line Blackboard. The Vancouver facility also purchased the same displays as LA — Sony BVM-X300s.

Messiah

The mirrored setup was necessary for the shared work on Netflix’s Messiah, an HDR project that dropped January 1. “We had to deliver 10 episodes all at once in 4K, [along with] both the HDR PQ masters and the Dolby SDR deliverable, which were done here as well,” explains Boyle. “So we needed the capability to store all of that and all of those renders. It was quite a VFX-heavy show, too.”

Using Pulse, Technicolor’s internal cloud-based system, the data set is shared between the LA and Vancouver sites. Technicolor staff can pull down the data, and VFX vendors can pull their own VFX shots too. “We had exact mirrors of the data. We were not sending project files back and forth, but rather, we shared them,” Boyle explains. “So anyone could jump on the project, whether in Vancouver or LA, and immediately open the project, and everything would appear instantaneously.”

When it comes to the hardware itself, speed and power are big factors. As Boyle points out, the group handles large files, and slowdowns, render issues and playback hiccups are unacceptable.

Messiah

The color system proved its mettle on Messiah, which required a lot of skin retouching and other beauty work. “The system is dedicated and designed only for colorists,” says Boyle. “And the tools are color-focused.”

Indeed, Boyle has witnessed drastic changes in color workstations over the past several years. File sizes have increased thanks to Red 8K and raw materials, which have driven the need for more powerful machines and more powerful GPUs, particularly with the increasingly complex HDR workflows, wherein floating points are necessary for good color. “More work nowadays needs to be performed on the GPU,” she adds. “You just can’t have enough power behind you.”

NBCUniversal StudioPost
NBCUniversal StudioPost knows a thing or two about post production. Not only does the facility provide a range of post, sound and finishing services, but it also offers cutting-edge equipment rentals and custom editorial rooms used by internal and third-party clients.

Danny Bernardino

Specifically, NBCUniversal offers end-to-end picture services that include dailies, editorial, VFX, color correction, duplication and encoding/decoding, data management, QC, sound, sound editorial, sound supervision, mixing and streaming.

Each area has a plethora of workstations and systems needed to perform its given tasks. For the colorists, the facility offers two choices, both on Linux OS: a Blackmagic DaVinci Resolve 16.1.2 (fully loaded with a creative suite of plugins and add-ons) running on an HP Z840 machine, and Autodesk Lustre 2019 running on an HP Z820.

“We look for a top-of-the-line color corrector that has a robust creative tool set as well as one that is technically stable, which is why we prefer Linux-based systems,” says Danny Bernardino, digital colorist at NBCUniversal StudioPost. Furthermore, the facility prefers a color corrector that adapts to new file formats and workflows by frequently updating its versions. Another concern is that the system works in concert with all of the ever-changing display demands, such as UHD, 4K, HDR and Dolby Vision.

Color bay

According to Bernardino, the color systems at NBCUniversal are outfitted with the proper CPU/GPU and SAN storage connectivity to ensure efficient image processing, thereby allowing the color talent to work without interruption. The color suites also are outfitted with production-level video monitors that represent true color. Each has high-quality scopes (waveform, vector and audio) that handle all formats.

When it comes time to select machines for the colorists there, it is a collective process, says senior VP Thomas Thurau. First, the company ascertains the delivery requirements, and then the color talent, engineering and operations staff work together to configure the proper tool sets for the customers’ content. How often the equipment is replaced is contingent on whether new image and display technology has been introduced.

Thurau defines a solid colorist workstation as a robust platform that is Linux-based and has enough card slots or expansion chassis capabilities to handle four or more GPU cards, Fibre Channel cards and more. “All of our systems are in constant demand, from compute to storage, thus we look for systems and hardware that are robust through to delivery,” he notes.

Mr. Robot

NBCUniversal StudioPost is always humming with various work. Some of the more recent projects there includes Jaws, which was remastered in UHD/HDR, Casino (UHD/HDR), the How to Train Your Dragon series (UHD/HDR) and an array of Alfred Hitchcock’s more famous films. The company also services broadcast episodic (NBCU and others) and OTT/streaming customers, offering a full suite of services (Avid, picture and sound). This includes Law & Order SVU, Chicago Med, Will & Grace, Four Weddings and a Funeral and Mr. Robot, as well as others.

“We take incredible pride in all aspects of our color services here at NBCUniversal StudioPost, and we are especially pleased with our HDR grades,” says Thurau.

For those who prefer to do their own work, NBCUniversal has over 185 editorial rooms, ranging from small to large suites, set up with Avid Media Composer.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Colorist Chat: Framestore LA senior colorist Beau Leon

Veteran colorist Beau Leon recently worked with director Spike Jonze on a Beastie Boys documentary and a spot for cannabis retailer MedMen.

What’s your title and company?
I’m senior colorist at LA’s Framestore

Spike Jonze’s MedMen

What kind of services does Framestore offer?
Framestore is a multi-Oscar-winning creative studio founded over 30 years ago, and the services offered have evolved considerably over the decades. We work across film, television, advertising, music videos, cinematic data visualization, VR, AR, XR, theme park rides… the list is endless and continues to change as new platforms emerge.

As a colorist, what would surprise people the most about what falls under that title?
Despite creative direction or the equipment used to shoot something, whether it be for film or TV, people might not factor in how much color or tone can dictate the impact a story has on its audience. As a colorist, my role often involves acting as a mediator of sorts between various creative stakeholders to ensure everyone is on the same page about what we’re trying to convey, as it can translate differently through color.

Are you sometimes asked to do more than just color on projects?
Earlier in my career, the process was more collaborative with DPs and directors who would bring color in at the beginning of a project. Now, particularly when it comes to commercials with tighter deadlines and turnarounds, many of those conversations happen during pre-production without grading factored in until later in the pipeline.

Rihanna’s Needed Me

Building strong relationships and working on multiple projects with DPs or directors always allows for more trust and creative control on my end. Some of the best examples I’ve seen of this are on music video projects, like Rihanna’s Needed Me, which I graded here at Framestore for a DP I’d grown up in the industry with. That gave me the opportunity to push the creative boundaries.

What system do you work on?
FilmLight Baselight

You recently worked on the new Beastie Boys documentary, Beastie Boys Story. Can you talk a bit about what you did and any challenges relating to deadlines?
I’ve been privileged to work with Spike Jonze on a number of projects throughout my career, so going into Beastie Boys Story, we already had a strong dialogue. He’s a very collaborative director and respectful of everyone’s craft and expertise, which can be surprisingly rare within our industry.

Spike Jonze’s Beatie Boys Story

The unique thing about this project was that, with so much old footage being used, it needed to be mastered in HDR as well as reworked for IMAX. And with Spike being so open to different ideas, the hardest part was deciding which direction to choose. Whether you’re a hardcore Beastie Boys fan or not, the documentary is well worth watching once it will air on AppleTV+ in April.

Any suggestions for getting the most out of a project from a color perspective?
As an audience, our eyes have evolved a great deal over the last few decades. I would argue that most of what we see on TV and film today is extremely oversaturated compared to what we’d experience in our real environment. I think it speaks to how we treat consumers and anticipate what we think they want — colorful, bright and eye-catching. When it’s appropriate, I try to challenge clients to think outside those new norms.

How do you prefer to work with the DP or director?
Whether it’s working with a DP or director, the more involved I can be early on in the conversation, the more seamless the process becomes during post production and ultimately leads to a better end result. In my experience, this type of access is more common when working on music videos.

Most one-off commercial projects see us dealing with an agency more often than the director, but an exception to the rule that comes to mind is on another occasion when I had the chance to collaborate on a project with Spike Jonze for the first ever brand campaign for cannabis retailer MedMen called The New Normal. He placed an important emphasis on grading and was very open to my recommendations and vision.

How do you like getting feedback in terms of the look?
A conversation is always the best way to receive feedback versus a written interpretation of imagery, which tends to become very personal. An example might be when a client wants to create the feeling of a warm climate in a particular scene. Some might interpret that as adding more warm color tones, when in fact, if you think about some of the hottest places you’ve ever visited, the sun shines so fiercely that it casts a bright white hue.

What’s your favorite part of the job?
That’s an easy answer — to me, it’s all about the amazing people you meet in this industry and the creative collaboration that happens as a result. So many of my colleagues over the years have become great friends.

Any least favorites?
There isn’t much that I don’t love about my job, but I have witnessed a change over the years in the way that our industry has begun to undervalue relationships, which I think is a shame.

If you didn’t have this job, what would you be doing instead?
I would be an art teacher. It combines my passion for color and visual inspiration with a forum for sharing knowledge and fostering creativity.

How early did you know this would be your path?
In my early 20s, I started working on dailies (think The Dukes of Hazzard, The Karate Kid, Fantasy Island) at a place in The Valley that had a telecine machine that transferred at a frame rate faster than anywhere else in LA at the time. It was there that I started coloring (without technically realizing that was the job I was doing, or that it was even a profession).

Soon after, I received a call from a company called 525 asking me to join them. They worked on all of the top music videos during the prime “I Want My MTV” era, and after working on music videos as a side hustle at night, I knew that’s where I wanted to be. When I first walked into the building, I was struck by how much more advanced their technology was and immediately felt out of my depth. Luckily, someone saw something in me before I recognized it within myself. I worked on everything from R.E.M.’s “Losing My Religion” to TLC’s “Waterfalls” and The Smashing Pumpkins’ “Tonight, Tonight.” I found such joy in collaborating with some of the most creative and spirited directors in the business, many of whom were inspiring artists, designers and photographers in their spare time.

Where do you find inspiration?
I’m lucky to live in a city like LA with such a rich artistic scene, so I make a point to attend as many gallery openings and exhibitions as I can. Some of my favorite spaces are the Annenberg Space for Photography, the Hammer Museum and Hauser & Wirth. On the weekends I also stop by Arcana bookstore in Culver City, where they source rare books on art and design.

Name three pieces of technology you can’t live without.
I think I would be completely fine if I had to survive without technology.

This industry comes with tight deadlines. How do you de-stress from it all?
After a long day, cooking helps me decompress and express my creativity through a different outlet. I never miss a trip to my local farmer’s market, which also helps to keep me inspired. And when I’m not looking at other people’s art, I’m painting my own abstract pieces at my home studio.

Assimilate intros live grading, video monitoring and dailies tools

Assimilate has launched Live Looks and Live Assist, production tools that give pros speed and specialized features for on-set live grading, look creation, advanced video monitoring and recording.

Live Looks provides an easy-to-set-up environment for video monitoring and live grading that supports any resolution, from standard HD up to 8K workflows. Featuring professional grading and FX/greenscreen tools, it is straightforward to operate and offers a seamless connection into dailies and post workflows. With Live Looks being available on both macOS and Windows, users are, for the first time, free to use the platform and hardware of their choice. You can see their intro video here.

“I interact regularly with DITs to get their direct input about tools that will help them be more efficient and productive on set, and Live Looks and Live Assist are a result of that,” says Mazze Aderhold, product marketing manager at Assimilate. “We’ve bundled unique and essential features with the needed speed to boost their capabilities, and enabling them to contribute to time savings and lower costs in the filmmaking workflow.”

Users can run this on a variety of places — from a  laptop to a full-blown on-set DIT rig. Live Looks provides LUT-box control over Flanders, Teradek and TVLogic devices. It also supports video I/O from AJA, Bluefish444 and Blackmagic for image and full-camera metadata capture. There is also now direct reference recording to Apple ProRes on macOS and Windows.

Live Looks goes beyond LUT-box control. Users can process the live camera feed via video I/O, making it possible to do advanced grading, compare looks, manage all metadata, annotate camera input and generate production reports. Its fully color-managed environment ensures the created looks will come out the same in dailies and post. Live Looks provides a seamless path into dailies and post with look-matching in Scratch and CDL-EDL transfer to DaVinci Resolve.

With Live Looks, Assimilate takes its high-end grading tool set beyond Lift, Gamma, Gain and CDL by adding Powerful Curves and an easy-to-use Color Remapper. On-set previews can encompass not just color but realtime texture effects, like Grain, Highlight Glow, Diffusion and Vignette — all GPU-accelerated.

Advanced chroma keying lets users replace greenscreen backgrounds with two clicks. This allows for proper camera angles, greenscreen tracking/anchor point locations and lighting. As with all Assimilate software, users can load and play back any camera format, including raw formats such as Red raw and Apple ProRes raw.

Live Assist has all of the features of Live Looks but also handles basic video-assist tasks, and like Live Assist, it is available on both macOS and Windows. It provides multicam recording and instant playback of all recorded channels and seamlessly combines live grading with video-assist tasks in an easy-to-use UI. Live Assist automatically records camera inputs to file based on the Rec-flag inside the SDI signal, including all live camera metadata. It also extends the range of supported “edit-ready” capture formats: Apple ProRes (Mov), H264 (MP4) and Avid DNxHD/HR (MXF). Operators can then choose whether they want to record the clean signal or record with the grade baked in.

Both Live Looks and Live Assist are available now. Live Looks starts at $89 per month, and Live Assist starts at $325 per month. Both products and free trials are available on the Assimilate site.

Colorist Chat: Keith Shaw on Showtime’s Homeland and the process

By Randi Altman

The long wait for the final season of Showtime’s Homeland seemed to last an eternity, but thankfully the series is now airing, and we here at postPerspective are pretty jazzed about it. Our favorite spies, Carrie and Saul, are back at it, with this season being set in Afghanistan.

Keith Shaw

Year after year, the writing, production and post values on Homeland have been outstanding. One of those post folks is colorist Keith Shaw from FotoKem’s Keep Me Posted, which focuses on finishing services to television.

Shaw’s credits are impressive. In addition to Homeland, his work can be seen on Ray Donovan, Shameless, Animal Kingdom and many others. We reached out to Shaw to find out more about working on Homeland from the first episode to the last. Shaw shares his workflow and what inspires him.

You’ve been on Homeland since the beginning. Can you describe the look of the show and how you’ve worked with DPs David Klein, ASC, and Giorgio Scali, ASC, as well as producer Katie O’Hara?
Working on Homeland from Episode 1 has been a truly amazing experience. Katie, Dave, Giorgio and I are an extremely collaborative group.

One consistent factor of all eight seasons has been the need for the show to look “real.” We don’t have any drastic or aggressively stylized looks, so the goal is to subtly manipulate the color and mood yet make it distinct enough to help support the storyline.

When you first started on the show, how would you describe the look?
The first two seasons were shot by Nelson Cragg, ASC. For those early episodes, the show was a bit grittier and more desaturated. It had a darker, heavier feel to it. There was not as much detail in the dark areas of the image, and the light fell off more quickly on the edges.

Although the locations and looks have changed over the years, what’s been the common thread?
As I mentioned earlier, the show has a realism to it. It’s not super-stylized and affected.

Do the DPs come to the color suite? What kind of notes do you typically get from them?
They do when they are able (which is not often). They are generally on the other side of the world. As far as notes, it depends on the episode. When I’m lucky, I get none. Generally, there are not a lot of notes. That’s the advantage of collaborating on a show from the beginning. You and the DP can “mold” the look of the show together.

You’ve worked on many episodics at Keep Me Posted. Prior to that you were working on features at Warner Bros. Can you talk about how that process differs for you?
In remastering and restoration of feature films, the production stage is complete. It’s not happening simultaneously, and that means the timeline and deadlines aren’t as stressful.

Digital intermediates on original productions, on the other hand, are similar to television because multiple things are happening all at once. There is an overlap between production and post. During color, the cut can be changing, and new effects could be added or updated, but with much tighter deadlines. DI was a great stepping stone for me to move from feature films to television.

Now let’s talk about some more general aspects of the job…

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
First of all, most people don’t have a clear understanding of what a colorist is or does. Even after 25 years and multiple explanations, my father-in-law still tells everyone I’m an editor.

Being a colorist means you wear many hats — confidante, mediator, therapist, VFX supervisor, scheduler and data manager — in addition to that color thing. For me, it boils down to three main attributes. One, you need to be artistic/creative. Two, you need to be technical. Finally, you need to mediate the decision-making processes. Sometimes that can be the hardest part of all, when there are competing viewpoints and visions between all the parties involved.

WHAT SYSTEM DO YOU WORK ON?
Digital Vision’s Nucoda.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Today’s color correctors are incredibly powerful and versatile. In addition to color, I can do light VFX, beauty work, editing or technical fixes when necessary. The clients appreciate the value of saving time and money by taking care of last-minute issues in the color suite.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Building relationships with clients, earning their trust and helping them bring their vision to the screen. I love that special moment when you and the DP are completely in sync — you’re reaching for the knobs before they even ask for a change, and you are finishing each other’s sentences.

WHAT’S YOUR LEAST FAVORITE?
Deadlines. However, they are actually helpful in my case because otherwise I would tweak and re-tweak the smallest details endlessly.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Ray Donovan, Shameless, Animal Kingdom, Single Parents and Bless This Mess are my current shows.

ANY SUGGESTIONS FOR GETTING THE MOST OUT OF A PROJECT FROM A COLOR PERSPECTIVE?
Become a part of the process as early as possible. Establishing looks, LUTs and good communication with the cinematographer are essential.

HOW DO YOU PREFER THE DP OR DIRECTOR TO DESCRIBE THE LOOK THEY WANT?
Each client has a different source of inspiration and way of conveying their vision. I’ve worked from fabric and paint samples, YouTube videos, photographs, magazine ads, movie or television show references, previous work (theirs and/or mine) and so on.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I can’t pick just one, so I’ll pick two. From my feature mastering work, The Shawshank Redemption. From television, Homeland.

WHERE DO YOU FIND INSPIRATION?
Definitely in photography. My father was a professional photographer and we had our own darkroom. As a kid, I spent countless hours after school and on weekends learning how to plan, take and create great photographs. It is still a favorite hobby of mine to this day.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Director Vincent Lin discusses colorful Seagram’s Escapes spot

By Randi Altman

Valiant Pictures, a New York-based production house, recently produced a commercial spot featuring The Bachelor/Bachelorette host Chris Harrison promoting Seagram’s Escapes and its line of alcohol-based fruit drinks. A new addition to the product line is Tropical Rosé, which was co-developed by Harrison and contains natural passion fruit, dragon fruit and rosé flavors.

Valiant’s Vincent Lin directed the piece, which features Harrison in a tropical-looking room — brightened with sunny pinks and yellows thanks to NYC’s Nice Shoes — describing the rosé and signing off with the Seagram’s Escapes brand slogan, “Keep it colorful!”

Here, director Lin — and his DP Alexander Chinnici — talks about the project’s conception, shoot and post.

How early did you get involved? Did Valiant act as the creative agency on this spot?
Valiant has a long-standing history with the Seagram’s Escapes brand team, and we were fortunate enough to have the opportunity to brainstorm a few ideas with them early on for their launch of Seagram’s Escapes Tropical Rosé with Chris Harrison. The creative concept was developed by Valiant’s in-house creative agency, headed by creative directors Nicole Zizila and Steven Zizila, and me. Seagram’s was very instrumental in the creative for the project, and we collaborated to make sure it felt fresh and new — like an elevated evolution of their “Keep It Colorful” campaign rather than a replacement.

Clearly, it’s meant to have a tropical vibe. Was it shot greenscreen?
We had considered doing this greenscreen, which would open up some interesting options, but also it would pose some challenges. What was important for this campaign creatively was to seamlessly take Chris Harrison to the magical world of Seagram’s Escapes Tropical Rosé. A practical approach was chosen so it didn’t feel too “out of this world,” and the live action still felt real and relatable. We had considered putting Chris in a tropical location — either in greenscreen or on location — but we really wanted to play to Chris’ personality and strengths and have him lead us to this world, rather than throw him into it. Plus, they didn’t sign off on letting us film in the Maldives. I tried (smiles).

L-R: Vincent Lin and Alex Chinnici

What was the spot shot on?
Working with the very talented DP Alex Chinnici, he recommended shooting on the ARRI Alexa for many reasons. I’ll let Alex answer this one.

Alex Chinnici: Some DPs would likely answer with something sexier  like, “I love the look!” But that is ignoring a lot of the technical realities available to us these days. A lot of these cameras are wonderful. I can manipulate the look, so I choose a camera based on other reasons. Without an on-set live, color-capable DIT, I had to rely on the default LUT seen on set and through post. The Alexa’s default LUT is my preference among the digital cameras. For lighting and everyone on the set, we start in a wonderful place right off the bat. Post houses also know it so well, along with colorists and VFX. Knowing our limitations and expecting not to be entirely involved, I prefer giving these departments the best image/file possible.

Inherently, the color, highlight retention and skin tone are wonderful right off the bat without having to bend over backward for anyone. With the Alexa, you end up being much closer to the end rather than having to jump through hoops to get there like you would with some other cameras. Lastly, the reliability is key. With the little time that we had, and a celebrity talent, I would never put a production through the risk of some new tech. Being in a studio, we had full control but still, I’d rather start in a place of success and only make it better from there.

What about the lenses?
Chinnici: I chose the Zeiss Master Primes for similar reasons. While sharp, they are not overbearing. With some mild filtration and very soft and controlled lighting, I can adjust that in other ways. Plus, I know that post will beautify anything that needs it; giving them a clean, sharp image (especially considering the seltzer can) is key.

I shot at a deeper stop to ensure that the lenses are even cleaner and sharper, although the Master Primes do hold up very well wide open. I also wanted the Seagram’s can to be in focus as much as possible and for us to be able to see the set behind Chris Harrison, as opposed to a very shallow depth of field. I also wanted to ensure little to no flares, solid contrast, sharpness across the field and no surprises.

Thanks Alex. Back to you Vincent. How did you work with Alex to get the right look?
There was a lot of back and forth between Alex and me, and we pulled references to discuss. Ultimately, we knew the two most important things were to highlight Chris Harrison and the product. We also knew we wanted the spot to feel like a progression from the brand’s previous work. We decided the best way to do this was to introduce some dimensionality by giving the set depth with lighting, while keeping a clean, polished and sophisticated aesthetic. We also introduced a bit of camera movement to further pull the audience in and to compose the shots it in a way that all the focus would be on Chris Harrison to bring us into that vibrant CG world.

How did you work with Nice Shoes colorist Chris Ryan to make sure the look stayed on point? 
Nice Shoes is always one of our preferred partners, and Chris Ryan was perfect for the job. Our creatives, Nicole and Steven, had worked with him a number of times. As with all jobs, there are certain challenges and limitations, and we knew we had to work fast. Chris is not only detail oriented, creative and a wizard with color correction, but also able to work efficiently.

He worked on a FilmLight Baselight system off the Alexa raw files. The color grading really brought out the saturation to further reinforce the brand’s slogan, “Keep It Colorful,” but also to manage the highlights and whites so it felt inviting and bright throughout, but not at all sterile.

What about the VFX? Can you talk about how that was accomplished? 
Much like the camera work, we wanted to continue giving dimensionality to the spot by having depth in each of our CG shots. Not only depth in space but also in movement and choreography. We wanted the CG world to feel full of life and vibrant in order to highlight key elements of the beverage — the flavors, dragonfruit and passionfruit — and give it a sense of motion that draws you in while making you believe there’s a world outside of it. We wanted the hero to shine in the center and the animation to play out as if a kaleidoscope or tornado was pulling you in closer and closer.

We sought the help of creative production studio Taylor James tto build the CG elements. We chose to work with a core of 3ds Max artists who could do a range of tasks using Autodesk 3ds Max and Chaos Group’s V-Ray (we also use Maya and Arnold). We used Foundry Nuke to composite all of the shots and integrate the CGI into the footage. The 3D asset creation, animation and lighting were constructed and rendered in Autodesk Maya, with compositing done in Adobe After Effects.

One of the biggest challenges was making sure the live action felt connected to the CG world, but with each still having its own personality. There is a modern and clean feel to these spots that we wanted to uphold while still making it feel fun and playful with colors and movement. There were definitely a few earlier versions that we went a bit crazy with and had to scale down a bit.

Does a lot of your work feature live action and visual effects combined?
I think of VFX like any film technique: It’s simply a tool for directors and creatives to use. The most essential thing is to understand the brand, if it’s a commercial, and to understand the story you are trying to tell. I’ve been fortunate to do a number of spots that involve live-action and VFX now, but truth be told, VFX almost always sneaks its way in these days.

Even if I do a practical effect, there are limitless possibilities in post production and VFX. Anything from simple cleanup to enhancing, compositing, set building and extending — it’s all possible. It’d be foolish not to consider it as a viable tool. Now, that’s not to say you should rely solely on VFX to fix problems, but if there’s a way it can improve your work, definitely use it. For this particular project, obviously, the CG was crucial to let us really be immersed in a magical world at the level of realism and proximity we desired.

Anything challenging about this spot that you’d like to share?
Chris Harrison was terrible to work with and refused to wear a shirt for some reason … I’m just kidding! Chris was one of the most professional, humblest and kindest celebrity talents that I’ve had the pleasure to work with. This wasn’t a simple endorsement for him; he actually did work closely with Seagram’s Escapes over several months to create and flavor-test the Tropical Rosé beverage.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Blackmagic releases Resolve 16.2, beefs up audio post tools

Blackmagic has updated its color, edit, VFX and audio post tool to Resolve 16.2. This new version features major Fairlight updates for audio post as well as many improvements for color correction, editing and more.

This new version has major new updates for editing in the Fairlight audio timeline when using a mouse and keyboard. This is because the new edit selection mode unlocks functionality previously only available via the audio editor on the full Fairlight console, so editing is much faster than before. In addition, the edit selection mode makes adding fades and cuts and even moving clips only a mouse click away. New scalable waveforms let users zoom in without adjusting the volume. Bouncing lets customers render a clip with custom sound effects directly from the Fairlight timeline.

Adding multiple clips is also easier, as users can now add them to the timeline vertically, not just horizontally, making it simpler to add multiple tracks of audio at once. Multichannel tracks can now be converted into linked groups directly in the timeline so users no longer have to change clips manually and reimport. There’s added support for frame boundary editing, which improves file export compatibility for film and broadcast deliveries. Frame boundary editing now adds precision so users can easily trim to frame boundaries without having to zoom all the way in the timeline. The new version supports modifier keys so that clips can be duplicated directly in the timeline using the keyboard and mouse. Users can also copy clips across multiple timelines with ease.

Resolve 16.2 also includes support for the Blackmagic Fairlight Sound Library with new support for metadata based searches, so customers don’t need to know the filename to find a sound effect. Search results also display both the file name and description, so finding the perfect sound effect is faster and easier than before.

MPEG-H 3D immersive surround sound audio bussing and monitoring workflows are now supported. Additionally, improved pan and balance behavior includes the ability to constrain panning.

Fairlight audio editing also has index improvements. The edit index is now available in the Fairlight page and works as it does in the other pages, displaying a list of all media used; users simply click on a clip to navigate directly to its location in the timeline. The track index now supports drag selections for mute, solo, record enable and lock as well as visibility controls so editors can quickly swipe through a stack of tracks without having to click on each one individually. Audio tracks can also be rearranged by click and dragging a single track or a group of tracks in the track index.

This new release also includes improvements in AAF import and export. AAF support has been refined so that AAF sequences can be imported directly to the timeline in use. Additionally, if the project features a different time scale, the AAF data can also be imported with an offset value to match. AAF files that contain multiple channels will also be recognized as linked groups automatically. The AAF export has been updated and now supports industry-standard broadcast wave files. Audio cross-fades and fade handles are now added to the AAF files exported from Fairlight and will be recognized in other applications.

For traditional Fairlight users, this new update makes major improvements in importing old legacy Fairlight projects —including improved speed when opening projects with over 1,000 media files, so projects are imported more quickly.

Audio mixing is also improved. A new EQ curve preset for clip EQ in the inspector allows removal of troublesome frequencies. New FairlightFX filters include a new meter plug-in that adds a floating meter for any track or bus, so users can keep an eye on levels even if the monitoring panel or mixer are closed. There’s also a new LFE filter designed to smoothly roll off the higher frequencies when mixing low-frequency effects in surround.

Working with immersive sound workflows using the Fairlight audio editor has been updated and now includes dedicated controls for panning up and down. Additionally, clip EQ can now be altered in the inspector on the editor panel. Copy and paste functions have been updated, and now all attributes — including EQ, automation and clip gain — are copied. Sound engineers can set up their preferred workflow, including creating and applying their own presets for clip EQ. Plug-in parameters can also be customized or added so that users have fast access to their preferred tool set.

Clip levels can now be changed relatively, allowing users to adjust the overall gain while respecting existing adjustments. Clip levels can also be reset to unity, easily removing any level adjustments that might have previously been made. Fades can also be deleted directly from the Fairlight Editor, making it faster to do than before. Sound engineers can also now save their preferred track view so that they get the view they want without having to create it each time. More functions previously only available via the keyboard are now accessible using the panel, including layered editing. This also means that automation curves can now be selected via the keyboard or audio panel.

Continuing on with the extensive improvements to the Fairlight audio, there has also been major updates to the audio editor transport control. Track navigation is now improved and even works when nothing is selected. Users can navigate directly to the timecode entry window above the timeline from the audio editor panel, and there is added support for high-frame-rate timecodes. Timecode entry now supports values relative to the current CTI location, so the playhead can move along the timeline relative to the position rather than a set timecode.

Support has also been added so the colon key can be used in place of the user typing 00. Master spill on console faders now lets users spill out all the tracks to a bus fader for quick adjustments in the mix. There’s also more precision with rotary controls on the panel and when using a mouse with a modifier key. Users can also change the layout and select either icon or text-only labels on the Fairlight editor. Legacy Fairlight users can now use the traditional — and perhaps more familiar — Fairlight layout. Moving around the timeline is even quicker with added support for “media left” and “media right” selection keys to jump the playhead forward and back.

This update also improves editing in Resolve. Loading and switching timelines on the edit page is now faster, with improved performance when working with a large number of audio tracks. Compound clips can now be made from in and out points so that editors can be more selective about which media they want to see directly in the edit page. There is also support for previewing timeline audio when performing live overwrites of video-only edits. Now when trimming, the duration will reflect the clip duration as users actively trim, so they can set a specific clip length. Support for a change transition duration dialogue.

The media pool now includes metadata support for audio files with up to 24 embedded channels. Users can also duplicate clips and timelines into the same bin using copy and paste commands. Support for running the primary DaVinci Resolve screen as a window when dual-screen mode is enabled. Smart filters now let users sort media based on metadata fields, including keywords and people tags, so users can find the clips they need faster.

Video Coverage: HPA Tech Retreat’s making of The Lost Lederhosen

By Randi Altman

At the HPA Tech Retreat in Rancho Mirage, California, the Supersession was a little different this year. Under the leadership of Joachim (JZ) Zell — who you might know from his day job as VP of technology at EFILM — the Supersession focused on the making of the short film, The Lost Lederhosen, in “near realtime,” in the desert. And postPerspective was there, camera in hand, to interview a few of the folks involved.

Watch our video coverage here.

While production for the film began a month before the Retreat — with Steve Shaw, ASC, directing and DP Roy H. Wagner Jr., ASC, lending his cinematography talents — some scenes were shot the morning of the session with data transfer taking place during lunch and post production in the afternoon. Peter Moss, ASC, and Sam Nicholson, ASC, also provided their time and expertise. After an active day of production, cloud-based post and extreme collaboration, the Supersession ended with the first-ever screening of The Lost Lederhosen, the story of Helga and her friend Hans making their way to Los Angeles, Zell and the HBA (Hollywood Beer Alliance). Check out HPA’s trailer here.

From acquisition to post (and with the use of multiple camera formats, framefrates and lenses), the film’s crew were volunteers and includes creatives and technologists from companies such as AWS, Colorfront, Frame.io, Avid, Blackmagic, Red, Panavision, Zeiss, FilmLight, SGO, Stargate, Unreal Engine, Sohonet and many more. One of the film’s goals was to use the cloud as much as possible in order to test out that particular workflow. While there were some minor hiccups along the way, the film got made — at the HPA Tech Retreat — and these industry pros got smarter about working in the cloud, something that will be increasingly employed going forward.

While we were were only able to chat with a handful of those pros involved, like any movie, the list of credits and thank you’s are too extensive to mention here — there were dozens of individuals and companies who donated their services and time to make this possible.

Watch our video coverage here.

(A big thank you and shout out to Twain Richardson for editing our videos.)

Main Image Caption: AWS’ Jack Wenzinger and EFILM’s Joachim Zell

Senior colorist Tony D’Amore joins Picture Shop

Burbank’s Picture Shop has beefed up its staff with senior colorist Tony D’Amore, who will also serve as a director of creative workflow. In that role, he will oversee a team focusing on color prep and workflow efficiency.

Originally from rural Illinois, D’Amore made the leap to the West Coast to pursue an education, studying film and television at UCLA. He started his career in color in the early ‘90s, gaining valuable experience in the world of post. He has been working closely with color and post workflow since.

While D’Amore has experience working on Autodesk Lustre and FilmLight Baselight, he primarily grades in Blackmagic DaVinci Resolve. D’Amore has contributed color to several Emmy Award-winning shows nominated in the category of “Outstanding Cinematography.”

D’Amore has developed new and efficient workflows for Dolby Vision HDR and HDR10, coloring hundreds of hours of episodic programming for networks including CBS, ABC and Fox, as well as cable and streaming platforms such as HBO, Starz, Netflix, Hulu and Amazon.

D’Amore’s most notable project to date is having colored a Marvel series simultaneously for IMAX and ABC delivery. His list of color credits include, Barry (HBO), Looking for Alaska (Hulu), Legion (FX), Carnival Row (Amazon), Power (Starz), Fargo (FX), Elementary (CBS), Hanna (Amazon), and a variety of Marvel series, including Jessica Jones, Daredevil, The Defenders, Luke Cage and Iron Fist. All of these are available on streaming platforms

Amazon’s The Expanse Season 4 gets HDR finish

The fourth season of the sci-fi series The Expanse was finished in HDR for the first time streaming via Amazon Prime Video. Deluxe Toronto handled end-to-end post services, including online editorial, sound remixing and color grading. The series was shot on ARRI Alexa Minis.

In preparation for production, cinematographer Jeremy Benning, CSC, shot anamorphic test footage at a quarry that would serve as the filming stand-in for the season’s new alien planet, Ilus. Deluxe Toronto senior colorist Joanne Rourke then worked with Benning, VFX supervisor Bret Culp, showrunner Naren Shankar and series regular Breck Eisner to develop looks that would convey the location’s uninviting and forlorn nature, keeping the overall look desaturated and removing color from the vegetation. Further distinguishing Ilus from other environments, production chose to display scenes on or above Ilus in a 2.39 aspect ratio, while those featuring Earth and Mars remained in a 16:9 format.

“Moving into HDR for Season 4 of our show was something Naren and I have wanted to do for a couple of years,” says Benning. “We did test HDR grading a couple seasons ago with Joanne at Deluxe, but it was not mandated by the broadcaster at the time, so we didn’t move forward. But Naren and I were very excited by those tests and hoped that one day we would go HDR. With Amazon as our new home [after airing on Syfy], HDR was part of their delivery spec, so those tests we had done previously had prepared us for how to think in HDR.

“Watching Season 4 come to life with such new depth, range and the dimension that HDR provides was like seeing our world with new eyes,” continues Benning. “It became even more immersive. I am very much looking forward to doing Season 5, which we are shooting now, in HDR with Joanne.”

Rourke, who has worked on every season of The Expanse, explains, “Jeremy likes to set scene looks on set so everyone becomes married to the look throughout editorial. He is fastidious about sending stills each week, and the intended directive of each scene is clear long before it reaches my suite. This was our first foray into HDR with this show, which was exciting, as it is well suited for the format. Getting that extra bit of detail in the highlights made such a huge visual impact overall. It allowed us to see the comm units, monitors, and plumes on spaceships as intended by the VFX department and accentuate the hologram games.”

After making adjustments and ensuring initial footage was even, Rourke then refined the image by lifting faces and story points and incorporating VFX. This was done with input provided by producer Lewin Webb; Benning; cinematographer Ray Dumas, CSC; Culp or VFX supervisor Robert Crowther.

To manage the show’s high volume of VFX shots, Rourke relied on Deluxe Toronto senior online editor Motassem Younes and assistant editor James Yazbeck to keep everything in meticulous order. (For that they used the Grass Valley Rio online editing and finishing system.) The pair’s work was also essential to Deluxe Toronto re-recording mixers Steve Foster and Kirk Lynds, who have both worked on The Expanse since Season 2. Once ready, scenes were sent in HDR via Streambox to Shankar for review at Alcon Entertainment in Los Angeles.

“Much of the science behind The Expanse is quite accurate thanks to Naren, and that attention to detail makes the show a lot of fun to work on and more engaging for fans,” notes Foster. “Ilus is a bit like the wild west, so the technology of its settlers is partially reflected in communication transmissions. Their comms have a dirty quality, whereas the ship comms are cleaner-sounding and more closely emulate NASA transmissions.”

Adds Lynds, “One of my big challenges for this season was figuring out how to make Ilus seem habitable and sonically interesting without familiar sounds like rustling trees or bird and insect noises. There are also a lot of amazing VFX moments, and we wanted to make sure the sound, visuals and score always came together in a way that was balanced and hit the right emotions story-wise.”

Foster and Lynds worked side by side on the season’s 5.1 surround mix, with Foster focusing on dialogue and music and Lynds on sound effects and design elements. When each had completed his respective passes using Avid ProTools workstations, they came together for the final mix, spending time on fine strokes, ensuring the dialogue was clear, and making adjustments as VFX shots were dropped in. Final mix playbacks were streamed to Deluxe’s Hollywood facility, where Naren could hear adjustments completed in real time.

In addition to color finishing Season 4 in HDR, Rourke also remastered the three previous seasons of The Expanse in HDR, using her work on Season 4 as a guide and finishing with Blackmagic DaVinci Resolve 15. Throughout the process, she was mindful to pull out additional detail in highlights without altering the original grade.

“I felt a great responsibility to be faithful to the show for the creators and its fans,” concludes Rourke. “I was excited to revisit the episodes and could appreciate the wonderful performances and visuals all over again.”

DP Chat: Watchmen cinematographer Greg Middleton

By Randi Altman

HBO’s Watchmen takes us to new dimensions in this recent interpretation of the popular graphic novel. In this iteration, we spend a lot of our time in Tulsa, Oklahoma, getting to know Regina King’s policewoman Angela Abar, her unconventional family and a shadowy organization steeped in racism called the Seventh Kavalry. We also get a look back — beautiful in black and white — at Abar’s tragic family back story. It was created and written for TV by Lost veteran Damon Lindelof.

Greg Middleton

Greg Middleton, ASC, CSC, who also worked on Game of Thrones and The Killing, was the series cinematographer. We reached out to him to find out about his process, workflow and where he gets inspiration.

When were you brought on to Watchmen, and what type of looks did the showrunner want from the show?
I joined Watchmen after the pilot for Episode 2. A lot of my early prep was devoted to discussions with the showrunner and producing directors on how to develop the look from the pilot going forward. This included some pilot reshoots due to changes in casting and the designing and building of new sets, like the police precinct.

Nicole Kassell (director of Episodes 1, 2 and 8) and series production designer Kristian Milstead and I spent a lot of time breaking down the possibilities of how we could define the various worlds through color and style.

How was the look described to you? What references were you given?
We based the evolution of the look of the show on the scripts, the needs of the structure within the various worlds and on the graphic novel, which we commonly referred to as “the Old Testament.”

As you mention, it’s based on a graphic novel. Did the look give a nod to that? If so, how? Was that part of the discussion?
We attempted to break down the elements of the graphic novel that might translate well and those that would not. It’s an interesting bit of detective work because a lot of the visual cues in the comic are actually a commentary on the style of comics at the time it was published in 1985.

Those cues, if taken literally, would not necessarily work for us, as their context would not be clear. Things like color were very referential to other comics of the time. For example, they used only secondary color instead of primaries as was the norm. The graphic novel is also a film noir in many ways, so we got some of our ideas based on that.

What did translate well were compositional elements — tricks of transition like match cuts and the details of story in props, costumes and sets within each frame. We used some split diopters and swing shift lenses to give us some deep focus effects for large foreground objects. In the graphic novel, of course, everything is in focus, so those type of compositions are common!

This must have been fun because of the variety of looks the series has — the black-and-white flashbacks, the stylized version of Tulsa, the look of the mansion in Wales (Europa), Vietnam in modern day. Can you talk about each of the different looks?
Yes, there were so many looks! When we began prep on the series with the second episode, we were also simultaneously beginning to film the scenes in Wales for the “blond man” scenes. We knew that that storyline would have its own particular feel because of the location and its very separateness from the rest of the world.

A more classic traditional proscenium-like framing and style seemed very appropriate. Part of that intent was designed to both confuse and to make very apparent to the audience that we were definitely in another world. Cinematographer Chris Seager, BSC, was filming those scenes as I was still doing tests for the other looks and the rest of our show in Atlanta.

We discussed lenses, camera format, etc. The three major looks we had to design that we knew would go together initially were our “Watchmen” world, the “American hero story” show within the show, and the various flashbacks to 1921 Tulsa and World War I. I was determined to make sure that the main world of the show did not feel overly processed and colorized photographically. We shot many tests and developed a LUT that was mostly film-like. The other important aspects to creating a look are, of course, art direction and production design, and I had a great partner in Kristian Milstead, the production designer who joined the show after the pilot.

This was a new series. Do you enjoy developing the look of a show versus coming on board after the look was established?
I enjoy helping to figure out how to tell the story. For series, helping develop the look photographically in the visual strategy is a big part of that. Even if some of those are established, you still do similar decision-making for shooting individual scenes. However, I much prefer being engaged from the beginning.

So even when you weren’t in Wales, you were providing direction?
As I mentioned earlier, Chris Seager and I spoke and emailed regarding lenses and those choices. It was still early for us in Atlanta, but there were preliminary decisions to be made on how the “blond man” (our code name for Jeremy Irons) world would be compared to our Watchmen world. What I did was consult with my director, Nicole Kassell, on her storyboards for her sequences in Wales.

Were there any scenes or looks that stood out as more challenging than others? Can you describe?
Episode 106 was a huge challenge. We have a lot of long takes involving complex camera moves and dimmer cues as a camera would circle or travel between rooms. Also, we developed the black-and-white look to feel like older black-and-white film.
One scene in June’s apartment involved using the camera on a small Scorpio 10-foot crane and a mini libre head to accomplish a slow move around the room. Then we had to push between her two actors toward the wall as an in-camera queue of a projected image of the black-and-white movie Trust in the Law reveals itself with a manual iris.

This kind of shot ends up being a dance with at least six people, not including the cast. The entire “nostalgia” part of the episode was done this way. And none of this would’ve been possible without incredible cast being able to hit these incredibly long takes and choreograph themselves with the camera. Jovan Adepo and Danielle Deadwyler were incredible throughout the episode.

I assume you did camera tests. Why did you choose the ARRI Alexa? Why was it right for this? What about lenses, etc.?
I have been working with the Alexa for many years now, so I was aware of what I could do with the camera. I tested a couple of others, but in the end the Alexa Mini was the right choice for us. I also needed a camera that was small so I could go on and off of a gimbal or fit into small places.

How did you work with the colorist? Who was that on this show? Were you in the suite with them?
Todd Bochner was our final colorist at Sim in LA. I shot several camera tests and worked with him in the suite to help develop viewing LUTs for the various worlds of the show. We did the same procedure for the black and white. In the end, we mimicked some techniques similar to black-and-white film (like red filters), except for us, it was adjusting the channels accordingly.

Do you know what they used on the color?
Yes, it was Blackmagic DaVinci Resolve 16.

How did you get interested in cinematography?
I was always making films as a kid, and then in school and then in university. In film school, at some point splitting apart the various jobs, I seemed to have some aptitude for the cinematography, so after school I decided to try making my focus. I came to it more out of a love of storytelling and filmmaking and less about photography.

Greg Middleton

What inspires you? Other films?
Films that move me emotionally.

What’s next for you?
A short break! I’ve been very fortunate to have been working a lot lately. A film I shot just before Watchmen called American Woman, directed by Semi Chellas, should be coming out this year.

And what haven’t I asked that’s important?
I think the question all filmmakers should ask themselves is, “Why am I telling this story, and what is unique about the way in which I’m telling it?”


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Colorist Chat: Nice Shoes’ Maria Carretero on Super Bowl ads, more

This New York-based colorist, who worked on four Super Bowl spots this year, talks workflow, inspiration and more.

Name: Maria Carretero

Company: Nice Shoes

What kind of services does Nice Shoes offer?
Nice Shoes is a creative studio with color, editorial, animation, VFX, AR and VR services. It’s a full-service studio with offices in NYC, Chicago, Boston, Minneapolis and Toronto, as well as remote locations throughout North America.

Michelob Ultra’s Jimmy Works It Out

As a colorist, what would surprise people the most about what falls under that title?
I think people are surprised when they discover that there is a visual language in every single visual story that connects your emotions through all the imagery that we’ve collected in our brains. This work gives us the ability to nudge the audience emotionally over the course of a piece. Color grading is rooted in a very artistic base — core, emotional aspects that have been studied in art and color theory that make you explore cinematography in such an interesting way.

What system do you work on?
We use FilmLight Baselight as our primary system, but the team is also versed in Blackmagic Resolve.

Are you sometimes asked to do more than just color on projects?
Sometimes. If you have a solid relationship with the DP or the director, they end up consulting you about palettes, optics and references, so you become an active part of the creativity in the film, which is very cool. I love when I can get involved in projects from the beginning.

What’s your favorite part of the job?
My favorite moment is when you land on the final look and you see that the whole film is making visual sense and you feel that the story, the look and the client are all aligned — that’s magic!

Any least favorites?
No, I love coloring. Sometimes the situation becomes difficult because there are technical issues or disagreements, but it’s part of the work to push through those moments and make things work

If you didn’t have this job, what would you be doing instead?
I would probably be a visual artist… always struggling to keep the lights on. I’m kidding! I have so much respect for visual artists, I think they should be treated better by our society because without art there is no progress.

How early did you know this would be your path?
I was a visual artist for seven years. I was part of Nives Fernandez’s roster, and all that I wanted at that time was to try to tell my stories as an artist. I was freelancing in VFX to get some money that helped me survive, and I landed on the VFX side, and from there to color was a very easy switch. When I landed in Deluxe Spain 16 years ago and started to explore color, I quickly fell in love.

It’s why I like to say that color chose me.

Avocados From Mexico: Shopping Network

You recently worked on a number of Super Bowl spots. Can you talk a bit about your work on them, and any challenges relating to deadlines?
This year I worked on four Super Bowl spots Michelob Ultra PureGold: 6 for 6 Pack, Michelob Ultra: Jimmy Works It Out, Walmart: United Towns and Avocados From Mexico: Shopping Network.

Working on these kinds of projects is definitely a really interesting experience. The deadlines are tight, the pressure is enormous, but at the same time, the amount of talent and creativity involved is gigantic, so if you survive (laughs) you always will be a better professional. As a colorist I love to be challenged. I love dealing with difficult situations where all your resources and your energy is being put to the test.

Any suggestions for getting the most out of a project from a color perspective?
Thousands! Technical understanding, artistic involvement, there are so many… But definitely trying to create something new, special, different; embracing the challenges and pushing beyond the boundaries are the keys to delivering good work.

How do you prefer to work with the DP or director?
I like working with both. Debating with any kind of artist is the best. It’s really great to be surrounded by someone that uses a common “language.” As I mentioned earlier, I love when there’s the opportunity to get the conversation going at the beginning of a project so that there’s more opportunity for collaboration, debate and creativity.

How do you like getting feedback in terms of the look? Photos, films, etc.?
Every single bit of information is useful. I love when they verbalize what they’re going for using stories, feelings — when you can really feel they’re expressing personality with the film.

Where do you find inspiration? Art? Photography?
I find inspiration in living! There are so many things that surround us that can be a source of inspiration. Art, landscapes, the light that you remember from your childhood, a painting, watching someone that grabs your attention on a train. New York is teeming with more than enough life and creativity to keep any artist going.

Name three pieces of technology you can’t live without.
The Tracker, Spotify and FaceTime.

This industry comes with tight deadlines. How do you de-stress from it all?
I have a sense of humor and lots of red wine (smiles).

MPI restores The Wizard of Oz in 4K HDR

By Barry Goch

The classic Victor Fleming-directed film The Wizard of Oz, which was released by MGM in 1939 and won two of its six Academy Award nominations, has been beautifully restored by Burbank’s Warner Bros. Motion Picture Imaging (MPI).

Bob Bailey

To share its workflow on the film, MPI invited a group of journalists to learn about the 4K UHD HDR restoration of this classic film. The tour guide for our high-tech restoration journey was MPI’s VP of operations and sales Bob Bailey, who walked us through the entire restoration process — from the original camera negative to final color.

The Wizard of Oz, which starred Judy Garland, was shot on a Technicolor three-strip camera system. According to Bailey, it ran three black and white negatives simultaneously. “That is why it is known as three-strip Technicolor. The magazine on top of the camera was triple the width of a normal black and white camera because it contained each roll of negative to capture your red, green and blue records,” explained Bailey.

“When shooting in Technicolor, you weren’t just getting the camera. You would rent a package that included the camera, a camera crew with three assistants, the film, the processing and a Technicolor color consultant.”

George Feltenstein, SVP of theatrical catalog marketing for Warner Bros. Home Entertainment, spoke about why the film was chosen for restoration. “The Wizard of Oz is among the crown jewels that we hold,” he said. “We wanted to embrace the new 4K HDR technology, but nobody’s ever released a film that old using this technology. HDR, or high dynamic range, has a color range that is wider than anything that’s come before it. There are colors [in The Wizard of Oz] that were never reproducible before, so what better a film to represent that color?”

Feltenstein went on to explain that this is the oldest film to get released in the 4K format. He hopes that this is just the beginning and that many of the films in Warner Bros.’ classic library will also be released on 4K HDR and worked on at MPI under Bailey’s direction.

The Process
MPI scanned each of the three-strip Technicolor nitrate film negatives at 8K 16-bit, composited them together and then applied a new color grain. The film was rescanned with the Lasergraphics Director 10K scanner. “We have just under 15 petabytes of storage here,” said Bailey. “That’s working storage, because we’re working on 8K movies since [some places in the world] are now broadcasting 8K.”

Steven Anastasi

Our first stop was to look at the Lasergraphics Director. We then moved on to MPI’s climate-controlled vault, where we were introduced to Steven Anastasi, VP of technical operations at Warner Bros. Anastasi explained that the original negative vault has climate-controlled conditions with 25% humidity at 35 degrees Fahrenheit, which is the combination required for keeping these precious assets safe for future generations. He said there are 2 million assets in the building, including picture and sound.

It was amazing to see film reels for 2001: A Space Odyssey sitting on a shelf right in front of me. In addition to the feature reels, MPI also stores millions of negatives captured throughout the years by Warner productions. “We also have a very large library,” reported Anastasi. “So the original negatives from the set, a lot of unit photography, head shots in some cases and so forth. There are 10 million of these.”

Finally, we were led into the color bay to view the film. Janet Wilson, senior digital colorist at MPI, has overseen every remaster of The Wizard of Oz for the past 20 years. Wilson used a FilmLight Baselight X system for the color grade. The grading suite housed multiple screens: a Dolby Pulsar for the Dolby Vision pass, a Sony X300 and a Panasonic EZ1000 OLED 4K HDR.

“We have every 4K monitor manufactured, and we run the film through all of them,” said Bailey. “We painstakingly go through the process from a post perspective to make sure that our consumers get the best quality product that’s available out in the marketplace.”

“We want the consumer experience on all monitors to be something that’s taken into account,” added Feltenstein. “So we’ve changed our workflow by having a consumer or prosumer monitor in these color correction suites so the colorist has an idea of what people are going to see at home, and that’s helped us make a better product.”

Our first view of the feature was a side-by-side comparison of the black and white scanned negative and the sepia color corrected footage. The first part of the film, which takes place in Kansas, was shot in black and white, and then a sepia look was applied to it. The reveal scene, when Dorothy passes through the door going into Oz, was originally shot in color. For this new release, the team generated a matte so Wilson could add this sepia area to the inside of the house as Dorothy transitioned into Oz.

“So this is an example of some of the stuff that we could do in this version of the restoration,” explained Wilson. “With this version, you can see that the part of the image where she’s supposed to be in the monochrome house is not actually black and white. It was really a color image. So the trick was always to get the interior of the house to look sepia and the exterior to look like all of the colors that it’s supposed to. Our visual effects team here at MPI — Mike Moser and Richie Hiltzik — was able to draw a matte for me so that I could color inside of the house independently of the exterior and make them look right, which was always a really tricky thing to do.”

Wilson referred back to the Technicolor three-strip, explaining that because you’ve got three different pieces of film — the different records — they’re receiving the light in different ways. “So sometimes one will be a little brighter than the other. One will be a little darker than the other, which means that the Technicolor is not a consistent color. It goes a little red, and then it goes a little green, and then it goes a little blue, and then it goes a little red again. So if you stop on any given frame, it’s going to look a little different than the frames around it, which is one of the tricky parts of color correcting technical art. When that’s being projected by a film projector, it’s less noticeable than when you’re looking at it on a video monitor, so it takes a lot of little individual corrections to smooth those kinds of things out.”

Wilson reported seeing new things with the 8K scan and 4K display. “The amount of detail that went into this film really shows up.” She said that one of the most remarkable things about the restoration was the amazing detail visible on the characters. For the first time in many generations, maybe ever, you can actually see the detail of the freckles on Dorothy’s face.

In terms of leveraging the expanded dynamic range of HDR, I asked Wilson if she tried to map the HDR, like in kind of a sweet spot, so that it’s both spectacular yet not overpowering at the same time.

“I ended up isolating the very brightest parts of the picture,” she replied. “In this case, it’s mostly the sparkles on their shoes and curving those off so I could run those in, because this movie is not supposed to have modern-day animation levels of brightness. It’s supposed to be much more contained. I wanted to take advantage of brightness and the ability to show the contrast we get from this format, because you can really see the darker parts of the picture. You can really see detail within the Wicked Witch’s dress. I don’t want it to look like it’s not the same film. I want it to replicate that experience of the way this film should look if it was projected on a good print on a good projector.”

Dorothy’s ruby slippers also presented a challenge to Wilson. “They are so red and so bright. They’re so light-reflective, but there were times when they were just a little too distracting. So I had to isolate this level at the same track with slippers and bring them down a little bit so that it wasn’t the first and only thing you saw in the image.”

If you are wondering if audio was part of this most recent restoration, the answer is no, but it had been remastered for a previous version. “As early at 1929, MGM began recording its film music using multiple microphones. Those microphonic angles allowed the mixer to get the most balanced monophonic mix, and they were preserved,” explained Feltenstein. “Twenty years ago, we created a 5.1 surround mix that was organically made from the original elements that were created in 1939. It is full-frequency, lossless audio, and a beautiful restoration job was made to create that track so you can improve upon what I consider to be close to perfection without anything that would be disingenuous to the production.”

In all, it was an amazing experience to go behind the scenes and see how the wizards of MPI created a new version of this masterpiece for today and preserved it for future generations.

This restored version of The Wizard of Oz is a must-see visual extravaganza, and there is no better way to see it than in UHD, HDR, Dolby Vision or HDR10+. What I saw in person took my breath away, and I hope every movie fan out there can have the opportunity to see this classic film in its never-before-seen glory.

The 4K version of The Wizard of Oz is currently available via an Ultra HD Blu-ray Combo Pack and digital.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Sony adds 4K HDR reference monitors to Trimaster range

Sony is offering a new set of high-grade 4K HDR monitors as part of its Trimaster range. The PVM-X2400 (24-inch) and the PVM-X1800 (18.4-inch) professional 4K HDR monitors were demo’d at the BSC Expo 2020 in London. They will be available in the US starting in July.

The monitors provide ultra-high definition with a resolution of 3840×2160 pixels and a brightness of all-white luminance of 1000 cd/m2. For optimum film production, their wide color gamut matches the BVM-HX310 Trimaster HX master monitor. This means both monitors feature accurate color reproduction and greyscale, which helps filmmakers make critical imaging decisions and deploy faithful color matching throughout the workflow.

The monitors, which are small and portable, are designed to expand footprints in 4K HDR production, including applications such as on-set monitoring, nonlinear video editing, studio wall monitoring and rack-mount monitoring in OB trucks or machine rooms.

The monitors also feature new Black Detail High/Mid/Low, which helps maintain accurate color reproduction by reducing the brightness of the backlight to reproduce the correct colors and gradations in low-luminance areas. Another new function, Dynamic Contrast Drive, changes backlight luminance to adapt to each scene or frame when transferring images from PVM-X2400/X1800 to an existing Sony OLED monitor.  This functionality allows filmmakers to check the highlight and low-light balance of the contents with both bright and dark scenes.

Other features include:
• Dynamic contrast ratio of 1,000,000:1 by Dynamic Contrast Drive, a new backlight driving system that dynamically changes the backlight luminance to adapt for each frame of a scene.
• 4K/HD scopes with HDR scales that are waveform/vector.
• Quad View display and User 3D LUT functionality.
• 12G/6G/3G/HD-SDI with auto configuration.

Marriage Story director Noah Baumbach

By Iain Blair

Writer/director Noah Baumbach first made a name for himself with The Squid and the Whale, his 2005 semi-autobiographical, bittersweet story about his childhood and his parents’ divorce. It launched his career, scoring him an Oscar nomination for Best Original Screenplay.

Noah Baumbach

His latest film, Marriage Story, is also about the disintegration of a marriage — and the ugly mechanics of divorce. Detailed and emotionally complex, the film stars Scarlett Johansson and Adam Driver as the doomed couple.

In all, Marriage Story scooped up six Oscar nominations — Best Picture, Best Actress, Best Actor, Best Supporting Actress, Best Original Screenplay and Best Original Score. Laura Dern walked away with a statue for her supporting role.

The film co-stars Dern, Alan Alda and Ray Liotta. The behind-the-scenes team includes director of photography Robbie Ryan, editor Jennifer Lame and composer Randy Newman.

Just a few days before the Oscars, Baumbach — whose credits also include The Meyerwitz Stories, Frances Ha and Margot at the Wedding — talked to me about making the film and his workflow.

What sort of film did you set out to make?
It’s obviously about a marriage and divorce, but I never really think about a project in specific terms, like a genre or a tone. In the past, I may have started a project thinking it was a comedy but then it morphs into something else. With this, I just tried to tell the story as I initially conceived it, and then as I discovered it along the way. While I didn’t think about tone in any general sense, I became aware as I worked on it that it had all these different tones and genre elements. It had this flexibility, and I just stayed open to all those and followed them.

I heard that you were discussing this with Adam Driver and Scarlett Johansson as you wrote the script. Is that true?
Yes, but it wasn’t daily. I’d reached out to both of them before I began writing it, and luckily they were both enthusiastic and wanted to do it, so I had them as an inspiration and guide as I wrote. Periodically, we’d get together and discuss it and I’d show them some pages to keep them in the loop. They were very generous with conversations about their own lives, their characters. My hope was that when I gave them the finished script it would feel both new and familiar.

What did they bring to the roles?
They were so prepared and helped push for the truth in every scene. Their involvement from the very start did influence how I wrote their roles. Nicole has that long monologue and I don’t know if I’d have written it without Scarlett’s input and knowing it was her. Adam singing “Being Alive” came out of some conversations with him. They’re very specific elements that come from knowing them as people.

You reunited with Irish DP Robbie Ryan, who shot The Meyerowitz Stories. Talk about how you collaborated on the look and why you shot on film?
I grew up with film and feel it’s just the right medium for me. We shot The Meyerowitz Stories on Super 16, and we shot this on 35mm, and we had to deal with all these office spaces and white rooms, so we knew there’d be all these variations on white. So there was a lot of discussion about shades and the palette, along with the production and costume designers, and also how we were going to shoot these confined spaces, because it was what the story required.

You shot on location in New York and LA. How tough was the shoot?
It was challenging, but mainly because of the sheer length of many of the scenes. There’s a lot of choreography in them, and some are quite emotional, so everyone had to really be up for the day, every day. There was no taking it easy one day. Every day felt important for the movie.

Where did you do the post?
All in New York. I have an office in the Village where I cut my last two films, and we edited there again. We mixed on the Warner stage, where I’ve mixed most of my movies. We recorded the music and orchestra in LA.

Do you like the post process?
I really love it. It’s the most fun and the most civilized part of the whole process. You go to work and work on the film all day, have dinner and go home. Writing is always a big challenge, as you’re making it up as you go along, and it can be quite agonizing. Shooting can be fun, but it’s also very stressful trying to get everything you need. I love working with the actors and crew, but you need a high level of energy and endurance to get through it. So then post is where you can finally relax, and while problems and challenges always arise, you can take time to solve them. I love editing, the whole rhythm of it, the logic of it.

_DSC4795.arw

Talk about editing with Jennifer Lame. How did that work?
We work so well together, and our process really starts in the script stage. I’ll give her an early draft to get her feedback and, basically, we start editing the script. We’ll go through it and take out anything we know we’re not going to use. Then during the shoot she’ll sometimes come to the set, and we’ll also talk twice a day. We’ll discuss the day’s work before I start, and then at lunch we’ll go over the previous day’s dailies. So by the time we sit down to edit, we’re really in sync about the whole movie. I don’t work off an assembly, so she’ll put together stuff for herself to let me know a scene is working the way we designed it. If there’s a problem, she’ll let me know what we need.

What were the big editing challenges?
Besides the general challenges of getting a scene right, I think for some of the longer ones it was all about finding the right rhythm and pacing. And it was particularly true of this film that the pace of something early on could really affect something later. Then you have to fix the earlier bit first, and sometimes it’s the scene right before. For instance, the scene where Charlie and Nicole have a big argument that turns into a very emotional fight is really informed by the courtroom scene right before it. So we couldn’t get it right until we’d got the courtroom scene right.

A lot of directors do test screenings. Do you?
No, I have people I show it to and get feedback, but I’ve never felt the need for testing.

VFX play a role. What was involved?
The Artery did them. For instance, when Adam cuts his arm we used VFX in addition to the practical effects, and then there’s always cleanup.

Talk about the importance of sound to you as a filmmaker, as it often gets overlooked in this kind of film.
I’m glad you said that because that’s so true, and this doesn’t have obvious sound effects. But the sound design is quite intricate, and Chris Scarabosio (working out of Skywalker Sound), who did Star Wars, did the sound design and mix; he was terrific.

A lot of it was taking the real-world environments in New York and LA and building on that, and maybe taking some sounds out and playing around with all the elements. We spent a lot of time on it, as both the sound and image should be unnoticed in this. If you start thinking, “That’s a cool shot or sound effect,” it takes you out of the movie. Both have to be emotionally correct at all times.

Where did you do the DI and how important is it to you?
We did it at New York’s Harbor Post with colorist Marcy Robinson, who’s done several of my films. It’s very important, but we didn’t do anything too extreme, as there’s not a lot of leeway for changing the look that much. I’m very happy with the look and the way it all turned out.

Congratulations on all the Oscar noms. How important is that for a film like this?
It’s a great honor. We’re all still the kids who grew up watching movies and the Oscars, so it’s a very cool thing. I’m thrilled.

What’s next?
I don’t know. I just started writing, but nothing specific yet.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Visible Studios produces, posts Dance Monkey music video

If you haven’t heard about the Dance Monkey song by Tones and I, you soon will.  Australia’s Visible Studios provided production and post on the video to go with the song that has hit number one in more than 30 countries, went seven times platinum and remained at the top of the charts in Australia for 22 weeks. The video has been viewed on YouTube more than half a billion times.

Visible Studios, a full production and post company, is run by producer Tim Whiting and director and editor Nick Kozakis. The company features a team of directors, scriptwriters, designers, motion graphic artists and editors working on films, TV commercials and music videos.

For Dance Monkey, Visible Studios worked directly with Tones and I to develop the idea for the video. The video, which was shot on Red cameras at the beginning of the song’s meteoric rise, was completed in less than a week and on a small budget.

“The Dance Monkey music video was made on an extremely quick turnaround,” says Whiting. “[Tones] was blowing up at the time, and they needed the music video out fast. The video was shot in one day, edited in two, with an extra day and a half for color and VFX.”  Visible Studios called on Blackmagic Resolve studio for edit, VFX and color.

Dance Monkey features the singer dressed as Old Tones, an elderly man whisked away by his friends to a golf course to dance and party. On the day of production, the sun was nowhere to be found, and each shot was done against a gray and dismal background. To fix this, the team brought in a sky image as a matte and used Resolve’s match move tool, keyer, lens blur and power windows to turn gray footage to brilliant sunshine.

“In post we decided to replace the overcast skies with a cloudy blue sky. We ended up doing this all in Resolve’s color page and keyed the grass and plants to make them more lush, and we were there,” says Whiting.

Editor/directors Kozakis and Liam Kelly used Resolve for the entire editing process. “Being able to edit 6K raw footage smoothly on a 4K timeline, at a good quality debayer, means that we don’t have to mess around with proxies and that the footage gets out of the way of the editing process. The recent update for decompression and debayer on Nvidia cards has made this performance even better,” Kozakis says.

 

Review: Neat Video 5 noise reduction plugin

By Brady Betzel

One of the best (and most underrated) tricks in an online editor’s tool kit is to have good image restoration techniques. Removing digital video imperfections — from flicker to digital video noise — is not easy, and not easy to do well. That is, unless you have good noise reduction software like Neat Video.

While Neat Video might not be that well-known, once you see how simply (or intricatly) Neat Video 5 works inside of apps like Blackmagic’s DaVinci Resolve, it will be hard to forget the company’s name.

(While the software was recently updated to 5.1.5 — with expanded GPU support as well as support for new versions of Resolve, Adobe and Nuke — nothing really changes for this review. You can check out a detailed list of the updates here.)

Neat Video 5 is a noise reduction plugin. In a Windows OS environment, Neat Video is compatible with apps like Adobe After Effects, Adobe Premiere Pro, DaVinci Resolve, Avid Media Composer, Vegas, Magix, Edius, Virtual Dub, and the OFX-compatible apps Nuke, Fusion, Scratch, HitFilm, Mamba, Natron, Flame, Baselight and DustBuster. In a macOS environment, Neat Video 5 is compatible with After Effects, Premiere, Final Cut Pro X, Motion 5, OFX, Resolve and Media Composer. In Linux, the software is compatible with OFX-compatible apps and Resolve.

Neat Video 5 comes in three flavors: Demo, Home and Pro. The Demo version works in up to 1280×720 resolution with a watermark. Home is literally made for the home user: It will process video up to 1920×1080 resolutions, it will use up to one GPU, and it is for non-commercial use. The cost is just $74.90 for most apps (Resolve is $89.90). The Pro version has no resolution restrictions, will work on two or more GPUs simultaneously, and can be used commercially. The Pro version starts at $129.90 per app ($159.90 for Resolve). Because Neat Video 5 for OFX works with so many apps, it only comes in Pro ($249.90) and Studio ($349.90) versions. The Studio version adds the ability for a floating license. You can see all of the pricing details here.

If there is one line you should take away from this review, it is this: Neat Video 5 is by far the easiest and best noise reduction software I have used in any application to date. And while this review is focusing on the Resolve version of Neat Video 5, all other apps work in much the same way. You can find Neat Video’s software-specific Quick Start Guides to help. Once you install and register your Neat Video 5 license, removing digital video noise is as easy as applying Neat Video 5 to a node in the color tab, clicking on “Prepare Noise Profile,” clicking on “Auto Profile,” and clicking “Apply.” Then, unless you want to fine-tune your noise reduction, you are done. Obviously, I have somewhat simplified how Neat Video 5 works, but essentially it can be done in as little as three steps per clip, and the results are typically amazing. If they aren’t amazing, you can jump back into Neat Video 5 and manually adjust specifics until the noise reduction looks correct. But I will say that in about 90% of cases, the Auto Profiling will do all of the noise reduction work necessary.

For tinkerers, or for those who need to go far beyond an Auto Profile, you can manually adjust your settings. But taking a step back, Neat Video needs an area of your image that has a uniform color and noise profile to process how it removes noise. The automatic profiling will do its best to find an area, but it doesn’t always work. What you need to keep in mind when building a good noise profile inside of Neat Video is that the area being processed needs to be as uniform as possible (think dark night sky or a wall painted in one color) — meaning no prominent features, a high noise level (something in the high four area is better), the largest possible sample area and no warnings from Neat Video.

So, if your automatic profile doesn’t do the job, you can find an area of your image that meets the above requirements and then build a profile. From there you can use one of the Neat Video 5 features, like “Profile Check.” Profile Check will highlight details that aren’t being affected by Neat Video, giving you a clear representation of what noise is being reduced and whether you need to adjust your profile to better reduce video noise.

At this point you might be wondering where you tweak advanced settings. When you load Neat Video, you will be in Beginner mode. To get into Advanced mode, go to the Tools menu, where you will see a lot of advanced functions that can help you fine-tune your noise profile. And if you still can’t get a good noise reduction profile, you can try out the “Generic Profile,” which can help you build a profile even if your video doesn’t have a large enough area of uniform noise. There are also presets — such as like light flicker, moire flicker, repeat frame issues, dust and scratch filters (including scan lines), jitter of details, artifact removal filter and more — that can solve certain problems.

Neat Video 5 is faster than previous generations. As in previous versions, there is even a tool that inside of Neat Video preferences that will run your CPU and GPU through a benchmark to specify whether you should run on CPU only, GPU only, or a combo of both. In Neat Video 5, if you have trouble with a clip, you can use up to four “Variants” of noise reduction in the new playback window to see how each profile works with your clip.

In terms of playback and rendering, noise reduction is never fast. However, inside of Neat Video the new playback window will typically play back your footage to preview the noise reduction before you jump back into Resolve. Inside of Resolve, even in just 1080p, my sequence would crawl to just a few frames of playback per second. It is one of the most processor- and GPU-intensive tasks you will run on your computer.

In my testing I applied Neat Video 5 to the first node in my color correction tree, followed by a basic color correction in a one-minute timeline. I took those same clips and compared my Neat Video results to Resolve’s Temporal and Spatial noise reduction tools. In terms of visual results, Neat Video 5 was superior. If that’s not the case for you, then jump into YCbCr viewer mode inside of Neat Video 5, isolate each channel and tweak each channel individually so you won’t affect your overall noise reduction if it isn’t necessary. Not only did Neat Video 5 handle normal noise in the shadows well but on clips with very tight lines, it was able to keep a lot of the details while removing the noise. Resolve’s noise reduction tools had a harder time removing noise but keeping detail. Temporal noise reduction really didn’t do much, and while Spatial noise reduction did work it would heavily blur and distort the image — essentially not acceptable.

To get a good example of how Neat Video 5 slams a computer system, I exported 1080p MP4s. Resolve’s built-in Temporal noise reduction took 1:03, while the Spatial noise reduction took 1:05. The Neat Video 5 render of the same one-minute timeline took 3:51 — almost four times as long. I was curious how much longer a 4K render would take. Using 4K (UHD) media, I applied a simple color correction and on a previous serial node that applied Neat Video 5. I exported a 4K (UHD) MP4, which took 52 seconds without Neat Video 5 applied and 16:27 with Neat Video applied — at least 16 times more render time! So while Neat Video 5 is an amazing tool, there is a trade-off in high render times.

To find additional training on more advanced noise reduction techniques in Neat Video, check out the video tutorials. I find myself watching these just because of how much you can learn about noise reduction in general. They aren’t as exciting as watching Game of Thrones or The Handmaid’s Tale, but they will push your knowledge in noise reduction to the next level.

Summing Up
I’ve used Neat Video for a while, so when I was approached to review Version 5 I immediately said yes. Noise reduction is post skill that not many possess.

If you are an online editor or colorist looking to separate yourself from the pack, learn all the noise reduction techniques you can and definitely check out Neat Video 5. Not only can Neat Video 5 work automatically, but you can fine-tune your noise reduction as much as you want.

And when demoing your color correction services, think about using Neat Video 5 to remove camera noise, flickering and chroma issues; color correcting your footage; and, finally, adding some grain back into your shot. Not only will your footage look better, but you’ll have a technical workflow that will definitely impress clients. Just don’t forget to account for the extra render time.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Editor David Cea joins Chicago’s Optimus  

Chicago-based production and post house Optimus has added editor David Cea to its roster. With 15 years of experience in New York and Chicago, Cea brings a varied portfolio of commercial editing experience to Optimus.

Cea has cut spots for brands such as Bank of America, Chevrolet, Exxon, Jeep, Hallmark, McDonald’s, Microsoft and Target. He has partnered with many agencies, including BBDO, Commonwealth, DDB, Digitas, Hill Holliday, Leo Burnett, Mother and Saatchi & Saatchi.

“I grew up watching movies with my dad and knew I wanted to be a part of that magical process in some way,” explains Cea. “The combination of Goodfellas and Monty Python gave me all the fuel I needed to start my film journey. It wasn’t until I took an editing class in college that I discovered the part of filmmaking I wanted to pursue. The editor is the one who gets to shape the final product and bring out the true soul of the footage.”

After studying film at Long Island’s Hofstra University, Cea met Optimus editor and partner Angelo Valencia while working as his assistant at Whitehouse New York in 2005. Cea then moved on to hone his craft further at Cosmo Street in New York. Chicago became home for him in 2013 as he spent three years at Whitehouse. After heading back east for a couple of years, he returned to Chicago to put down roots.

While Avid Media Composer is Cea’s go-to choice for editing, he is also proficient in Adobe Premiere.

Colorist Chat: Light Iron supervising colorist Ian Vertovec

“As colorists, we are not just responsible for enhancing each individual shot based on the vision of the filmmakers, but also for helping to visually construct an emotional arc over time.”

NAME: Ian Vertovec

TITLE: Supervising Colorist

COMPANY: Light Iron

CAN YOU DESCRIBE YOUR ROLE IN THE COMPANY?
A Hollywood-based collaborator for motion picture finishing, with a studio in New York City as well.

GLOW

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
As colorists, we are not just responsible for enhancing each individual shot based on the vision of the filmmakers, but also for helping to visually construct an emotional arc over time. For example, a warm scene feels warmer coming out of a cool scene as opposed to another warm scene. We have the ability and responsibility to nudge the audience emotionally over the course of the film. Using color in this way makes color grading a bit like a cross between photography and editing.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Once in a while, I’ll be asked to change the color of an object, like change a red dress to blue or a white car to black. While we do have remarkable tools at our disposal, this isn’t quite the correct way to think about what we can do. Instead of being able to change the color of objects, it’s more like we can change the color of the light shining on objects. So instead of being able to turn a red dress to blue, I can change the light on the dress (and only the dress) to be blue. So while the dress will appear blue, it will not look exactly how a naturally blue dress would look under white light.

WHAT’S YOUR FAVORITE PART OF THE JOB?
There is a moment with new directors, after watching the first finished scene, when they realize they have made a gorgeous-looking movie. It’s their first real movie, which they never fully saw until that moment — on the big screen, crystal clear and polished — and it finally looks how they envisioned it. They are genuinely proud of what they’ve done, as well as appreciative of what you brought out in their work. It’s an authentic filmmaking moment.

WHAT’S YOUR LEAST FAVORITE?
Working on multiple jobs at a time and long days can be very, very draining. It’s important to take regular breaks to rest your eyes.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Something with photography, VFX or design, maybe.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I was doing image manipulation in high school and college before I even knew what color grading was.

Just Mercy

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Just Mercy, Murder Mystery, GLOW, What We Do in the Shadows and Too Old to Die Young.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Sometimes your perspective and a filmmaker’s perspective for a color grade can be quite divergent. There can be a temptation to take the easy way and either defer or overrule. I find tremendous value in actually working out those differences and seeing where and why you are having a difference of opinion.

It can be a little scary, as nobody wants to be perceived as confrontational, but if you can civilly explain where and why you see a different approach, the result will almost always be better than what either of you thought possible in the first place. It also allows you to work more closely and understand each other’s creative instincts more accurately. Those are the moments I am most proud of — when we worked through an awkward discord and built something better.

WHERE DO YOU FIND INSPIRATION?
I have a fairly extensive library of Pinterest boards — mostly paintings — but it’s real life and being in the moment that I find more interesting. The color of a green leaf at night under a sodium vapor light, or how sunlight gets twisted by a plastic water bottle — that is what I find so cool. Why ruin that with an Insta post?

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
FilmLight Baselight’s Base Grade, FilmLight Baselight’s Texture Equalizer and my Red Hydrogen.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram mostly.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
After working all day on a film, I often don’t feel like watching another movie when I get home because I’ll just be thinking about the color.  I usually unwind with a video game, book or podcast. The great thing about a book or video games is that they demand your 100% attention. You can’t be simultaneously browsing social media or the news  or be thinking about work. You have to be 100% in the moment, and it really resets your brain.

Quick Chat: Director Sorrel Brae on Rocket Mortgage campaign

By Randi Altman

Production company Native Content and director Sorrel Brae have collaborated once again with Rocket Mortgage’s in-house creative team on two new spots in the ongoing “More Than a House” campaign. Brae and Native had worked together on the campaigns first four offerings.

The most recent spots are More Than a Tradition and More Than a Bear. More Than a Tradition shows a ‘50s family sitting down to dinner and having a fun time at home. Then the audience sees the same family in modern times, hammering home how traditions become traditions.

More Than a Bear combines fantasy and reality as it shows a human-sized teddy bear on an operating table. Then viewers see a worried boy looking on as his mother is repairing the his stuffed animal. Each spot opens with the notes of Bob Dylan’s “The Man In Me,” which is featured in all the “More Than a House” spots.

More Than a Bear was challenging, according to Brae, because there was some darker material in this piece as compared to the others  —  viewers aren’t sure at first if the bear will make it. Brae worked closely with DP Jeff Kim on the lighting and color palette to find a way to keep the tone lighthearted. By embracing primary colors, the two were able to channel a moodier tone and bring viewers inside a scared child’s imagination while still maintaining some playfulness.

We reached out to director Brae to find our more.

Sorrel Brae

What did you shoot these two spots on, and why?
I felt that in order for the comedy to land and the idea to shine, the visual separation between fantasy and reality had to be immediate, even shocking. Shooting on an Alexa Mini, we used different lenses for the two looks: Hawk V-Lite Vintage ’74 anamorphic for epic and cinematic fantasy, and spherical Zeiss and Cooke S4 primes for reality. The notable exception was in the hospital for the teddy bear spot, where our references were the great Spielberg and Zemeckis films from the ‘80s, which are primarily spherical and have a warmer, friendlier feeling.

How did you work with the DP and the colorist on the look? And how would you describe the look of each spot, and the looks within each spot? 
I was fortunate to bring on longtime collaborators DP Jeffrey Kim and colorist Mike Howell for both spots. Over the years, Jeff and I have developed a shorthand for working together. It all starts with defining our intention and deciding how to give the audience the feelings we want them to have.

In Tradition, for example, that feeling is a warm nostalgia for a bygone era that was probably a fantasy then, just as it is now. We looked to period print advertisements, photographs, color schemes, fonts — everything that spoke to that period. Crucial to pulling off both looks in one day was Heidi Adams’ production design. I wanted the architecture of the house to match when cutting between time periods. Her team had to put a contemporary skin on a 1950s interior for us to shoot “reality” and then quickly reset the entire house back to 1950s to shoot “fantasy.”

The intention for More Than a Bear was trickier. From the beginning I worried a cinematic treatment of a traumatic hospital scene wouldn’t match the tone of the campaign. My solution with Jeff was to lean into the look of ‘80s fantasy films like E.T. and Back to the Future with primary colors, gelled lights, a continuously moving camera and tons of atmosphere.

Mike at Color Collective even added a retro Ektachrome film emulation for the hospital and a discontinued Kodak 5287 emulation for the bedroom to complete the look. But the most fun was the custom bear that costume designer Bex Crofton-Atkins created for the scene. My only regret is that the spot isn’t 60 seconds because there’s so much great bear footage that we couldn’t fit into the cut.

What was this edited on? Did you work with the same team on both campaigns?
The first four spots of this campaign were cut by Jai Shukla out of Nomad Edit. Jai did great work establishing the rhythm between fantasy and reality and figuring out how to weave in Bob Dylan’s memorable track for the strongest impact. I’m pretty sure Jai cuts on Avid, which I like to tease him about.

These most recent two spots (Tradition and Teddy Bear) were cut by Zach DuFresne out of Hudson Edit, who did an excellent job navigating scripts with slightly different challenges. Teddy Bear has more character story than any of the others, and Tradition relies heavily on making the right match between time periods. Zach cuts on Premiere, which I’ve also migrated to (from FCP 7) for personal use.

Were any scenes more challenging than the others?
What could be difficult about kids, complex set design, elaborate wardrobe changes and detailed camera moves on a compressed schedule? In truth, it was all equally challenging and rewarding.

Ironically, the shots that gave us the most difficulty probably look the simplest. In Tradition there’s a SteadiCam move that introduces us into the contemporary world, has match cuts on either end and travels through most of the set and across most of the cast. Because everyone’s movements had to perfectly align with a non-repeatable camera, that one took longer than expected.

And on Teddy Bear, the simple shot looking up from the patient’s POV as the doctor/mom looms overhead was surprisingly difficult. Because we were on an extremely wide lens (12mm or similar), our actress had to nail her marks down to the millimeter, otherwise it looked weird. We probably shot that one setup 20 times.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: FilmConvert Nitrate for film stock emulation

By Brady Betzel

If you’ve been around any sort of color grading forums or conferences, you’ve definitely heard some version of this: Film is so much better than digital. While I don’t completely disagree with the sentiment, let’s be real. We are in a digital age, and the efficiency and cost associated with digital recording is, in most cases, far superior to film.

Personally, I love the way film looks; it has an essence that is very difficult to duplicate — from the highlight roll-offs to the organic grain — but it is very costly. That is why film is hard to imitate digitally, and that is why so many companies try and often fail.

Sony A7iii footage

One company that has had grassroots success with digital film stock emulation is FilmConvert. The original plugin, known as FilmConvert Pro, works with Adobe’s Premiere and After Effects, Avid Media Composer and as an OFX plugin for apps like Blackmagic’s DaVinci Resolve.

Recently, FilmConvert expanded its lineup with the introduction of Nitrate, a film emulation plugin that can take Log-based video and transform it into full color corrected media with a natural grain similar to that of commonly loved film stocks. Currently, Nitrate works with Premiere and After Effects, with an OFX version for Resolve. A plugin for FCPX is coming in March.

The original FilmConvert Pro plugin works great, but it adjusts your image through an sRGB pipeline. That means FilmConvert Pro adjusts any color effects after your “base” grade is locked in while living in an sRGB world. While you download camera-specific “packs” that apply the film emulation — custom-made based on your sensor and color space — you are still locked into an sRGB pipeline, with little wiggle room. This means sometimes blowing out your highlights and muddying your shadows with little ability to recover any details.

SonyA7iii footage

I imagine FilmConvert Pro was introduced at a time when a lot of users shot with cameras like Canon 5D or other sRGB cameras that weren’t shooting in a Log color space. Think of using a LUT and trying to adjust the highlights and shadows after the LUT; typically, you will have a hard time getting any detail back, losing dynamic range even if your footage was shot Log. But if you color before a LUT (think Log footage), you can typically recover a lot of information as long as your shot was recorded properly. That blown-out sky might be able to be recovered if shot in a Log colorspace. This is what FilmConvert is solving with its latest offering, Nitrate.

How It Works
FilmConvert’s Nitrate works in a Cineon-Log processing pipeline for its emulation, as well as a full Log image processing pipeline. This means your highlights and shadows are not being heavily compressed into an sRGB color space, which allows you to fine-tune your shadows and highlights without losing as much detail. Simply, it means that the plugin will work more naturally with your footage.

In additional updates, FilmConvert has overhauled its GUI to be more natural and fluid. The Color Wheels have been redesigned, a new color tint slider has been added to quickly remove any green or magenta color cast, a new Color Curve control has been added, and there is now a Grain Response curve.

Grain Response

The Grain Response curve takes adding grain to your footage up a notch. Not only can you select between 8mm and 35mm grain sizes (with many more in between) but you can adjust the application of that grain from shadows to highlights. If you want your highlights to have more grain response, just point the Grain Response curve higher up. In the same window you can adjust the grain size, softness, strength and saturation via sliders.

Of the 19 film emulation options to choose from, there are many unique and great-looking presets. From the “KD 5207 Vis3” to the “Plrd 600,” there are multiple brands and film stocks offered. For instance, the “Kodak 5207 Vis3” is described on Kodak’s website in more detail:

“Vision3 250D Film offers outstanding performance in the extremes of exposure — including increased highlight latitude, so you can move faster on the set and pull more detail out of the highlights in post. You’ll also see reduced grain in shadows, so you can push the boundaries of underexposure and still get outstanding results.”

One of my favorite emulations in Nitrate — “Fj Velvia 100” or Fujichrome Velvia 100 — is described on FilmConvert’s website:

“FJ Velvia 100 is based on the Fujichrome Velvia 100 photographic film stock. Velvia is a daylight-balanced color reversal film that provides brighter ultra-high-saturation color reproduction. The Velvia is especially suited to scenery and nature photography as well as other subjects that require precisely modulated vibrant color reproduction.”

Accurate Grain

FilmConvert’s website offers a full list of the 19 film stocks, as well as examples and detailed descriptions of each film stock.

Working With FilmConvert Nitrate
I used Nitrate strictly in Premiere Pro because the OFX version (specifically for Resolve) wasn’t available at the time of this review.

Nitrate works pretty well inside of Premiere, and surprisingly plays back fluidly — this is probably thanks to its GPU acceleration. Even with Sony a7 III UHD footage, Premiere was able to keep up with Lumetri Color layered underneath the FilmConvert Nitrate plugin. To be transparent I tested Nitrate on a laptop with an Intel i7 CPU and an Nvidia RTX 2080 GPU, so that definitely helps.

At first, I struggled to see where I would fit FilmConvert’s Nitrate plugin into my normal workflow so I could color correct my own footage and add a grade later. However, when I started cycling through the different film emulations, I quickly saw that they were adding a lot of life to the images and videos. Whether it was the grain that comes from the updated 6K grain scans in Nitrate or the ability to identify which camera and color profile you used when filming via the downloadable camera packs, FilmConvert’s Nitrate takes well-colored footage and elevates it to finished film levels.

It’s pretty remarkable; I came in thinking FilmConvert was essentially a preset LUT plugin and wasn’t ready for it to be great. To my surprise, it was great and it will add the extra edge of professional feeling to your footage quickly and easily.

Test 1
In my first test, I threw some clips I had shot on a Sony a7 III camera in UHD (at SLog3 — SGamut3) into a timeline, applied the FilmConvert Nitrate plugin and realized I needed to download the Sony camera packs. This pack was about 1GB, but others —like the Canon 5D Mark II — came in at just over 300MB. Not the end of the world, but if you have multiple cameras, you are going to need to download quite a few packs and the download size will add up.

Canon 5D

I tried using just the Nitrate plugin to do color correction and film emulation from start to finish, but I found the tools a little cumbersome and not really my style. I am not the biggest fan of Lumetri color correction tools, but I used those to get a base grade and apply Nitrate over that grade. I tend to like to keep looks to their own layer, so coloring under Nitrate was a little more natural to me.

A quick way to cycle through a bunch of looks quickly is to apply Nitrate to the adjustment layer and hit the up or down arrows. As I was flicking through the different looks, I noticed that FilmConvert does a great job processing the film emulations with the specified camera. All of the emulations looked good with or without a color balance done ahead of time.

It’s like adding a LUT and then a grade all in one spot. I was impressed by how quickly this worked and how good they all looked. When I was done, I rendered my one-minute sequence out of Adobe Media Encoder, which took 45 seconds to encode a ProResHQ and 57 seconds for an H.264 at 10Mb/s. For reference, the uncolored version of this sequence took 1:17 for the ProResHQ and :56 for the H.264 at 10Mb/s. Interesting, because the Nvidia RTX 2080 GPU definitely kicked in more when the FilmConvert Nitrate effect was added. That’s a definite plus.

Test 2
I also shot some clips using the Blackmagic Pocket Cinema Camera (BMPCC) and the Canon 5D Mark II. With the BMPCC, I recorded CinemaDNG files in the film color space, essentially Log. With the 5D, the videos were recorded as Movie files wrapped in MP4 files (unless you shoot with the Magic Lantern hack, which allows you to record in the raw format). I brought in the BMPCC CinemaDNG files via the Media Browser as well as imported the 5D Movs and applied the FilmConvert Nitrate plugin to the clips. Keep in mind you will need to download and install those camera packs if you haven’t already.

Pocket Cinema Camera

For the BMPCC clips I identified the camera and model as appropriate and chose “Film” under profile. It seemed to turn my CinemaDNG files a bit too orange, which could have been my white balance settings and/or the CinemaDNG processing done by Premiere. I could swing the orange hue out by using the temperature control, but it seemed odd to have to knock it down to -40 or -50 for each clip. Maybe it was a fluke, but with some experimentation I got it right.

With the Canon 5D Mark II footage, I chose the corresponding manufacturer and model as well as the “Standard” profile. This worked as it should. But I also noticed some other options like Prolost, Marvel, VisionTech, Technicolor, Flaat and Vision Color — these are essentially color profiles people have made for the 5D Mark II. You can find them with a quick Google search.

Summing Up
In the end, FilmConvert’s Nitrate will elevate your footage. The grain looks smooth and natural, the colors in the film emulation add a modern take on nostalgic color corrections (that don’t look too cheesy), and most cameras are supported via downloads. If you don’t have a large budget for a color grading session you should be throwing $139 at FilmConvert for its Nitrate plugin.

Nitrate in Premiere

When testing Nitrate on a few different cameras, I noticed that it even made color matching between cameras a little bit more consistent. Even if you have a budget for color grading, I would still suggest buying Nitrate; it can be a great starting block to send to your colorist for inspiration.

Check out FilmConvert’s website and definitely follow them on Instagram, where they are very active and show a lot of before-and-afters from their users  — another great source of inspiration.

Main Image: Two-year-old Oliver Betzel shot with a Canon 5D with KD P400 Ptra emulsion applied.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Talking with Franki Ashiruka of Nairobi’s Africa Post Office

By Randi Altman

After two decades of editing award-winning film and television projects for media companies throughout Kenya and around the world, Franki Ashiruka opened Africa Post Office, a standalone, post house in Nairobi, Kenya. The studio provides color grading, animation, visual effects, motion graphics, compositing and more. In addition, they maintain a database of the Kenyan post production community that allows them to ramp up with the right artists when the need arises.

Here she talks about the company, its workflow and being a pioneer in Nairobi’s production industry.

When did you open Africa Post Office, and what was your background prior to starting this studio?
Africa Post Office (APO) opened its doors in February 2017. Prior to starting APO, I was a freelance editor with plenty of experience working with well-established media houses such as Channel 4 (UK), Fox International Channels (UK), 3D Global Leadership (Nigeria), PBS (USA), Touchdown (New Zealand), Greenstone Pictures (New Zealand) and Shadow Films (South Africa).

In terms of Kenya-based projects, I’ve worked with a number of production houses including Quite Bright Films, Fat Rain Films, Film Crew in Africa, Mojo Productions, Multichoice, Zuku, Content House and Ginger Ink Films.

I imagine female-run, independent studios in Africa are rare?
On the contrary, Kenya has reached a point where more and more women are emerging as leaders of their own companies. I actually think there are more women-led film production companies than male-led. The real challenge was that before APO, there was nothing quite like it in Nairobi. Historically, video production here was very vertical — if you shot something, you’d need to also manage post within whatever production house you were working in. There were no standalone post houses until us. That said, with my experience, even though hugely daunting, I never thought twice about starting APO. It is what I have always wanted to do, and if being the first company of our kind didn’t intimidate me, being female was never going to be a hindrance.

L-R: Franki Ashiruka, Kevin Kyalo, Carole Kinyua and Evans Wenani

What is the production and post industry like in Nairobi? 
When APO first opened, the workload was commercial-heavy, but in the last two years that has steadily declined. We’re seeing this gap filled by documentary films, corporate work and television series. Feature films are also slowly gaining traction and becoming the focus of many up-and-coming filmmakers.

What services do you provide, and what types of projects do you work on?
APO has a proven track record of successful delivery on hundreds of film and video projects for a diverse range of clients and collaborators, including major corporate entities, NGOs, advertising and PR agencies, and television stations. We also have plenty of experience mastering according to international delivery standards. We’re proud to house a complete end-to-end post ecosystem of offline and online editing suites.

Most importantly, we maintain a very thorough database of the post production community in Kenya.
This is of great benefit to our clients who come to us for a range of services including color grading, animation, visual effects, motion graphics and compositing. We are always excited to collaborate with the right people and get additional perspectives on the job at hand. One of our most notable collaborators is Ikweta Arts (Avatar, Black Panther, Game of Thrones, Hacksaw Ridge), owned and run by Yvonne Muinde. They specialize in providing VFX services with a focus in quality matte painting/digital environments, art direction, concept and post visual development art. We also collaborate with Keyframe (L’Oréal, BMW and Mitsubishi Malaysia) for motion graphics and animations.

Can you name some recent projects and the work you provided?
We are incredibly fortunate to be able to select projects that align with our beliefs and passions.

Our work on the short film Poacher (directed by Tom Whitworth) won us three global Best Editing Awards from the Short to the Point Online Film Festival (Romania, 2018), Feel the Reel International Film Festival (Glasgow, 2018) and Five Continents International Film Festival (Venezuela, 2019).

Other notable work includes three feature documentaries for the Big Story segment on China Global Television Network, directed by Juan Reina (director of the Netflix Original film Diving Into the Unknown), Lion’s Den (Quite Bright Films) an adaptation of ABC’s Shark Tank and The Great Kenyan Bake Off (Showstopper Media) adopted from the BBC series The Great British Bake Off. We also worked on Disconnect, a feature film produced by Kenya’s Tosh Gitonga (Nairobi Half Life), a director who is passionate about taking Africa’s budding film industry to the next level. We have also worked on a host of television commercials for clients extending across East Africa, including Kenya, Rwanda, South Sudan and Uganda.

What APO is most proud of though, is our clients’ ambitions and determination to contribute toward the growth of the African film industry. This truly resonates with APO’s mantra.

You recently added a MAM and some other gear. Can you talk about the need to upgrade?
Bringing on the EditShare EFS 200 nodes has significantly improved the collaborative possibilities of APO. We reached a point where we were quickly growing, and the old approach just wasn’t going to cut it.

Prior to centralizing our content, projects lived on individual hard disks. This meant that if I was editing and needed my assistant to find me a scene or a clip, or I needed VFX on something, I would have to export individual clips to different workstations. This created workflow redundancies and increased potential for versioning issues, which is something we couldn’t afford to be weighed down with.

The remote capabilities of the EditShare system were very appealing as well. Our color grading collaborator, Nic Apostoli of Comfort and Fame, is based in Cape Town, South Africa. From there, he can access the footage on the server and grade it while the client reviews with us in Nairobi. Flow media asset management also helps in this regard. We’re able to effectively organize and index clips, graphics, versions, etc. into clearly marked folders so there is no confusion about what media should be used. Collaboration among the team members is now seamless regardless of their physical location or tools used, which include the Adobe Creative Suite, Foundry Nuke, Autodesk Maya and Maxon Cinema 4D.

Any advice for others looking to break out on their own and start a post house?
Know what you want to do, and just do it! Thanks Nike …


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.

Review: HP’s ZBook G6 mobile workstation

By Brady Betzel

In a year that’s seen AMD reveal an affordable 64-core processor with its Threadripper 3, it appears as though we are picking up steam toward next-level computing.

Apple finally released its much-anticipated Mac Pro (which comes with a hefty price tag for the 1.5TB upgrade), and custom-build workstation companies — like Boxx and Puget Systems — can customize good-looking systems to fit any need you can imagine. Additionally, over the past few months, I have seen mobile workstations leveling the playing field with their desktop counterparts.

HP is well-known in the M&E community for its powerhouse workstations. Since I started my career, I have either worked on a MacPro or an HP. Both have their strong points. However, workstation users who must be able to travel with their systems, there have always been some technical abilities you had to give up in exchange for a smaller footprint. That is, until now.

The newly released HP ZBook 15 G6 has become the rising the rising tide that will float all the boats in the mobile workstation market. I know I’ve said it before, but the classification of “workstation” is technically much more than just a term companies just throw around. The systems with workstation-level classification (at least from HP) are meant to be powered on and run at high levels 24 hours a day, seven days a week, 365 days a year.

They are built with high-quality, enterprise-level components, such as ECC (error correcting code) memory. ECC memory will self-correct errors that it sees, preventing things like blue screens of death and other screen freezes. ECC memory comes at a cost, and that is why these workstations are priced a little higher than a standard computer system. In addition, the warranties are a little more inclusive — the HP ZBook 15 G6 comes with a standard three-year/on-site service warranty.

Beyond the “workstation” classification, the ZBook 15 G6 is amazingly powerful, brutally strong and incredibly colorful and bright. But what really matters is under the hood. I was sent the HP ZBook 15 G6 that retails for $4,096 and contains the following specs:
– Intel Xeon E-2286M (eight cores/16 threads — 2.4GHz base/5GHz Turbo)
– Nvidia Quadro RTX 3000 (6GB VRAM)
15.6-inch UHD HP Dream Color display, anti-glare, WLED backlit 600 nits, 100% DCI-P3
– 64GB DDR4 2667MHz
– 1TB PCIe Gen 3 x4 NVMe SSD TLC
– FHD webcam 1080p plus IR camera
– HP collaboration keyboard with dual point stick
– Fingerprint sensor
– Smart Card reader
– Intel Wi-Fi 6 AX 200, 802.11ac 2×2 +BT 4.2 combo adapter (vPro)
– HP long-life battery four-cell 90 Wh
– Three-year limited warranty

The ZBook 15 G6 is a high-end mobile workstation with a price that reflects it. However, as I said earlier, true workstations are built to withstand constant use and, in this case, abuse. The ZBook 15 G6 has been designed to pass up to 21 extensive MIL-STD 810G tests, which is essentially worst-case scenario testing. For instance, drop testing of around four feet, sand and dust testing, radiation testing (the sun beating down on the laptop for an extended period) and much more.

The exterior of the G6 is made of aluminum and built to withstand abuse. The latest G6 is a little bulky/boxy, in my opinion, but I can see why it would hold up to some bumps and bruises, all while working at blazingly fast speeds, so bulk isn’t a huge issue for me. Because of that bulk, you can imagine that this isn’t the lightest laptop either. It weighs in at 5.79 pounds for the lowest end and measures 1 inch by 14.8 inches by 10.4 inches.

On the bottom of the workstation is an easy-to-access panel for performing repairs and upgrades yourself. I really like the bottom compartment. I opened it and noticed I could throw in an additional NVMe drive and an SSD if needed. You can also access memory here. I love this because not only can you perform easy repairs yourself, but you can perform upgrades or part replacements without voiding your warranty on the original equipment. I’m glad to see that HP kept this in mind.

The keyboard is smaller than a full-size version but has a number keypad, which I love using when typing in timecodes. It is such a time-saver for me. (I credit entering in repair order numbers when I fixed computers at Best Buy as a teenager.) On the top of the keyboard are some handy shortcuts if you do web conferences or calls on your computer, including answering and ending calls. The Bang & Olufsen speakers are some of the best laptop speakers I’ve heard. While they aren’t quite monitor-quality, they do have some nice sound on the low end that I was able to fine-tune in the Bang & Olufsen audio control app.

Software Tests
All right, enough of the technical specs. Let’s get on to what people really want to know — how the HP ZBook 15 G6 performs while using apps like Blackmagic’s DaVinci Resolve and Adobe Premiere Pro. I used sample Red and Blackmagic Raw footage that I use a lot in testing. You can grab the Red footage here and the BRaw footage here. Keep in mind you will need to download the BRaw software to edit with BRaw inside of Adobe products, which you can find here).

Performance monitor while exporting in Resolve with VFX.

For testing in Resolve and Premiere, I strung out one-minute of 4K, 6K and 8K Red media in one sequence and the 4608×2592 4K and 6K BRaw media in another. During the middle of my testing Resolve had a giant Red API upgrade to allow for better realtime playback of Red Raw files if you have an Nvidia CUDA-based GPU.

First up is Resolve 16.1.1 and then Resolve 16.1.2. Both sequences are set to UHD (3840×2160) resolution. One sequence of each codec contains just color correction, while another of each codec contains effects and color correction. The Premiere sequence with color and effects contains basic Lumetri color correction, noise reduction (50) and a Gaussian blur with settings of 0.4. In Resolve, the only difference in the color and effects sequence is that the noise reduction is spatial and set to Enhanced, Medium and 25/25.

In Resolve, the 4K Red media would play in realtime while the 6K (RedCode 3:1) would jump down to about 14fps to 15fps, and the 8K (RedCode 7:1) would play at 10fps at full resolution with just color correction. With effects, the 4K media would play at 20fps, 6K at 3fps and 8K at 10fps. The Blackmagic Raw video would play at real time with just color correction and around 3fps to 4fps with effects.

This is where I talk about just how loud the fans in the ZBook 15 G6 can get. When running exports and benchmarks, the fans are noticeable and a little distracting. Obviously, we are running some high-end testing with processor- and GPU-intensive tests but still, the fans were noticeable. However, the bottom of the mobile workstation was not terribly hot, unlike the MacBook Pros I’ve tested before. So my lap was not on fire.

In my export testing, I used those same sequences as before and from Adobe Premiere Pro 2020. I exported UHD files using Adobe Media Encoder in different containers and codecs: H.264 (Mov), H.265 (Mov), ProResHQ, DPX, DCP and MXF OP1a (XDCAM). The MXF OP1a was at 1920x1080p export.
Here are my results:

Red (4K,6K,8K)
– Color Only: H.264 – 5:27; H.265 – 4:45; ProResHQ – 4:29; DPX – 3:37; DCP – 10:38; MXF OP1a – 2:31

Red Color, Noise Reduction (50), Gaussian Blur .4: H.264 – 4:56; H.265 – 4:56; ProResHQ – 4:36; DPX – 4:02; DCP – 8:20; MXF OP1a – 2:41

Blackmagic Raw
Color Only: H.264 – 2:05; H.265 – 2:19; ProResHQ – 2:04; DPX – 3:33; DCP – 4:05; MXF OP1a – 1:38

Color, Noise Reduction (50), Gaussian Blur 0.4: H.264 – 1:59; H.265 – 2:22; ProResHQ – 2:07; DPX – 3:49; DCP – 3:45; MXF OP1a – 1:51

What is surprising is that when adding effects like noise reduction and a Gaussian blur in Premiere, the export times stayed similar. While using the ZBook 15 G6, I noticed my export times improved when I upgraded driver versions, so I re-did my tests with the latest Nvidia drivers to make sure I was consistent. The drivers also solved an issue in which Resolve wasn’t reading BRaw properly, so remember to always research drivers.

The Nvidia Quadro RTX 3000 really pulled its weight when editing and exporting in both Premiere and Resolve. In fact, in previous versions of Premiere, I noticed that the GPU was not really being used as well as it should have been. With the Premiere Pro 2020 upgrade it seems like Adobe really upped its GPU usage game — at some points I saw 100% GPU usage.

In Resolve, I performed similar tests, but instead of ProResHQ I exported a DNxHR QuickTime file/package instead of a DCP and IMF package. For the most part, they are stock exports in the Deliver page of Resolve, except I forced Video Levels, Forced Debayer and Resizing to Highest Quality. Here are my results from Resolve version 16.1.1 and 16.1.2. (16.1.2 will be in parenthesis.)

– Red (4K, 6K, 8K) Color Only: H.264 – 2:17 (2:31); H.265 – 2:23 (2:37); DNxHR – 2:59 (3:06); IMF – 6:37 (6:40); DPX – 2:48 (2:45); MXF OP1A – 2:45 (2:33)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 5:00 (5:15); H.265 – 5:18 (5:21); DNxHR – 5:25 (5:02); IMF – 5:28 (5:11); DPX – 5:23 (5:02); MXF OP1a – 5:20 (4:54)

-Blackmagic Raw Color Only: H.264 – 0:26 (0:25); H.265 – 0:31 (0:30); DNxHR – 0:50 (0:50); IMF – 3:51 (3:36); DPX – 0:46 (0:46); MXF OP1a – 0:23 (0:22)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 7:51 (7:53); H.265 – 7:45 (8:01); DNxHR – 7:53 (8:00); IMF – 8:13 (7:56); DPX – 7:54 (8:18); MXF OP1a – 7:58 (7:57)

Interesting to note: Exporting Red footage with color correction only was significantly faster from Resolve, but for Red footage with effects applied, export times were similar between Resolve and Premiere. With the CUDA Red SDK update to Resolve in 16.1.2, I thought I would see a large improvement, but I didn’t. I saw an approximate 10% increase in playback but no improvement in export times.

Puget

Puget Systems has some great benchmarking tools, so I reached out to Matt Bach, Puget Systems’ senior labs technician, about my findings. He suggested that the mobile Xeon could possibly still be the bottleneck for Resolve. In his testing he saw a larger increase in speed with AMD Threadripper 3 and Intel i9-based systems. Regardless, I am kind of going deep on realtime playback of 8K Red Raw media on a mobile workstation — what a time we are in. Nonetheless, Blackmagic Raw footage was insanely fast when exporting out of Resolve, while export time for the Blackmagic Raw footage with effects was higher than I expected. There was a consistent use of the GPU and CPU in Resolve much like in the new version of Premiere 2020, which is a trend that’s nice to see.

In addition to Premiere and Resolve testing, I ran some common benchmarks that provide a good 30,000-foot view of the HP ZBook 15 G6 when comparing it to other systems. I decided to use the Puget Systems benchmarking tools. Unfortunately, at the time of this review, the tools were only working properly with Premiere and After Effects 2019, so I ran the After Effects benchmark using the 2019 version. The ZBook 15 G6 received an overall score of 802, render score of 79, preview score of 75.2 and tracking score of 86.4. These are solid numbers that beat out some desktop systems I have tested.

Corona

To test some 3D applications, I ran the Cinebench R20, which gave a CPU score of 3243, CPU (single core) score of 470 and an M/P ratio of 6.90x. I recently began running the Gooseberry benchmark scene in Blender to get a better sense of 3D rendering performance, and it took 29:56 to export. Using the Corona benchmark, it took 2:33 to render 16 passes, 3,216,368 rays/s. Using Octane Bench the ZBook 15 G6 received a score of 139.79. In the Vray benchmark for CPU, it received 9833 Ksamples, and in the Vray GPU testing, 228 mpaths. I’m not going to lie; I really don’t know a lot about what these benchmarks are trying to tell me, but they might help you decide whether this is the mobile workstation for your work.

Cinebench

One benchmark I thought was interesting between driver updates for the Nvidia Quadro RTX 3000 was the Neat Bench from Neat Video — the noise reduction plugin for video. It measures whether your system should use the CPU, GPU or a combination thereof to run Neat Video. Initially, the best combination result was to use the CPU only (seven cores) at 11.5fps.

After updating to the latest Nvidia drivers, the best combination result was to use the CPU (seven cores) and GPU (Quadro RTX 3000) at 24.2fps. A pretty incredible jump just from a driver update. Moral of the story: Make sure you have the correct drivers always!

Summing Up
Overall, the HP ZBook 15 G6 is a powerful mobile workstation that will work well across the board. From 3D to color correction apps, the Xeon processor in combination with the Quadro RTX 3000 will get you running 4K video without a problem. With the HP DreamColor anti-glare display using up to 600 nits of brightness and covering 100% of the DCI-P3 color space, coupled with the HDR option, you can rely on the attached display for color accuracy if you don’t have your output monitor attached. And with features like two USB Type-C ports (Thunderbolt 3 plus DP 1.4 plus USB 3.1 Gen 2), you can connect external monitors for a larger view of your work

The HP Fast Charge will get you out of a dead battery fiasco with the ability to go from 0% to 50% charge in 45 minutes. All of this for around $4,000 seems to be a pretty low price to pay, especially because it includes a three-year on-site warranty and because the device is certified to work seamlessly with many apps that pros use with HP’s independent software vendor verifications.

If you are looking for a mobile workstation upgrade, are moving from desktop to mobile or want an alternative to a MacBook Pro, you should price a system out online.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

The Mill opens boutique studio in Berlin

Technicolor’s The Mill has officially launched in Berlin. This new boutique studio is located in the heart of Berlin, situated in the creative hub of Mitte, near many of Germany’s agencies, production companies and brands.

The Mill has been working with German clients for years. Recent projects include the Mercedes’ Bertha Benz spot with director Sebastian Strasser; Netto’s The Easter Surprise, directed in-house by The Mill; and BMW The 8 with director Daniel Wolfe. The new studio will bring The Mill’s full range of creative services from color to experiential and interactive, as well as visual effects and design.

The Mill Berlin crew

Creative director Greg Spencer will lead the creative team. He is a multi-award winning creative, having won several VES, Cannes Lions and British Arrow awards. His recent projects include Carlsberg’s The Lake, PlayStation’s This Could Be You and Eve Cuddly Toy. Spencer also played a role in some of Mill Film’s major titles. He was the 2D supervisor for Les Misérables and also worked on the Lord of the Rings trilogy. His resume also includes campaigns for brands such as Nike and Samsung.

Executive producer Justin Stiebel moves from The Mill London, where he has been since early 2014, to manage client relationships and new business. Since joining the company, Stiebel has produced spots such as Audi’s Next Level and the Mini’s “The Faith of a Few” campaign. He has also collaborated with directors such as Sebastian Strasser, Markus Walter and Daniel Wolfe while working on brands like Mercedes, Audi and BMW.

Sean Costelloe is managing director of The Mill London and The Mill Berlin.

Main Image Caption: (L-R) Justin Stiebel and Greg Spencer