AMD 2.1

Category Archives: Color Grading

Behind the Title: Unit9 director Matthew Puccini

This young director has already helmed two short films, Dirty and Lavender, that got into Sundance. And he still finds time to edit when his schedule allows. 

Name: Director Matthew Puccini

Can you describe Unit9?
Unit9, which has headquarters in London and Los Angeles, is a global production company that represents a team of producers and film directors, creative and art directors, designers, architects, product designers, software engineers and gaming experts. I’m based in Brooklyn.

Puccini on set of Dirty

What would surprise people the most about what falls under the title of director?
These days, there’s a certain element of self-promotion that’s required to be a young director. We have to figure out how to brand ourselves in a way that people might not have had to do 10 to 15 years ago when the Internet wasn’t as prevalent in how people discovered new artists. I constantly have to be tip-toeing back and forth between the creative side of the work and the more strategic side — getting the work seen and amplified as much as possible.

What’s your favorite part of the job?
My favorite part of directing is the collaborative aspect of it. I love that it offers this unique ability to dip into so many other disciplines and to involve so many other incredible, wildly different people.

What’s your least favorite?
The salesperson aspect of it can be frustrating. In a perfect world it would be nice to just make things and not have to worry about the back end of finding an audience. But at the same time, sometimes being forced to articulate your vision in a way that’s palatable to a financier or a production company can be helpful in figuring out what the core of the idea is. It’s a necessary evil.

Why did you choose this profession? How early on did you know this would be your path?
I fell in love with directing in high school. We had an amazing theater program at my school. I started off mainly acting, and then there was one show where I ended up being the assistant director instead of acting. That experience was so wonderful and fulfilling and I realized that I preferred being on that side of things. That happened parallel to getting my first video camera, which I enjoyed as a hobby but began to take more seriously during my junior and senior years of high school.

What was it about directing that attracted you?
I fell in love with working with actors to craft performances. The whole process requires so much collaboration and trust and vulnerability. Over time, I’ve also grown to appreciate filmmaking as a means of filling in gaps in representation. I get to highlight human experiences that I feel like I haven’t seen properly portrayed before. It’s wish fulfillment, in a sense; you get to make the work that you wish you were seeing as an audience member.

Puccini on set of Lavender

How do you pick the people you work with on a particular project?
I began making work while I was in school in New York, so there’s a wonderful community of people that I met in college and who I still work with often. I also continue to meet new collaborators at film festivals, or will occasionally just reach out to someone after having seen a film of theirs that I responded to. I continue to be amazed by how willing people are to make time for something if they believe in it, even if it seems like it’s far beneath their pay grade.

How do you work with your DP?
It always just starts with me sending them the script and having a meeting to talk about the story. I might have some preconceived ideas going into that meeting about how I’m thinking of shooting it — what my visual references were while writing the script — but I try to stay open to what they imagined when they were reading it. From there, it’s a very organic process of us pulling references and gradually building a look book together of colors, lighting styles, compositions and textures.

It could be as specific as a frame that we want to be completely copy or as general as a feeling that an image evokes, but the idea is that we’re figuring out what our shared vocabulary is going to be before we get to set. My number one criteria is knowing that the person is just as passionate about the story as I am and is able to tailor their shooting style to what’s right for that particular project.

Do you get involved with the post at all?
Definitely. I’m very involved with every stage of post, working closely with the department heads who are running the show on a more granular level. I love the post process and enjoy being involved as much as possible.

I also work as a video editor myself, which has given me so much awareness and respect for the importance of a good edit and a good editor. I think sometimes it’s easy to waste time and resources on shooting coverage you’re never going to use. So as a director, it’s important even before starting a project for me to think ahead and visualize what the film really needs so that I can be as efficient and decisive as possible on set.

Dirty

Can you talk about Dirty. What was it like getting it ready for Sundance?
We found out that Dirty got into Sundance last November — obviously, it’s the call of anyone’s dreams and such a wonderful feeling and boost of validation. We had finished the film back in April, so it had been a long time of waiting.

From November to the festival it was a rush to get the film ready — we got it re-colored and re-mixed. Trying to make it as good as possible before it premiered there — it was a bit of a whirlwind. The festival itself was a really special experience, it was incredibly powerful to have a film that, in my mind, is somewhat doing things that are really pushing the boundaries of what we’re seeing on screen and getting to share it with a lot of people. There’s a gay sex scene in the middle of the film, and to have that celebrated and accepted by an important part of the film community was really special.

Can you describe the film?
Dirty is a tender coming-of-age film. It follows two queer teenagers over an afternoon as they navigate intimacy for the first time.

What about Lavender? Do you have a distributor for that?
The film was acquired by Searchlight Pictures out of Sundance last year — they released the film on their Vimeo and YouTube channels last spring. They put the film in theaters for a week in NYC and LA in front of a feature film they were showing, which actually qualified it for the Oscars last year.

Can you describe that film?
The film is about a young gay man who is growing increasingly entangled in the marriage of an older couple. It is the portrait of an unconventional relationship as it blossoms and ultimately unravels.

What is the project that you are most proud of?
To me Dirty and Lavender are both equally important. I don’t have an answer. I’m grateful for both films for different reasons and they are all part of one period of my life — exploring these ideas of intimacy and loneliness and queer people seeking connection. In some ways they’re almost two attempts to answer the same question.

Name three pieces of technology you can’t live without.
My laptop for all of the writing and editing I do. I try to watch a lot of movies, so I enjoy my TV. And even though I’m trying to wean myself off my phone as much as possible, I still rely on that throughout the day. Obvious answers I know, but it’s true!

What do you do to de-stress from it all?
I find that watching movies and seeing a lot of theater are often the best ways to get inspired and excited about making new work. I’m trying to meditate more. Starting the day with something like that and building out some introspection into my routine has been really helpful. And therapy, of course. Gotta have therapy.

Emma DP Christopher Blauvelt talks looks and workflow

Focus Features Emma, the latest reincarnation of the Jane Austen novel, was directed by Autumn de Wilde and shot by cinematographer Christopher Blauvelt. For de Wilde, a photographer and music video director, Emma is her feature film directorial debut. In addition to her still work on CD covers for The White Stripes, Fiona Apple and Beck she has directed music videos for indie bands, including Spoon, The Decemberists and Rilo Kiley.

DP Christopher Blauvelt (right)

Blauvelt and de Wilde met in 2000 when she directed Elliott Smith’s Son of Sam music video. He then shot some 16mm footage that was projected behind Elliott on his tour, and the collaborators became friends. When it came time to start on her directorial debut, de Wilde reached out, knowing that he could help her bring her vision to the screen.

Emma was shot with the ARRI Alexa LF using ARRI Signature Prime lenses. Blauvelt had done some tests with it a year or so ago for a film he shot with Gus Van Sant, called Don’t Worry, He Won’t Get Far on Foot, so he was familiar with the camera. “We looked at many different cameras to find our aesthetic,” Blauvelt explains. “I remember making a note about the softness and way it treated skin. It was also something I would think about for scope and scale, for which Emma provided the environments in the form of castles and the English countryside. Of course, we didn’t just test cameras — Autumn had given me many references to use as our guide in finding the unique look of Emma, so we tested many different lenses, cameras, filters, lights and lookup tables.”

Principal photography began in March 2019. Blauvelt hadn’t worked on a feature film in the UK before but was fortunate enough to team up with Mission, a UK-based DIT and digital services company, who assisted in setting up a workflow and a color pipeline that ensured that the director and DP’s vision was communicated to everyone. Mission’s Jacob Robinson was the DIT.

DITs have become a more and more important part of the camera crew and often build close working relationships with DPs. Designing the look of a production is a collaborative process that often begins in preproduction. “I really enjoy my working relationships with DITs; they are the people I rely on to inform me on the rules we’ve put in place on any particular shoot,” says Blauvelt. “Usually during prep, we will do an enormous amount of testing and come up with the recipe that we decide on for the shoot.”

The final DI colorist is often part of the mix too. For Emma, Goldcrest’s Rob Pizzey was the DI colorist, so he was also involved in creating the color pipeline and LUTs with Blauvelt and DIT Robinson. As Blauvelt explains, “It’s also really great having the chance to create custom LUTs with our final grade colorist. We work hard to have a formula that works while on location and all the way to the final grade.”

There are several different ways for a DP to work with LUTs. In Blauvelt’s case, during testing the team created several different base LUTs, including day/exterior, day/interior, night/exterior, night/interior, day/exterior in clouds and sun and other variations they might encounter during the shoot. “These LUTs are all adjustable as well,” he continues, “and will be manipulated live on set to achieve the desired look. We also have images to serve as our spirit throughout the shoot to remind us of the original intent as well.”

The digital lab process on Emma was straightforward. Every day, DIT Robinson would send the capture drives from set along with a Blackmagic DaVinci Resolve project with CDLs applied. Mission’s Niall Todd and Neil Gray were tasked with creating synced H264s and DNxHD 115 files for Avid. The data was then backed up to dual LTOs and a G-Tech 10TB G-Drive.

Mission’s on-set to near-set process made it simple for Blauvelt’s vision to be conveyed to everyone. “Mark Purvis has created a collective and creative environment with Mission that I had not experienced before.”

The digital intermediate process was straightforward, with Goldcrest’s Pizzey using DaVinci Resolve to complete the final grading. Emma, which was edited by Nick Emerson, is now streaming.

AMD 2.1

Sebastian Robertson, Mark Johnson on making Playing For Change’s The Weight

By Randi Altman

If you have any sort of social media presence, it’s likely that you have seen Playing For Change’s The Weight video featuring The Band’s Robbie Robertson, Ringo Starr, Lukas Nelson and musicians from all over the world. It’s amazing, and if you haven’t seen it, please click here now. Right now. Then come back and read how it was made.

L-R: Mark Johnson, Robbie Robertson, Sebastian Robertson, Raan Williams and Robin Moxey

The Weight was produced by Mark Johnson and Sebastian Robertson, Robbie’s son. It was a celebration of the 50th anniversary of The Band’s first studio album, Music From Big Pink, where the song “The Weight” first appeared. Raan Williams and Robin Moxey were also producers on the project.

Playing For Change (PFC) was co-founded by Johnson and Whitney Kroenke in 2002 with the goal to share the music of street musicians worldwide. And it seems the seed of the idea involved the younger Robertson and Johnson. “Mark Johnson is an old friend of mine,” explains Robertson. “I was sitting around in his apartment when he initially conceived the idea of Playing For Change. At first, it was a vehicle that brought street musicians into the spotlight, then it became world musicians, and then it evolved into a big musical celebration.”

Johnson explains further: “Playing For Change was born out of the idea that no matter how many things in life divide us, they will never be as strong as the power of music to bring us all together. We record and film songs around the world to reconnect all of us to our shared humanity and to show the world through the lens of music and art.” Pretty profound words considering current events.

Mermans Mosengo – Kinshasa Congo

Each went on with their busy lives, Robertson as a musician and composer, and Johnson traveling the world capturing all types of music. They reconnected a couple of years ago, and the timing was ideal. “I wanted to do something to commemorate the 50th anniversary of The Band’s Music From Big Pink — this beautiful album and this beautiful song that my dad wrote — so I brought it to Mark. I wanted to team up with some friends and we all came together to do something really special for him. That was the driving force behind the production of this video.”

To date, Playing For Change has created over 50 “Songs Around the World” videos — including The Grateful Dead’s Ripple and Jimi Hendrix’s All Along the Watchtower — and recorded and filmed over 1,000 musicians across more than 60 countries.

The Weight is beautifully shot and edited, featuring amazingly talented musicians, interesting locales and one of my favorite songs to sing along to. I reached out to Robertson and Johnson to talk through the production, post and audio post.

This was a big undertaking. All those musicians and locales… how did you choose the musicians that were going to take part in it?
Robertson: First, some friends and I went into the studio to record the very basic tracks of the song — the bass, drums, guitar, a piano and a scratch vocal. The first instrument that was added was my dad on rhythm and lead guitar. He heard this very kind of rough demo version of what we had done and played along with it. Then, slowly along the way, we started to replace all those rough instruments with other musicians around them. That’s basically how the process worked.

Larkin Poe – Venice, California

Was there an audition process, or people you knew, like Lukas Nelson and Marcus King? Or did Playing For Change suggest them?
Robertson: Playing For Change was responsible for the world musicians, and I brought in artists like Lukas, my dad, Ringo and Larkin Poe. They have this incredible syndicate of world musicians, so there is no auditioning. So we knew they were going to be amazing. We brought what we had, they added this flavor, and then the song started to take on a new identity because of all these incredible cultures that are added to it. And it just so happened that Lukas was in Los Angeles because he had been recording up at Shangri-La in Malibu. My friend Eric (Lynn) runs that studio, so we got in touch. Then we filmed Lukas.

Is Shangri-La where you initially went to record the very basic parts of the song?
Robertson: It is. The funny and kind of amazing coincidence is that Shangri-La was The Band’s clubhouse in the ’70s. Since then, producer Rick Rubin has taken over. That’s where the band recorded the studio songs of The Last Waltz (film). That’s where they recorded their album, Northern Lights – Southern Cross. Now, here we are 50 years later, recording The Weight.

Mark, how did you choose the locations for the musicians? They were all so colorful and visually stunning.
Johnson: We generally try to work with each musician to find an outdoor location that inspires them and a place that can give the audience a window into their world. Not every location is always so planned out, so we do a lot of improvising to find a suitable location to record and film music live outside.

Shooting Marcus King in Greenville, South Carolina

What did you shoot on? Did you have one DP/crew or use some from all over the world? Were you on set?
Johnson: Most of the PFC videos are recorded and filmed by one crew (Guigo Foggiatto and Joe Miller), including myself, an additional audio person and two camera operators. We work with a local guide to help us find both musicians and locations. We filmed The Weight around the world on 4K with Sony A7 cameras — one side angle, one zoom and a Ronin for more motion.

How did you capture the performances from an audio aspect, and who did the audio post?
Johnson: We record all the musicians around the world live and outside using the same mobile recording studio we’ve used since the beginning of our “Song Around the World” videos over 10 years ago. The only thing that has changed is the way we power everything. In the beginning it was golf cart batteries and then car batteries with big heavy equipment, but fortunately it evolved into lightweight battery packs.

We primarily use Grace mic preamps and Schoeps microphones, and our recording mantra comes from a good friend and musician named Keb’ Mo’. He once told us, “Sound is a feeling first, so if it feels good it will always sound good…” This inspires us to help the musicians to feel comfortable and aware that they are performing along with other musicians from around the world to create something bigger than themselves.

One interesting thing that often comes from this project that differs from life in the studio is that the musicians playing on our songs around the world tend to listen more and play less. They know they are only a part of the performance and so they try to find the best way to fit in and support the song without any ego. This reality makes the editing and mixing process much easier to handle in post.

Lukas Nelson – Austin, Texas

The Weight was recorded by the Playing For Change crew and mixed by Greg Morgenstein, Robin Moxey, Sebastian and me.

What about the editing? All that footage and lining up the song must have been very challenging. I’m assuming cutting your previous videos has given you a lot of experience with this.
Johnson: That is a great question, and one of the most challenging and rewarding parts of the process. It can get really complicated sometimes to edit because we have three cameras per shoot/musician and sometimes many takes of each performance. And sometimes we comp the audio. For example, the first section came from Take 1, the second from Take 6, etc. … and we need to match the video to correspond to each different audio take/performance. We always rough-mix the music first in Avid Pro Tools and then find the corresponding video takes in Adobe Premiere. Whenever we return from a trip, we add the new layer to the Pro Tools session, then the video edit and build the song as we go.

The Weight was a really big audio session in Pro Tools with so many tracks and options to choose from as to who would play what fill or riff and who would sing each verse, and the video session was also huge. with about 20 performances around the world combined with all the takes that go along with them. One of the best parts of the process for me is soloing all the various instruments from around the world and seeing how amazing they all fit together.

You edited this yourself? And who did the color grade?
Johnson: The video was colored by Jon Walls and Yasuhiro Takeuchi on Blackmagic DaVinci Resolve and edited by me, along with everyone’s help, using Premiere. The entire song and video took over a year to make, so we had time throughout the process to work together on the rough mixes and rough edits from each location and build it brick by brick as we went along the journey.

Sherieta Lewis and Roselyn Williams – Trenchtown, Jamaica

When your dad is on the bench playing and wearing headphones — and the other artists as well — what are they listening to? Are they listening to the initial sort of music that you recorded in studio, or was it as it evolved, adding the different instruments and stuff? Is that what he was listening to and playing along to?
Robertson: Yeah. My dad would listen to what we recorded, except in his case we muted the guitar, so he was now playing the guitar part. Then, as elements from my dad and Ringo are added, those [scratch] elements were removed from what we would call the demo. So then as it’s traveling around the world, people are hearing more and more of what the actual production is going to be. It was not long before all those scratch tracks were gone and people were listening to Ringo and my dad. Then we just started filling in with the singers and so on and so forth.

I’m assuming that each artist played the song from start to finish in the video, or at least for the video, and then the editor went in and cut different lines together?
Robertson: Yes and no. For example, we asked Lukas to do a very specific part as far as singing. He would sing his verse, and then he would sing a couple choruses and play guitar over his section. It varied like that. Sometimes when necessary, if somebody is playing percussion throughout the whole song, then they would listen to it from start to finish. But if somebody was just being asked to sing a specific section, they would just sing that section.

Rajeev Shrestha – Nepal

How was your dad’s reaction to all of it? From recording his own bit to watching it and listening to the final?
Robertson: He obviously came on board very early because we needed to get his guitar, and we wanted to get him filmed at the beginning of the process. He was kind of like, “I don’t know what the hell you guys are doing, but it seems cool.” And then by the time the end result came, he was like, “Oh my God.” Also, the response that his friends and colleagues had to it… I think they had the similar response to what you had, which is A, how the hell did you do this? And, B, this is one of the most beautiful things I’ve ever seen.

It really is amazing. One of my favorite parts of the video is the very end, when your dad’s done playing, looks up and has that huge smile on his face.
Robertson: Yeah. It’s a pulling-at-the-heart-strings moment for me, because that was really a perfect picture of the feeling that I had when it all came together.

You’re a musician as well. What are you up to these days?
Robertson: I have a label under the Universal Production Music umbrella, called Sonic Beat Records. The focus of the label is on contemporary, up-to-the-minute super-slick productions. My collaboration with Universal has been a great one so far; we just started in the fall of 2019, so it’s really new. But I’m finding my way in that family, and they’ve welcomed me with open arms.

Another really fun collaboration was working with my dad on the score for Martin Scorsese’s The Irishman. That was a wonderful experience for me. I’m happy with how the music that we did turned out. Over the course of my life, my dad and I haven’t collaborated that much. We’ve just been father and son, and good friends, but as of late, we’ve started to put our forces together, and that has been a lot of fun.

L-R: Mark Johnson and Ahmed Al Harmi – Bahrain

Any other scores on the horizon?
Robertson: Yeah. I just did another score for a documentary film called Let There Be Drums!, which is a look into the mindset of rock and roll drummers. My friend, Justin Kreutzmann, directed it. He’s the son of Bill Kreutzmann, the drummer of the Grateful Dead. He gave me some original drum tracks of his dad’s and Mickey Hart’s, so I would have all these rhythmic elements to play with, and I got to compose a score on top of Mickey Hart and Bill Kreutzmann’s percussive and drumming works. That was a thrill of a lifetime.

Any final thoughts? And what’s next for you, Mark?
Johnson: One of the many amazing things that came out making this video was our partnership with Sheik Abdulla bin Hamad bin Isa Al Khalifa from Bahrain, who works with us to help end the stereotype of terrorism through music by including musicians from the Middle East in our videos. In The Weight watch an oud master in Bahrain cut to a sitar master in Nepal followed by Robbie Robertson and Ringo Starr, and they all work so well together.

One of the best things about Playing For Change is that it never ends. There are always more songs to make, more musicians to record and more people to inspire through the power of music. One heart and one song at a time…


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years.


Finishing artist Tim Nagle discuses work on indie film Miss Juneteenth

Lucky Post Flame artist Tim Nagle has a long list of projects under his belt, including collaborations with David Lowery — providing Flame work on the short film Pioneer as well as finishing and VFX work to Lowery’s motion picture A Ghost Story. He is equally at home working on spots, such as campaigns for AT&T, Hershey’s, The Home Depot, Jeep, McDonald’s and Ram..

Nagle began his formal career on the audio side of the business, working as engineer for Solid State Logic, where he collaborated with clients including Fox, Warner Bros., Skywalker, EA Games and ABC.

Tim Nagle

We reached out to Nagle about his and Lucky Post’s work on the feature film Miss Juneteenth, which premiered at Sundance and was recently honored by SXSW 2020 as the winner of the Louis Black Lone Star award.

Miss Juneteenth was directed (and written) by Channing Godfrey Peoples — her first feature-length film. It focuses on a woman from the south — a bona fide beauty queen once crowned Miss Juneteenth, a title commemorating the day slavery was abolished in Texas. The film follows her journey as she tries to hold onto her elegance while striving to survive. She looks for ways to thrive despite her own shortcomings as she marches, step by step, toward self-realization.

How did the film come to you?
We have an ongoing relationship with Sailor Bear, the film’s producing team of David Lowery, Toby Halbrooks and James Johnston. We’ve collaborated with them on multiple projects, including The Old Man & The Gun, directed by Lowery.

What were you tasked to do?
We were asked to provide dailies transcoding, additional editorial, VFX, color and finishing and ultimately delivery to distribution.

How often did you talk to director Channing Godfrey Peoples?
Channing was in the studio, working side by side with our creatives, including colorist Neil Anderson and me, to get the project completed for the Sundance deadline. It was a massive team effort, and we felt privileged to help Channing with her debut feature.

Without spoilers, what most inspires you about the film?
There’s so much to appreciate in the film — it’s a love letter to Texas, for one. It’s directed by a woman, has a single mother at its center and is a celebration of black culture. The LA Times called it one of the best films to come out of Sundance 2020.

Once you knew the film was premiering at Sundance, what was left to complete and in what amount of time?
This was by far the tightest turnaround we have ever experienced. Everything came down to the wire, sound being the last element. It’s one of the advantages of having a variety of talent and services under one roof — the creative collaboration was immediate, intense and really made possible by our shorthand and proximity.

How important do you think it is for post houses to be diversified in terms of the work they do?
I think diversification is important not only for business purposes but also to keep the artists creatively inspired. Lucky Post’s ongoing commitment to support independent film, both financially and creatively, is an integrated part of our business along with brand-supported work and advertising. Increasingly, as you see greater crossover of these worlds, it just seems like a natural evolution for the business to have fewer silos.

What does it mean to you as a company to have work at Sundance? What kinds of impact do you see — business, morale and otherwise?
Having a project that we put our hands on accepted into Sundance was such an honor. It is unclear what the immediate and direct business impacts might be, but for morale, this is often where the immediate value is clear. The excitement and inspiration we all get from projects like this just naturally makes how we do business better.

What software and hardware did you use?
On this project we started with Assimilate Scratch for dailies creation. Editorial was done in Adobe Premiere. Color was Blackmagic DaVinci Resolve, and finishing was done in Autodesk Flame.

What is a piece of advice that you’d give to filmmakers when considering the post phase of their films?
We love being involved as early as possible — certainly not to get in anyone’s way,  but to be in the background supporting the director’s creative vision. I’d say get with a post company that can assist in setting looks and establishing a workflow. With a little bit of foresight, this will create the efficiency you need to deliver in what always ends up being a tight deadline with the utmost quality.


Workstations and Color Grading

By Karen Moltenbrey

A workstation is a major investment for any studio. Today, selecting the right type of machine for the job can be a difficult process. There are many brands and flavors on the market, and some facilities even opt to build their own. Colorists have several tools available to them when it comes to color grading, ranging from software-based systems (which typically use a multiprocessor workstation with a high-end GPU) to those that are hardware-based.

Here, we examine the color workflow of two different facilities: Technicolor Vancouver and NBCUniversal StudioPost in Los Angeles.

[Editor’s note: These interviews were conducted before the coronavirus work limits were put in place.]

Anne Boyle

Technicolor Vancouver
Technicolor is a stalwart in the post industry, with its creative family — including VFX studios MPC, The Mill, Mr. X and Mikros — and wide breadth of post production services offered in many locations around the world. Although Technicolor Vancouver has been established for some time now, it was only within the past two years that the decision was made to offer finishing services again there, with an eye toward becoming more of a boutique operation, albeit one offering top-level effects.

With this in mind, Anne Boyle joined as a senior colorist, and immediately Technicolor Vancouver began a co-production with Technicolor Los Angeles. The plan was for the work to be done in Vancouver, with review and supervision handled in LA. “So we hit the ground running and built out new rooms and bought a lot of new equipment,” says Boyle. “This included investing in FilmLight Baselight, and we quickly built a little boutique post finishing house here.”

This shared-location work setup enabled Technicolor to take advantage of the lucrative tax credits offered in Vancouver. The supervising colorist in LA reviews sessions with the client, after which she and Boyle discuss them, and then Boyle picks up the scene and performs the work based on those conversations or notes in the timeline. A similar process occurs for the Dolby SDR deliverables. “There isn’t much guesswork. It is very seamless,” she says.

“I’ve always used Baselight,” says Boyle, “and was hoping to go that route when I got here, and then this shared project happened, and it was on a Baselight [in LA]. Happily for me, the supervising colorist, Maxine Gervais, insisted that we mirror the exact setup that they had.”

Gervais was using a Baselight X system, so that is what was installed in Vancouver. “It’s multi-GPU (six Nvidia Titan XPs) with a huge amount of storage,” she says. “So we put in the same thing and mimicked the infrastructure in LA. They also put in a Baselight Assist station and plan to upgrade it in the coming months to make it color-capable as well.”

The Baselight X turnkey system ships with bespoke storage and processing hardware, although Technicolor Vancouver loaded it with additional storage. For the grading panels, the company went with the top-of-the-line Blackboard. The Vancouver facility also purchased the same displays as LA — Sony BVM-X300s.

Messiah

The mirrored setup was necessary for the shared work on Netflix’s Messiah, an HDR project that dropped January 1. “We had to deliver 10 episodes all at once in 4K, [along with] both the HDR PQ masters and the Dolby SDR deliverable, which were done here as well,” explains Boyle. “So we needed the capability to store all of that and all of those renders. It was quite a VFX-heavy show, too.”

Using Pulse, Technicolor’s internal cloud-based system, the data set is shared between the LA and Vancouver sites. Technicolor staff can pull down the data, and VFX vendors can pull their own VFX shots too. “We had exact mirrors of the data. We were not sending project files back and forth, but rather, we shared them,” Boyle explains. “So anyone could jump on the project, whether in Vancouver or LA, and immediately open the project, and everything would appear instantaneously.”

When it comes to the hardware itself, speed and power are big factors. As Boyle points out, the group handles large files, and slowdowns, render issues and playback hiccups are unacceptable.

Messiah

The color system proved its mettle on Messiah, which required a lot of skin retouching and other beauty work. “The system is dedicated and designed only for colorists,” says Boyle. “And the tools are color-focused.”

Indeed, Boyle has witnessed drastic changes in color workstations over the past several years. File sizes have increased thanks to Red 8K and raw materials, which have driven the need for more powerful machines and more powerful GPUs, particularly with the increasingly complex HDR workflows, wherein floating points are necessary for good color. “More work nowadays needs to be performed on the GPU,” she adds. “You just can’t have enough power behind you.”

NBCUniversal StudioPost
NBCUniversal StudioPost knows a thing or two about post production. Not only does the facility provide a range of post, sound and finishing services, but it also offers cutting-edge equipment rentals and custom editorial rooms used by internal and third-party clients.

Danny Bernardino

Specifically, NBCUniversal offers end-to-end picture services that include dailies, editorial, VFX, color correction, duplication and encoding/decoding, data management, QC, sound, sound editorial, sound supervision, mixing and streaming.

Each area has a plethora of workstations and systems needed to perform its given tasks. For the colorists, the facility offers two choices, both on Linux OS: a Blackmagic DaVinci Resolve 16.1.2 (fully loaded with a creative suite of plugins and add-ons) running on an HP Z840 machine, and Autodesk Lustre 2019 running on an HP Z820.

“We look for a top-of-the-line color corrector that has a robust creative tool set as well as one that is technically stable, which is why we prefer Linux-based systems,” says Danny Bernardino, digital colorist at NBCUniversal StudioPost. Furthermore, the facility prefers a color corrector that adapts to new file formats and workflows by frequently updating its versions. Another concern is that the system works in concert with all of the ever-changing display demands, such as UHD, 4K, HDR and Dolby Vision.

Color bay

According to Bernardino, the color systems at NBCUniversal are outfitted with the proper CPU/GPU and SAN storage connectivity to ensure efficient image processing, thereby allowing the color talent to work without interruption. The color suites also are outfitted with production-level video monitors that represent true color. Each has high-quality scopes (waveform, vector and audio) that handle all formats.

When it comes time to select machines for the colorists there, it is a collective process, says senior VP Thomas Thurau. First, the company ascertains the delivery requirements, and then the color talent, engineering and operations staff work together to configure the proper tool sets for the customers’ content. How often the equipment is replaced is contingent on whether new image and display technology has been introduced.

Thurau defines a solid colorist workstation as a robust platform that is Linux-based and has enough card slots or expansion chassis capabilities to handle four or more GPU cards, Fibre Channel cards and more. “All of our systems are in constant demand, from compute to storage, thus we look for systems and hardware that are robust through to delivery,” he notes.

Mr. Robot

NBCUniversal StudioPost is always humming with various work. Some of the more recent projects there includes Jaws, which was remastered in UHD/HDR, Casino (UHD/HDR), the How to Train Your Dragon series (UHD/HDR) and an array of Alfred Hitchcock’s more famous films. The company also services broadcast episodic (NBCU and others) and OTT/streaming customers, offering a full suite of services (Avid, picture and sound). This includes Law & Order SVU, Chicago Med, Will & Grace, Four Weddings and a Funeral and Mr. Robot, as well as others.

“We take incredible pride in all aspects of our color services here at NBCUniversal StudioPost, and we are especially pleased with our HDR grades,” says Thurau.

For those who prefer to do their own work, NBCUniversal has over 185 editorial rooms, ranging from small to large suites, set up with Avid Media Composer.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.


Colorist Chat: Framestore LA senior colorist Beau Leon

Veteran colorist Beau Leon recently worked with director Spike Jonze on a Beastie Boys documentary and a spot for cannabis retailer MedMen.

What’s your title and company?
I’m senior colorist at LA’s Framestore

Spike Jonze’s MedMen

What kind of services does Framestore offer?
Framestore is a multi-Oscar-winning creative studio founded over 30 years ago, and the services offered have evolved considerably over the decades. We work across film, television, advertising, music videos, cinematic data visualization, VR, AR, XR, theme park rides… the list is endless and continues to change as new platforms emerge.

As a colorist, what would surprise people the most about what falls under that title?
Despite creative direction or the equipment used to shoot something, whether it be for film or TV, people might not factor in how much color or tone can dictate the impact a story has on its audience. As a colorist, my role often involves acting as a mediator of sorts between various creative stakeholders to ensure everyone is on the same page about what we’re trying to convey, as it can translate differently through color.

Are you sometimes asked to do more than just color on projects?
Earlier in my career, the process was more collaborative with DPs and directors who would bring color in at the beginning of a project. Now, particularly when it comes to commercials with tighter deadlines and turnarounds, many of those conversations happen during pre-production without grading factored in until later in the pipeline.

Rihanna’s Needed Me

Building strong relationships and working on multiple projects with DPs or directors always allows for more trust and creative control on my end. Some of the best examples I’ve seen of this are on music video projects, like Rihanna’s Needed Me, which I graded here at Framestore for a DP I’d grown up in the industry with. That gave me the opportunity to push the creative boundaries.

What system do you work on?
FilmLight Baselight

You recently worked on the new Beastie Boys documentary, Beastie Boys Story. Can you talk a bit about what you did and any challenges relating to deadlines?
I’ve been privileged to work with Spike Jonze on a number of projects throughout my career, so going into Beastie Boys Story, we already had a strong dialogue. He’s a very collaborative director and respectful of everyone’s craft and expertise, which can be surprisingly rare within our industry.

Spike Jonze’s Beatie Boys Story

The unique thing about this project was that, with so much old footage being used, it needed to be mastered in HDR as well as reworked for IMAX. And with Spike being so open to different ideas, the hardest part was deciding which direction to choose. Whether you’re a hardcore Beastie Boys fan or not, the documentary is well worth watching once it will air on AppleTV+ in April.

Any suggestions for getting the most out of a project from a color perspective?
As an audience, our eyes have evolved a great deal over the last few decades. I would argue that most of what we see on TV and film today is extremely oversaturated compared to what we’d experience in our real environment. I think it speaks to how we treat consumers and anticipate what we think they want — colorful, bright and eye-catching. When it’s appropriate, I try to challenge clients to think outside those new norms.

How do you prefer to work with the DP or director?
Whether it’s working with a DP or director, the more involved I can be early on in the conversation, the more seamless the process becomes during post production and ultimately leads to a better end result. In my experience, this type of access is more common when working on music videos.

Most one-off commercial projects see us dealing with an agency more often than the director, but an exception to the rule that comes to mind is on another occasion when I had the chance to collaborate on a project with Spike Jonze for the first ever brand campaign for cannabis retailer MedMen called The New Normal. He placed an important emphasis on grading and was very open to my recommendations and vision.

How do you like getting feedback in terms of the look?
A conversation is always the best way to receive feedback versus a written interpretation of imagery, which tends to become very personal. An example might be when a client wants to create the feeling of a warm climate in a particular scene. Some might interpret that as adding more warm color tones, when in fact, if you think about some of the hottest places you’ve ever visited, the sun shines so fiercely that it casts a bright white hue.

What’s your favorite part of the job?
That’s an easy answer — to me, it’s all about the amazing people you meet in this industry and the creative collaboration that happens as a result. So many of my colleagues over the years have become great friends.

Any least favorites?
There isn’t much that I don’t love about my job, but I have witnessed a change over the years in the way that our industry has begun to undervalue relationships, which I think is a shame.

If you didn’t have this job, what would you be doing instead?
I would be an art teacher. It combines my passion for color and visual inspiration with a forum for sharing knowledge and fostering creativity.

How early did you know this would be your path?
In my early 20s, I started working on dailies (think The Dukes of Hazzard, The Karate Kid, Fantasy Island) at a place in The Valley that had a telecine machine that transferred at a frame rate faster than anywhere else in LA at the time. It was there that I started coloring (without technically realizing that was the job I was doing, or that it was even a profession).

Soon after, I received a call from a company called 525 asking me to join them. They worked on all of the top music videos during the prime “I Want My MTV” era, and after working on music videos as a side hustle at night, I knew that’s where I wanted to be. When I first walked into the building, I was struck by how much more advanced their technology was and immediately felt out of my depth. Luckily, someone saw something in me before I recognized it within myself. I worked on everything from R.E.M.’s “Losing My Religion” to TLC’s “Waterfalls” and The Smashing Pumpkins’ “Tonight, Tonight.” I found such joy in collaborating with some of the most creative and spirited directors in the business, many of whom were inspiring artists, designers and photographers in their spare time.

Where do you find inspiration?
I’m lucky to live in a city like LA with such a rich artistic scene, so I make a point to attend as many gallery openings and exhibitions as I can. Some of my favorite spaces are the Annenberg Space for Photography, the Hammer Museum and Hauser & Wirth. On the weekends I also stop by Arcana bookstore in Culver City, where they source rare books on art and design.

Name three pieces of technology you can’t live without.
I think I would be completely fine if I had to survive without technology.

This industry comes with tight deadlines. How do you de-stress from it all?
After a long day, cooking helps me decompress and express my creativity through a different outlet. I never miss a trip to my local farmer’s market, which also helps to keep me inspired. And when I’m not looking at other people’s art, I’m painting my own abstract pieces at my home studio.


Assimilate intros live grading, video monitoring and dailies tools

Assimilate has launched Live Looks and Live Assist, production tools that give pros speed and specialized features for on-set live grading, look creation, advanced video monitoring and recording.

Live Looks provides an easy-to-set-up environment for video monitoring and live grading that supports any resolution, from standard HD up to 8K workflows. Featuring professional grading and FX/greenscreen tools, it is straightforward to operate and offers a seamless connection into dailies and post workflows. With Live Looks being available on both macOS and Windows, users are, for the first time, free to use the platform and hardware of their choice. You can see their intro video here.

“I interact regularly with DITs to get their direct input about tools that will help them be more efficient and productive on set, and Live Looks and Live Assist are a result of that,” says Mazze Aderhold, product marketing manager at Assimilate. “We’ve bundled unique and essential features with the needed speed to boost their capabilities, and enabling them to contribute to time savings and lower costs in the filmmaking workflow.”

Users can run this on a variety of places — from a  laptop to a full-blown on-set DIT rig. Live Looks provides LUT-box control over Flanders, Teradek and TVLogic devices. It also supports video I/O from AJA, Bluefish444 and Blackmagic for image and full-camera metadata capture. There is also now direct reference recording to Apple ProRes on macOS and Windows.

Live Looks goes beyond LUT-box control. Users can process the live camera feed via video I/O, making it possible to do advanced grading, compare looks, manage all metadata, annotate camera input and generate production reports. Its fully color-managed environment ensures the created looks will come out the same in dailies and post. Live Looks provides a seamless path into dailies and post with look-matching in Scratch and CDL-EDL transfer to DaVinci Resolve.

With Live Looks, Assimilate takes its high-end grading tool set beyond Lift, Gamma, Gain and CDL by adding Powerful Curves and an easy-to-use Color Remapper. On-set previews can encompass not just color but realtime texture effects, like Grain, Highlight Glow, Diffusion and Vignette — all GPU-accelerated.

Advanced chroma keying lets users replace greenscreen backgrounds with two clicks. This allows for proper camera angles, greenscreen tracking/anchor point locations and lighting. As with all Assimilate software, users can load and play back any camera format, including raw formats such as Red raw and Apple ProRes raw.

Live Assist has all of the features of Live Looks but also handles basic video-assist tasks, and like Live Assist, it is available on both macOS and Windows. It provides multicam recording and instant playback of all recorded channels and seamlessly combines live grading with video-assist tasks in an easy-to-use UI. Live Assist automatically records camera inputs to file based on the Rec-flag inside the SDI signal, including all live camera metadata. It also extends the range of supported “edit-ready” capture formats: Apple ProRes (Mov), H264 (MP4) and Avid DNxHD/HR (MXF). Operators can then choose whether they want to record the clean signal or record with the grade baked in.

Both Live Looks and Live Assist are available now. Live Looks starts at $89 per month, and Live Assist starts at $325 per month. Both products and free trials are available on the Assimilate site.


Colorist Chat: Keith Shaw on Showtime’s Homeland and the process

By Randi Altman

The long wait for the final season of Showtime’s Homeland seemed to last an eternity, but thankfully the series is now airing, and we here at postPerspective are pretty jazzed about it. Our favorite spies, Carrie and Saul, are back at it, with this season being set in Afghanistan.

Keith Shaw

Year after year, the writing, production and post values on Homeland have been outstanding. One of those post folks is colorist Keith Shaw from FotoKem’s Keep Me Posted, which focuses on finishing services to television.

Shaw’s credits are impressive. In addition to Homeland, his work can be seen on Ray Donovan, Shameless, Animal Kingdom and many others. We reached out to Shaw to find out more about working on Homeland from the first episode to the last. Shaw shares his workflow and what inspires him.

You’ve been on Homeland since the beginning. Can you describe the look of the show and how you’ve worked with DPs David Klein, ASC, and Giorgio Scali, ASC, as well as producer Katie O’Hara?
Working on Homeland from Episode 1 has been a truly amazing experience. Katie, Dave, Giorgio and I are an extremely collaborative group.

One consistent factor of all eight seasons has been the need for the show to look “real.” We don’t have any drastic or aggressively stylized looks, so the goal is to subtly manipulate the color and mood yet make it distinct enough to help support the storyline.

When you first started on the show, how would you describe the look?
The first two seasons were shot by Nelson Cragg, ASC. For those early episodes, the show was a bit grittier and more desaturated. It had a darker, heavier feel to it. There was not as much detail in the dark areas of the image, and the light fell off more quickly on the edges.

Although the locations and looks have changed over the years, what’s been the common thread?
As I mentioned earlier, the show has a realism to it. It’s not super-stylized and affected.

Do the DPs come to the color suite? What kind of notes do you typically get from them?
They do when they are able (which is not often). They are generally on the other side of the world. As far as notes, it depends on the episode. When I’m lucky, I get none. Generally, there are not a lot of notes. That’s the advantage of collaborating on a show from the beginning. You and the DP can “mold” the look of the show together.

You’ve worked on many episodics at Keep Me Posted. Prior to that you were working on features at Warner Bros. Can you talk about how that process differs for you?
In remastering and restoration of feature films, the production stage is complete. It’s not happening simultaneously, and that means the timeline and deadlines aren’t as stressful.

Digital intermediates on original productions, on the other hand, are similar to television because multiple things are happening all at once. There is an overlap between production and post. During color, the cut can be changing, and new effects could be added or updated, but with much tighter deadlines. DI was a great stepping stone for me to move from feature films to television.

Now let’s talk about some more general aspects of the job…

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
First of all, most people don’t have a clear understanding of what a colorist is or does. Even after 25 years and multiple explanations, my father-in-law still tells everyone I’m an editor.

Being a colorist means you wear many hats — confidante, mediator, therapist, VFX supervisor, scheduler and data manager — in addition to that color thing. For me, it boils down to three main attributes. One, you need to be artistic/creative. Two, you need to be technical. Finally, you need to mediate the decision-making processes. Sometimes that can be the hardest part of all, when there are competing viewpoints and visions between all the parties involved.

WHAT SYSTEM DO YOU WORK ON?
Digital Vision’s Nucoda.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Today’s color correctors are incredibly powerful and versatile. In addition to color, I can do light VFX, beauty work, editing or technical fixes when necessary. The clients appreciate the value of saving time and money by taking care of last-minute issues in the color suite.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Building relationships with clients, earning their trust and helping them bring their vision to the screen. I love that special moment when you and the DP are completely in sync — you’re reaching for the knobs before they even ask for a change, and you are finishing each other’s sentences.

WHAT’S YOUR LEAST FAVORITE?
Deadlines. However, they are actually helpful in my case because otherwise I would tweak and re-tweak the smallest details endlessly.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Ray Donovan, Shameless, Animal Kingdom, Single Parents and Bless This Mess are my current shows.

ANY SUGGESTIONS FOR GETTING THE MOST OUT OF A PROJECT FROM A COLOR PERSPECTIVE?
Become a part of the process as early as possible. Establishing looks, LUTs and good communication with the cinematographer are essential.

HOW DO YOU PREFER THE DP OR DIRECTOR TO DESCRIBE THE LOOK THEY WANT?
Each client has a different source of inspiration and way of conveying their vision. I’ve worked from fabric and paint samples, YouTube videos, photographs, magazine ads, movie or television show references, previous work (theirs and/or mine) and so on.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I can’t pick just one, so I’ll pick two. From my feature mastering work, The Shawshank Redemption. From television, Homeland.

WHERE DO YOU FIND INSPIRATION?
Definitely in photography. My father was a professional photographer and we had our own darkroom. As a kid, I spent countless hours after school and on weekends learning how to plan, take and create great photographs. It is still a favorite hobby of mine to this day.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Director Vincent Lin discusses colorful Seagram’s Escapes spot

By Randi Altman

Valiant Pictures, a New York-based production house, recently produced a commercial spot featuring The Bachelor/Bachelorette host Chris Harrison promoting Seagram’s Escapes and its line of alcohol-based fruit drinks. A new addition to the product line is Tropical Rosé, which was co-developed by Harrison and contains natural passion fruit, dragon fruit and rosé flavors.

Valiant’s Vincent Lin directed the piece, which features Harrison in a tropical-looking room — brightened with sunny pinks and yellows thanks to NYC’s Nice Shoes — describing the rosé and signing off with the Seagram’s Escapes brand slogan, “Keep it colorful!”

Here, director Lin — and his DP Alexander Chinnici — talks about the project’s conception, shoot and post.

How early did you get involved? Did Valiant act as the creative agency on this spot?
Valiant has a long-standing history with the Seagram’s Escapes brand team, and we were fortunate enough to have the opportunity to brainstorm a few ideas with them early on for their launch of Seagram’s Escapes Tropical Rosé with Chris Harrison. The creative concept was developed by Valiant’s in-house creative agency, headed by creative directors Nicole Zizila and Steven Zizila, and me. Seagram’s was very instrumental in the creative for the project, and we collaborated to make sure it felt fresh and new — like an elevated evolution of their “Keep It Colorful” campaign rather than a replacement.

Clearly, it’s meant to have a tropical vibe. Was it shot greenscreen?
We had considered doing this greenscreen, which would open up some interesting options, but also it would pose some challenges. What was important for this campaign creatively was to seamlessly take Chris Harrison to the magical world of Seagram’s Escapes Tropical Rosé. A practical approach was chosen so it didn’t feel too “out of this world,” and the live action still felt real and relatable. We had considered putting Chris in a tropical location — either in greenscreen or on location — but we really wanted to play to Chris’ personality and strengths and have him lead us to this world, rather than throw him into it. Plus, they didn’t sign off on letting us film in the Maldives. I tried (smiles).

L-R: Vincent Lin and Alex Chinnici

What was the spot shot on?
Working with the very talented DP Alex Chinnici, he recommended shooting on the ARRI Alexa for many reasons. I’ll let Alex answer this one.

Alex Chinnici: Some DPs would likely answer with something sexier  like, “I love the look!” But that is ignoring a lot of the technical realities available to us these days. A lot of these cameras are wonderful. I can manipulate the look, so I choose a camera based on other reasons. Without an on-set live, color-capable DIT, I had to rely on the default LUT seen on set and through post. The Alexa’s default LUT is my preference among the digital cameras. For lighting and everyone on the set, we start in a wonderful place right off the bat. Post houses also know it so well, along with colorists and VFX. Knowing our limitations and expecting not to be entirely involved, I prefer giving these departments the best image/file possible.

Inherently, the color, highlight retention and skin tone are wonderful right off the bat without having to bend over backward for anyone. With the Alexa, you end up being much closer to the end rather than having to jump through hoops to get there like you would with some other cameras. Lastly, the reliability is key. With the little time that we had, and a celebrity talent, I would never put a production through the risk of some new tech. Being in a studio, we had full control but still, I’d rather start in a place of success and only make it better from there.

What about the lenses?
Chinnici: I chose the Zeiss Master Primes for similar reasons. While sharp, they are not overbearing. With some mild filtration and very soft and controlled lighting, I can adjust that in other ways. Plus, I know that post will beautify anything that needs it; giving them a clean, sharp image (especially considering the seltzer can) is key.

I shot at a deeper stop to ensure that the lenses are even cleaner and sharper, although the Master Primes do hold up very well wide open. I also wanted the Seagram’s can to be in focus as much as possible and for us to be able to see the set behind Chris Harrison, as opposed to a very shallow depth of field. I also wanted to ensure little to no flares, solid contrast, sharpness across the field and no surprises.

Thanks Alex. Back to you Vincent. How did you work with Alex to get the right look?
There was a lot of back and forth between Alex and me, and we pulled references to discuss. Ultimately, we knew the two most important things were to highlight Chris Harrison and the product. We also knew we wanted the spot to feel like a progression from the brand’s previous work. We decided the best way to do this was to introduce some dimensionality by giving the set depth with lighting, while keeping a clean, polished and sophisticated aesthetic. We also introduced a bit of camera movement to further pull the audience in and to compose the shots it in a way that all the focus would be on Chris Harrison to bring us into that vibrant CG world.

How did you work with Nice Shoes colorist Chris Ryan to make sure the look stayed on point? 
Nice Shoes is always one of our preferred partners, and Chris Ryan was perfect for the job. Our creatives, Nicole and Steven, had worked with him a number of times. As with all jobs, there are certain challenges and limitations, and we knew we had to work fast. Chris is not only detail oriented, creative and a wizard with color correction, but also able to work efficiently.

He worked on a FilmLight Baselight system off the Alexa raw files. The color grading really brought out the saturation to further reinforce the brand’s slogan, “Keep It Colorful,” but also to manage the highlights and whites so it felt inviting and bright throughout, but not at all sterile.

What about the VFX? Can you talk about how that was accomplished? 
Much like the camera work, we wanted to continue giving dimensionality to the spot by having depth in each of our CG shots. Not only depth in space but also in movement and choreography. We wanted the CG world to feel full of life and vibrant in order to highlight key elements of the beverage — the flavors, dragonfruit and passionfruit — and give it a sense of motion that draws you in while making you believe there’s a world outside of it. We wanted the hero to shine in the center and the animation to play out as if a kaleidoscope or tornado was pulling you in closer and closer.

We sought the help of creative production studio Taylor James tto build the CG elements. We chose to work with a core of 3ds Max artists who could do a range of tasks using Autodesk 3ds Max and Chaos Group’s V-Ray (we also use Maya and Arnold). We used Foundry Nuke to composite all of the shots and integrate the CGI into the footage. The 3D asset creation, animation and lighting were constructed and rendered in Autodesk Maya, with compositing done in Adobe After Effects.

One of the biggest challenges was making sure the live action felt connected to the CG world, but with each still having its own personality. There is a modern and clean feel to these spots that we wanted to uphold while still making it feel fun and playful with colors and movement. There were definitely a few earlier versions that we went a bit crazy with and had to scale down a bit.

Does a lot of your work feature live action and visual effects combined?
I think of VFX like any film technique: It’s simply a tool for directors and creatives to use. The most essential thing is to understand the brand, if it’s a commercial, and to understand the story you are trying to tell. I’ve been fortunate to do a number of spots that involve live-action and VFX now, but truth be told, VFX almost always sneaks its way in these days.

Even if I do a practical effect, there are limitless possibilities in post production and VFX. Anything from simple cleanup to enhancing, compositing, set building and extending — it’s all possible. It’d be foolish not to consider it as a viable tool. Now, that’s not to say you should rely solely on VFX to fix problems, but if there’s a way it can improve your work, definitely use it. For this particular project, obviously, the CG was crucial to let us really be immersed in a magical world at the level of realism and proximity we desired.

Anything challenging about this spot that you’d like to share?
Chris Harrison was terrible to work with and refused to wear a shirt for some reason … I’m just kidding! Chris was one of the most professional, humblest and kindest celebrity talents that I’ve had the pleasure to work with. This wasn’t a simple endorsement for him; he actually did work closely with Seagram’s Escapes over several months to create and flavor-test the Tropical Rosé beverage.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Blackmagic releases Resolve 16.2, beefs up audio post tools

Blackmagic has updated its color, edit, VFX and audio post tool to Resolve 16.2. This new version features major Fairlight updates for audio post as well as many improvements for color correction, editing and more.

This new version has major new updates for editing in the Fairlight audio timeline when using a mouse and keyboard. This is because the new edit selection mode unlocks functionality previously only available via the audio editor on the full Fairlight console, so editing is much faster than before. In addition, the edit selection mode makes adding fades and cuts and even moving clips only a mouse click away. New scalable waveforms let users zoom in without adjusting the volume. Bouncing lets customers render a clip with custom sound effects directly from the Fairlight timeline.

Adding multiple clips is also easier, as users can now add them to the timeline vertically, not just horizontally, making it simpler to add multiple tracks of audio at once. Multichannel tracks can now be converted into linked groups directly in the timeline so users no longer have to change clips manually and reimport. There’s added support for frame boundary editing, which improves file export compatibility for film and broadcast deliveries. Frame boundary editing now adds precision so users can easily trim to frame boundaries without having to zoom all the way in the timeline. The new version supports modifier keys so that clips can be duplicated directly in the timeline using the keyboard and mouse. Users can also copy clips across multiple timelines with ease.

Resolve 16.2 also includes support for the Blackmagic Fairlight Sound Library with new support for metadata based searches, so customers don’t need to know the filename to find a sound effect. Search results also display both the file name and description, so finding the perfect sound effect is faster and easier than before.

MPEG-H 3D immersive surround sound audio bussing and monitoring workflows are now supported. Additionally, improved pan and balance behavior includes the ability to constrain panning.

Fairlight audio editing also has index improvements. The edit index is now available in the Fairlight page and works as it does in the other pages, displaying a list of all media used; users simply click on a clip to navigate directly to its location in the timeline. The track index now supports drag selections for mute, solo, record enable and lock as well as visibility controls so editors can quickly swipe through a stack of tracks without having to click on each one individually. Audio tracks can also be rearranged by click and dragging a single track or a group of tracks in the track index.

This new release also includes improvements in AAF import and export. AAF support has been refined so that AAF sequences can be imported directly to the timeline in use. Additionally, if the project features a different time scale, the AAF data can also be imported with an offset value to match. AAF files that contain multiple channels will also be recognized as linked groups automatically. The AAF export has been updated and now supports industry-standard broadcast wave files. Audio cross-fades and fade handles are now added to the AAF files exported from Fairlight and will be recognized in other applications.

For traditional Fairlight users, this new update makes major improvements in importing old legacy Fairlight projects —including improved speed when opening projects with over 1,000 media files, so projects are imported more quickly.

Audio mixing is also improved. A new EQ curve preset for clip EQ in the inspector allows removal of troublesome frequencies. New FairlightFX filters include a new meter plug-in that adds a floating meter for any track or bus, so users can keep an eye on levels even if the monitoring panel or mixer are closed. There’s also a new LFE filter designed to smoothly roll off the higher frequencies when mixing low-frequency effects in surround.

Working with immersive sound workflows using the Fairlight audio editor has been updated and now includes dedicated controls for panning up and down. Additionally, clip EQ can now be altered in the inspector on the editor panel. Copy and paste functions have been updated, and now all attributes — including EQ, automation and clip gain — are copied. Sound engineers can set up their preferred workflow, including creating and applying their own presets for clip EQ. Plug-in parameters can also be customized or added so that users have fast access to their preferred tool set.

Clip levels can now be changed relatively, allowing users to adjust the overall gain while respecting existing adjustments. Clip levels can also be reset to unity, easily removing any level adjustments that might have previously been made. Fades can also be deleted directly from the Fairlight Editor, making it faster to do than before. Sound engineers can also now save their preferred track view so that they get the view they want without having to create it each time. More functions previously only available via the keyboard are now accessible using the panel, including layered editing. This also means that automation curves can now be selected via the keyboard or audio panel.

Continuing on with the extensive improvements to the Fairlight audio, there has also been major updates to the audio editor transport control. Track navigation is now improved and even works when nothing is selected. Users can navigate directly to the timecode entry window above the timeline from the audio editor panel, and there is added support for high-frame-rate timecodes. Timecode entry now supports values relative to the current CTI location, so the playhead can move along the timeline relative to the position rather than a set timecode.

Support has also been added so the colon key can be used in place of the user typing 00. Master spill on console faders now lets users spill out all the tracks to a bus fader for quick adjustments in the mix. There’s also more precision with rotary controls on the panel and when using a mouse with a modifier key. Users can also change the layout and select either icon or text-only labels on the Fairlight editor. Legacy Fairlight users can now use the traditional — and perhaps more familiar — Fairlight layout. Moving around the timeline is even quicker with added support for “media left” and “media right” selection keys to jump the playhead forward and back.

This update also improves editing in Resolve. Loading and switching timelines on the edit page is now faster, with improved performance when working with a large number of audio tracks. Compound clips can now be made from in and out points so that editors can be more selective about which media they want to see directly in the edit page. There is also support for previewing timeline audio when performing live overwrites of video-only edits. Now when trimming, the duration will reflect the clip duration as users actively trim, so they can set a specific clip length. Support for a change transition duration dialogue.

The media pool now includes metadata support for audio files with up to 24 embedded channels. Users can also duplicate clips and timelines into the same bin using copy and paste commands. Support for running the primary DaVinci Resolve screen as a window when dual-screen mode is enabled. Smart filters now let users sort media based on metadata fields, including keywords and people tags, so users can find the clips they need faster.

Video Coverage: HPA Tech Retreat’s making of The Lost Lederhosen

By Randi Altman

At the HPA Tech Retreat in Rancho Mirage, California, the Supersession was a little different this year. Under the leadership of Joachim (JZ) Zell — who you might know from his day job as VP of technology at EFILM — the Supersession focused on the making of the short film, The Lost Lederhosen, in “near realtime,” in the desert. And postPerspective was there, camera in hand, to interview a few of the folks involved.

Watch our video coverage here.

While production for the film began a month before the Retreat — with Steve Shaw, ASC, directing and DP Roy H. Wagner Jr., ASC, lending his cinematography talents — some scenes were shot the morning of the session with data transfer taking place during lunch and post production in the afternoon. Peter Moss, ASC, and Sam Nicholson, ASC, also provided their time and expertise. After an active day of production, cloud-based post and extreme collaboration, the Supersession ended with the first-ever screening of The Lost Lederhosen, the story of Helga and her friend Hans making their way to Los Angeles, Zell and the HBA (Hollywood Beer Alliance). Check out HPA’s trailer here.

From acquisition to post (and with the use of multiple camera formats, framefrates and lenses), the film’s crew were volunteers and includes creatives and technologists from companies such as AWS, Colorfront, Frame.io, Avid, Blackmagic, Red, Panavision, Zeiss, FilmLight, SGO, Stargate, Unreal Engine, Sohonet and many more. One of the film’s goals was to use the cloud as much as possible in order to test out that particular workflow. While there were some minor hiccups along the way, the film got made — at the HPA Tech Retreat — and these industry pros got smarter about working in the cloud, something that will be increasingly employed going forward.

While we were were only able to chat with a handful of those pros involved, like any movie, the list of credits and thank you’s are too extensive to mention here — there were dozens of individuals and companies who donated their services and time to make this possible.

Watch our video coverage here.

(A big thank you and shout out to Twain Richardson for editing our videos.)

Main Image Caption: AWS’ Jack Wenzinger and EFILM’s Joachim Zell

Senior colorist Tony D’Amore joins Picture Shop

Burbank’s Picture Shop has beefed up its staff with senior colorist Tony D’Amore, who will also serve as a director of creative workflow. In that role, he will oversee a team focusing on color prep and workflow efficiency.

Originally from rural Illinois, D’Amore made the leap to the West Coast to pursue an education, studying film and television at UCLA. He started his career in color in the early ‘90s, gaining valuable experience in the world of post. He has been working closely with color and post workflow since.

While D’Amore has experience working on Autodesk Lustre and FilmLight Baselight, he primarily grades in Blackmagic DaVinci Resolve. D’Amore has contributed color to several Emmy Award-winning shows nominated in the category of “Outstanding Cinematography.”

D’Amore has developed new and efficient workflows for Dolby Vision HDR and HDR10, coloring hundreds of hours of episodic programming for networks including CBS, ABC and Fox, as well as cable and streaming platforms such as HBO, Starz, Netflix, Hulu and Amazon.

D’Amore’s most notable project to date is having colored a Marvel series simultaneously for IMAX and ABC delivery. His list of color credits include, Barry (HBO), Looking for Alaska (Hulu), Legion (FX), Carnival Row (Amazon), Power (Starz), Fargo (FX), Elementary (CBS), Hanna (Amazon), and a variety of Marvel series, including Jessica Jones, Daredevil, The Defenders, Luke Cage and Iron Fist. All of these are available on streaming platforms

Amazon’s The Expanse Season 4 gets HDR finish

The fourth season of the sci-fi series The Expanse was finished in HDR for the first time streaming via Amazon Prime Video. Deluxe Toronto handled end-to-end post services, including online editorial, sound remixing and color grading. The series was shot on ARRI Alexa Minis.

In preparation for production, cinematographer Jeremy Benning, CSC, shot anamorphic test footage at a quarry that would serve as the filming stand-in for the season’s new alien planet, Ilus. Deluxe Toronto senior colorist Joanne Rourke then worked with Benning, VFX supervisor Bret Culp, showrunner Naren Shankar and series regular Breck Eisner to develop looks that would convey the location’s uninviting and forlorn nature, keeping the overall look desaturated and removing color from the vegetation. Further distinguishing Ilus from other environments, production chose to display scenes on or above Ilus in a 2.39 aspect ratio, while those featuring Earth and Mars remained in a 16:9 format.

“Moving into HDR for Season 4 of our show was something Naren and I have wanted to do for a couple of years,” says Benning. “We did test HDR grading a couple seasons ago with Joanne at Deluxe, but it was not mandated by the broadcaster at the time, so we didn’t move forward. But Naren and I were very excited by those tests and hoped that one day we would go HDR. With Amazon as our new home [after airing on Syfy], HDR was part of their delivery spec, so those tests we had done previously had prepared us for how to think in HDR.

“Watching Season 4 come to life with such new depth, range and the dimension that HDR provides was like seeing our world with new eyes,” continues Benning. “It became even more immersive. I am very much looking forward to doing Season 5, which we are shooting now, in HDR with Joanne.”

Rourke, who has worked on every season of The Expanse, explains, “Jeremy likes to set scene looks on set so everyone becomes married to the look throughout editorial. He is fastidious about sending stills each week, and the intended directive of each scene is clear long before it reaches my suite. This was our first foray into HDR with this show, which was exciting, as it is well suited for the format. Getting that extra bit of detail in the highlights made such a huge visual impact overall. It allowed us to see the comm units, monitors, and plumes on spaceships as intended by the VFX department and accentuate the hologram games.”

After making adjustments and ensuring initial footage was even, Rourke then refined the image by lifting faces and story points and incorporating VFX. This was done with input provided by producer Lewin Webb; Benning; cinematographer Ray Dumas, CSC; Culp or VFX supervisor Robert Crowther.

To manage the show’s high volume of VFX shots, Rourke relied on Deluxe Toronto senior online editor Motassem Younes and assistant editor James Yazbeck to keep everything in meticulous order. (For that they used the Grass Valley Rio online editing and finishing system.) The pair’s work was also essential to Deluxe Toronto re-recording mixers Steve Foster and Kirk Lynds, who have both worked on The Expanse since Season 2. Once ready, scenes were sent in HDR via Streambox to Shankar for review at Alcon Entertainment in Los Angeles.

“Much of the science behind The Expanse is quite accurate thanks to Naren, and that attention to detail makes the show a lot of fun to work on and more engaging for fans,” notes Foster. “Ilus is a bit like the wild west, so the technology of its settlers is partially reflected in communication transmissions. Their comms have a dirty quality, whereas the ship comms are cleaner-sounding and more closely emulate NASA transmissions.”

Adds Lynds, “One of my big challenges for this season was figuring out how to make Ilus seem habitable and sonically interesting without familiar sounds like rustling trees or bird and insect noises. There are also a lot of amazing VFX moments, and we wanted to make sure the sound, visuals and score always came together in a way that was balanced and hit the right emotions story-wise.”

Foster and Lynds worked side by side on the season’s 5.1 surround mix, with Foster focusing on dialogue and music and Lynds on sound effects and design elements. When each had completed his respective passes using Avid ProTools workstations, they came together for the final mix, spending time on fine strokes, ensuring the dialogue was clear, and making adjustments as VFX shots were dropped in. Final mix playbacks were streamed to Deluxe’s Hollywood facility, where Naren could hear adjustments completed in real time.

In addition to color finishing Season 4 in HDR, Rourke also remastered the three previous seasons of The Expanse in HDR, using her work on Season 4 as a guide and finishing with Blackmagic DaVinci Resolve 15. Throughout the process, she was mindful to pull out additional detail in highlights without altering the original grade.

“I felt a great responsibility to be faithful to the show for the creators and its fans,” concludes Rourke. “I was excited to revisit the episodes and could appreciate the wonderful performances and visuals all over again.”

DP Chat: Watchmen cinematographer Greg Middleton

By Randi Altman

HBO’s Watchmen takes us to new dimensions in this recent interpretation of the popular graphic novel. In this iteration, we spend a lot of our time in Tulsa, Oklahoma, getting to know Regina King’s policewoman Angela Abar, her unconventional family and a shadowy organization steeped in racism called the Seventh Kavalry. We also get a look back — beautiful in black and white — at Abar’s tragic family back story. It was created and written for TV by Lost veteran Damon Lindelof.

Greg Middleton

Greg Middleton, ASC, CSC, who also worked on Game of Thrones and The Killing, was the series cinematographer. We reached out to him to find out about his process, workflow and where he gets inspiration.

When were you brought on to Watchmen, and what type of looks did the showrunner want from the show?
I joined Watchmen after the pilot for Episode 2. A lot of my early prep was devoted to discussions with the showrunner and producing directors on how to develop the look from the pilot going forward. This included some pilot reshoots due to changes in casting and the designing and building of new sets, like the police precinct.

Nicole Kassell (director of Episodes 1, 2 and 8) and series production designer Kristian Milstead and I spent a lot of time breaking down the possibilities of how we could define the various worlds through color and style.

How was the look described to you? What references were you given?
We based the evolution of the look of the show on the scripts, the needs of the structure within the various worlds and on the graphic novel, which we commonly referred to as “the Old Testament.”

As you mention, it’s based on a graphic novel. Did the look give a nod to that? If so, how? Was that part of the discussion?
We attempted to break down the elements of the graphic novel that might translate well and those that would not. It’s an interesting bit of detective work because a lot of the visual cues in the comic are actually a commentary on the style of comics at the time it was published in 1985.

Those cues, if taken literally, would not necessarily work for us, as their context would not be clear. Things like color were very referential to other comics of the time. For example, they used only secondary color instead of primaries as was the norm. The graphic novel is also a film noir in many ways, so we got some of our ideas based on that.

What did translate well were compositional elements — tricks of transition like match cuts and the details of story in props, costumes and sets within each frame. We used some split diopters and swing shift lenses to give us some deep focus effects for large foreground objects. In the graphic novel, of course, everything is in focus, so those type of compositions are common!

This must have been fun because of the variety of looks the series has — the black-and-white flashbacks, the stylized version of Tulsa, the look of the mansion in Wales (Europa), Vietnam in modern day. Can you talk about each of the different looks?
Yes, there were so many looks! When we began prep on the series with the second episode, we were also simultaneously beginning to film the scenes in Wales for the “blond man” scenes. We knew that that storyline would have its own particular feel because of the location and its very separateness from the rest of the world.

A more classic traditional proscenium-like framing and style seemed very appropriate. Part of that intent was designed to both confuse and to make very apparent to the audience that we were definitely in another world. Cinematographer Chris Seager, BSC, was filming those scenes as I was still doing tests for the other looks and the rest of our show in Atlanta.

We discussed lenses, camera format, etc. The three major looks we had to design that we knew would go together initially were our “Watchmen” world, the “American hero story” show within the show, and the various flashbacks to 1921 Tulsa and World War I. I was determined to make sure that the main world of the show did not feel overly processed and colorized photographically. We shot many tests and developed a LUT that was mostly film-like. The other important aspects to creating a look are, of course, art direction and production design, and I had a great partner in Kristian Milstead, the production designer who joined the show after the pilot.

This was a new series. Do you enjoy developing the look of a show versus coming on board after the look was established?
I enjoy helping to figure out how to tell the story. For series, helping develop the look photographically in the visual strategy is a big part of that. Even if some of those are established, you still do similar decision-making for shooting individual scenes. However, I much prefer being engaged from the beginning.

So even when you weren’t in Wales, you were providing direction?
As I mentioned earlier, Chris Seager and I spoke and emailed regarding lenses and those choices. It was still early for us in Atlanta, but there were preliminary decisions to be made on how the “blond man” (our code name for Jeremy Irons) world would be compared to our Watchmen world. What I did was consult with my director, Nicole Kassell, on her storyboards for her sequences in Wales.

Were there any scenes or looks that stood out as more challenging than others? Can you describe?
Episode 106 was a huge challenge. We have a lot of long takes involving complex camera moves and dimmer cues as a camera would circle or travel between rooms. Also, we developed the black-and-white look to feel like older black-and-white film.
One scene in June’s apartment involved using the camera on a small Scorpio 10-foot crane and a mini libre head to accomplish a slow move around the room. Then we had to push between her two actors toward the wall as an in-camera queue of a projected image of the black-and-white movie Trust in the Law reveals itself with a manual iris.

This kind of shot ends up being a dance with at least six people, not including the cast. The entire “nostalgia” part of the episode was done this way. And none of this would’ve been possible without incredible cast being able to hit these incredibly long takes and choreograph themselves with the camera. Jovan Adepo and Danielle Deadwyler were incredible throughout the episode.

I assume you did camera tests. Why did you choose the ARRI Alexa? Why was it right for this? What about lenses, etc.?
I have been working with the Alexa for many years now, so I was aware of what I could do with the camera. I tested a couple of others, but in the end the Alexa Mini was the right choice for us. I also needed a camera that was small so I could go on and off of a gimbal or fit into small places.

How did you work with the colorist? Who was that on this show? Were you in the suite with them?
Todd Bochner was our final colorist at Sim in LA. I shot several camera tests and worked with him in the suite to help develop viewing LUTs for the various worlds of the show. We did the same procedure for the black and white. In the end, we mimicked some techniques similar to black-and-white film (like red filters), except for us, it was adjusting the channels accordingly.

Do you know what they used on the color?
Yes, it was Blackmagic DaVinci Resolve 16.

How did you get interested in cinematography?
I was always making films as a kid, and then in school and then in university. In film school, at some point splitting apart the various jobs, I seemed to have some aptitude for the cinematography, so after school I decided to try making my focus. I came to it more out of a love of storytelling and filmmaking and less about photography.

Greg Middleton

What inspires you? Other films?
Films that move me emotionally.

What’s next for you?
A short break! I’ve been very fortunate to have been working a lot lately. A film I shot just before Watchmen called American Woman, directed by Semi Chellas, should be coming out this year.

And what haven’t I asked that’s important?
I think the question all filmmakers should ask themselves is, “Why am I telling this story, and what is unique about the way in which I’m telling it?”


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Colorist Chat: Nice Shoes’ Maria Carretero on Super Bowl ads, more

This New York-based colorist, who worked on four Super Bowl spots this year, talks workflow, inspiration and more.

Name: Maria Carretero

Company: Nice Shoes

What kind of services does Nice Shoes offer?
Nice Shoes is a creative studio with color, editorial, animation, VFX, AR and VR services. It’s a full-service studio with offices in NYC, Chicago, Boston, Minneapolis and Toronto, as well as remote locations throughout North America.

Michelob Ultra’s Jimmy Works It Out

As a colorist, what would surprise people the most about what falls under that title?
I think people are surprised when they discover that there is a visual language in every single visual story that connects your emotions through all the imagery that we’ve collected in our brains. This work gives us the ability to nudge the audience emotionally over the course of a piece. Color grading is rooted in a very artistic base — core, emotional aspects that have been studied in art and color theory that make you explore cinematography in such an interesting way.

What system do you work on?
We use FilmLight Baselight as our primary system, but the team is also versed in Blackmagic Resolve.

Are you sometimes asked to do more than just color on projects?
Sometimes. If you have a solid relationship with the DP or the director, they end up consulting you about palettes, optics and references, so you become an active part of the creativity in the film, which is very cool. I love when I can get involved in projects from the beginning.

What’s your favorite part of the job?
My favorite moment is when you land on the final look and you see that the whole film is making visual sense and you feel that the story, the look and the client are all aligned — that’s magic!

Any least favorites?
No, I love coloring. Sometimes the situation becomes difficult because there are technical issues or disagreements, but it’s part of the work to push through those moments and make things work

If you didn’t have this job, what would you be doing instead?
I would probably be a visual artist… always struggling to keep the lights on. I’m kidding! I have so much respect for visual artists, I think they should be treated better by our society because without art there is no progress.

How early did you know this would be your path?
I was a visual artist for seven years. I was part of Nives Fernandez’s roster, and all that I wanted at that time was to try to tell my stories as an artist. I was freelancing in VFX to get some money that helped me survive, and I landed on the VFX side, and from there to color was a very easy switch. When I landed in Deluxe Spain 16 years ago and started to explore color, I quickly fell in love.

It’s why I like to say that color chose me.

Avocados From Mexico: Shopping Network

You recently worked on a number of Super Bowl spots. Can you talk a bit about your work on them, and any challenges relating to deadlines?
This year I worked on four Super Bowl spots Michelob Ultra PureGold: 6 for 6 Pack, Michelob Ultra: Jimmy Works It Out, Walmart: United Towns and Avocados From Mexico: Shopping Network.

Working on these kinds of projects is definitely a really interesting experience. The deadlines are tight, the pressure is enormous, but at the same time, the amount of talent and creativity involved is gigantic, so if you survive (laughs) you always will be a better professional. As a colorist I love to be challenged. I love dealing with difficult situations where all your resources and your energy is being put to the test.

Any suggestions for getting the most out of a project from a color perspective?
Thousands! Technical understanding, artistic involvement, there are so many… But definitely trying to create something new, special, different; embracing the challenges and pushing beyond the boundaries are the keys to delivering good work.

How do you prefer to work with the DP or director?
I like working with both. Debating with any kind of artist is the best. It’s really great to be surrounded by someone that uses a common “language.” As I mentioned earlier, I love when there’s the opportunity to get the conversation going at the beginning of a project so that there’s more opportunity for collaboration, debate and creativity.

How do you like getting feedback in terms of the look? Photos, films, etc.?
Every single bit of information is useful. I love when they verbalize what they’re going for using stories, feelings — when you can really feel they’re expressing personality with the film.

Where do you find inspiration? Art? Photography?
I find inspiration in living! There are so many things that surround us that can be a source of inspiration. Art, landscapes, the light that you remember from your childhood, a painting, watching someone that grabs your attention on a train. New York is teeming with more than enough life and creativity to keep any artist going.

Name three pieces of technology you can’t live without.
The Tracker, Spotify and FaceTime.

This industry comes with tight deadlines. How do you de-stress from it all?
I have a sense of humor and lots of red wine (smiles).

MPI restores The Wizard of Oz in 4K HDR

By Barry Goch

The classic Victor Fleming-directed film The Wizard of Oz, which was released by MGM in 1939 and won two of its six Academy Award nominations, has been beautifully restored by Burbank’s Warner Bros. Motion Picture Imaging (MPI).

Bob Bailey

To share its workflow on the film, MPI invited a group of journalists to learn about the 4K UHD HDR restoration of this classic film. The tour guide for our high-tech restoration journey was MPI’s VP of operations and sales Bob Bailey, who walked us through the entire restoration process — from the original camera negative to final color.

The Wizard of Oz, which starred Judy Garland, was shot on a Technicolor three-strip camera system. According to Bailey, it ran three black and white negatives simultaneously. “That is why it is known as three-strip Technicolor. The magazine on top of the camera was triple the width of a normal black and white camera because it contained each roll of negative to capture your red, green and blue records,” explained Bailey.

“When shooting in Technicolor, you weren’t just getting the camera. You would rent a package that included the camera, a camera crew with three assistants, the film, the processing and a Technicolor color consultant.”

George Feltenstein, SVP of theatrical catalog marketing for Warner Bros. Home Entertainment, spoke about why the film was chosen for restoration. “The Wizard of Oz is among the crown jewels that we hold,” he said. “We wanted to embrace the new 4K HDR technology, but nobody’s ever released a film that old using this technology. HDR, or high dynamic range, has a color range that is wider than anything that’s come before it. There are colors [in The Wizard of Oz] that were never reproducible before, so what better a film to represent that color?”

Feltenstein went on to explain that this is the oldest film to get released in the 4K format. He hopes that this is just the beginning and that many of the films in Warner Bros.’ classic library will also be released on 4K HDR and worked on at MPI under Bailey’s direction.

The Process
MPI scanned each of the three-strip Technicolor nitrate film negatives at 8K 16-bit, composited them together and then applied a new color grain. The film was rescanned with the Lasergraphics Director 10K scanner. “We have just under 15 petabytes of storage here,” said Bailey. “That’s working storage, because we’re working on 8K movies since [some places in the world] are now broadcasting 8K.”

Steven Anastasi

Our first stop was to look at the Lasergraphics Director. We then moved on to MPI’s climate-controlled vault, where we were introduced to Steven Anastasi, VP of technical operations at Warner Bros. Anastasi explained that the original negative vault has climate-controlled conditions with 25% humidity at 35 degrees Fahrenheit, which is the combination required for keeping these precious assets safe for future generations. He said there are 2 million assets in the building, including picture and sound.

It was amazing to see film reels for 2001: A Space Odyssey sitting on a shelf right in front of me. In addition to the feature reels, MPI also stores millions of negatives captured throughout the years by Warner productions. “We also have a very large library,” reported Anastasi. “So the original negatives from the set, a lot of unit photography, head shots in some cases and so forth. There are 10 million of these.”

Finally, we were led into the color bay to view the film. Janet Wilson, senior digital colorist at MPI, has overseen every remaster of The Wizard of Oz for the past 20 years. Wilson used a FilmLight Baselight X system for the color grade. The grading suite housed multiple screens: a Dolby Pulsar for the Dolby Vision pass, a Sony X300 and a Panasonic EZ1000 OLED 4K HDR.

“We have every 4K monitor manufactured, and we run the film through all of them,” said Bailey. “We painstakingly go through the process from a post perspective to make sure that our consumers get the best quality product that’s available out in the marketplace.”

“We want the consumer experience on all monitors to be something that’s taken into account,” added Feltenstein. “So we’ve changed our workflow by having a consumer or prosumer monitor in these color correction suites so the colorist has an idea of what people are going to see at home, and that’s helped us make a better product.”

Our first view of the feature was a side-by-side comparison of the black and white scanned negative and the sepia color corrected footage. The first part of the film, which takes place in Kansas, was shot in black and white, and then a sepia look was applied to it. The reveal scene, when Dorothy passes through the door going into Oz, was originally shot in color. For this new release, the team generated a matte so Wilson could add this sepia area to the inside of the house as Dorothy transitioned into Oz.

“So this is an example of some of the stuff that we could do in this version of the restoration,” explained Wilson. “With this version, you can see that the part of the image where she’s supposed to be in the monochrome house is not actually black and white. It was really a color image. So the trick was always to get the interior of the house to look sepia and the exterior to look like all of the colors that it’s supposed to. Our visual effects team here at MPI — Mike Moser and Richie Hiltzik — was able to draw a matte for me so that I could color inside of the house independently of the exterior and make them look right, which was always a really tricky thing to do.”

Wilson referred back to the Technicolor three-strip, explaining that because you’ve got three different pieces of film — the different records — they’re receiving the light in different ways. “So sometimes one will be a little brighter than the other. One will be a little darker than the other, which means that the Technicolor is not a consistent color. It goes a little red, and then it goes a little green, and then it goes a little blue, and then it goes a little red again. So if you stop on any given frame, it’s going to look a little different than the frames around it, which is one of the tricky parts of color correcting technical art. When that’s being projected by a film projector, it’s less noticeable than when you’re looking at it on a video monitor, so it takes a lot of little individual corrections to smooth those kinds of things out.”

Wilson reported seeing new things with the 8K scan and 4K display. “The amount of detail that went into this film really shows up.” She said that one of the most remarkable things about the restoration was the amazing detail visible on the characters. For the first time in many generations, maybe ever, you can actually see the detail of the freckles on Dorothy’s face.

In terms of leveraging the expanded dynamic range of HDR, I asked Wilson if she tried to map the HDR, like in kind of a sweet spot, so that it’s both spectacular yet not overpowering at the same time.

“I ended up isolating the very brightest parts of the picture,” she replied. “In this case, it’s mostly the sparkles on their shoes and curving those off so I could run those in, because this movie is not supposed to have modern-day animation levels of brightness. It’s supposed to be much more contained. I wanted to take advantage of brightness and the ability to show the contrast we get from this format, because you can really see the darker parts of the picture. You can really see detail within the Wicked Witch’s dress. I don’t want it to look like it’s not the same film. I want it to replicate that experience of the way this film should look if it was projected on a good print on a good projector.”

Dorothy’s ruby slippers also presented a challenge to Wilson. “They are so red and so bright. They’re so light-reflective, but there were times when they were just a little too distracting. So I had to isolate this level at the same track with slippers and bring them down a little bit so that it wasn’t the first and only thing you saw in the image.”

If you are wondering if audio was part of this most recent restoration, the answer is no, but it had been remastered for a previous version. “As early at 1929, MGM began recording its film music using multiple microphones. Those microphonic angles allowed the mixer to get the most balanced monophonic mix, and they were preserved,” explained Feltenstein. “Twenty years ago, we created a 5.1 surround mix that was organically made from the original elements that were created in 1939. It is full-frequency, lossless audio, and a beautiful restoration job was made to create that track so you can improve upon what I consider to be close to perfection without anything that would be disingenuous to the production.”

In all, it was an amazing experience to go behind the scenes and see how the wizards of MPI created a new version of this masterpiece for today and preserved it for future generations.

This restored version of The Wizard of Oz is a must-see visual extravaganza, and there is no better way to see it than in UHD, HDR, Dolby Vision or HDR10+. What I saw in person took my breath away, and I hope every movie fan out there can have the opportunity to see this classic film in its never-before-seen glory.

The 4K version of The Wizard of Oz is currently available via an Ultra HD Blu-ray Combo Pack and digital.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Sony adds 4K HDR reference monitors to Trimaster range

Sony is offering a new set of high-grade 4K HDR monitors as part of its Trimaster range. The PVM-X2400 (24-inch) and the PVM-X1800 (18.4-inch) professional 4K HDR monitors were demo’d at the BSC Expo 2020 in London. They will be available in the US starting in July.

The monitors provide ultra-high definition with a resolution of 3840×2160 pixels and a brightness of all-white luminance of 1000 cd/m2. For optimum film production, their wide color gamut matches the BVM-HX310 Trimaster HX master monitor. This means both monitors feature accurate color reproduction and greyscale, which helps filmmakers make critical imaging decisions and deploy faithful color matching throughout the workflow.

The monitors, which are small and portable, are designed to expand footprints in 4K HDR production, including applications such as on-set monitoring, nonlinear video editing, studio wall monitoring and rack-mount monitoring in OB trucks or machine rooms.

The monitors also feature new Black Detail High/Mid/Low, which helps maintain accurate color reproduction by reducing the brightness of the backlight to reproduce the correct colors and gradations in low-luminance areas. Another new function, Dynamic Contrast Drive, changes backlight luminance to adapt to each scene or frame when transferring images from PVM-X2400/X1800 to an existing Sony OLED monitor.  This functionality allows filmmakers to check the highlight and low-light balance of the contents with both bright and dark scenes.

Other features include:
• Dynamic contrast ratio of 1,000,000:1 by Dynamic Contrast Drive, a new backlight driving system that dynamically changes the backlight luminance to adapt for each frame of a scene.
• 4K/HD scopes with HDR scales that are waveform/vector.
• Quad View display and User 3D LUT functionality.
• 12G/6G/3G/HD-SDI with auto configuration.

Marriage Story director Noah Baumbach

By Iain Blair

Writer/director Noah Baumbach first made a name for himself with The Squid and the Whale, his 2005 semi-autobiographical, bittersweet story about his childhood and his parents’ divorce. It launched his career, scoring him an Oscar nomination for Best Original Screenplay.

Noah Baumbach

His latest film, Marriage Story, is also about the disintegration of a marriage — and the ugly mechanics of divorce. Detailed and emotionally complex, the film stars Scarlett Johansson and Adam Driver as the doomed couple.

In all, Marriage Story scooped up six Oscar nominations — Best Picture, Best Actress, Best Actor, Best Supporting Actress, Best Original Screenplay and Best Original Score. Laura Dern walked away with a statue for her supporting role.

The film co-stars Dern, Alan Alda and Ray Liotta. The behind-the-scenes team includes director of photography Robbie Ryan, editor Jennifer Lame and composer Randy Newman.

Just a few days before the Oscars, Baumbach — whose credits also include The Meyerwitz Stories, Frances Ha and Margot at the Wedding — talked to me about making the film and his workflow.

What sort of film did you set out to make?
It’s obviously about a marriage and divorce, but I never really think about a project in specific terms, like a genre or a tone. In the past, I may have started a project thinking it was a comedy but then it morphs into something else. With this, I just tried to tell the story as I initially conceived it, and then as I discovered it along the way. While I didn’t think about tone in any general sense, I became aware as I worked on it that it had all these different tones and genre elements. It had this flexibility, and I just stayed open to all those and followed them.

I heard that you were discussing this with Adam Driver and Scarlett Johansson as you wrote the script. Is that true?
Yes, but it wasn’t daily. I’d reached out to both of them before I began writing it, and luckily they were both enthusiastic and wanted to do it, so I had them as an inspiration and guide as I wrote. Periodically, we’d get together and discuss it and I’d show them some pages to keep them in the loop. They were very generous with conversations about their own lives, their characters. My hope was that when I gave them the finished script it would feel both new and familiar.

What did they bring to the roles?
They were so prepared and helped push for the truth in every scene. Their involvement from the very start did influence how I wrote their roles. Nicole has that long monologue and I don’t know if I’d have written it without Scarlett’s input and knowing it was her. Adam singing “Being Alive” came out of some conversations with him. They’re very specific elements that come from knowing them as people.

You reunited with Irish DP Robbie Ryan, who shot The Meyerowitz Stories. Talk about how you collaborated on the look and why you shot on film?
I grew up with film and feel it’s just the right medium for me. We shot The Meyerowitz Stories on Super 16, and we shot this on 35mm, and we had to deal with all these office spaces and white rooms, so we knew there’d be all these variations on white. So there was a lot of discussion about shades and the palette, along with the production and costume designers, and also how we were going to shoot these confined spaces, because it was what the story required.

You shot on location in New York and LA. How tough was the shoot?
It was challenging, but mainly because of the sheer length of many of the scenes. There’s a lot of choreography in them, and some are quite emotional, so everyone had to really be up for the day, every day. There was no taking it easy one day. Every day felt important for the movie.

Where did you do the post?
All in New York. I have an office in the Village where I cut my last two films, and we edited there again. We mixed on the Warner stage, where I’ve mixed most of my movies. We recorded the music and orchestra in LA.

Do you like the post process?
I really love it. It’s the most fun and the most civilized part of the whole process. You go to work and work on the film all day, have dinner and go home. Writing is always a big challenge, as you’re making it up as you go along, and it can be quite agonizing. Shooting can be fun, but it’s also very stressful trying to get everything you need. I love working with the actors and crew, but you need a high level of energy and endurance to get through it. So then post is where you can finally relax, and while problems and challenges always arise, you can take time to solve them. I love editing, the whole rhythm of it, the logic of it.

_DSC4795.arw

Talk about editing with Jennifer Lame. How did that work?
We work so well together, and our process really starts in the script stage. I’ll give her an early draft to get her feedback and, basically, we start editing the script. We’ll go through it and take out anything we know we’re not going to use. Then during the shoot she’ll sometimes come to the set, and we’ll also talk twice a day. We’ll discuss the day’s work before I start, and then at lunch we’ll go over the previous day’s dailies. So by the time we sit down to edit, we’re really in sync about the whole movie. I don’t work off an assembly, so she’ll put together stuff for herself to let me know a scene is working the way we designed it. If there’s a problem, she’ll let me know what we need.

What were the big editing challenges?
Besides the general challenges of getting a scene right, I think for some of the longer ones it was all about finding the right rhythm and pacing. And it was particularly true of this film that the pace of something early on could really affect something later. Then you have to fix the earlier bit first, and sometimes it’s the scene right before. For instance, the scene where Charlie and Nicole have a big argument that turns into a very emotional fight is really informed by the courtroom scene right before it. So we couldn’t get it right until we’d got the courtroom scene right.

A lot of directors do test screenings. Do you?
No, I have people I show it to and get feedback, but I’ve never felt the need for testing.

VFX play a role. What was involved?
The Artery did them. For instance, when Adam cuts his arm we used VFX in addition to the practical effects, and then there’s always cleanup.

Talk about the importance of sound to you as a filmmaker, as it often gets overlooked in this kind of film.
I’m glad you said that because that’s so true, and this doesn’t have obvious sound effects. But the sound design is quite intricate, and Chris Scarabosio (working out of Skywalker Sound), who did Star Wars, did the sound design and mix; he was terrific.

A lot of it was taking the real-world environments in New York and LA and building on that, and maybe taking some sounds out and playing around with all the elements. We spent a lot of time on it, as both the sound and image should be unnoticed in this. If you start thinking, “That’s a cool shot or sound effect,” it takes you out of the movie. Both have to be emotionally correct at all times.

Where did you do the DI and how important is it to you?
We did it at New York’s Harbor Post with colorist Marcy Robinson, who’s done several of my films. It’s very important, but we didn’t do anything too extreme, as there’s not a lot of leeway for changing the look that much. I’m very happy with the look and the way it all turned out.

Congratulations on all the Oscar noms. How important is that for a film like this?
It’s a great honor. We’re all still the kids who grew up watching movies and the Oscars, so it’s a very cool thing. I’m thrilled.

What’s next?
I don’t know. I just started writing, but nothing specific yet.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Visible Studios produces, posts Dance Monkey music video

If you haven’t heard about the Dance Monkey song by Tones and I, you soon will.  Australia’s Visible Studios provided production and post on the video to go with the song that has hit number one in more than 30 countries, went seven times platinum and remained at the top of the charts in Australia for 22 weeks. The video has been viewed on YouTube more than half a billion times.

Visible Studios, a full production and post company, is run by producer Tim Whiting and director and editor Nick Kozakis. The company features a team of directors, scriptwriters, designers, motion graphic artists and editors working on films, TV commercials and music videos.

For Dance Monkey, Visible Studios worked directly with Tones and I to develop the idea for the video. The video, which was shot on Red cameras at the beginning of the song’s meteoric rise, was completed in less than a week and on a small budget.

“The Dance Monkey music video was made on an extremely quick turnaround,” says Whiting. “[Tones] was blowing up at the time, and they needed the music video out fast. The video was shot in one day, edited in two, with an extra day and a half for color and VFX.”  Visible Studios called on Blackmagic Resolve studio for edit, VFX and color.

Dance Monkey features the singer dressed as Old Tones, an elderly man whisked away by his friends to a golf course to dance and party. On the day of production, the sun was nowhere to be found, and each shot was done against a gray and dismal background. To fix this, the team brought in a sky image as a matte and used Resolve’s match move tool, keyer, lens blur and power windows to turn gray footage to brilliant sunshine.

“In post we decided to replace the overcast skies with a cloudy blue sky. We ended up doing this all in Resolve’s color page and keyed the grass and plants to make them more lush, and we were there,” says Whiting.

Editor/directors Kozakis and Liam Kelly used Resolve for the entire editing process. “Being able to edit 6K raw footage smoothly on a 4K timeline, at a good quality debayer, means that we don’t have to mess around with proxies and that the footage gets out of the way of the editing process. The recent update for decompression and debayer on Nvidia cards has made this performance even better,” Kozakis says.

 

Review: Neat Video 5 noise reduction plugin

By Brady Betzel

One of the best (and most underrated) tricks in an online editor’s tool kit is to have good image restoration techniques. Removing digital video imperfections — from flicker to digital video noise — is not easy, and not easy to do well. That is, unless you have good noise reduction software like Neat Video.

While Neat Video might not be that well-known, once you see how simply (or intricatly) Neat Video 5 works inside of apps like Blackmagic’s DaVinci Resolve, it will be hard to forget the company’s name.

(While the software was recently updated to 5.1.5 — with expanded GPU support as well as support for new versions of Resolve, Adobe and Nuke — nothing really changes for this review. You can check out a detailed list of the updates here.)

Neat Video 5 is a noise reduction plugin. In a Windows OS environment, Neat Video is compatible with apps like Adobe After Effects, Adobe Premiere Pro, DaVinci Resolve, Avid Media Composer, Vegas, Magix, Edius, Virtual Dub, and the OFX-compatible apps Nuke, Fusion, Scratch, HitFilm, Mamba, Natron, Flame, Baselight and DustBuster. In a macOS environment, Neat Video 5 is compatible with After Effects, Premiere, Final Cut Pro X, Motion 5, OFX, Resolve and Media Composer. In Linux, the software is compatible with OFX-compatible apps and Resolve.

Neat Video 5 comes in three flavors: Demo, Home and Pro. The Demo version works in up to 1280×720 resolution with a watermark. Home is literally made for the home user: It will process video up to 1920×1080 resolutions, it will use up to one GPU, and it is for non-commercial use. The cost is just $74.90 for most apps (Resolve is $89.90). The Pro version has no resolution restrictions, will work on two or more GPUs simultaneously, and can be used commercially. The Pro version starts at $129.90 per app ($159.90 for Resolve). Because Neat Video 5 for OFX works with so many apps, it only comes in Pro ($249.90) and Studio ($349.90) versions. The Studio version adds the ability for a floating license. You can see all of the pricing details here.

If there is one line you should take away from this review, it is this: Neat Video 5 is by far the easiest and best noise reduction software I have used in any application to date. And while this review is focusing on the Resolve version of Neat Video 5, all other apps work in much the same way. You can find Neat Video’s software-specific Quick Start Guides to help. Once you install and register your Neat Video 5 license, removing digital video noise is as easy as applying Neat Video 5 to a node in the color tab, clicking on “Prepare Noise Profile,” clicking on “Auto Profile,” and clicking “Apply.” Then, unless you want to fine-tune your noise reduction, you are done. Obviously, I have somewhat simplified how Neat Video 5 works, but essentially it can be done in as little as three steps per clip, and the results are typically amazing. If they aren’t amazing, you can jump back into Neat Video 5 and manually adjust specifics until the noise reduction looks correct. But I will say that in about 90% of cases, the Auto Profiling will do all of the noise reduction work necessary.

For tinkerers, or for those who need to go far beyond an Auto Profile, you can manually adjust your settings. But taking a step back, Neat Video needs an area of your image that has a uniform color and noise profile to process how it removes noise. The automatic profiling will do its best to find an area, but it doesn’t always work. What you need to keep in mind when building a good noise profile inside of Neat Video is that the area being processed needs to be as uniform as possible (think dark night sky or a wall painted in one color) — meaning no prominent features, a high noise level (something in the high four area is better), the largest possible sample area and no warnings from Neat Video.

So, if your automatic profile doesn’t do the job, you can find an area of your image that meets the above requirements and then build a profile. From there you can use one of the Neat Video 5 features, like “Profile Check.” Profile Check will highlight details that aren’t being affected by Neat Video, giving you a clear representation of what noise is being reduced and whether you need to adjust your profile to better reduce video noise.

At this point you might be wondering where you tweak advanced settings. When you load Neat Video, you will be in Beginner mode. To get into Advanced mode, go to the Tools menu, where you will see a lot of advanced functions that can help you fine-tune your noise profile. And if you still can’t get a good noise reduction profile, you can try out the “Generic Profile,” which can help you build a profile even if your video doesn’t have a large enough area of uniform noise. There are also presets — such as like light flicker, moire flicker, repeat frame issues, dust and scratch filters (including scan lines), jitter of details, artifact removal filter and more — that can solve certain problems.

Neat Video 5 is faster than previous generations. As in previous versions, there is even a tool that inside of Neat Video preferences that will run your CPU and GPU through a benchmark to specify whether you should run on CPU only, GPU only, or a combo of both. In Neat Video 5, if you have trouble with a clip, you can use up to four “Variants” of noise reduction in the new playback window to see how each profile works with your clip.

In terms of playback and rendering, noise reduction is never fast. However, inside of Neat Video the new playback window will typically play back your footage to preview the noise reduction before you jump back into Resolve. Inside of Resolve, even in just 1080p, my sequence would crawl to just a few frames of playback per second. It is one of the most processor- and GPU-intensive tasks you will run on your computer.

In my testing I applied Neat Video 5 to the first node in my color correction tree, followed by a basic color correction in a one-minute timeline. I took those same clips and compared my Neat Video results to Resolve’s Temporal and Spatial noise reduction tools. In terms of visual results, Neat Video 5 was superior. If that’s not the case for you, then jump into YCbCr viewer mode inside of Neat Video 5, isolate each channel and tweak each channel individually so you won’t affect your overall noise reduction if it isn’t necessary. Not only did Neat Video 5 handle normal noise in the shadows well but on clips with very tight lines, it was able to keep a lot of the details while removing the noise. Resolve’s noise reduction tools had a harder time removing noise but keeping detail. Temporal noise reduction really didn’t do much, and while Spatial noise reduction did work it would heavily blur and distort the image — essentially not acceptable.

To get a good example of how Neat Video 5 slams a computer system, I exported 1080p MP4s. Resolve’s built-in Temporal noise reduction took 1:03, while the Spatial noise reduction took 1:05. The Neat Video 5 render of the same one-minute timeline took 3:51 — almost four times as long. I was curious how much longer a 4K render would take. Using 4K (UHD) media, I applied a simple color correction and on a previous serial node that applied Neat Video 5. I exported a 4K (UHD) MP4, which took 52 seconds without Neat Video 5 applied and 16:27 with Neat Video applied — at least 16 times more render time! So while Neat Video 5 is an amazing tool, there is a trade-off in high render times.

To find additional training on more advanced noise reduction techniques in Neat Video, check out the video tutorials. I find myself watching these just because of how much you can learn about noise reduction in general. They aren’t as exciting as watching Game of Thrones or The Handmaid’s Tale, but they will push your knowledge in noise reduction to the next level.

Summing Up
I’ve used Neat Video for a while, so when I was approached to review Version 5 I immediately said yes. Noise reduction is post skill that not many possess.

If you are an online editor or colorist looking to separate yourself from the pack, learn all the noise reduction techniques you can and definitely check out Neat Video 5. Not only can Neat Video 5 work automatically, but you can fine-tune your noise reduction as much as you want.

And when demoing your color correction services, think about using Neat Video 5 to remove camera noise, flickering and chroma issues; color correcting your footage; and, finally, adding some grain back into your shot. Not only will your footage look better, but you’ll have a technical workflow that will definitely impress clients. Just don’t forget to account for the extra render time.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Editor David Cea joins Chicago’s Optimus  

Chicago-based production and post house Optimus has added editor David Cea to its roster. With 15 years of experience in New York and Chicago, Cea brings a varied portfolio of commercial editing experience to Optimus.

Cea has cut spots for brands such as Bank of America, Chevrolet, Exxon, Jeep, Hallmark, McDonald’s, Microsoft and Target. He has partnered with many agencies, including BBDO, Commonwealth, DDB, Digitas, Hill Holliday, Leo Burnett, Mother and Saatchi & Saatchi.

“I grew up watching movies with my dad and knew I wanted to be a part of that magical process in some way,” explains Cea. “The combination of Goodfellas and Monty Python gave me all the fuel I needed to start my film journey. It wasn’t until I took an editing class in college that I discovered the part of filmmaking I wanted to pursue. The editor is the one who gets to shape the final product and bring out the true soul of the footage.”

After studying film at Long Island’s Hofstra University, Cea met Optimus editor and partner Angelo Valencia while working as his assistant at Whitehouse New York in 2005. Cea then moved on to hone his craft further at Cosmo Street in New York. Chicago became home for him in 2013 as he spent three years at Whitehouse. After heading back east for a couple of years, he returned to Chicago to put down roots.

While Avid Media Composer is Cea’s go-to choice for editing, he is also proficient in Adobe Premiere.

Colorist Chat: Light Iron supervising colorist Ian Vertovec

“As colorists, we are not just responsible for enhancing each individual shot based on the vision of the filmmakers, but also for helping to visually construct an emotional arc over time.”

NAME: Ian Vertovec

TITLE: Supervising Colorist

COMPANY: Light Iron

CAN YOU DESCRIBE YOUR ROLE IN THE COMPANY?
A Hollywood-based collaborator for motion picture finishing, with a studio in New York City as well.

GLOW

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
As colorists, we are not just responsible for enhancing each individual shot based on the vision of the filmmakers, but also for helping to visually construct an emotional arc over time. For example, a warm scene feels warmer coming out of a cool scene as opposed to another warm scene. We have the ability and responsibility to nudge the audience emotionally over the course of the film. Using color in this way makes color grading a bit like a cross between photography and editing.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Once in a while, I’ll be asked to change the color of an object, like change a red dress to blue or a white car to black. While we do have remarkable tools at our disposal, this isn’t quite the correct way to think about what we can do. Instead of being able to change the color of objects, it’s more like we can change the color of the light shining on objects. So instead of being able to turn a red dress to blue, I can change the light on the dress (and only the dress) to be blue. So while the dress will appear blue, it will not look exactly how a naturally blue dress would look under white light.

WHAT’S YOUR FAVORITE PART OF THE JOB?
There is a moment with new directors, after watching the first finished scene, when they realize they have made a gorgeous-looking movie. It’s their first real movie, which they never fully saw until that moment — on the big screen, crystal clear and polished — and it finally looks how they envisioned it. They are genuinely proud of what they’ve done, as well as appreciative of what you brought out in their work. It’s an authentic filmmaking moment.

WHAT’S YOUR LEAST FAVORITE?
Working on multiple jobs at a time and long days can be very, very draining. It’s important to take regular breaks to rest your eyes.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Something with photography, VFX or design, maybe.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I was doing image manipulation in high school and college before I even knew what color grading was.

Just Mercy

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Just Mercy, Murder Mystery, GLOW, What We Do in the Shadows and Too Old to Die Young.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Sometimes your perspective and a filmmaker’s perspective for a color grade can be quite divergent. There can be a temptation to take the easy way and either defer or overrule. I find tremendous value in actually working out those differences and seeing where and why you are having a difference of opinion.

It can be a little scary, as nobody wants to be perceived as confrontational, but if you can civilly explain where and why you see a different approach, the result will almost always be better than what either of you thought possible in the first place. It also allows you to work more closely and understand each other’s creative instincts more accurately. Those are the moments I am most proud of — when we worked through an awkward discord and built something better.

WHERE DO YOU FIND INSPIRATION?
I have a fairly extensive library of Pinterest boards — mostly paintings — but it’s real life and being in the moment that I find more interesting. The color of a green leaf at night under a sodium vapor light, or how sunlight gets twisted by a plastic water bottle — that is what I find so cool. Why ruin that with an Insta post?

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
FilmLight Baselight’s Base Grade, FilmLight Baselight’s Texture Equalizer and my Red Hydrogen.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram mostly.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
After working all day on a film, I often don’t feel like watching another movie when I get home because I’ll just be thinking about the color.  I usually unwind with a video game, book or podcast. The great thing about a book or video games is that they demand your 100% attention. You can’t be simultaneously browsing social media or the news  or be thinking about work. You have to be 100% in the moment, and it really resets your brain.

Quick Chat: Director Sorrel Brae on Rocket Mortgage campaign

By Randi Altman

Production company Native Content and director Sorrel Brae have collaborated once again with Rocket Mortgage’s in-house creative team on two new spots in the ongoing “More Than a House” campaign. Brae and Native had worked together on the campaigns first four offerings.

The most recent spots are More Than a Tradition and More Than a Bear. More Than a Tradition shows a ‘50s family sitting down to dinner and having a fun time at home. Then the audience sees the same family in modern times, hammering home how traditions become traditions.

More Than a Bear combines fantasy and reality as it shows a human-sized teddy bear on an operating table. Then viewers see a worried boy looking on as his mother is repairing the his stuffed animal. Each spot opens with the notes of Bob Dylan’s “The Man In Me,” which is featured in all the “More Than a House” spots.

More Than a Bear was challenging, according to Brae, because there was some darker material in this piece as compared to the others  —  viewers aren’t sure at first if the bear will make it. Brae worked closely with DP Jeff Kim on the lighting and color palette to find a way to keep the tone lighthearted. By embracing primary colors, the two were able to channel a moodier tone and bring viewers inside a scared child’s imagination while still maintaining some playfulness.

We reached out to director Brae to find our more.

Sorrel Brae

What did you shoot these two spots on, and why?
I felt that in order for the comedy to land and the idea to shine, the visual separation between fantasy and reality had to be immediate, even shocking. Shooting on an Alexa Mini, we used different lenses for the two looks: Hawk V-Lite Vintage ’74 anamorphic for epic and cinematic fantasy, and spherical Zeiss and Cooke S4 primes for reality. The notable exception was in the hospital for the teddy bear spot, where our references were the great Spielberg and Zemeckis films from the ‘80s, which are primarily spherical and have a warmer, friendlier feeling.

How did you work with the DP and the colorist on the look? And how would you describe the look of each spot, and the looks within each spot? 
I was fortunate to bring on longtime collaborators DP Jeffrey Kim and colorist Mike Howell for both spots. Over the years, Jeff and I have developed a shorthand for working together. It all starts with defining our intention and deciding how to give the audience the feelings we want them to have.

In Tradition, for example, that feeling is a warm nostalgia for a bygone era that was probably a fantasy then, just as it is now. We looked to period print advertisements, photographs, color schemes, fonts — everything that spoke to that period. Crucial to pulling off both looks in one day was Heidi Adams’ production design. I wanted the architecture of the house to match when cutting between time periods. Her team had to put a contemporary skin on a 1950s interior for us to shoot “reality” and then quickly reset the entire house back to 1950s to shoot “fantasy.”

The intention for More Than a Bear was trickier. From the beginning I worried a cinematic treatment of a traumatic hospital scene wouldn’t match the tone of the campaign. My solution with Jeff was to lean into the look of ‘80s fantasy films like E.T. and Back to the Future with primary colors, gelled lights, a continuously moving camera and tons of atmosphere.

Mike at Color Collective even added a retro Ektachrome film emulation for the hospital and a discontinued Kodak 5287 emulation for the bedroom to complete the look. But the most fun was the custom bear that costume designer Bex Crofton-Atkins created for the scene. My only regret is that the spot isn’t 60 seconds because there’s so much great bear footage that we couldn’t fit into the cut.

What was this edited on? Did you work with the same team on both campaigns?
The first four spots of this campaign were cut by Jai Shukla out of Nomad Edit. Jai did great work establishing the rhythm between fantasy and reality and figuring out how to weave in Bob Dylan’s memorable track for the strongest impact. I’m pretty sure Jai cuts on Avid, which I like to tease him about.

These most recent two spots (Tradition and Teddy Bear) were cut by Zach DuFresne out of Hudson Edit, who did an excellent job navigating scripts with slightly different challenges. Teddy Bear has more character story than any of the others, and Tradition relies heavily on making the right match between time periods. Zach cuts on Premiere, which I’ve also migrated to (from FCP 7) for personal use.

Were any scenes more challenging than the others?
What could be difficult about kids, complex set design, elaborate wardrobe changes and detailed camera moves on a compressed schedule? In truth, it was all equally challenging and rewarding.

Ironically, the shots that gave us the most difficulty probably look the simplest. In Tradition there’s a SteadiCam move that introduces us into the contemporary world, has match cuts on either end and travels through most of the set and across most of the cast. Because everyone’s movements had to perfectly align with a non-repeatable camera, that one took longer than expected.

And on Teddy Bear, the simple shot looking up from the patient’s POV as the doctor/mom looms overhead was surprisingly difficult. Because we were on an extremely wide lens (12mm or similar), our actress had to nail her marks down to the millimeter, otherwise it looked weird. We probably shot that one setup 20 times.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: FilmConvert Nitrate for film stock emulation

By Brady Betzel

If you’ve been around any sort of color grading forums or conferences, you’ve definitely heard some version of this: Film is so much better than digital. While I don’t completely disagree with the sentiment, let’s be real. We are in a digital age, and the efficiency and cost associated with digital recording is, in most cases, far superior to film.

Personally, I love the way film looks; it has an essence that is very difficult to duplicate — from the highlight roll-offs to the organic grain — but it is very costly. That is why film is hard to imitate digitally, and that is why so many companies try and often fail.

Sony A7iii footage

One company that has had grassroots success with digital film stock emulation is FilmConvert. The original plugin, known as FilmConvert Pro, works with Adobe’s Premiere and After Effects, Avid Media Composer and as an OFX plugin for apps like Blackmagic’s DaVinci Resolve.

Recently, FilmConvert expanded its lineup with the introduction of Nitrate, a film emulation plugin that can take Log-based video and transform it into full color corrected media with a natural grain similar to that of commonly loved film stocks. Currently, Nitrate works with Premiere and After Effects, with an OFX version for Resolve. A plugin for FCPX is coming in March.

The original FilmConvert Pro plugin works great, but it adjusts your image through an sRGB pipeline. That means FilmConvert Pro adjusts any color effects after your “base” grade is locked in while living in an sRGB world. While you download camera-specific “packs” that apply the film emulation — custom-made based on your sensor and color space — you are still locked into an sRGB pipeline, with little wiggle room. This means sometimes blowing out your highlights and muddying your shadows with little ability to recover any details.

SonyA7iii footage

I imagine FilmConvert Pro was introduced at a time when a lot of users shot with cameras like Canon 5D or other sRGB cameras that weren’t shooting in a Log color space. Think of using a LUT and trying to adjust the highlights and shadows after the LUT; typically, you will have a hard time getting any detail back, losing dynamic range even if your footage was shot Log. But if you color before a LUT (think Log footage), you can typically recover a lot of information as long as your shot was recorded properly. That blown-out sky might be able to be recovered if shot in a Log colorspace. This is what FilmConvert is solving with its latest offering, Nitrate.

How It Works
FilmConvert’s Nitrate works in a Cineon-Log processing pipeline for its emulation, as well as a full Log image processing pipeline. This means your highlights and shadows are not being heavily compressed into an sRGB color space, which allows you to fine-tune your shadows and highlights without losing as much detail. Simply, it means that the plugin will work more naturally with your footage.

In additional updates, FilmConvert has overhauled its GUI to be more natural and fluid. The Color Wheels have been redesigned, a new color tint slider has been added to quickly remove any green or magenta color cast, a new Color Curve control has been added, and there is now a Grain Response curve.

Grain Response

The Grain Response curve takes adding grain to your footage up a notch. Not only can you select between 8mm and 35mm grain sizes (with many more in between) but you can adjust the application of that grain from shadows to highlights. If you want your highlights to have more grain response, just point the Grain Response curve higher up. In the same window you can adjust the grain size, softness, strength and saturation via sliders.

Of the 19 film emulation options to choose from, there are many unique and great-looking presets. From the “KD 5207 Vis3” to the “Plrd 600,” there are multiple brands and film stocks offered. For instance, the “Kodak 5207 Vis3” is described on Kodak’s website in more detail:

“Vision3 250D Film offers outstanding performance in the extremes of exposure — including increased highlight latitude, so you can move faster on the set and pull more detail out of the highlights in post. You’ll also see reduced grain in shadows, so you can push the boundaries of underexposure and still get outstanding results.”

One of my favorite emulations in Nitrate — “Fj Velvia 100” or Fujichrome Velvia 100 — is described on FilmConvert’s website:

“FJ Velvia 100 is based on the Fujichrome Velvia 100 photographic film stock. Velvia is a daylight-balanced color reversal film that provides brighter ultra-high-saturation color reproduction. The Velvia is especially suited to scenery and nature photography as well as other subjects that require precisely modulated vibrant color reproduction.”

Accurate Grain

FilmConvert’s website offers a full list of the 19 film stocks, as well as examples and detailed descriptions of each film stock.

Working With FilmConvert Nitrate
I used Nitrate strictly in Premiere Pro because the OFX version (specifically for Resolve) wasn’t available at the time of this review.

Nitrate works pretty well inside of Premiere, and surprisingly plays back fluidly — this is probably thanks to its GPU acceleration. Even with Sony a7 III UHD footage, Premiere was able to keep up with Lumetri Color layered underneath the FilmConvert Nitrate plugin. To be transparent I tested Nitrate on a laptop with an Intel i7 CPU and an Nvidia RTX 2080 GPU, so that definitely helps.

At first, I struggled to see where I would fit FilmConvert’s Nitrate plugin into my normal workflow so I could color correct my own footage and add a grade later. However, when I started cycling through the different film emulations, I quickly saw that they were adding a lot of life to the images and videos. Whether it was the grain that comes from the updated 6K grain scans in Nitrate or the ability to identify which camera and color profile you used when filming via the downloadable camera packs, FilmConvert’s Nitrate takes well-colored footage and elevates it to finished film levels.

It’s pretty remarkable; I came in thinking FilmConvert was essentially a preset LUT plugin and wasn’t ready for it to be great. To my surprise, it was great and it will add the extra edge of professional feeling to your footage quickly and easily.

Test 1
In my first test, I threw some clips I had shot on a Sony a7 III camera in UHD (at SLog3 — SGamut3) into a timeline, applied the FilmConvert Nitrate plugin and realized I needed to download the Sony camera packs. This pack was about 1GB, but others —like the Canon 5D Mark II — came in at just over 300MB. Not the end of the world, but if you have multiple cameras, you are going to need to download quite a few packs and the download size will add up.

Canon 5D

I tried using just the Nitrate plugin to do color correction and film emulation from start to finish, but I found the tools a little cumbersome and not really my style. I am not the biggest fan of Lumetri color correction tools, but I used those to get a base grade and apply Nitrate over that grade. I tend to like to keep looks to their own layer, so coloring under Nitrate was a little more natural to me.

A quick way to cycle through a bunch of looks quickly is to apply Nitrate to the adjustment layer and hit the up or down arrows. As I was flicking through the different looks, I noticed that FilmConvert does a great job processing the film emulations with the specified camera. All of the emulations looked good with or without a color balance done ahead of time.

It’s like adding a LUT and then a grade all in one spot. I was impressed by how quickly this worked and how good they all looked. When I was done, I rendered my one-minute sequence out of Adobe Media Encoder, which took 45 seconds to encode a ProResHQ and 57 seconds for an H.264 at 10Mb/s. For reference, the uncolored version of this sequence took 1:17 for the ProResHQ and :56 for the H.264 at 10Mb/s. Interesting, because the Nvidia RTX 2080 GPU definitely kicked in more when the FilmConvert Nitrate effect was added. That’s a definite plus.

Test 2
I also shot some clips using the Blackmagic Pocket Cinema Camera (BMPCC) and the Canon 5D Mark II. With the BMPCC, I recorded CinemaDNG files in the film color space, essentially Log. With the 5D, the videos were recorded as Movie files wrapped in MP4 files (unless you shoot with the Magic Lantern hack, which allows you to record in the raw format). I brought in the BMPCC CinemaDNG files via the Media Browser as well as imported the 5D Movs and applied the FilmConvert Nitrate plugin to the clips. Keep in mind you will need to download and install those camera packs if you haven’t already.

Pocket Cinema Camera

For the BMPCC clips I identified the camera and model as appropriate and chose “Film” under profile. It seemed to turn my CinemaDNG files a bit too orange, which could have been my white balance settings and/or the CinemaDNG processing done by Premiere. I could swing the orange hue out by using the temperature control, but it seemed odd to have to knock it down to -40 or -50 for each clip. Maybe it was a fluke, but with some experimentation I got it right.

With the Canon 5D Mark II footage, I chose the corresponding manufacturer and model as well as the “Standard” profile. This worked as it should. But I also noticed some other options like Prolost, Marvel, VisionTech, Technicolor, Flaat and Vision Color — these are essentially color profiles people have made for the 5D Mark II. You can find them with a quick Google search.

Summing Up
In the end, FilmConvert’s Nitrate will elevate your footage. The grain looks smooth and natural, the colors in the film emulation add a modern take on nostalgic color corrections (that don’t look too cheesy), and most cameras are supported via downloads. If you don’t have a large budget for a color grading session you should be throwing $139 at FilmConvert for its Nitrate plugin.

Nitrate in Premiere

When testing Nitrate on a few different cameras, I noticed that it even made color matching between cameras a little bit more consistent. Even if you have a budget for color grading, I would still suggest buying Nitrate; it can be a great starting block to send to your colorist for inspiration.

Check out FilmConvert’s website and definitely follow them on Instagram, where they are very active and show a lot of before-and-afters from their users  — another great source of inspiration.

Main Image: Two-year-old Oliver Betzel shot with a Canon 5D with KD P400 Ptra emulsion applied.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Talking with Franki Ashiruka of Nairobi’s Africa Post Office

By Randi Altman

After two decades of editing award-winning film and television projects for media companies throughout Kenya and around the world, Franki Ashiruka opened Africa Post Office, a standalone, post house in Nairobi, Kenya. The studio provides color grading, animation, visual effects, motion graphics, compositing and more. In addition, they maintain a database of the Kenyan post production community that allows them to ramp up with the right artists when the need arises.

Here she talks about the company, its workflow and being a pioneer in Nairobi’s production industry.

When did you open Africa Post Office, and what was your background prior to starting this studio?
Africa Post Office (APO) opened its doors in February 2017. Prior to starting APO, I was a freelance editor with plenty of experience working with well-established media houses such as Channel 4 (UK), Fox International Channels (UK), 3D Global Leadership (Nigeria), PBS (USA), Touchdown (New Zealand), Greenstone Pictures (New Zealand) and Shadow Films (South Africa).

In terms of Kenya-based projects, I’ve worked with a number of production houses including Quite Bright Films, Fat Rain Films, Film Crew in Africa, Mojo Productions, Multichoice, Zuku, Content House and Ginger Ink Films.

I imagine female-run, independent studios in Africa are rare?
On the contrary, Kenya has reached a point where more and more women are emerging as leaders of their own companies. I actually think there are more women-led film production companies than male-led. The real challenge was that before APO, there was nothing quite like it in Nairobi. Historically, video production here was very vertical — if you shot something, you’d need to also manage post within whatever production house you were working in. There were no standalone post houses until us. That said, with my experience, even though hugely daunting, I never thought twice about starting APO. It is what I have always wanted to do, and if being the first company of our kind didn’t intimidate me, being female was never going to be a hindrance.

L-R: Franki Ashiruka, Kevin Kyalo, Carole Kinyua and Evans Wenani

What is the production and post industry like in Nairobi? 
When APO first opened, the workload was commercial-heavy, but in the last two years that has steadily declined. We’re seeing this gap filled by documentary films, corporate work and television series. Feature films are also slowly gaining traction and becoming the focus of many up-and-coming filmmakers.

What services do you provide, and what types of projects do you work on?
APO has a proven track record of successful delivery on hundreds of film and video projects for a diverse range of clients and collaborators, including major corporate entities, NGOs, advertising and PR agencies, and television stations. We also have plenty of experience mastering according to international delivery standards. We’re proud to house a complete end-to-end post ecosystem of offline and online editing suites.

Most importantly, we maintain a very thorough database of the post production community in Kenya.
This is of great benefit to our clients who come to us for a range of services including color grading, animation, visual effects, motion graphics and compositing. We are always excited to collaborate with the right people and get additional perspectives on the job at hand. One of our most notable collaborators is Ikweta Arts (Avatar, Black Panther, Game of Thrones, Hacksaw Ridge), owned and run by Yvonne Muinde. They specialize in providing VFX services with a focus in quality matte painting/digital environments, art direction, concept and post visual development art. We also collaborate with Keyframe (L’Oréal, BMW and Mitsubishi Malaysia) for motion graphics and animations.

Can you name some recent projects and the work you provided?
We are incredibly fortunate to be able to select projects that align with our beliefs and passions.

Our work on the short film Poacher (directed by Tom Whitworth) won us three global Best Editing Awards from the Short to the Point Online Film Festival (Romania, 2018), Feel the Reel International Film Festival (Glasgow, 2018) and Five Continents International Film Festival (Venezuela, 2019).

Other notable work includes three feature documentaries for the Big Story segment on China Global Television Network, directed by Juan Reina (director of the Netflix Original film Diving Into the Unknown), Lion’s Den (Quite Bright Films) an adaptation of ABC’s Shark Tank and The Great Kenyan Bake Off (Showstopper Media) adopted from the BBC series The Great British Bake Off. We also worked on Disconnect, a feature film produced by Kenya’s Tosh Gitonga (Nairobi Half Life), a director who is passionate about taking Africa’s budding film industry to the next level. We have also worked on a host of television commercials for clients extending across East Africa, including Kenya, Rwanda, South Sudan and Uganda.

What APO is most proud of though, is our clients’ ambitions and determination to contribute toward the growth of the African film industry. This truly resonates with APO’s mantra.

You recently added a MAM and some other gear. Can you talk about the need to upgrade?
Bringing on the EditShare EFS 200 nodes has significantly improved the collaborative possibilities of APO. We reached a point where we were quickly growing, and the old approach just wasn’t going to cut it.

Prior to centralizing our content, projects lived on individual hard disks. This meant that if I was editing and needed my assistant to find me a scene or a clip, or I needed VFX on something, I would have to export individual clips to different workstations. This created workflow redundancies and increased potential for versioning issues, which is something we couldn’t afford to be weighed down with.

The remote capabilities of the EditShare system were very appealing as well. Our color grading collaborator, Nic Apostoli of Comfort and Fame, is based in Cape Town, South Africa. From there, he can access the footage on the server and grade it while the client reviews with us in Nairobi. Flow media asset management also helps in this regard. We’re able to effectively organize and index clips, graphics, versions, etc. into clearly marked folders so there is no confusion about what media should be used. Collaboration among the team members is now seamless regardless of their physical location or tools used, which include the Adobe Creative Suite, Foundry Nuke, Autodesk Maya and Maxon Cinema 4D.

Any advice for others looking to break out on their own and start a post house?
Know what you want to do, and just do it! Thanks Nike …


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.

Review: HP’s ZBook G6 mobile workstation

By Brady Betzel

In a year that’s seen AMD reveal an affordable 64-core processor with its Threadripper 3, it appears as though we are picking up steam toward next-level computing.

Apple finally released its much-anticipated Mac Pro (which comes with a hefty price tag for the 1.5TB upgrade), and custom-build workstation companies — like Boxx and Puget Systems — can customize good-looking systems to fit any need you can imagine. Additionally, over the past few months, I have seen mobile workstations leveling the playing field with their desktop counterparts.

HP is well-known in the M&E community for its powerhouse workstations. Since I started my career, I have either worked on a MacPro or an HP. Both have their strong points. However, workstation users who must be able to travel with their systems, there have always been some technical abilities you had to give up in exchange for a smaller footprint. That is, until now.

The newly released HP ZBook 15 G6 has become the rising the rising tide that will float all the boats in the mobile workstation market. I know I’ve said it before, but the classification of “workstation” is technically much more than just a term companies just throw around. The systems with workstation-level classification (at least from HP) are meant to be powered on and run at high levels 24 hours a day, seven days a week, 365 days a year.

They are built with high-quality, enterprise-level components, such as ECC (error correcting code) memory. ECC memory will self-correct errors that it sees, preventing things like blue screens of death and other screen freezes. ECC memory comes at a cost, and that is why these workstations are priced a little higher than a standard computer system. In addition, the warranties are a little more inclusive — the HP ZBook 15 G6 comes with a standard three-year/on-site service warranty.

Beyond the “workstation” classification, the ZBook 15 G6 is amazingly powerful, brutally strong and incredibly colorful and bright. But what really matters is under the hood. I was sent the HP ZBook 15 G6 that retails for $4,096 and contains the following specs:
– Intel Xeon E-2286M (eight cores/16 threads — 2.4GHz base/5GHz Turbo)
– Nvidia Quadro RTX 3000 (6GB VRAM)
15.6-inch UHD HP Dream Color display, anti-glare, WLED backlit 600 nits, 100% DCI-P3
– 64GB DDR4 2667MHz
– 1TB PCIe Gen 3 x4 NVMe SSD TLC
– FHD webcam 1080p plus IR camera
– HP collaboration keyboard with dual point stick
– Fingerprint sensor
– Smart Card reader
– Intel Wi-Fi 6 AX 200, 802.11ac 2×2 +BT 4.2 combo adapter (vPro)
– HP long-life battery four-cell 90 Wh
– Three-year limited warranty

The ZBook 15 G6 is a high-end mobile workstation with a price that reflects it. However, as I said earlier, true workstations are built to withstand constant use and, in this case, abuse. The ZBook 15 G6 has been designed to pass up to 21 extensive MIL-STD 810G tests, which is essentially worst-case scenario testing. For instance, drop testing of around four feet, sand and dust testing, radiation testing (the sun beating down on the laptop for an extended period) and much more.

The exterior of the G6 is made of aluminum and built to withstand abuse. The latest G6 is a little bulky/boxy, in my opinion, but I can see why it would hold up to some bumps and bruises, all while working at blazingly fast speeds, so bulk isn’t a huge issue for me. Because of that bulk, you can imagine that this isn’t the lightest laptop either. It weighs in at 5.79 pounds for the lowest end and measures 1 inch by 14.8 inches by 10.4 inches.

On the bottom of the workstation is an easy-to-access panel for performing repairs and upgrades yourself. I really like the bottom compartment. I opened it and noticed I could throw in an additional NVMe drive and an SSD if needed. You can also access memory here. I love this because not only can you perform easy repairs yourself, but you can perform upgrades or part replacements without voiding your warranty on the original equipment. I’m glad to see that HP kept this in mind.

The keyboard is smaller than a full-size version but has a number keypad, which I love using when typing in timecodes. It is such a time-saver for me. (I credit entering in repair order numbers when I fixed computers at Best Buy as a teenager.) On the top of the keyboard are some handy shortcuts if you do web conferences or calls on your computer, including answering and ending calls. The Bang & Olufsen speakers are some of the best laptop speakers I’ve heard. While they aren’t quite monitor-quality, they do have some nice sound on the low end that I was able to fine-tune in the Bang & Olufsen audio control app.

Software Tests
All right, enough of the technical specs. Let’s get on to what people really want to know — how the HP ZBook 15 G6 performs while using apps like Blackmagic’s DaVinci Resolve and Adobe Premiere Pro. I used sample Red and Blackmagic Raw footage that I use a lot in testing. You can grab the Red footage here and the BRaw footage here. Keep in mind you will need to download the BRaw software to edit with BRaw inside of Adobe products, which you can find here).

Performance monitor while exporting in Resolve with VFX.

For testing in Resolve and Premiere, I strung out one-minute of 4K, 6K and 8K Red media in one sequence and the 4608×2592 4K and 6K BRaw media in another. During the middle of my testing Resolve had a giant Red API upgrade to allow for better realtime playback of Red Raw files if you have an Nvidia CUDA-based GPU.

First up is Resolve 16.1.1 and then Resolve 16.1.2. Both sequences are set to UHD (3840×2160) resolution. One sequence of each codec contains just color correction, while another of each codec contains effects and color correction. The Premiere sequence with color and effects contains basic Lumetri color correction, noise reduction (50) and a Gaussian blur with settings of 0.4. In Resolve, the only difference in the color and effects sequence is that the noise reduction is spatial and set to Enhanced, Medium and 25/25.

In Resolve, the 4K Red media would play in realtime while the 6K (RedCode 3:1) would jump down to about 14fps to 15fps, and the 8K (RedCode 7:1) would play at 10fps at full resolution with just color correction. With effects, the 4K media would play at 20fps, 6K at 3fps and 8K at 10fps. The Blackmagic Raw video would play at real time with just color correction and around 3fps to 4fps with effects.

This is where I talk about just how loud the fans in the ZBook 15 G6 can get. When running exports and benchmarks, the fans are noticeable and a little distracting. Obviously, we are running some high-end testing with processor- and GPU-intensive tests but still, the fans were noticeable. However, the bottom of the mobile workstation was not terribly hot, unlike the MacBook Pros I’ve tested before. So my lap was not on fire.

In my export testing, I used those same sequences as before and from Adobe Premiere Pro 2020. I exported UHD files using Adobe Media Encoder in different containers and codecs: H.264 (Mov), H.265 (Mov), ProResHQ, DPX, DCP and MXF OP1a (XDCAM). The MXF OP1a was at 1920x1080p export.
Here are my results:

Red (4K,6K,8K)
– Color Only: H.264 – 5:27; H.265 – 4:45; ProResHQ – 4:29; DPX – 3:37; DCP – 10:38; MXF OP1a – 2:31

Red Color, Noise Reduction (50), Gaussian Blur .4: H.264 – 4:56; H.265 – 4:56; ProResHQ – 4:36; DPX – 4:02; DCP – 8:20; MXF OP1a – 2:41

Blackmagic Raw
Color Only: H.264 – 2:05; H.265 – 2:19; ProResHQ – 2:04; DPX – 3:33; DCP – 4:05; MXF OP1a – 1:38

Color, Noise Reduction (50), Gaussian Blur 0.4: H.264 – 1:59; H.265 – 2:22; ProResHQ – 2:07; DPX – 3:49; DCP – 3:45; MXF OP1a – 1:51

What is surprising is that when adding effects like noise reduction and a Gaussian blur in Premiere, the export times stayed similar. While using the ZBook 15 G6, I noticed my export times improved when I upgraded driver versions, so I re-did my tests with the latest Nvidia drivers to make sure I was consistent. The drivers also solved an issue in which Resolve wasn’t reading BRaw properly, so remember to always research drivers.

The Nvidia Quadro RTX 3000 really pulled its weight when editing and exporting in both Premiere and Resolve. In fact, in previous versions of Premiere, I noticed that the GPU was not really being used as well as it should have been. With the Premiere Pro 2020 upgrade it seems like Adobe really upped its GPU usage game — at some points I saw 100% GPU usage.

In Resolve, I performed similar tests, but instead of ProResHQ I exported a DNxHR QuickTime file/package instead of a DCP and IMF package. For the most part, they are stock exports in the Deliver page of Resolve, except I forced Video Levels, Forced Debayer and Resizing to Highest Quality. Here are my results from Resolve version 16.1.1 and 16.1.2. (16.1.2 will be in parenthesis.)

– Red (4K, 6K, 8K) Color Only: H.264 – 2:17 (2:31); H.265 – 2:23 (2:37); DNxHR – 2:59 (3:06); IMF – 6:37 (6:40); DPX – 2:48 (2:45); MXF OP1A – 2:45 (2:33)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 5:00 (5:15); H.265 – 5:18 (5:21); DNxHR – 5:25 (5:02); IMF – 5:28 (5:11); DPX – 5:23 (5:02); MXF OP1a – 5:20 (4:54)

-Blackmagic Raw Color Only: H.264 – 0:26 (0:25); H.265 – 0:31 (0:30); DNxHR – 0:50 (0:50); IMF – 3:51 (3:36); DPX – 0:46 (0:46); MXF OP1a – 0:23 (0:22)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 7:51 (7:53); H.265 – 7:45 (8:01); DNxHR – 7:53 (8:00); IMF – 8:13 (7:56); DPX – 7:54 (8:18); MXF OP1a – 7:58 (7:57)

Interesting to note: Exporting Red footage with color correction only was significantly faster from Resolve, but for Red footage with effects applied, export times were similar between Resolve and Premiere. With the CUDA Red SDK update to Resolve in 16.1.2, I thought I would see a large improvement, but I didn’t. I saw an approximate 10% increase in playback but no improvement in export times.

Puget

Puget Systems has some great benchmarking tools, so I reached out to Matt Bach, Puget Systems’ senior labs technician, about my findings. He suggested that the mobile Xeon could possibly still be the bottleneck for Resolve. In his testing he saw a larger increase in speed with AMD Threadripper 3 and Intel i9-based systems. Regardless, I am kind of going deep on realtime playback of 8K Red Raw media on a mobile workstation — what a time we are in. Nonetheless, Blackmagic Raw footage was insanely fast when exporting out of Resolve, while export time for the Blackmagic Raw footage with effects was higher than I expected. There was a consistent use of the GPU and CPU in Resolve much like in the new version of Premiere 2020, which is a trend that’s nice to see.

In addition to Premiere and Resolve testing, I ran some common benchmarks that provide a good 30,000-foot view of the HP ZBook 15 G6 when comparing it to other systems. I decided to use the Puget Systems benchmarking tools. Unfortunately, at the time of this review, the tools were only working properly with Premiere and After Effects 2019, so I ran the After Effects benchmark using the 2019 version. The ZBook 15 G6 received an overall score of 802, render score of 79, preview score of 75.2 and tracking score of 86.4. These are solid numbers that beat out some desktop systems I have tested.

Corona

To test some 3D applications, I ran the Cinebench R20, which gave a CPU score of 3243, CPU (single core) score of 470 and an M/P ratio of 6.90x. I recently began running the Gooseberry benchmark scene in Blender to get a better sense of 3D rendering performance, and it took 29:56 to export. Using the Corona benchmark, it took 2:33 to render 16 passes, 3,216,368 rays/s. Using Octane Bench the ZBook 15 G6 received a score of 139.79. In the Vray benchmark for CPU, it received 9833 Ksamples, and in the Vray GPU testing, 228 mpaths. I’m not going to lie; I really don’t know a lot about what these benchmarks are trying to tell me, but they might help you decide whether this is the mobile workstation for your work.

Cinebench

One benchmark I thought was interesting between driver updates for the Nvidia Quadro RTX 3000 was the Neat Bench from Neat Video — the noise reduction plugin for video. It measures whether your system should use the CPU, GPU or a combination thereof to run Neat Video. Initially, the best combination result was to use the CPU only (seven cores) at 11.5fps.

After updating to the latest Nvidia drivers, the best combination result was to use the CPU (seven cores) and GPU (Quadro RTX 3000) at 24.2fps. A pretty incredible jump just from a driver update. Moral of the story: Make sure you have the correct drivers always!

Summing Up
Overall, the HP ZBook 15 G6 is a powerful mobile workstation that will work well across the board. From 3D to color correction apps, the Xeon processor in combination with the Quadro RTX 3000 will get you running 4K video without a problem. With the HP DreamColor anti-glare display using up to 600 nits of brightness and covering 100% of the DCI-P3 color space, coupled with the HDR option, you can rely on the attached display for color accuracy if you don’t have your output monitor attached. And with features like two USB Type-C ports (Thunderbolt 3 plus DP 1.4 plus USB 3.1 Gen 2), you can connect external monitors for a larger view of your work

The HP Fast Charge will get you out of a dead battery fiasco with the ability to go from 0% to 50% charge in 45 minutes. All of this for around $4,000 seems to be a pretty low price to pay, especially because it includes a three-year on-site warranty and because the device is certified to work seamlessly with many apps that pros use with HP’s independent software vendor verifications.

If you are looking for a mobile workstation upgrade, are moving from desktop to mobile or want an alternative to a MacBook Pro, you should price a system out online.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

The Mill opens boutique studio in Berlin

Technicolor’s The Mill has officially launched in Berlin. This new boutique studio is located in the heart of Berlin, situated in the creative hub of Mitte, near many of Germany’s agencies, production companies and brands.

The Mill has been working with German clients for years. Recent projects include the Mercedes’ Bertha Benz spot with director Sebastian Strasser; Netto’s The Easter Surprise, directed in-house by The Mill; and BMW The 8 with director Daniel Wolfe. The new studio will bring The Mill’s full range of creative services from color to experiential and interactive, as well as visual effects and design.

The Mill Berlin crew

Creative director Greg Spencer will lead the creative team. He is a multi-award winning creative, having won several VES, Cannes Lions and British Arrow awards. His recent projects include Carlsberg’s The Lake, PlayStation’s This Could Be You and Eve Cuddly Toy. Spencer also played a role in some of Mill Film’s major titles. He was the 2D supervisor for Les Misérables and also worked on the Lord of the Rings trilogy. His resume also includes campaigns for brands such as Nike and Samsung.

Executive producer Justin Stiebel moves from The Mill London, where he has been since early 2014, to manage client relationships and new business. Since joining the company, Stiebel has produced spots such as Audi’s Next Level and the Mini’s “The Faith of a Few” campaign. He has also collaborated with directors such as Sebastian Strasser, Markus Walter and Daniel Wolfe while working on brands like Mercedes, Audi and BMW.

Sean Costelloe is managing director of The Mill London and The Mill Berlin.

Main Image Caption: (L-R) Justin Stiebel and Greg Spencer

Colorfront’s Express Dailies 2020 for Mac Pro, new rental model

Coinciding with Apple’s launch of the latest Mac Pro workstation, Colorfront announced a new, annual rental model for Colorfront Express Dailies.

Launching in Q1 2020, Colorfront’s subscription service allows users to rent Express Dailies 2020 for an annual fee of $5,000, including maintenance support, updates and upgrades. Additionally, the availability of Apple’s brand-new Pro Display XDR, designed for use with the new Mac Pro, makes on-set HDR monitoring, enabled by Colorfront systems, more cost effective.

Express Dailies 2020 supports 6K HDR/SDR workflow along with the very latest camera and editorial formats, including Apple ProRes and Apple ProRes RAW, ARRI MXF-wrapped ProRes, ARRI Alexa LF and Alexa Mini LF ARRIRAW, Sony Venice 5.0, Blackmagic RAW 1.5, and Codex HDE (High Density Encoding).

Express Dailies 2020 is optimized for 6K HDR/SDR dailies processing on the new Mac Pro running MacOS Catalina, leveraging the performance of the Mac Pro’s Intel Xeon 28 core CPU processor and multi-GPU rendering.

“With the launch of the new Mac Pro and Apple Pro Display XDR, we identified a new opportunity to empower top-end DITs and dailies facilities to adopt HDR workflows on a wide range of high-end TV ad motion picture productions,” says Aron Jaszberenyi, managing director of Colorfront. “When combined with the new Mac Pro and Pro Display XDR, Express Dailies 2020 subscription model gives new and cost-effective options for filmmakers wanting to take full advantage of 6K HDR/SDR workflows and HDR on-set.”

 

Company 3 ups Jill Bogdanowicz to co-creative head, feature post  

Company 3 senior colorist Jill Bogdanowicz will now share the title of creative head, feature post with senior colorist Stephen Nakamura. In this new role she will collaborate with Nakamura working to foster communication among artists, operations and management in designing and implementing workflows to meet the ever-changing needs of feature post clients.

“Company 3 has been and will always be guided by artists,” says senior colorist/president Stefan Sonnenfeld. “As we continue to grow, we have been formalizing our intra-company communication to ensure that our artists communicate among themselves and with the company as a whole. I’m excited that Jill will be joining Stephen as a representative of our feature colorists. Her years of excellent work and her deep understanding of color science makes her a perfect choice for this position.”

Among the kinds of issues Bogdanowicz and Nakamura will address: Mentorship within the company, artist recruitment and training and adapting for emerging workflows and client expectations.

Says Bogdanowicz, “As the company continues to expand, both in size and workload, I think it’s more important than ever to have Stephen and me in a position to provide guidance to help the features department grow efficiently while also maintaining the level of quality our clients expect. I intend to listen closely to clients and the other artists to make sure that their ideas and concerns are heard.”

Bogdanowicz has been a leading feature film colorist since the early 2000s. Recent work includes Joker, Spider-Man: Far From Home and Dr. Sleep, to name a few.

Storage for Color and Post

By Karen Moltenbrey

At nearly every phase of the content creation process, storage is at the center. Here we look at two post facilities whose projects continually push boundaries in terms of data, but through it all, their storage solution remains fast and reliable. One, Light Iron, juggles an average of 20 to 40 data-intensive projects at a time and must have a robust storage solution to handle its ever-growing work. Another, Final Frame, recently took on a project whose storage requirements were literally out of this world.

Amazon’s The Marvelous Mrs. Maisel

Light Iron
Light Iron provides a wide range of services, from dailies to post on feature films, indies and episodic shows, to color/conform/beauty work on commercials and short-form projects. The facility’s clients include Netflix, Amazon Studios, Apple TV+, ABC Studios, HBO, Fox, FX, Paramount and many more. Light Iron has been committed to evolving digital filmmaking techniques over the past 10 years and understands the importance of data availability throughout the pipeline. Having a storage solution that is reliable, fast and scalable is paramount to successfully servicing data-centric projects with an ever-growing footprint.

More than 100 full-time employees located at Light Iron’s Los Angeles and New York locations regularly access the company’s shared storage solutions. Both facilities are equipped for dailies and finishing, giving clients an option between its offices based on proximity. In New York, where space is at a premium, the company also offers offline editorial suites.

The central storage solution used at both locations is a Quantum StorNext file system along with a combination of network-attached and direct-attached storage. On the archive end, both sites use LTO-7 tapes for backing up before moving the data off the spinning disc storage.

As Lance Hayes, senior post production systems engineer, explains, the facility segments the storage between three different types of options. “We structured our storage environment in a three-tiered model, with redundancy, flexibility and security in mind. We have our fast disks (tier one), which are fast volumes used primarily for playbacks in the rooms. Then there are deliverable volumes (tier two), where the focus is on the density of the storage. These are usually the destination for rendered files. And then, our nearline network-attached storage (tier three) is more for the deep storage, a holding pool before output to tape,” he explains.

Light Iron has been using Quantum as its de facto standard for the past several years. Founded in 2009, Light Iron has been on an aggressive growth trajectory and has evolved its storage strategy in response to client needs and technological advancement. Before installing its StorNext system, it managed with JBOD (“just a bunch of discs”) direct-attached storage on a very limited number of systems to service its staff of then-30-some employees, says Keenan Mock, senior media archivist at Light Iron. Light Iron, though, grew quickly, “and we realized we needed to invest in a full infrastructure,” he adds.

Lance Hayes

At Light Iron, work often starts with dailies, so the workflow teams interact with production to determine the cameras being used, the codecs being shot, the number of shoot days, the expected shooting ratio and so forth. Based on that information, the group determines which generation of LTO stock makes the most sense for the project (LTO-6 or LTO-7, with LTO-8 soon to be an option at the facility). “The industry standard, and our recommendation as well, is to create two LTO tapes per shoot day,” says Mock. Then, those tapes are geographically separated for safety.

In terms of working materials, the group generally restores only what is needed for each individual show from LTO tape, as opposed to keeping the entire show on spinning disc. “This allows us to use those really fast discs in a cost-effective way,” Hayes says.

Following the editorial process, Light Iron restores only the needed shots plus handles from tape directly to the StorNext SAN, so online editors can have immediate access. The material stays on the system while the conform and DI occur, followed by the creation of final deliverables, which are sent to the tier two and tier three spinning disk storage. If the project needs to be archived to tape, Mock’s department takes care of that; if it needs to be uploaded, that usually happens from the spinning discs.

Light Iron’s FilmLight Baselight systems have local storage, which is used mainly as cache volumes to ensure sustained playback in the color suite. In addition, Blackmagic Resolve color correctors play back content directly to the SAN using tier two storage.

Keenan Mock

Light Iron continually analyzes its storage infrastructure and reviews its options in terms of the latest technologies. Currently, the company considers its existing storage solution to be highly functional, though it is reviewing options for the latest versions of flash solutions from Quantum in 2020.

Based on the facility’s storage workflow, there’s minimal danger of maxing out the storage space anytime soon.

While Light Iron is religious about creating a duplicate set of tapes for backup, “it’s a very rare occurrence [for the duplicate to be needed],” notes Mock, “But it can happen, and in that circumstance, Light Iron is prepared.”

As for the shared storage, the datasets used in post, compared to other industries, are very large, “and without shared storage and a clustered file system, we wouldn’t be able to do the jobs we are currently doing,” Hayes notes.

Final Frame
With offices in New York City and London, Final Frame is a full-featured post facility offering a range of services, including DI of every flavor, 8mm to 77mm film scanning and restoration, offline editing, VFX, sound editing (theatrical and home Dolby Atmos) and mastering. Its work spans feature films, documentaries and television. The facility’s recent work on the documentary film Apollo 11, though, tested its infrastructure like no other, including the amount of storage space it required.

Will Cox

“A long time ago, we decided that for the backbone of all our storage needs, we were going to rely on fiber. We have a total of 55 edit rooms, five projection theaters and five audio mixing rooms, and we have fiber connectivity between all of those,” says Will Cox, CEO/supervising colorist. So, for the past 20 years, ever since 1Gb fiber became available, Final Frame has relied on this setup, though every five years or so, the shop has upgraded to the next level of fiber and is currently using 16Gb fiber.

“Storage requirements have increased because image data has increased and audio data has increased with Atmos. So, we’ve needed more storage and faster storage,” Cox says.

While the core of the system is fiber, the facility uses a variety of storage arrays, the bulk of which are 16Gb 4000 Series SAN offerings from Infortrend, totaling approximately 2PB of space. In addition, the studio uses 8GB Promise Technology VTrak arrays, also totaling about 1PB. Additionally installed at the facility are some JetStor 8GB offerings. For SAN management, Final Frame uses Tiger Technology’s Tiger Store.

Foremost in Cox’s mind when looking for a storage solution is interoperability, since Final Frame uses Linux, Mac and Windows platforms; reliability and fault tolerance are important as well. “We run RAID-6 and RAID-60 for pretty much everything,” he adds. “We also focus on how good the remote management is. We’ve brought online so much storage, we need the storage vendors to provide good interfaces so that our engineers and IT people can manage and get realtime feedback about the performance of the arrays and any faults that are creeping in, whether it’s due to failed drives or drives that are performing less than we had anticipated.”

Final Frame has also brought on a good deal more SSD storage. “We manage projects a bit differently now than we used to, where we have more tiered storage,” Cox adds. “We still do a lot of spinning discs, but SSD is moving in, and that is changing our workflows somewhat in that we don’t have to render as many files and as many versions when we have really fast storage. As a result, there’s some cost-savings on personnel at the workflow level when you have extremely fast storage.”

When working with clients who are doing offline editing, Final Frame will build an isolated SAN for them, and when it comes time to finish the project, whether it’s a picture or audio, the studio will connect its online and mixing rooms to that SAN. This setup is beneficial to security, Cox contends, as it accelerates the workflow since there’s no copying of data. However, aside from that work, everyone generally has parallel access to the storage infrastructure and can access it at any time.

More recently, in addition to other projects, Final Frame began working on Apollo 11, a film directed by Todd Douglas Miller. Miller wanted to rescan all the original negatives and all the original elements available from the Apollo 11 moon landing for a documentary film using audio and footage (16mm and 35mm) from NASA during that extraordinary feat. “He asked if we could make a movie just with the archival elements of what existed,” says Cox.

While ramping up and determining a plan of attack — Final Frame was going to scan the data at 4K resolution — NASA and NARA (National Archives and Records Administration) discovered a lost cache of archives containing 65mm and 70mm film.

“At that point, we decided that existing scanning technology wasn’t sufficient, and we’d need a film scanner to scan all this footage at 16K,” Cox adds, noting the company had to design and build an entirely new 16K film scanner and then build a pipeline that could handle all that data. “If you can imagine how tough 4K is to deal with, then think about 16K, with its insanely high data rates. And 8K is four times larger than 4K, and 16K is four times larger than 8K, so you’re talking about orders-of-magnitude increases in data.”

Adding to the complexity, the facility had no idea how much footage it would be using. Alas, Final Frame ultimately considered its storage structure and the costs needed to take it to the next level for 16K scanning and determined that amount of data was just too much to move and too much to store. “As it was, we filled up a little over a petabyte of storage just scanning the 8K material. We were looking at 4PB, quadrupling the amount of storage infrastructure needed. Then we would have had to run backups of everything, which would have increased it by another 4PB.”

Considering these factors, Final Frame changed its game plan and decided to scan at 8K. “So instead of 2PB to 2.5PB, we would have been looking at 8PB to 10PB of storage if we continued with our earlier plan, and that was really beyond what the production could tolerate,” says Cox.

Even scanning at 8K, the group had to have the data held in the central repository. “We were scanning in, doing what were extensively dailies, restoration and editorial, all from the same core set of media. Then, as editorial was still going on, we were beginning to conform and finish the film so we could make the Sundance deadline,” recalls Cox.

In terms of scans, copies and so forth, Final Frame stored about 2.5PB of data for that project. But in terms of data created and then destroyed, the amount of data was between 12PB and 15PB. To handle this load, the facility needed storage that could perform quickly, be very redundant and large. This led the company to bring on an additional 1PB of Fibre Channel SAN storage to add to the 1.5PB already in place — dedicated to just the Apollo 11 project. “We almost had to double the amount of storage infrastructure in the whole facility just to run this one project,” Cox points out. The additional storage was added in half-petabyte array increments, all connected to the SAN, all at 16Gb fiber.

While storage is important to any project, it was especially true for the Apollo 11 project due to the aggressive deadlines and excessively large storage needs. “Apollo 11 was a unique project. We were producing imagery that was being returned to the National Archives to be part of the historic record. Because of the significance of what we were scanning, we had to be very attentive to the longevity and accuracy of the media,” says Cox. “So, how it was being stored and where it was being stored were important factors on this project, more so than maybe any other project we’ve ever done.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Storage Roundtable

By Randi Altman

Every year in our special Storage Edition, we poll those who use storage and those who make storage. This year is no different. The users we’ve assembled for our latest offering weigh in on how they purchase gear, how they employ storage and cloud-based solutions. Storage makers talk about what’s to come from them, how AI and ML are affecting their tools, NVMe growth and more.

Enjoy…

Periscope Post & Audio, GM, Ben Benedetti

Periscope Post & Audio is a full-service post company with facilities in Hollywood and Chicago’s Cinespace. Both facilities provide a range of sound and picture finishing services for TV, film, spots, video games and other media.

Ben Benedetti

What types of storage are you using for your workflows?
For our video department, we have a large, high-speed Quantum media array supporting three color bays, two online edit suites, a dailies operation, two VFX suites and a data I/O department. The 15 systems in the video department are connected via 16GB fiber.

For our sound department, we are using an Avid Nexis System via 6e Ethernet supporting three Atmos mix stages, two sound design suites, an ADR room and numerous sound-edit bays. All the CPUs in the facility are securely located in two isolated machine rooms (one for video on our second floor and one for audio on the first). All CPUs in the facility are tied via an IHSE KVM system, giving us incredible flexibility to move and deliver assets however our creatives and clients need them. We aren’t interested in being the biggest. We just want to provide the best and most reliable services possible.

Cloud versus on-prem – what are the pros and cons?
We are blessed with a robust pipe into our facility in Hollywood and are actively discussing with our engineering staff about using potential cloud-based storage solutions in the future. We are already using some cloud-based solutions for our building’s security system and CCTV systems as well as the management of our firewall. But the concept of placing client intellectual property in the cloud sparks some interesting conversations.We always need immediate access to the raw footage and sound recordings of our client productions, so I sincerely doubt we will ever completely rely on a cloud-based solution for the storage of our clients’ original footage. We have many redundancy systems in place to avoid slowdowns in production workflows. This is so critical. Any potential interruption in connectivity that is beyond our control gives me great pause.

How often are you adding or upgrading your storage?
Obviously, we need to be as proactive as we can so that we are never caught unready to take on projects of any size. It involves continually ensuring that our archive system is optimized correctly and requires our data management team to constantly analyze available space and resources.

How do you feel about the use of ML/AI for managing assets?
Any AI or ML automated process that helps us monitor our facility is vital. Technology advancements over the past decade have allowed us to achieve amazing efficiencies. As a result, we can give the creative executives and storytellers we service the time they need to realize their visions.

What role might the different tiers of cloud storage play in the lifecycle of an asset?
As we have facilities in both Chicago and Hollywood, our ability to take advantage of Google cloud-based services for administration has been a real godsend. It’s not glamorous, but it’s extremely important to keeping our facilities running at peak performance.

The level of coordination we have achieved in that regard has been tremendous. Those low-tiered storage systems provide simple and direct solutions to our administrative and accounting needs, but when it comes to the high-performance requirements of our facility’s color bays and audio rooms, we still rely on the high-speed on-premises storage solutions.

For simple archiving purposes, a cloud-based solution might work very well, but for active work currently in production … we are just not ready to make that leap … yet. Of course, given Moore’s Law and the exponential advancement of technology, our position could change rapidly. The important thing is to remain open and willing to embrace change as long as it makes practical sense and never puts your client’s property at risk.

Panasas, Storage Systems Engineer, RW Hawkins

RW Hawkins

Panasas offers a scalable high-performance storage solution. Its PanFS parallel file system, delivered on the ActiveStor appliance, accelerates data access for VFX feature production, Linux-based image processing, VR/AR and game development, and multi-petabyte sized active media archives.

What kind of storage are you offering, and will that be changing in the coming year?
We just announced that we are now shipping the next generation of the PanFS parallel file system on the ActiveStor Ultra turnkey appliance, which is already in early deployment with five customers.

This new system offers unlimited performance scaling in 4GB/s building blocks. It uses multi-tier intelligent data placement to maximize storage performance by placing metadata on low-latency NVMe SSDs, small files on high IOPS SSDs and large files on high-bandwidth HDDs. The system’s balanced-node architecture optimizes networking, CPU, memory and storage capacity to prevent hot spots and bottlenecks, ensuring high performance regardless of workload. This new architecture will allow us to adapt PanFS to the ever-changing variety of workloads our customers will face over the next several years.

Are certain storage tiers more suitable for different asset types, workflows, etc.?
Absolutely. However, too many tiers can lead to frustration around complexity, loss of productivity and poor reliability. We take a hybrid approach, whereby each server has multiple types of storage media internal to one server. Using intelligent data placement, we put data on the most appropriate tier automatically. Using this approach, we can often replace a performance tier and a tier two active archive with one cost-effective appliance. Our standard file-based client makes it easy to gateway to an archive tier such as tape or an object store like S3.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
AI/ML is so widespread, it seems to be all encompassing. Media tools will benefit greatly because many of the mundane production tasks will be optimized, allowing for more creative freedom. From a storage perspective, machine learning is really pushing performance in new directions; low latency and metadata performance are becoming more important. Large amounts of unstructured data with rich metadata are the norm, and today’s file systems need to adapt to meet these requirements.

How has NVMe advanced over the past year?
Everyone is taking notice of NVMe; it is easier than ever to build a fast array and connect it to a server. However, there is much more to making a performant storage appliance than just throwing hardware at the problem. My customers are telling me they are excited about this new technology but frustrated by the lack of scalability, the immaturity of the software and the general lack of stability. The proven way to scale is to build a file system on top of these fast boxes and connect them into one large namespace. We will continue to augment our architecture with these new technologies, all the while keeping an eye on maintaining our stability and ease of management.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
Today’s modern NAS can take on all the tasks that historically could only be done with SAN. The main thing holding back traditional NAS has been the client access protocol. With network-attached parallel clients, like Panasas’ DirectFlow, customers get advanced client caching, full POSIX semantics and massive parallelism over standard ethernet.

Regarding cloud, my customers tell me they want all the benefits of cloud (data center consolidation, inexpensive power and cooling, ease of scaling) without the vendor lock-in and metered data access of the “big three” cloud providers. A scalable parallel file system forms the core of a private cloud model that yields the benefits without the drawbacks. File-based access to the namespace will continue to be required for most non-web-based applications.

Goldcrest Post, New York, Technical Director, Ahmed Barbary

Goldcrest Post is an independent post facility, providing solutions for features, episodic TV, docs, and other projects. The company provides editorial offices, on-set dailies, picture finishing, sound editorial, ADR and mixing, and related services.

Ahmed Barbary

What types of storage are you using for your workflows?
Storage performance in the post stage is tremendously demanding. We are using multiple SAN systems in office locations that provide centralized storage and easy access to disk arrays, servers, and other dedicated playout applications to meet storage needs throughout all stages of the workflow.

While backup refers to duplicating the content for peace of mind, short-term retention, and recovery, archival signifies transferring the content from the primary storage location to long-term storage to be preserved for weeks, months, and even years to come. Archival storage needs to offer scalability, flexible and sustainable pricing, as well as accessibility for individual users and asset management solutions for future projects.

LTO has been a popular choice for archival storage for decades because of its affordable, high-capacity solutions with low write/high read workloads that are optimal for cold storage workflows. The increased need for instant access to archived content today, coupled with the slow roll-out of LTO-8, has made tape a less favorable option.

Cloud versus on-prem – what are the pros and cons?
The fact is each option has its positives and negatives, and understanding that and determining how both cloud and on-premises software fit into your organization are vital. So, it’s best to be prepared and create a point-by-point comparison of both choices.

When looking at the pros and cons of cloud vs. on-premises solutions, everything starts with an understanding of how these two models differ. With a cloud deployment, the vendor hosts your information and offers access through a web portal. This enables more mobility and flexibility of use for cloud-based software options. When looking at an on-prem solution, you are committing to local ownership of your data, hardware, and software. Everything is run on machines in your facility with no third-party access.

How often are you adding or upgrading your storage?
We keep track of new technologies and continuously upgrade our systems, but when it comes to storage, it’s a huge expense. When deploying a new system, we do our best to future-proof and ensure that it can be expanded.

How do you feel about the use of ML/AI for managing assets?
For most M&E enterprises, the biggest potential of AI lies in automatic content recognition, which can drive several path-breaking business benefits. For instance, most content owners have thousands of video assets.

Cataloging, managing, processing, and re-purposing this content typically requires extensive manual effort. Advancements in AI and ML algorithms have
now made it possible to drastically cut down the time taken to perform many of these tasks. But there is still a lot of work to be done — especially as ML algorithms need to be trained, using the right kind of data and solutions, to achieve accurate results.

What role might the different tiers of cloud storage play in the lifecycle of an asset?
Data sets have unique lifecycles. Early in the lifecycle, people access some data often, but the need for access drops drastically as the data ages. Some data stays idle in the cloud and is rarely accessed once stored. Some data expires days or months after creation, while other data sets are actively read and modified throughout their lifetimes.

Rohde & Schwarz, Product Manager, Storage Solutions, Dirk Thometzek

Rohde & Schwarz offers broadcast and media solutions to help companies grow in media production, management and delivery in the IP and wireless age.

Dirk Thometzek

What kind of storage are you offering, and will that be changing in the coming year?
The industry is constantly changing, so we monitor market developments and key demands closely. We will be adding new features to the R&S SpycerNode in the next few months that will enable our customers to get their creative work done without focusing on complex technologies. The R&S SpycerNode will be extended with JBODs, which will allow seamless integration with our erasure coding technology, guaranteeing complete resilience and performance.

Are certain storage tiers more suitable for different asset types, workflows, etc.?
Each workflow is different, so, consequently, there is almost no system alike. The real artistry is to tailor storage systems according to real requirements without over-provisioning hardware or over-stressing budgets. Using different tiers can be very helpful to build effective systems, but they might introduce additional difficulties to the workflows if the system isn’t properly designed.

Rohde & Schwarz has developed R&S SpycerNode in a way that its performance is linear and predictable. Different tiers are aggregated under a single namespace, and our tools allow seamless workflows while complexity remains transparent to the users.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
Machine learning and artificial intelligence can be helpful to automate certain tasks, but they will not replace human intervention in the short term. It might not be helpful to enrich media with too much data because doing so could result in imprecise queries that return far too much content.

However, clearly defined changes in sequences or reoccurring objects — such as bugs and logos — can be used as a trigger to initiate certain automated workflows. Certainly, we will see many interesting advances in the future.

How has NVMe advanced over the past year?
NVMe has very interesting aspects. Data rates and reduced latencies are admittedly quite impressive and are garnering a lot of interest. Unfortunately, we do see a trend inside our industry to be blinded by pure performance figures and exaggerated promises without considering hardware quality, life expectancy or proper implementation. Additionally, if well-designed and proven solutions exist that are efficient enough, then it doesn’t make sense to embrace a technology just because it is available.

R&S is dedicated to bringing high-end devices to the M&E market. We think that reliability and performance build the foundation for user-friendly products. Next year, we will update the market on how NVMe can be used in the most efficient way within our products.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
We definitely see a trend away from classic Fibre Channel to Ethernet infrastructures for various reasons. For many years, NAS systems have been replacing central storage systems based on SAN technology for a lot of workflows. Unfortunately, standard NAS technologies will not support all necessary workflows and applications in our industry. Public and private cloud storage systems play an important role in overall concepts, but they can’t fulfil all necessary media production requirements or ease up workflows by default. Plus, when it comes to subscription models, [sometimes there could be unexpected fees]. In fact, we do see quite a few customers returning to their previous services, including on-premises storage systems such as archives.

When it comes to the very high data rates necessary for high-end media productions, NAS will relatively quickly reach its technical limits. Only block-level access can deliver the reliable performance necessary for uncompressed productions at high frame rates.

That does not necessarily mean Fibre Channel is the only solution. The R&S SpycerNode, for example, features a unified 100Gb/s Ethernet backbone, wherein clients and the redundant storage nodes are attached to the same network. This allows the clients to access the storage over industry-leading NAS technology or native block level while enabling true flexibility using state-of-the-art technology.

MTI Film, CEO, Larry Chernoff

Hollywood’s MTI Film is a full-service post facility, providing dailies, editorial, visual effects, color correction, and assembly for film, television, and commercials.

Larry Chernoff

What types of storage are you using for your workflows?
MTI uses a mix of spinning and SSD disks. Our volumes range from 700TB to 1000TB and are assigned to projects depending on the volume of expected camera files. The SSD volumes are substantially smaller and are used to play back ultra-large-resolution files, where several users are using the file.

Cloud versus on-prem — what are the pros and cons?
MTI only uses on-prem storage at the moment due to the real-time, full-resolution nature of our playback requirements. There is certainly a place for cloud-based storage but, as a finishing house, it does not apply to most of our workflows.

How often are you adding or upgrading your storage?
We are constantly adding storage to our facility. Each year, for the last five, we’ve added or replaced storage annually. We now have approximately 8+ PB, with plans for more in the future.

How do you feel about the use of ML/AI for managing assets?
Sounds like fun!

What role might the different tiers of cloud storage play in the lifecycle of an asset?
For a post house like MTI, we consider cloud storage to be used only for “deep storage” since our bandwidth needs are very high. The amount of Internet connectivity we would require to replicate the workflows we currently have using on-prem storage would be prohibitively expensive for a facility such as MTI. Speed and ease of access is critical to being able to fulfill our customers’ demanding schedules.

OWC,Founder/CEO, Larry O’Connor

Larry O’Connor

OWC offers storage, connectivity, software, and expansion solutions designed to enhance, accelerate, and extend the capabilities of Mac- and PC-based technology. Their products range from the home desktop to the enterprise rack to the audio recording studio to the motion picture set and beyond.

What kind of storage are you offering, and will that be changing in the coming year?
OWC will be expanding our Jupiter line of NAS storage products in 2020 with an all new external flash base array. We will also be launching the OWC ThunderBay Flex 8, a three-in-one Thunderbolt 3 storage, docking, and PCIe expansion solution for digital imaging, VFX, video production, and video editing.

Are certain storage tiers more suitable for different asset types, workflows etc?
Yes. SSD and NVMe are better for on-set storage and editing. Once you are finished and looking to archive, HDD are a better solution for long term storage.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
We see U.2 SSDs as a trend that can help storage in this space. Also, solutions that allow the use of external docking of U.2 across different workflow needs.

How has NvME advanced over the past year?
We have seen NVMe technology become higher in capacity, higher in performance, and substantially lower in power draw. Yet even with all the improving performance, costs are lower today versus 12 months ago. SSD and NVMe are better for on-set storage and editing. Once you are finished and looking to archive, HDD are a better solution for long term storage.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
I see both still having their place — I can’t speak to if one will take over the other. SANs provide other services that typically go hand in hand with M&E needs.

As for cloud, I can see some more cloud coming in, but for M&E on-site needs, it just doesn’t compete anywhere near with what the data rate demand is for editing, etc. Everything independently has its place.

EditShare, VP of Product Management, Sunil Mudholkar

EditShare offers a range of media management solutions, from ingest to archive with a focus on media and entertainment.

Sunil Mudholkar

What kind of storage are you offering and will that be changing in the coming year?
EditShare currently offers RAID and SSD, along with our nearline Sata HDD-based storage. We are on track to deliver NVMe- and cloud-based solutions in the first half of 2020. The latest major upgrade of our file system and management console, EFS2020, enables us to migrate to emerging technologies, including cloud deployment and using NVMe hardware.

EFS can manage and use multiple storage pools, enabling clients to use the most cost-effective tiered storage for their production, all while keeping that single namespace.

Are certain storage tiers more suitable for different asset types, workflows etc?
Absolutely. It’s clearly financially advantageous to have varying performance tiers of storage that are in line with the workflows the business requires. This also extends to the cloud, where we are seeing public cloud-based solutions augment or replace both high-performance and long-term storage needs. Tiered storage enables clients to be at their most cost effective by including parking storage and cloud storage for DR, while keeping SSD and NVME storage ready and primed for their high-end production.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
AI and ML have somewhat of an advantage for storage when it comes to things like algorithms that are designed to automatically move content between storage tiers to optimize costs. This has been commonplace in the distribution side of the ecosystem for a long time with CDNs. ML and AI have a great ability to impact the Opex side of asset management and metadata by helping to automate very manual, repetitive data entry tasks through audio and image recognition, as an example.

AI can also assist by removing mundane human-centric repetitive tasks, such as logging incoming content. AI can assist with the growing issue of unstructured and unmanaged storage pools, enabling the automatic scanning and indexing of every piece of content located on a storage pool.

How has NVMe advanced over the past year?
Like any other storage medium, when it’s first introduced there are limited use cases that make sense financially, and only a certain few can afford to deploy it. As the technology scales and changes in form factor, and pricing becomes more competitive and inline with other storage options, it then can become more mainstream. This is what we are starting to see with NVMe.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
Yes, NAS has overtaken SAN. It’s easier technology to deal with — this is fairly well acknowledged. It’s also easier to find people/talent with experience in NAS. Cloud will start to replace more NAS workflows in 2020, as we are already seeing today. For example, our ACL media spaces project options within our management console were designed for SAN clients migrating to NAS. They liked the granular detail that SAN offered, but wanted to migrate to NAS. EditShare’s ACL enables them to work like a SAN but in a NAS environment.

Zoic Studios CTO Saker Klippsten

Zoic Studios is an Emmy-winning VFX company based in Culver City, California, with sister offices in Vancouver and NYC. It creates computer-generated special effects for commercials, films, television and video games.

Saker Klippsten

What types of projects are you working on?
We work on a range of projects for series, film, commercial and interactive games (VR/AR). Most of the live-action projects are mixed with CG/VFX and some full-CG animated shots. In addition, there is typically some form of particle or fluid effects simulation going on, such as clouds, water, fire, destruction or other surreal effects.

What types of storage are you using for those workflows?
Cryogen – Off-the-shelf tape/disk/chip. Access time > 1 day. Mostly tape-based and completely offline, which requires human intervention to load tapes or restore from drives.
Freezing – Tape robot library. Access time < .5 day. Tape-based and in the robot. This does not require intervention.Cold – Spinning disk. Access time — slow (online). Disaster recovery and long-term archiving.
Warm – Spinning disk. Access time — medium (online). Data that needs to still be accessed promptly and transferred quickly (asset depot).
Hot – Chip-based. Access time — fast (online). SSD generic active production storage.
Blazing – Chip-based. Access time — uber fast (online). NVMe dedicated storage for 4K and 8K playback, databases and specific simulation workflows.

Cloud versus on-prem – what are the pros and cons?
The great debate! I tend to not look at it as pro vs. con, but where you are as a company. Many factors are involved and there is no one size that fits all, as many are led to believe, and neither cloud or on-prem alone can solve all your workflow and business challenges.

Cinemax’s Warrior (Credit: HBO/David Bloomer)

There are workflows that are greatly suited for the cloud and others that are potentially cost prohibitive for a number of reasons, such as the size of the data set being generated. Dynamics Cache Simulations are a good example, which can quickly generate tens of TBs or sometimes hundreds of TBs. If the workflow requires you to transfer this data on premises for review, it could take a very long time. Other workflows such as 3D CG-generated data can take better advantage of the cloud. They typically have small source file payloads that need to be uploaded and then only require final frames to be downloaded, which is much more manageable. Depending on the size of your company and level of technical people on hand, the cloud can be a problem

What triggers buying more storage in your shop?
Storage tends to be one of the largest and most significant purchases at many companies. End users do not have a clear concept of what happens at the other end of the wire from their workstation.

All they know is that there is never enough storage and it’s never fast enough. Not investing in the right storage can not only be detrimental to the delivery and production of a show, but also to the mental focus and health of the end users. If artists are constantly having to stop and clean up/delete, it takes them out of their creative rhythm and slows down task completion.

If the storage is not performing properly and is slow, this will not only have an impact on delivery, but the end user might be afraid they are being perceived as being slow. So what goes into buying more storage? What type of impact will buying more storage have on the various workflows and pipelines? Remember, if you are a mature company you are buying 2TB of storage for every 1TB required for DR purposes, so you have a complete up-to-the-hour backup.

Do you see ML/AI as important to your content strategy?
We have been using various layers of ML and heuristics sprinkled throughout our content workflows and pipelines. As an example, we look at the storage platforms we use to understand what’s on our storage, how and when it’s being used, what it’s being used for and how it’s being accessed. We look at the content to see what it contains and its characteristics. What are the overall costs to create that content? What insights can we learn from it for similarly created content? How can we reuse assets to be more efficient?

Dell Technologies, CTO, Media & Entertainment, Thomas Burns

Thomas Burns

Dell offers technologies across workstations, displays, servers, storage, networking and VMware, and partnerships with key media software vendors to provide media professionals the tools to deliver powerful stories, faster.

What kind of storage are you offering, and will that be changing in the coming year?
Dell Technologies offers a complete range of storage solutions from Isilon all-flash and disk-based scale-out NAS to our object storage, ECS, which is available as an appliance or a software-defined solution on commodity hardware. We have also developed and open-sourced Pravega, a new storage type for streaming data (e.g. IoT and other edge workloads), and continue to innovate in file, object and streaming solutions with software-defined and flexible consumption models.

Are certain storage tiers more suitable for different asset types, workflows etc?
Intelligent tiering is crucial to building a post and VFX pipeline. Today’s global pipelines must include software that distinguishes between hot data on the fastest tier and cold or versioned data on less performant tiers, especially in globally distributed workflows. Bringing applications to the media rather than unnecessarily moving media into a processing silo is the key to an efficient production.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
New developments in storage class memory (SCM) — including the use of carbon nanotubes to create a nonvolatile, standalone memory product with speeds rivaling DRAM without needing battery backup — have the potential to speed up media workflows and eliminate AI/ML bottlenecks. New protocols such as NVMe allow much deeper I/O queues, overcoming today’s bus bandwidth limits.

GPUDirect enables direct paths between GPUs and network storage, bypassing the CPU for lower latency access to GPU compute — desirable for both M&E and AI/ML applications. Ethernet mesh, a.k.a. Leaf/Spine topologies, allow storage networks to scale more flexibly than ever before.

How has NVMe advanced over the past year?
Advances in I/O virtualization make NVMe useful in hyper-converged infrastructure, by allowing different virtual machines (VMs) to share a single PCIe hardware interface. Taking advantage of multi-stream writes, along with vGPUs and vNICs, allows talent to operate more flexibly as creative workstations start to become virtualized.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
IP networks scale much better than any other protocol, so NAS allows on-premises workloads to be managed more efficiently than SAN. Object stores (the basic storage type for cloud services) support elastic workloads extremely well and will continue to be an integral part of public, hybrid and private cloud media workflows.

ATTO, Manager, Products Group, Peter Donnelly

ATTO network and storage connectivity products are purpose-made to support all phases of media production, from ingest to final archiving. ATTO offers an ecosystem of high-performance connectivity adapters, network interface cards and proprietary software.

Peter Donnelly

What kind of storage are you offering, and will that be changing in the coming year?
ATTO designs and manufactures storage connectivity products, and although we don’t manufacture storage, we are a critical part of the storage ecosystem. We regularly work with our customers to find the best solutions to their storage workflow and performance challenges.

ATTO designs products that use a wide variety of storage protocols. SAS, SATA, Fibre Channel, Ethernet and Thunderbolt are all part of our core technology portfolio. We’re starting to see more interest in NVMe solutions. While NVMe has already seen some solid growth as an “inside-the-box” storage solution, scalability, cost and limited management capabilities continue to limit its adoption as an external storage solution.

Data protection is still an important criteria in every data center. We are seeing a shift from traditional hardware RAID and parity RAID to software RAID and parity code implementations. Disk capacity has grown so quickly that it can take days to rebuild a RAID group with hardware controllers. Instead, we see our customers taking advantage of rapidly dropping storage prices and using faster, reliable software RAID implementations with basic HBA hardware.

How has NVMe advanced over the past year?
For inside-the-box storage needs, we have absolutely seen adoption skyrocket. It’s hard to beat the price-to-performance ratio of NVMe drives for system boot, application caching and similar use cases.

ATTO is working independently and with our ecosystem partners to bring those same benefits to shared, networked storage systems. Protocols such as NVMe-oF and FC-NVMe are enabling technologies that are starting to mature, and we see these getting further attention in the coming year.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
We see customers looking for ways to more effectively share storage resources. Acquisition and ongoing support costs, as well as the ability to leverage existing technical skills, seem to be important factors pulling people toward Ethernet-based solutions.
However, there is no free lunch, and these same customers aren’t able to compromise on performance and latency concerns, which are important reasons why they used SANs in the first place. So there’s a lot of uncertainty in the market today. Since we design and market products in both the NAS and SAN spaces, we spend a lot of time talking with our customers about their priorities so that we can help them pick the solutions that best fit their needs.

Masstech, CTO, Mike Palmer

Masstech creates intelligent storage and asset lifecycle management solutions for the media and entertainment industry, focusing on broadcast and video content storage management with IT technologies.

Mike Palmer

What kind of storage are you offering, and will that be changing in the coming year?
Masstech products are used to manage a combination of any or all of these kinds of storage. Masstech allows content to move without friction across and through all of these technologies, most often using automated workflows and unified interfaces that hide the complexity otherwise required to directly manage content across so many different types of storage.

Are certain storage tiers more suitable for different asset types, workflows, etc.?
One of the benefits of having such a wide range of storage technologies to choose from is that we have the flexibility to match application requirements with the optimum performance characteristics of different storage technologies in each step of the lifecycle. Users now expect that content will automatically move to storage with the optimal combination of speed and price as it progresses through workflow.

In the past, HSM was designed to handle this task for on-prem storage. The challenge is much wider now with the addition of a plethora of storage technologies and services. Rather than moving between just two or three tiers of on-prem storage, content now often needs to flow through a hybrid environment of on-prem and cloud storage, often involving multiple cloud services, each with three or four sub-tiers. Making that happen in a seamless way, both to users and to integrated MAMs and PAMs, is what we do.

What do you see are the big technology trends that can help storage for M&E?
Cloud storage pricing that continues to drop along with the advance of storage density in both spinning disk and solid state. All of these are interrelated and have the general effect of lowering costs for the end user. For those who have specific business requirements that drive on-prem storage, the availability of higher density tape and optical disks is enabling petabytes of very efficient cold storage within less space than contained in a single rack.

How has NVMe advanced over the past year?
In addition to the obvious application of making media available more quickly, the greatest value of NVMe within M&E may be found in enabling faster search of both structured and unstructured metadata associated with media. Yes, we need faster access to media, but in many cases we must first find the media before it can be accessed. NVMe can make that search experience, particularly for large libraries, federated data sets and media lakes, lightning quick.

Do you see NAS overtaking SAN for larger workgroups? How about cloud taking on some of what NAS used to do?
Just as AWS, Azure and Wasabi, among other large players, have replaced many instances of on-prem NAS, so have Box, Dropbox, Google Drive and iCloud replaced many (but not all) of the USB drives gathering dust in the bottom of desk drawers. As NAS is built on top of faster and faster performing technologies, it is also beginning to put additional pressure on SAN – particularly for users who are sensitive to price and the amount of administration required.

Backblaze, Director of Product Marketing, M&E, Skip Levens

Backblaze offers easy-to-use cloud backup, archive and storage services. With over 12 years of experience and more than 800 Petabytes of customer data under management, Backblaze has offers cloud storage to anyone looking to create, distribute and preserve their content forever.

What kind of storage are you offering and will that be changing in the coming year?
At Backblaze, we offer a single class, or tier, of storage where everything’s active and immediately available wherever you need it, and it’s protected better than it would be on spinning disk or RAID systems.

Skip Levens

Are certain storage tiers more suitable for different asset types, workflows, etc?
Absolutely. For example, animators need different storage than a team of editors all editing a 4K project at the same time. And keeping your entire content library on your shared storage could get expensive indeed.

We’ve found that users can give up all that unneeded complexity and cost that gets in the way of creating content in two steps:
– Step one is getting off of the “shared storage expansion treadmill” and buying just enough on-site shared storage that fits your team. If you’re delivering a TV show every week and need a SAN, make it just large enough for your work in process and no larger.

– Step two is to get all of your content into active cloud storage. This not only frees up space on your shared storage, but makes all of your content highly protected and highly available at the same time. Since most of your team probably use MAM to find and discover content, the storage that assets actually live on is completely transparent.

Now life gets very simple for creative support teams managing that workflow: your shared storage stays fast and lean, and you can stop paying for storage that doesn’t fit that model. This could include getting rid of LTO, big JBODs or anything with a limited warranty and a maintenance contract.

What do you see are the big technology trends that can help storage for M&E?
For shooters and on-set data wranglers, the new class of ultra-fast flash drives dramatically speeds up collecting massive files with extremely high resolution. Of course, raw content isn’t safe until it’s ingested, so even after moving shots to two sets of external drives or a RAID cart, we’re seeing cloud archive on ingest. Uploading files from a remote location, before you get all the way back to the editing suite, unlocks a lot of speed and collaboration advantages — the content is protected faster, and your ingest tools can start making proxy versions that everyone can start working on, such as grading, commenting, even rough cuts.

We’re also seeing cloud-delivered workflow applications. The days of buying and maintaining a server and storage in your shop to run an application may seem old-fashioned. Especially when that entire experience can now be delivered from the cloud and on-demand.

Iconik, for example, is a complete, personalized deployment of a project collaboration, asset review and management tool – but it lives entirely in the cloud. When you log in, your app springs to life instantly in the cloud, so you only pay for the application when you actually use it. Users just want to get their creative work done and can’t tell it isn’t a traditional asset manager.

How has NVMe advanced over the past year?
NVMe means flash storage can completely ditch legacy storage controllers like the ones on traditional SATA hard drives. When you can fit 2TB of storage on a stick thats only 22 millimeters by 80 millimeters — not much larger than a stick of gum — and it’s 20 times faster than an external spinning hard drive and draws only 3.5V, that’s a game changer for data wrangling and camera cart offload right now.

And that’s on PCIe 3. The PCI Express standard is evolving faster and faster too. PCIe 4 motherboards are starting to come online now, PCIe 5 was finalized in May, and PCIe 6 is already in development. When every generation doubles the available bandwidth that can feed that NVMEe storage, the future is very, very bright for NVMe.

Do you see NAS overtaking SAN for larger workgroups? How about cloud taking on some of what NAS used to do?
For users who work in widely distributed teams, the cloud is absolutely eating NAS. When the solution driving your team’s projects and collaboration is the dashboard and focus of the team — and active cloud storage seamlessly supports all of the content underneath — it no longer needs to be on a NAS.

But for large teams that do fast-paced editing and creation, the answer to “what is the best shared storage for our team” is still usually a SAN, or tightly-coupled, high-performance NAS.

Either way, by moving content and project archives to the cloud, you can keep SAN and NAS costs in check and have a more productive workflow, and more opportunities to use all that content for new projects.

Quick Chat: The Rebel Fleet’s Michael Urban talks on-set workflows

When shooting major motion pictures and episodic television with multiple crews in multiple locations, production teams need a workflow that gives them fast access and complete control of the footage across the entire production, from the first day of the shoot to the last day of post. This is Wellington, New Zealand-based The Rebel Fleet’s reason for being.

What exactly do they do? Well we reached out to managing director Michael Urban to find out.

Can you talk more about what you do and what types of workflows you supply?
The Rebel Fleet supplies complete workflow solutions, from on-set Qtake video assist and DIT to dailies, QC, archive and delivery to post. By managing the entire workflow, we can provide consistency and certainty around the color pipeline, monitor calibration, crew expertise and communication, and production can rely on one team to take care of that part of the workflow.

We have worked closely with Moxion many times and use its Immediates workflow, which enables automated uploads direct from video assist into its secure dailies platform. Anyone with access to the project can view rushes and metadata from set moments after the video is shot. This also enables different shooting units to automatically and securely share media. Two units shooting in different countries can see what each other has shot, including all camera and scene/take metadata. This is then available and catalogued directly into the video assist system. We have a lot of experience working alongside camera and VFX on-set as well as delivering to post, making sure we are delivering exactly what’s needed in the right formats.

You recently worked on a film that was shot in New Zealand and China, and you sent crews to China. Can you talk about that workflow a bit and name the film?
I can’t name the film yet, but I can tell you that it’s in the adventure genre and is coming out in the second half of 2020. The main pieces of software are Colorfront On-Set Dailies for processing all the media and Yoyotta for downloading and verifying media. We also use Avid for some edit prep before handing over to editorial.

How did you work with the DP and director? Can you talk about those relationships on this particular film?
On this shoot the DP and director had rushes screenings each night to go over the main unit and second unit rushes and make sure the dailies grade was exactly what they wanted. This was the last finesse before handing over dailies to editorial, so it had to be right. As rushes were being signed off, we would send them off to the background render engine, which would create four different outputs in multiple resolutions and framing. This meant that moments after the last camera mag was signed off, the media was ready for Avid prep and delivery. Our data team worked hard to automate as many processes as possible so there would be no long nights sorting reports and sheets. That work happened as we went throughout the day instead of leaving a multitude of tasks for the end of the day.

How do your workflows vary from project to project?
Every shoot is approached with a clean slate, and we work with the producers, DP and post to make sure we create a workflow that suits the logistical, budgetary and technical needs of that shoot. We have a tool kit that we rely on and use it to select the correct components required. We are always looking for ways to innovate and provide more value for the bottom line.

You mentioned using Colorfront tools, what does that offer you? And what about storage? Seems like working on location means you need a solid way to back up.
Colorfront On-Set Dailies takes care of QC, grade, sound sync and metadata. All of our shared storage is built around Quantum Xcellis, plus the Quantum QXS hybrid storage systems for online and nearline. We create the right SAN for the job depending on the amount of storage and clients required for that shoot.

Can you name projects you’ve worked on in the past as well as some recent work?
Warner Bros.’ The Meg, DreamWorks’ Ghost in the Shell, Sonar’s The Shannara Chronicles, STX Entertainment’s Adrift, Netflix’s The New Legends of Monkey and The Letter for the King and Blumhouse’s Fantasy Island.

Deluxe NY adds color to Mister Rogers biopic

A Beautiful Day in the Neighborhood stars Tom Hanks as children’s television icon Fred Rogers in a story about kindness triumphing over cynicism. Inspired by the article “Can You Say…Hero?” by journalist Tom Junod, the film is directed by Marielle Heller. The cinematographer Jody Lee Lipes worked on the color finishing with Deluxe New York’s Sam Daley.

Together Heller and Lipes worked to replicate the feature’s late 1990’s film aesthetic through in-camera techniques. After testing various film and digital camera options, production opted to shoot a majority of the footage with ARRI Alexa cameras in Super 16 mode. To more accurately represent the look of the Mister Rogers’ Neighborhood, Lipes’ team scoured the globe for working versions of the same Ikegami video cameras that were used to tape the show. In a similar quest for authenticity, Daley got a touch up on the look of Mister Rogers’ Neighborhood by watching old episodes, even visiting a Pittsburgh museum that housed the show’s original set. He also researched film styles typical of the time period to help inform the overall look of the feature.

“Incorporating Ikegami video footage into the pipeline was the most challenging aspect of the color on this film, and we did considerable testing to make sure that the quality of the video recordings would hold up in a theatrical environment,” Daley explained. “Jody and I have been working together for more than 10 years; we’re aesthetically in-sync and we both like to take what some might consider risks visually, and this film is no different.”

Through the color finish process, Daley helped unify and polish the final footage, which included PAL and NTSC video in addition to the Alexa-acquired digital material. He paid careful attention to integrate the different video standards and frame rates while also shaping two distinct looks to reflect the narrative. For contrast between the optimistic Rogers and his colorful world, Daley incorporated a cool moody feel around the pessimistic Junod, named “Lloyd Vogel” in the film and played by Matthew Rhys.

2019 HPA Award winners announced

The industry came together on November 21 in Los Angeles to celebrate its own at the 14th annual HPA Awards. Awards were given to individuals and teams working in 12 creative craft categories, recognizing outstanding contributions to color grading, sound, editing and visual effects for commercials, television and feature film.

Rob Legato receiving Lifetime Achievement Award from presenter Mike Kanfer. (Photo by Ryan Miller/Capture Imaging)

As was previously announced, renowned visual effects supervisor and creative Robert Legato, ASC, was honored with this year’s HPA Lifetime Achievement Award; Peter Jackson’s They Shall Not Grow Old was presented with the HPA Judges Award for Creativity and Innovation; acclaimed journalist Peter Caranicas was the recipient of the very first HPA Legacy Award; and special awards were presented for Engineering Excellence.

The winners of the 2019 HPA Awards are:

Outstanding Color Grading – Theatrical Feature

WINNER: “Spider-Man: Into the Spider-Verse”
Natasha Leonnet // Efilm

“First Man”
Natasha Leonnet // Efilm

“Roma”
Steven J. Scott // Technicolor

Natasha Leonnet (Photo by Ryan Miller/Capture Imaging)

“Green Book”
Walter Volpatto // FotoKem

“The Nutcracker and the Four Realms”
Tom Poole // Company 3

“Us”
Michael Hatzer // Technicolor

 

Outstanding Color Grading – Episodic or Non-theatrical Feature

WINNER: “Game of Thrones – Winterfell”
Joe Finley // Sim, Los Angeles

 “The Handmaid’s Tale – Liars”
Bill Ferwerda // Deluxe Toronto

“The Marvelous Mrs. Maisel – Vote for Kennedy, Vote for Kennedy”
Steven Bodner // Light Iron

“I Am the Night – Pilot”
Stefan Sonnenfeld // Company 3

“Gotham – Legend of the Dark Knight: The Trial of Jim Gordon”
Paul Westerbeck // Picture Shop

“The Man in The High Castle – Jahr Null”
Roy Vasich // Technicolor

 

Outstanding Color Grading – Commercial  

WINNER: Hennessy X.O. – “The Seven Worlds”
Stephen Nakamura // Company 3

Zara – “Woman Campaign Spring Summer 2019”
Tim Masick // Company 3

Tiffany & Co. – “Believe in Dreams: A Tiffany Holiday”
James Tillett // Moving Picture Company

Palms Casino – “Unstatus Quo”
Ricky Gausis // Moving Picture Company

Audi – “Cashew”
Tom Poole // Company 3

 

Outstanding Editing – Theatrical Feature

Once Upon a Time… in Hollywood

WINNER: “Once Upon a Time… in Hollywood”
Fred Raskin, ACE

“Green Book”
Patrick J. Don Vito, ACE

“Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese”
David Tedeschi, Damian Rodriguez

“The Other Side of the Wind”
Orson Welles, Bob Murawski, ACE

“A Star Is Born”
Jay Cassidy, ACE

 

Outstanding Editing – Episodic or Non-theatrical Feature (30 Minutes and Under)

VEEP

WINNER: “Veep – Pledge”
Roger Nygard, ACE

“Russian Doll – The Way Out”
Todd Downing

“Homecoming – Redwood”
Rosanne Tan, ACE

“Withorwithout”
Jake Shaver, Shannon Albrink // Therapy Studios

“Russian Doll – Ariadne”
Laura Weinberg

 

Outstanding Editing – Episodic or Non-theatrical Feature (Over 30 Minutes)

WINNER: “Stranger Things – Chapter Eight: The Battle of Starcourt”
Dean Zimmerman, ACE, Katheryn Naranjo

“Chernobyl – Vichnaya Pamyat”
Simon Smith, Jinx Godfrey // Sister Pictures

“Game of Thrones – The Iron Throne”
Katie Weiland, ACE

“Game of Thrones – The Long Night”
Tim Porter, ACE

“The Bodyguard – Episode One”
Steve Singleton

 

Outstanding Sound – Theatrical Feature

WINNER: “Godzilla: King of Monsters”
Tim LeBlanc, Tom Ozanich, MPSE // Warner Bros.
Erik Aadahl, MPSE, Nancy Nugent, MPSE, Jason W. Jennings // E Squared

“Shazam!”
Michael Keller, Kevin O’Connell // Warner Bros.
Bill R. Dean, MPSE, Erick Ocampo, Kelly Oxford, MPSE // Technicolor

“Smallfoot”
Michael Babcock, David E. Fluhr, CAS, Jeff Sawyer, Chris Diebold, Harrison Meyle // Warner Bros.

“Roma”
Skip Lievsay, Sergio Diaz, Craig Henighan, Carlos Honc, Ruy Garcia, MPSE, Caleb Townsend

“Aquaman”
Tim LeBlanc // Warner Bros.
Peter Brown, Joe Dzuban, Stephen P. Robinson, MPSE, Eliot Connors, MPSE // Formosa Group

 

Outstanding Sound – Episodic or Non-theatrical Feature

WINNER: “The Haunting of Hill House – Two Storms”
Trevor Gates, MPSE, Jason Dotts, Jonathan Wales, Paul Knox, Walter Spencer // Formosa Group

“Chernobyl – 1:23:45”
Stefan Henrix, Stuart Hilliker, Joe Beal, Michael Maroussas, Harry Barnes // Boom Post

“Deadwood: The Movie”
John W. Cook II, Bill Freesh, Mandell Winter, MPSE, Daniel Colman, MPSE, Ben Cook, MPSE, Micha Liberman // NBC Universal

“Game of Thrones – The Bells”
Tim Kimmel, MPSE, Onnalee Blank, CAS, Mathew Waters, CAS, Paula Fairfield, David Klotz

“Homecoming – Protocol”
John W. Cook II, Bill Freesh, Kevin Buchholz, Jeff A. Pitts, Ben Zales, Polly McKinnon // NBC Universal

 

Outstanding Sound – Commercial 

WINNER: John Lewis & Partners – “Bohemian Rhapsody”
Mark Hills, Anthony Moore // Factory

Audi – “Life”
Doobie White // Therapy Studios

Leonard Cheshire Disability – “Together Unstoppable”
Mark Hills // Factory

New York Times – “The Truth Is Worth It: Fearlessness”
Aaron Reynolds // Wave Studios NY

John Lewis & Partners – “The Boy and the Piano”
Anthony Moore // Factory

 

Outstanding Visual Effects – Theatrical Feature

WINNER: “The Lion King”
Robert Legato
Andrew R. Jones
Adam Valdez, Elliot Newman, Audrey Ferrara // MPC Film
Tom Peitzman // T&C Productions

“Avengers: Endgame”
Matt Aitken, Marvyn Young, Sidney Kombo-Kintombo, Sean Walker, David Conley // Weta Digital

“Spider-Man: Far From Home”
Alexis Wajsbrot, Sylvain Degrotte, Nathan McConnel, Stephen Kennedy, Jonathan Opgenhaffen // Framestore

“Alita: Battle Angel”
Eric Saindon, Michael Cozens, Dejan Momcilovic, Mark Haenga, Kevin Sherwood // Weta Digital

“Pokemon Detective Pikachu”
Jonathan Fawkner, Carlos Monzon, Gavin Mckenzie, Fabio Zangla, Dale Newton // Framestore

 

Outstanding Visual Effects – Episodic (Under 13 Episodes) or Non-theatrical Feature

Game of Thrones

WINNER: “Game of Thrones – The Bells”
Steve Kullback, Joe Bauer, Ted Rae
Mohsen Mousavi // Scanline
Thomas Schelesny // Image Engine

“Game of Thrones – The Long Night”
Martin Hill, Nicky Muir, Mike Perry, Mark Richardson, Darren Christie // Weta Digital

“The Umbrella Academy – The White Violin”
Everett Burrell, Misato Shinohara, Chris White, Jeff Campbell, Sebastien Bergeron

“The Man in the High Castle – Jahr Null”
Lawson Deming, Cory Jamieson, Casi Blume, Nick Chamberlain, William Parker, Saber Jlassi, Chris Parks // Barnstorm VFX

“Chernobyl – 1:23:45”
Lindsay McFarlane
Max Dennison, Clare Cheetham, Steven Godfrey, Luke Letkey // DNEG

 

Outstanding Visual Effects – Episodic (Over 13 Episodes)

Team from The Orville – Outstanding VFX, Episodic, Over 13 Episodes (Photo by Ryan Miller/Capture Imaging)

WINNER: “The Orville – Identity: Part II”
Tommy Tran, Kevin Lingenfelser, Joseph Vincent Pike // FuseFX
Brandon Fayette, Brooke Noska // Twentieth Century FOX TV

“Hawaii Five-O – Ke iho mai nei ko luna”
Thomas Connors, Anthony Davis, Chad Schott, Gary Lopez, Adam Avitabile // Picture Shop

“9-1-1 – 7.1”
Jon Massey, Tony Pirzadeh, Brigitte Bourque, Gavin Whelan, Kwon Choi // FuseFX

“Star Trek: Discovery – Such Sweet Sorrow Part 2”
Jason Zimmerman, Ante Dekovic, Aleksandra Kochoska, Charles Collyer, Alexander Wood // CBS Television Studios

“The Flash – King Shark vs. Gorilla Grodd”
Armen V. Kevorkian, Joshua Spivack, Andranik Taranyan, Shirak Agresta, Jason Shulman // Encore VFX

The 2019 HPA Engineering Excellence Awards were presented to:

Adobe – Content-Aware Fill for Video in Adobe After Effects

Epic Games — Unreal Engine 4

Pixelworks — TrueCut Motion

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix for Photon.

IDC goes bicoastal, adds Hollywood post facility 


New York’s International Digital Centre (IDC) has opened a new 6,800-square-foot digital post facility in Hollywood, with Rosanna Marino serving as COO. She will manage the day-to-day operations of the West Coast post house. IDC LA will focus on serving the entertainment, content creation, distribution and streaming industries.

Rosanna Marino

Marino will manage sales, marketing, engineering and the day-to-day operations for the Hollywood location, while IDC founder/CEO Marcy Gilbert, will lead the company’s overall activities and New York headquarters.

IDC will provide finishing, color grading and editorial in Dolby Vision 4K HDR, UHD as well as global QC. IDC LA features 11 bays and a DI theater, which includes Dolby 7.1 Atmos audio mixing, dubbing and audio description. They are also providing subtitle and closed caption-timed text creation and localization, ABS scripting and translations in over 40 languages.

To complete the end-to-end chain, they provide IMF and DCP creation, supplemental and all media fulfillment processing, including audio and timed text conforms for distribution. IDC is an existing Netflix Partner Program member — NP3 in New York and NPFP for the Americas and Canada.

IDC LA occupies the top two floors and rooftop deck in a vintage 1930’s brick building on Santa Monica Boulevard.

Abu Dhabi’s twofour54 is now Dolby Vision certified

Abu Dhabi’s twofour54 has become Dolby Vision certified in an effort to meet the demand for color grading and mastering Dolby Vision HDR content. twofour54 is the first certified Dolby Vision facility in the UAE, providing work in both Arabic and English.

“The way we consume content has been transformed by connectivity and digitalization, with consumers able to choose not only what they watch but where, when and how,” says Katrina Anderson, director of commercial services at twofour54. “This means it is essential that content creators have access to technology such as Dolby Vision in order to ensure their content reaches as wide an audience as possible around the world.”

With Netflix, Amazon Prime and others now competing with existing broadcasters, there is a big demand around the world for high-quality production facilities. According to twofour54, Netflix’s expenditure on content creation soared from $4.6 billion in 2015 to $12 billion last year, while other platforms — such as Amazon Prime, Apple TV and YouTube — are also seeking to create more unique content. Consequently, the global demand for production facilities such as those offered by twofour54 is outstripping supply.

“We have seen an increased interest for Dolby Vision in home entertainment due to growing popularity of digital streaming services in Middle East, and we are now able to support studios and content creators with leading-edge tools that are deployed at twofour54 world-class post facility,” explains Pankaj Kedia, managing director of emerging markets for Dolby Laboratories. “Dolby Vision is the preferred HDR mastering workflow for leading studios and a growing number of content creators, and hence this latest offering demonstrates twofour54 commitment to make Abu Dhabi a preferred location for film and TV production.”

Why is this important? For color grading of movies and episodic content, Dolby has created a workflow that generates shot-by-shot dynamic metadata that allows filmmakers to see how their content will look on consumer devices. The colorist can then add “trims” to adjust how the mapping looks and to deliver a better-looking SDR version for content providers serving early Ultra HD (UHD) televisions that are capable only of SDR reproduction.

The colorists at twofour54 use both Blackmagic DaVinci Resolve and FilmLight Baselight systems.

Main Image: Engineer Noura Al Ali

Harbor crafts color and sound for The Lighthouse

By Jennifer Walden

Director Robert Eggers’ The Lighthouse tells the tale of two lighthouse keepers, Thomas Wake (Willem Dafoe) and Ephraim Winslow (Robert Pattinson), who lose their minds while isolated on a small rocky island, battered by storms, plagued by seagulls and haunted by supernatural forces/delusion-inducing conditions. It’s an A24 film that hit theaters in late October.

Much like his first feature-length film The Witch (winner of the 2015 Sundance Film Festival Directing Award for a dramatic film and the 2017 Independent Spirit Award for Best First Feature), The Lighthouse is a tense and haunting slow descent into madness.

But “unlike most films where the crazy ramps up, reaching a fever pitch and then subsiding or resolving, in The Lighthouse the crazy ramps up to a fever pitch and then stays there for the next hour,” explains Emmy-winning supervising sound editor/re-recording mixer Damian Volpe. “It’s like you’re stuck with them, they’re stuck with each other and we’re all stuck on this rock in the middle of the ocean with no escape.”

Volpe, who’s worked with director Eggers on two short films — The Tell-Tale Heart and Brothers — thought he had a good idea of just how intense the film and post sound process would be going into The Lighthouse, but it ended up exceeding his expectations. “It was definitely the most difficult job I’ve done in over two decades of working in post sound for sure. It was really intense and amazing,” he says.

Eggers chose Harbor’s New York City location for both sound and final color. This was colorist Joe Gawler’s first time working with Eggers, but it couldn’t have been a more fitting film. The Lighthouse was shot on 35mm black & white (Double-X 5222) film with a 1.19:1 aspect ratio, and as it happens Gawler is well versed in the world of black & white. He’s remastered a tremendous amount of classic movie titles for The Criterion Collection, such as Breathless, Seventh Samurai and several Fellini films like 8 ½. “To take that experience from my Criterion title work and apply that to giving authenticity to a contemporary film that feels really old, I think it was really helpful,” Gawler says.

Joe Gawler

The advantage of shooting on film versus shooting digitally is that film negatives can be rescanned as technology advances, making it possible to take a film from the ‘60s and remaster it into 4K resolution. “When you shoot something digitally, you’re stuck in the state-of-the-moment technology. If you were shooting digitally 10 years ago and want to create a new deliverable of your film and reimagine it with today’s display technologies, you are compromised in some ways. You’re having to up-res that material. But if you take a 35mm film negative shot 100 years ago, the resolution is still inside that negative. You can rescan it with a new scanner and it’s going to look amazing,” explains Gawler.

While most of The Lighthouse was shot on black & white film (with Baltar lenses designed in the 1930s for that extra dose of authenticity), there were a few stock footage shots of the ocean with big storm waves and some digitally rendered elements, such as the smoke, that had to be color corrected and processed to match the rich, grainy quality of the film. “Those stock footage shots we had to beat up to make them feel more aged. We added a whole bunch of grain into those and the digital elements so they felt seamless with the rest of the film,” says Gawler.

The digitally rendered elements were separate VFX pieces composited into the black & white film image using Blackmagic’s DaVinci Resolve. “Conforming the movie in Resolve gave us the flexibility to have multiple layers and allowed us to punch through one layer to see more or less of another layer,” says Gawler. For example, to get just that right amount of smoke, “we layered the VFX smoke element on top of the smokestack in the film and reduced the opacity of the VFX layer until we found the level that Rob and DP Jarin Blaschke were happy with.”

In terms of color, Gawler notes The Lighthouse was all about exposure and contrast. The spectrum of gray rarely goes to true white and the blacks are as inky as they can be. “Jarin didn’t want to maintain texture in the blackest areas, so we really crushed those blacks down. We took a look at the scopes and made sure we were bottoming out so that the blacks were pure black.”

From production to post, Eggers’ goal was to create a film that felt like it could have been pulled from a 1930’s film archive. “It feels authentically antique, and that goes for the performances, the production design and all the period-specific elements — the lights they used and the camera, and all the great care we took in our digital finish of the film to make it feel as photochemical as possible,” says Gawler.

The Sound
This holds true for post sound, too. So much so that Eggers and Volpe kicked around the idea of making the soundtrack mono. “When I heard the first piece of score from composer Mark Korven, the whole mono idea went out the door,” explains Volpe. “His score was so wide and so rich in terms of tonality that we never would’ve been able to make this difficult dialogue work if we had to shove it all down one speaker’s mouth.”

The dialogue was difficult on many levels. First, Volpe describes the language as “old-timey, maritime” delivered in two different accents — Dafoe has an Irish-tinged seasoned sailor accent and Pattinson has an up-east Maine accent. Additionally, the production location made it difficult to record the dialogue, with wind, rain and dripping water sullying the tracks. Re-recording mixer Rob Fernandez, who handled the dialogue and music, notes that when it’s raining the lighthouse is leaking. You see the water in the shots because they shot it that way. “So the water sound is married to the dialogue. We wanted to have control over the water so the dialogue had to be looped. Rob wanted to save as much of the amazing on-set performances as possible, so we tried to go to ADR for specific syllables and words,” says Fernandez.

Rob Fernandez

That wasn’t easy to do, especially toward the end of the film during Dafoe’s monologue. “That was very challenging because at one point all of the water and surrounding sounds disappear. It’s just his voice,” says Fernandez. “We had to do a very slow transition into that so the audience doesn’t notice. It’s really focusing you in on what he is saying. Then you’re snapped out of it and back into reality with full surround.”

Another challenging dialogue moment was a scene in which Pattinson is leaning on Dafoe’s lap, and their mics are picking up each other’s lines. Plus, there’s water dripping. Again, Eggers wanted to use as much production as possible so Fernandez tried a combination of dialogue tools to help achieve a seamless match between production and ADR. “I used a lot of Synchro Arts’ Revoice Pro to help with pitch matching and rhythm matching. I also used every tool iZotope offers that I had at my disposal. For EQ, I like FabFilter. Then I used reverb to make the locations work together,” he says.

Volpe reveals, “Production sound mixer Alexander Rosborough did a wonderful job, but the extraneous noises required us to replace at least 60% of the dialogue. We spent several months on ADR. Luckily, we had two extremely talented and willing actors. We had an extremely talented mixer, Rob Fernandez. My dialogue editor William Sweeney was amazing too. Between the directing, the acting, the editing and the mixing they managed to get it done. I don’t think you can ever tell that so much of the dialogue has been replaced.”

The third main character in the film is the lighthouse itself, which lives and breathes with a heartbeat and lungs. The mechanism of the Fresnel lens at the top of the lighthouse has a deep, bassy gear-like heartbeat and rasping lungs that Volpe created from wrought iron bars drawn together. Then he added reverb to make the metal sound breathier. In the bowels of the lighthouse there is a steam engine that drives the gears to turn the light. Ephraim (Pattinson) is always looking up toward Thomas (Dafoe), who is in the mysterious room at the top of the lighthouse. “A lot of the scenes revolve around clockwork, which is just another rhythmic element. So Ephraim starts to hear that and also the sound of the light that composer Korven created, this singing glass sound. It goes over and over and drives him insane,” Volpe explains.

Damian Volpe

Mermaids make a brief appearance in the film. To create their vocals, Volpe and his wife did a recording session in which they made strange sea creature call-and-response sounds to each other. “I took those recordings and beat them up in Pro Tools until I got what I wanted. It was quite a challenge and I had to throw everything I had at it. This was more of a hammer-and-saw job than a fancy plug-in job,” Volpe says.

He captured other recordings too, like the sound of footsteps on the stairs inside a lighthouse on Cape Cod, marine steam engines at an industrial steam museum in northern Connecticut and more at the Mystic Sea Port… seagulls and waves. “We recorded so much. We dug a grave. We found an 80-year-old lobster pot that we smashed about. I recorded the inside of conch shells to get drones. Eighty percent of the sound in the film is material that I and Filipe Messeder (assistant and Foley editor) recorded, or that I recorded with my wife,” says Volpe.

But one of the trickiest sounds to create was a foghorn that Eggers originally liked from a lighthouse in Wales. Volpe tracked down the keeper there but the foghorn was no longer operational. He then managed to locate a functioning steam-powered diaphone foghorn in Shetland, Scotland. He contacted the lighthouse keeper Brian Hecker and arranged for a local documentarian to capture it. “The sound of the Sumburgh Lighthouse is a major element in the film. I did a fair amount of additional work on the recordings to make them sound more like the original one Rob [Eggers] liked, because the Sumburgh foghorn had a much deeper, bassier, whale-like quality.”

The final voice in The Lighthouse’s soundtrack is composer Korven’s score. Since Volpe wanted to blur the line between sound design and score, he created sounds that would complement Korven’s. Volpe says, “Mark Korven has these really great sounds that he generated with a ball on a cymbal. It created this weird, moaning whale sound. Then I created these metal creaky whale sounds and those two things sing to each other.”

In terms of the mix, nearly all the dialogue plays from the center channel, helping it stick to the characters within the small frame of this antiquated aspect ratio. The Foley, too, comes from the center and isn’t panned. “I’ve had some people ask me (bizarrely) why I decided to do the sound in mono. There might be a psychological factor at work where you’re looking at this little black & white square and somehow the sound glues itself to that square and gives you this idea that it’s vintage or that it’s been processed or is narrower than it actually is.

“As a matter of fact, this mix is the farthest thing from mono. The sound design, effects, atmospheres and music are all very wide — more so than I would do in a regular film as I tend to be a bit conservative with panning. But on this film, we really went for it. It was certainly an experimental film, and we embraced that,” says Volpe.

The idea of having the sonic equivalent of this 1930’s film style persisted. Since mono wasn’t feasible, other avenues were explored. Volpe suggested recording the production dialogue onto a NAGRA to “get some of that analog goodness, but it just turned out to be one thing too many for them in the midst of all the chaos of shooting on Cape Forchu in Nova Scotia,” says Volpe. “We did try tape emulator software, but that didn’t yield interesting results. We played around with the idea of laying it off to a 24-track or shooting in optical. But in the end, those all seemed like they’d be expensive and we’d have no control whatsoever. We might not even like what we got. We were struggling to come up with a solution.”

Then a suggestion from Harbor’s Joel Scheuneman (who’s experienced in the world of music recording/producing) saved the day. He recommended the outboard Rupert Neve Designs 542 Tape Emulator.

The Mix
The film was final mixed in 5.1 surround on a Euphonix S5 console. Each channel was sent through an RND 542 module and then into the speakers. The units’ magnetic heads added saturation, grain and a bit of distortion to the tracks. “That is how we mixed the film. We had all of these imperfections in the track that we had to account for while we were mixing,” explains Fernandez.

“You couldn’t really ride it or automate it in any way; you had to find the setting that seemed good and then just let it rip. That meant in some places it wasn’t hitting as hard as we’d like and in other places it was hitting harder than we wanted. But it’s all part of Rob Eggers’s style of filmmaking — leaving room for discovery in the process,” adds Volpe.

“There’s a bit of chaos factor because you don’t know what you’re going to get. Rob is great about being specific but also embracing the unknown or the unexpected,” he concludes.


Jennifer Walden is a New Jersey-based audio engineer and writer. Follow her on Twitter @audiojeney.

Color Chat: Light Iron’s Corinne Bogdanowicz

Corinne Bogdanowicz colorist at Light Iron, joined the post house in 2010 after working as a colorist and digital compositor for Post Logic/Prime Focus, Pacific Title and DreamWorks Animation.

Bogdanowicz, who comes from a family of colorists/color scientists (sister and father), has an impressive credit list, including the features 42, Flight, Hell or High Water, Allied and Wonder. On the episodic side, she has colored all five seasons of Amazon’s Emmy-winning series Transparent, as well as many other shows, including FX’s Baskets and Boomerang for BET. Her most recent work includes Netflix’s Dolemite is My Name and HBO’s Mrs. Fletcher.

HBO’s Mrs. Fletcher

We reached out to find out more…

NAME: Corinne Bogdanowicz

COMPANY: Light Iron

CAN YOU DESCRIBE YOUR COMPANY?
Light Iron is a post production company owned by Panavision. We have studios in New York and Los Angeles.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think that most people would be surprised that we are the last stop for all visuals on a project. We are where all of the final VFX come together, and we also manage the different color spaces for final distribution.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Yes, I am very often doing work that crosses over into visual effects. Beauty work, paint outs and VFX integration are all commonplace in the DI suite these days.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The collaboration between myself and the creatives on a project is my favorite aspect of color correction. There is always a moment when we start color where I get “the look,” and everyone is excited that their vision is coming to fruition.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Maybe farming? (laughs) I’m not sure. I love being outdoors and working with animals.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I have an art background, and when I moved to Los Angeles years ago I worked in VFX. I quickly was introduced to the world of color and found it was a great fit. I love the combination of art and technology, as well as constantly being introduced to new ideas by industry creatives.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Where’d You Go, Bernadette?, Sextuplets, Truth Be Told, Transparent, Mrs. Fletcher and Dolemite is My Name.

Transparent

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
This is a hard question because I feel like I leave a little piece of myself in everything that I work on.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone, the coffee maker and FilmLight Baselight.

WHAT DO YOU DO TO DE-STRESS FROM THE PRESSURES OF THE JOB?
I have two small children at home, so I think I de-stress when I get to work (laughs)!

Bonfire adds Jason Mayo as managing director/partner

Jason Mayo has joined digital production company Bonfire in New York as managing director and partner. Industry veteran Mayo will be working with Bonfire’s new leadership lineup, which includes founder/Flame artist Brendan O’Neil, CD Aron Baxter, executive producer Dave Dimeola and partner Peter Corbett. Bonfire’s offerings include VFX, design, CG, animation, color, finishing and live action.

Mayo comes to Bonfire after several years building Postal, the digital arm of the production company Humble. Prior to that he spent 14 years at Click 3X, where he worked closely with Corbett as his partner. While there he also worked with Dimeola, who cut his teeth at Click as a young designer/compositor. Dimeola later went on to create The Brigade, where he developed the network and technology that now forms the remote, cloud-based backbone referred to as the Bonfire Platform.

Mayo says a number of factors convinced him that Bonfire was the right fit for him. “This really was what I’d been looking for,” he says. “The chance to be part of a creative and innovative operation like Bonfire in an ownership role gets me excited, as it allows me to make a real difference and genuinely effect change. And when you’re working closely with a tight group of people who are focused on a single vision, it’s much easier for that vision to be fully aligned. That’s harder to do in a larger company.”

O’Neil says that having Mayo join as partner/MD is a major move for the company. “Jason’s arrival is the missing link for us at Bonfire,” he says. “While each of us has specific areas to focus on, we needed someone who could both handle the day to day of running the company while keeping an eye on our brand and our mission and introducing our model to new opportunities. And that’s exactly his strong suit.”

For the most part, Mayo’s familiarity with his new partners means he’s arriving with a head start. Indeed, his connection to Dimeola, who built the Bonfire Platform — the company’s proprietary remote talent network, nicknamed the “secret sauce” — continued as Mayo tapped Dimeola’s network for overflow and outsourced work while at Postal. Their relationship, he says, was founded on trust.

“Dave came from the artist side, so I knew the work I’d be getting would be top quality and done right,” Mayo explains. “I never actually questioned how it was done, but now that he’s pulled back the curtain, I was blown away by the capabilities of the Platform and how it dramatically differentiates us.

“What separates our system is that we can go to top-level people around the world but have them working on the Bonfire Platform, which gives us total control over the process,” he continues. “They work on our cloud servers with our licenses and use our cloud rendering. The Platform lets us know everything they’re doing, so it’s much easier to track costs and make sure you’re only paying for the work you actually need. More importantly, it’s a way for us to feel connected – it’s like they’re working in a suite down the hall, except they could be anywhere in the world.”

Mayo stresses that while the cloud-based Platform is a huge advantage for Bonfire, it’s just one part of its profile. “We’re not a company riding on the backs of freelancers,” he points out. “We have great, proven talent in our core team who work directly with clients. What I’ve been telling my longtime client contacts is that Bonfire represents a huge step forward in terms of the services and level of work I can offer them.”

Corbett believes he and Mayo will continue to explore new ways of working now that he’s at Bonfire. “In the 14 years Jason and I built Click 3X, we were constantly innovating across both video and digital, integrating live action, post production, VFX and digital engagements in unique ways,” he observes. “I’m greatly looking forward to continuing on that path with him here.”

Technicolor Post opens in Wales 

Technicolor has opened a new facility in Cardiff, Wales, within Wolf Studios. This expansion of the company’s post production footprint in the UK is a result of the growing demand for more high-quality content across streaming platforms and the need to post these projects, as well as the growth of production in Wales.

The facility is connected to all of Technicolor’s locations worldwide through the Technicolor Production Network, giving creatives easy access and to their projects no matter where they are shooting or posting.

The facility, an extension of Technicolor’s London operations, supports all Welsh productions and features a multi-purpose, state-of-the-art suite as well as space for VFX and front-end services including dailies. Technicolor Wales is working on Bad Wolf Production’s upcoming fantasy epic His Dark Materials, providing picture and sound services for the BBC/HBO show. Technicolor London’s recent credits include The Two Popes, The Souvenir, Chernobyl, Black Mirror, Gentleman Jack and The Spanish Princess.

Within this new Cardiff facility, Technicolor is offering 2K digital cinema projection, FilmLight Baselight color grading, realtime 4K HDR remote review, 4K OLED video monitoring, 5.1/7.1 sound, ADR recording/source connect, Avid Pro Tools sound mixing, dailies processing and Pulse cloud storage.

Bad Wolf Studios in Cardiff offers 125,000 square feet of stage space with five stages. There is flexible office space, as well as auxiliary rooms and costume and props storage. Its within

VFX house Blacksmith now offering color grading, adds Mikey Pehanich

New York-based visual effects studio Blacksmith has added colorist Mikey Pehanich to its team. With this new addition, Blacksmith expands its capabilities to now offer color grading in addition to VFX.

Pehanich has worked on projects for high-profile brands including Amazon, Samsung, Prada, Nike, New Balance, Marriott and Carhartt. Most recently, Pehanich worked on Smirnoff’s global “Infamous Since 1864” campaign directed by Rupert Sanders, Volkswagen’s Look Down in Awe spot from Garth Davis, Fisher-Price’s “Let’s Be Kids” campaign and Miller Lite’s newly launched Followers spot, both directed by Ringan Ledwidge.

Prior to joining Blacksmith, Pehanich spent six years as colorist at The Mill in Chicago. Pehanich was the first local hire when The Mill opened its Chicago studio in 2013. Initially cutting his teeth as color assistant, he quickly worked his way up to becoming a full-fledged colorist, lending his talent to campaigns that include Michelob’s 2019 Super Bowl spot featuring Zoe Kravitz and directed by Emma Westenberg, as well as music videos, including Regina Spektor’s Black and White.

In addition to commercial work, Pehanich’s diverse portfolio encompasses several feature films, short films and music videos. His recent longform work includes Shabier Kirchner’s short film Dadli about an Antiguan boy and his community, and Andre Muir’s short film 4 Corners, which tackles Chicago’s problem with gun violence.

“New York has always been a creative hub for all industries — the energy and vibe that is forever present in the air here has always been a draw for me. When the opportunity presented itself to join the incredible team over at Blacksmith, there was no way I could pass it up,” says Pehanich, who will be working on Blackmagic’s DaVinci Resolve.

 

Color grading Empire State Building’s immersive exhibits

As immersive and experiential projects are being mounted in more and more settings — and as display technology allows for larger and more high-resolution screens to be integrated into these installations —colorists are being called on to grade video and film content that’s meant to be viewed in vastly different settings than in the past. No longer are they grading for content that will live on a 50-inch flat screen TV or a 9-inch tablet —they’re grading for wall-sized screens that dominate museum exhibits or public spaces.

James Tillett

A recent example is when the Manhattan office of Squint /Opera, a London-based digital design studio, tapped Moving Picture Company colorist James Tillett to grade content that has taken over floor-to-ceiling screens in the new Second Floor Experience in the iconic Empire State Building. Comprising nine interactive and immersive galleries that recreate everything from the building’s construction to its encounter with its most famous visitor and unofficial mascot, King Kong, the 10,000-square-foot space is part of the building’s multimillion dollar renovation.

Here, Tillett discusses what went into grading for such a large-scale experiential project such as this.

How did this project come about?
Alvin Cruz, one of our creative directors here in New York, has a designer colleague who put us in contact with the Squint/Opera team. We met with them and they quickly realized they’d be able to do everything on this project except the color grade. That’s where we came in.

How did this project differ from the more traditional color grading work you usually do?
You have to work in a different color space if the final product will be shown in a theater versus, say, broadcast TV or online. The same thinking goes here, but as every experiential project is different, you have to evaluate based on the design of the space and the type of screen or projection system being used, and then make an educated guess on how the footage will respond.

What were the steps you took to tackle this kind of project?
The first thing we did when we got the footage from Squint/Opera was to bring it into the suite and view it in that environment. Then my executive producer, Ed Koenig, and I jumped on the Q train and went into the space at the Empire State Building to see how the same footage looked in the various gallery settings. This helped us to get a feel for how it will ultimately be seen. I also wanted to see how those spaces differed visually from our grading suite. That informed my process going forward.

What sections of the Experience required extra consideration?
The “Construction Area” gallery, which documents the construction of the building, has very large screens. This meant paying close attention to the visual details within each of the films. For example, zooming in close to certain parts of the image and keeping an eye on noise and grain structure.

The “Site Survey” gallery gives the visitor a sense of what it would be like on the ground as the building surveyors are taking their measurements. Visitors are able to look through various replica surveying devices and see different scenes unfolding. During the grade (I use FilmLight Baselight), we had a prototype device in the suite that Squint/Opera created with a 3D printer. This allowed us to preview the grade through the same type of special mirrored screen that’s used in the actual replica surveying devices in the exhibit. In fact, we actually ended up setting the calibration of these screens as part of the grading process and then transferred those settings over to the actual units at the ESB.

In the “King Kong” gallery, even though the video content is in black and white, it was important that the image on the screens was consistent with the model of King Kong’s hand that reaches into the physical space, which has a slightly reddish tone to it. We started off just trying to make the footage feel more like a vintage black and white film print, but realized we needed to introduce some color to make it sit better in the space. This meant experimenting with different levels of red/sepia tint to the black and white and exporting different versions, with a final decision then made on-site.

Were you able to replicate what the viewing conditions would be for these films while working in the color suite? And did this influence the grade?
What’s important about grading for experiential projects like this is that, while you can’t replicate the exact conditions, you still have to give the footage a grade that supports the theme or focus of the film’s content. You also have to fully understand and appreciate where it’s going to be seen and keep that top of mind throughout the entire process.

 

 

 

 

Nice Shoes Toronto adds colorist Yulia Bulashenko

Creative studio Nice Shoes has added colorist Yulia Bulashenko to its Toronto location. She brings over seven years of experience as a freelance colorist, working worldwide across on projects with such top global clients as Nike, Volkswagen, MTV, Toyota, Diesel, Uniqlo, Uber, Adidas and Zara, among numerous others.

Bulashenko’s resume includes work across commercials, music videos, fashion, and feature films. Notable projects include Sia and Diplo’s (LSD) music video for “Audio,” “Sound and Vision” a tribute to the late singer David Bowie directed by Canada for whom she has been a colorist of choice for the past five years; and feature films The Girl From The Song and Gold.

Toronto-based Bulashenko is available immediately and also available remotely via Nice Shoes’s New York, Boston, Chicago, and Minneapolis spaces.

Bulashenko began her career as a fashion photographer before transitioning into creating fashion films. Through handling all of the post on her own film projects, she discovered a love for color grading. After building relationships with a number of collaborators, she began taking on projects as a freelancer, working with clients in Spain and the UK working on a wide range of projects throughout Europe, Mexico, Qatar and India.

Managing director Justin Pandolfino notes, “We’re excited to announce Yulia as the first of a number of new signings as we enter our fourth year in the Toronto market. Bringing her onboard is part of our ongoing efforts to unite the best talent from around the world to deliver stunning design, animation, VFX, VR/AR, editorial, color grading and finishing for our clients.”

Colorist Chat: Scott Ostrowsky on Amazon’s Sneaky Pete

By Randi Altman

Scott Ostrowsky, senior colorist at Deluxe’s Level 3 in Los Angeles has worked on all three seasons of Amazon’s Sneaky Pete, produced by Bryan Cranston and David Shore and starring Giovanni Ribisi. Season 3 is the show’s last.

For those of you unfamiliar with the series, it follows a con man named Marius (Ribisi), who takes the place of his former cell-mate Pete and endears himself to Pete’s seemingly idyllic family while continuing to con his way through life. Over time he comes to love the family, which is nowhere as innocent as they seem.

Scott Ostrowsky

We reached out to this veteran colorist to learn more about how the look of the series developed over the seasons and how he worked with the showrunners and DPs.

You’ve been on Sneaky Pete since the start. Can you describe how the look has changed over the years?
I worked on Seasons 1 through Season 3. The DP for Season 1 was Rene Ohashi and it had somewhat of a softer feel. It was shot on a Sony F55. It mostly centered around the relationship of Bryan Cranston’s character and Giovanni Ribisi’s newly adopted fake family and his brother.

Season 2 was shot by DPs Frank DeMarco and William Rexer on a Red Dragon, and it was a more stylized and harsher look in some ways. The looks were different because the storylines and the locations had changed. So, even though we had some beautiful, resplendent looks in Season 2, we also created some harsher environments, and we did that through color correction. Going into Season 2, the storyline changed, and it became more defined in the sense that we used the environments to create an atmosphere that matched the storyline and the performances.

An example of this would be the warehouse where they all came together to create the scam/ heist that they were going to pull off. Another example of this would be the beautiful environment in the casino that was filled with rich lighting and ornate colors. But there are many examples of this through the show — both DPs used shadow and light to create a very emotional mood or a very stark mood and everything in between.

Season 3 shot by Arthur Albert and his son, Nick Albert on a Red Gemini, and it had a beautiful, resplendent, rich look that matched the different environments when it moved from the cooler look of New York to the more warm, colorful look in California.

So you gave different looks based on locale? 
Yes, we did. Many times, the looks would depend on time of day and the environment that they were in. An example of this might be the harsh fluorescent green in the gas station bathroom where Giovanni’s character is trying to figure out a way to help his brother and avoid his captures.

How did you work with the Alberts on the most recent season?
I work at Level 3 Post, which is a Deluxe company. I did Season 1 and 2 at the facility on the Sony lot. Season 3 was posted at Level 3. Arthur and Nick Albert came in to my color suite with the camera tests shot on the Red Gemini and also the Helium. We set up a workflow based on the Red cameras and proceeded to grade the various setups.

Once Arthur and Nick decided to use the Gemini, we set up our game plan for the season. When I received my first conform, I proceeded to grade it based on our conversations. I was very sensitive to the way they used their setups, lighting and exposures. Once I finished my first primary grade, Arthur would come in and sit with me to watch the show and make any changes. After Arthur approved the grade, The producers and showrunner would come in for their viewing. They could make any additional changes at that time. (Read our interview with Arthur Albert here.)

How do you prefer to work with directors/DPs?
The first thing is have conversation with them on their approach and how they view color as being part of the story they want to tell. I always like to get a feel for how the cinematographer will shoot the show and what, if any, LUTs they’re using so I can emulate that look as a starting point for my color grading.

It is really important to me to find out how a director envisions the image he or she would like to portray on the screen. An example of this would be facial expressions. Do we want to see everything or do they mind if the shadow side remains dark and the light falls off.

A lot of times, it’s about how the actors emote and how they work in tandem with each other to create tension, comedy or other emotions — and what the director is looking for in these scenes.

Any tips for getting the most out of a project from a color perspective?
Communication. Communication. Communication. Having an open dialogue with the cinematographer, showrunners and directors is extremely important. If the colorist is able to get the first pass very close, you spend more time on the nuisances rather than balancing or trying to find a look. That is why it is so important to have an understanding of the essence of what a director, cinematographer and showrunner is looking for.

How do you prefer the DP or director to describe their desired look?
However they’re comfortable in enlightening me to their styles or needs for the show is fine. Usually, we can discuss this when we have a camera test before principal photography starts. There’s no one way that you can work with everybody — you just adapt to how they work. And as a colorist, it’s your job to make that image sing or shine the way that they intended it to.

You used Resolve on this. Is there a particular tool that came in handy for this show?
All tools on the Resolve are useful for a drama series. You would not buy the large crayon box and throw out colors you didn’t like because, at some point, you might need them. I use all tools — from keys, windows, log corrections and custom curves to create the looks that were needed.

You have been working in TV for many years. How has color grading changed during that time?
Color correction has become way more sophisticated over the years, and is continually growing and expanding into a blend of not only color grading but helping to create environments that are needed to express the look of a show. We no longer just have simple color correctors with simple secondaries; the toolbox continues to grow with added filters, added grain and sometimes even helping to create visual effects, which most color correctors are able to do today.

Where do you find inspiration? Art? Photography?
I’ve always loved photography and B&W movies. There’s a certain charm or subtlety that you find in B&W, whether it’s a film noir, the harshness of film grain, or just the use of shadow and light. I’ve always enjoyed going to museums and looking at different artists and how they view the world and what inspires them.

To me, it’s trying to portray an image and have that image make a statement. In daily life, you can see multiple examples as you go through your day, and I try and keep the most interesting ones that I can remember in my lexicon of images.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: PixelTools V.1 PowerGrade presets for Resolve

By Brady Betzel

Color correction and color grading can be tricky (especially for those of us who don’t work as a dedicated colorist). And to be good at one doesn’t necessarily mean you will be good at the other. After watching hundreds of hours of tutorials, the only answer to getting better at color correction and color grading is to practice. As trite and cliche as it sounds, it’s the truth. There is also the problem of a creative block. I can sometimes get around a creative block when color correcting or general editing by trying out of the box ideas, like adding a solid color on top of footage and changing blend modes to spark some ideas.

An easier way to get a bunch of quick looks on your footage is with LUTs (Look Up Tables) and preset color grades. LUTs can sometimes work at getting your footage into an acceptable spot color correction-wise or technically, in the correct color space (the old technical vs. creative LUTs discussion). They often need (or should) be tweaked to fit the footage you are using.

Dawn

This is where PixelTool’s PowerGrade presets for Blackmagic’s DaVinci Resolve come in to play. PixelTool’s presets give you that instant wow of a color grade, sharpening and even grain, but with the flexibility to tweak and adjust to your own taste.

PixelTool’s PowerGrade V.1 are a set of Blackmagic’s DaVinci Resolve PowerGrades (essentially pre-built color grades sometimes containing noise reduction, glows or film grain) that retail for $79.99. Once purchased, the PowerGrade presets can be downloaded immediately. If you aren’t sure about the full commitment to purchase for $79.99, you can download eight sample PowerGrade presets to play with by signing up for PixelTools’ newsletter.

While it doesn’t typically matter what version of Resolve you are using with the PixelTool PowerGrade, you will probably want to make sure you are using Resolve Studio 15 (or higher) or you may miss out on some of the noise reduction or film. I’m running Resolve 16 Studio.

What are PowerGrades? In Resolve, you can save and access pre-built color correction node trees across all projects in a single database. This way if you have an amazing orange and teal, bleach bypass, or maybe a desaturated look with a vignette and noise reduction that you don’t want to rebuild inside every project you can them in the PowerGrades folder in the color correction tab. Easy! Just go into the Color Correction Tab > Gallery (in the upper left corner) > click the little split window icon > right click and “Add PowerGrade Album.”

Golden

Installing the PixelTools presets is pretty easy, but there are a few steps you are going to want to follow if you’ve never made a PowerGrades folder before. Luckily, there is a video just for that. Once you’ve added the presets into your database you can access over 110 grades in both Log and Rec 709 color spaces. In addition, there is a folder of “Utilities,” which offers some helpful tools like Scanlines (Mild-Intense), various Vignettes, Sky Debanding, preset Noise Reductions, two-and three-way Grain Nodes and much more. Some of the color grading presets can fit on one node but some have five or six nodes like the “2-Strip Holiday.” They will sometimes be applied as a Compound Node for organization-sake but can be decomposed to see all the goodness inside.

The best part of PixelTools, other than the great looks, is the ability to decompose or view the Compound Node structure and see what’s under the hood. Not only does it make you appreciate all of the painstaking work that is already done for you, but you can study it, tweak it and learn from it. I know a lot of companies that don’t like to reveal how things are done, but with PixelTools you can break the grades. Follows my favorite motto: “A rising tide lifts all boats” mindset.

From the understated “2-Strip Holiday” look to the crunchy “Bleach Duotone 2” with the handy “Saturation Adjust” node on the end of the tree, PixelTools is the prime example of pre-built looks that can be as easy as drag-and-dropping onto a clip or as intricate as adjusting each node to the way you like it. One of my favorite looks is a good-old Bleach Bypass — use two layer nodes (one desaturated and one colored), layer mix with a composite mode set to Overlay and adjust saturation to taste. The Bleach Bypass setup is not a tightly guarded secret, but PixelTools gets you right to the Bleach Bypass look with the Bleach Duotone 2 and also adds a nice orange and teal treatment on top.

2-Strip Holiday

Now I know what you are thinking — “Orange and Teal! Come on, what are we Michael Bay making Transformers 30?!” Well, the answer is, obviously, yes. But to really dial the look to taste on my test footage I brought down the Saturation node at the end of the node tree to around 13%, and it looks fantastic! Moral of the story is: always dial in your looks, especially with presets. Just a little customization can take your preset-look to a personalized look quickly. Plus, you won’t be the person who just throws on a preset and walks away.

Will these looks work with my footage? If you shot in a Log-ish style like SLog or BMD Film, Red Log Film or even GoPro Flat you can use the Log presets and dial them to taste. If you shot footage in Rec. 709 with your Canon 5D Mark II, you can just use the standard looks. And if you want to create your own basegrade on Log footage just add the PixelTool PowerGrade Nodes after!

Much like my favorite drag-and-drop tools from Rampant Design, PixelTools will give you a jump on your color grading quickly and if nothing else can maybe shake loose some of that colorist creative block that creeps in. Throw on that “Fuji 1” or “Fuji 2” look, add a serial node in the beginning and crank up the red highlights…who knows it may give you some creative jumpstart that you are looking for. Know the rules to break the rules, but also break the rules to get those creative juices flowing.

Saturate-Glow-Shadows

Summing Up
In the end, PixelTools is not just a set of PowerGrades for DaVinci Resolve, they can also be creative jumpstarts. If you think your footage is mediocre, you will be surprised at what a good color grade will do. It can save your shoot. But don’t forget about the rendering when you are finished. Rendering speed will still be dependent on your CPU and GPU setup. In fact, using an Asus ConceptD 7 laptop with an Nvidia RTX 2080 GPU, I exported a one-minute long Blackmagic Raw sequence with only color correction (containing six clips) to 10-bit DPX files in :46 seconds, with a random PixelTools PowerGrade applied to each clip it took :40 seconds! In this case the Nvidia RTX 2080 really aided in the fast export but your mileage may vary.

Check out pixeltoolspost.com and make sure to at least download their sample pack. From the one of five Kodak looks, two Fuji Looks, Tobacco Newspaper to Old Worn VHS 2 with a hint of chromatic aberration you are sure to find something that fits your footage.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Colorist Joanne Rourke grades Netflix horror film In the Tall Grass

Colorists are often called on to help enhance a particular mood or item for a film, show or spot. For Netflix’s In the Tall Grass — based on a story from horror writers Stephen King and Joe Hill — director Vincenzo Natali and DP Craig Wrobleski called on Deluxe Toronto’s Joanne Rourke to finesse the film’s final look using color to give the grass, which plays such a large part in the film, personality.

In fact, most of the film takes place in a dense Kansas field. It all begins when a brother and his pregnant sister hear a boy’s cries coming from a field of tall grass and go to find him. Soon they realize they can’t escape.

Joanne Rourke

“I worked with Vincenzo more than 20 years ago when I did the video mastering for his film Cube, so it was wonderful to reconnect with him and a privilege to work with Craig. The color process on this project was highly collaborative and we experimented a lot. It was decided to keep the day exteriors natural and sunny with subtle chromatic variations between. While this approach is atypical for horror flicks, it really lends itself to a more unsettling and ominous feeling when things begin to go awry,” explains Rourke.

In the Tall Grass was principally shot using the ARRI Alexa LF camera system, which helped give the footage a more immersive feeling when the characters are trapped in the grass. The grass itself comprised a mix of practical and CG grass that Rourke adjusted the color of depending on the time of day and where the story was taking place in the field. For the night scenes, she focused on giving the footage a silvery look while keeping the overall look as dark as possible with enough details visible. She was also mindful to keep the mysterious rock dark and shadowed.

Rourke completed the film’s first color pass in HDR, then used that version to create an SDR trim pass. She found the biggest challenge of working in HDR on this film to be reining in unwanted specular highlights in night scenes. To adjust for this, she would often window specific areas of the shot, an approach that leveraged the benefits of HDR without pushing the look to the extreme. She used Blackmagic Resolve 15 along with the occasional Boris FX Sapphire plugins.

“Everyone involved on this project had a keen attention to detail and was so invested in the final look of the project, which made for such great experience,” says Rourke. “I have many favorite shots, but I love how the visual of the dead crow on the ground perfectly captures the silver feel. Craig and Vincenzo created such stunning imagery, and I was just happy to be along for the ride. Also, I had no idea that head squishing could be so gleeful and fun.”

In the Tall Grass is now streaming on Netflix.

Harbor adds talent to its London, LA studios

Harbor has added to its London- and LA-based studios. Marcus Alexander joins as VP of picture post, West Coast and Darren Rae as senior colorist. He will be supervising all dailies in the UK.

Marcus Alexander started his film career in London almost 20 years ago as an assistant editor before joining Framestore as a VFX editor. He helped Framestore launch its digital intermediate division, producing multiple finishes on a host of tent-pole and independent titles, before joining Deluxe to set up its London DI facility. Alexander then relocated to New York to head up Deluxe New York DI. With the growth in 3D movies, he returned to the UK to supervise stereo post conversions for multiple studios before his segue into VFX supervising.

“I remember watching It Came from Outer Space at a very young age and deciding there and then to work in movies,” says Alexander. “Having always been fascinated with photography and moving images, I take great pride in thorough involvement in my capacity from either a production or creative standpoint. Joining Harbor allows me to use my skills from a post-finishing background along with my production experience in creating both 2D and 3D images to work alongside the best talent in the industry and deliver content we can be extremely proud of.”

Rae began his film career in the UK in 1995 as a sound sync operator at Mike Fraser Neg Cutters. He moved into the telecine department in 1997 as a trainee. By 1998 he was a dailies colorist working with 16mm and 35mm film. From 2001, Rae spent three years with The Machine Room in London as telecine operator and joined Todd AO’s London lab in 2014 as colorist working on drama and commercials 35mm and 16mm film and 8mm projects for music videos. In 2006 Rae moved into grading dailies at Todd AO parent company Deluxe in Soho London, moving to Company 3 London in 2007 as senior dailies colorist. In 2009, he was promoted to supervising colorist.

Prior to joining Harbor, Rae was senior colorist for Pinewood Digital, supervising multiple shows and overseeing a team of four, eventually becoming head of grading. Projects include Pokemon Detective Pikachu, Dumbo, Solo: A Star Wars Story, The Mummy, Rogue One, Doctor Strange and Star Wars Episode VII — The Force Awakens.

“My main goal is to make the director of photography feel comfortable. I can work on a big feature film from three months to a year, and the trust the DP has in you is paramount. They need to know that wherever they are shooting in the world, I’m supporting them. I like to get under the skin of the DP right from the start to get a feel for their wants and needs and to provide my own input throughout the entire creative process. You need to interpret their instructions and really understand their vision. As a company, Harbor understands and respects the filmmaker’s process and vision, so for me, it’s the ideal new home for me.”

Harbor has also announced that colorists Elodie Ichter and Katie Jordan are now available to work with clients on both the East and West Coasts in North America as well as the UK. Some of the team’s work includes Once Upon a Time in Hollywood, The Irishman, The Hunger Games, The Maze Runner, Maleficent, The Wolf of Wall Street, Anna, Snow White and the Huntsman and Rise of the Planet of the Apes.

Charlieuniformtango names company vets as new partners

Charlieuniformtango principal/CEO Lola Lott has named three of the full-service studio’s most veteran artists as new partners — editors Deedle LaCour and James Rayburn, and Flame artist Joey Waldrip. This is the first time in the company’s almost 25-year history that the partnership has expanded. All three will continue with their current jobs but have received the expanded titles of senior editor/partner and senior Flame artist/partner, respectively. Lott, who retains majority ownership of Charlieuniformtango, will remain principal/CEO, and Jack Waldrip will remain senior editor/co-owner.

“Deedle, Joey and James came to me and Jack with a solid business plan about buying into the company with their futures in mind,” explains Lott. “All have been with Charlieuniformtango almost from the beginning: Deedle for 20 years, Joey for 19 years and James for 18. Jack and I were very impressed and touched that they were interested and willing to come to us with funding and plans for continuing and growing their futures with us.

So why now after all these years? “Now is the right time because while Jack and I still have a passion for this business and we also have employees/talent — that have been with us for over 18 years — who also have a passion be a partner in this company,” says Lott. “While still young, they have invested and built their careers within the Tango culture and have the client bonds, maturity and understanding of the business to be able to take Tango to a greater level for the next 20 years. That was mine and Jack’s dream, and they came to us at the perfect time.”

Charlieuniformtango is a full-service creative studio that produces, directs, shoots, edits, mixes, animates and provides motion graphics, color grading, visual effects and finishing for commercials, short films, full-length feature films, documentaries, music videos and digital content.

Main Image: (L-R) Joey Waldrip, James Rayburn, Jack Waldrip, Lola Lott and Deedle LaCour

Colorist Chat: Lucky Post’s Neil Anderson

After joining Lucky Post in Dallas in 2013 right out of film school, Neil Anderson was officially promoted to colorist in 2017. He has worked on a variety of projects during his time at the studio, including projects for Canada Dry, Costa, TGI Fridays, The Salvation Army and YETI. He also contributed to Augustine Frizzell’s feature comedy, Never Goin’ Back, which premiered at Sundance and was distributed by A24.

YETI

We checked in with Anderson to find out how he works, some favorite projects and what inspires him.

What do you enjoy most about your work?
That’s a really hard question because there are a lot of things I really enjoy about color grading. If I had to choose, I think it comes back to the fact that it’s rewarding to both left- and right-brained people. It truly is both an art and a science.

The satisfaction I get when I first watch a newly graded spot is also very special. A cohesive and mindful color grade absolutely transforms the piece into something greater, and it’s a great feeling to be able to make such a powerful impact.

What’s the most misunderstood aspect of color artistry?
I’m not sure many people stop and think about how amazing it is that we can fine tune our engineering to something as wild as our eye sight. Our vision is very fluid and organic, constantly changing under different constraints and environments, filled with optical illusions and imperfect guesses. There are immensely strange phenomena that drastically change our perception of what we see. Yet we need to make camera systems and displays work with this deeply non-uniform perception. It’s an absolutely massive area of study that we take for granted; I’m thankful for those color scientists out there.

Where do you find your creative inspiration?
I definitely like to glean new ideas and ways of approaching new projects from seeing other great colorists. Sometimes certain commercials come on TV that catch my eye and I’ll excitedly say to my partner Odelie, “That is damn good color!” Depending on the situation, I might get an eye-roll or two from her.

Tell us about some recent projects, and what made them stand out to you creatively?
Baylor Scott & White Health: I just loved how moody we took these in the end. They are very inspiring stories that we wanted to make feel even more impactful. I think the contrast and color really turned out beautiful.

Is This All There Is?

Is This All There Is? by Welcome Center: This is a recent music video that we filmed in a stunningly dilapidated house. The grit and grain we added in color really brings out the “worst” of it.

Hurdle: This was a documentary feature I worked on that I really enjoyed. The film was shot over a six-month window in the West Bank in Israel, so wrangling it in while also giving it a distinctly unique look was both difficult and fun.

Light From Light: Also a feature film that I finished a few months ago. I really enjoyed the process of developing the look with its wonderful DP Greta Zozula. We specifically wanted to capture the feeling of paintings by Andrew Wyeth, Thomas Eakins and Johannes Vermeer.

Current bingeable episodics and must see films?
Exhibit A, Mindhunter, Midsommar and The Cold Blue.

When you are not at Lucky Post, where do you like to spend time?
I’m an avid moviegoer so definitely a lot of my time (and money) is spent at the theater. I’m also a huge sports fan; you’ll find me anywhere that carries my team’s games! (Go Pack Go)

Favorite podcast?
The Daily (“The New York Times”)

Current Book?
“Parting the Waters: America in the King Years 1954-1963”

Dumbest thing you laughed at today?
https://bit.ly/2MYs0V1

Song you can’t stop listening to?
John Frusciante — 909 Day