Author Archives: Randi Altman

Words of wisdom from editor Jesse Averna, ACE

We are all living in a world we’ve never had to navigate before. People’s jobs are in flux, others are working from home, and anxiety is a regular part of our lives. Through all the chaos, Jesse Averna has been a calming voice on social media, so postPerspective reached out to ask him to address our readership directly.

Jesse, who was co-founder of the popular Twitter chat and Facebook group @PostChat, currently works at Disney Animation Studio and is a member of the American Cinema Editors.


Hey,

How are you doing? This isn’t an ad. I’m not going to sell you anything or try to convince you of anything. I just want to take the opportunity to check in. Like many of you, I’m a post professional (an editor), currently working from home. If we don’t look out for each other, who will? Please know that it’s okay to not be okay right now. I have to be honest, I’m exhausted. I’m just endlessly reading news and searching for new news and reading posts about news I’ve already read and searching again for news I may have missed…

I want to remind you of a couple things that I think may bring some peace, if you let me. I fear it’s about to get much darker and much scarier, so we need to anchor ourselves to some hope.

You are valuable. The world is literally different because you are here. You have intrinsic value and that will never change. No matter what. You are thought about and loved, despite whatever the voice in your head says. I’m sure your first reaction to reading that is to blow it off, but try and own it. Even for just a moment. It’s true.

You don’t deserve what’s going on, but let it bring some peace that the whole world is going through it together. You may be isolated, but not alone. We are forced to look out for one another by looking out for ourselves. It’s interesting, I feel so separate and vulnerable, but the truth is that the whole planet is feeling and reacting to this as one. We are in sync, whether we know it or not — and that’s encouraging to me. We ALL want to be well and be safe, and we want our neighbors to also be well. We have a rare moment of feeling like a people, like a planet.

If you are feeling anxious, do me a favor tonight. Go outside and look at the stars. Set a timer for five minutes. No entertainment or phone or anything else. Just five minutes. Reset. Feel yourself on a cosmic scale. Small. A blink of an eye. But so, so valuable.

And please give yourself a break. A sanity check. If you need help, please reach out. If you need to nest, do it. You need to tune out, do it. Take care of yourself. This is an unprecedented moment. It’s okay to not be okay. Once you can, though, see who you can help. This complete shift of reality has made me think about legacy. This is a unique legacy-building moment. That student who reached out to you on LinkedIn asking for advice? You now have time to reply. That non-profit you thought about volunteering your talents to? Now’s your chance. Even just to make the connection. Who can you help? Check in on? You don’t need any excuse in our current state to reach out.

I know I’m just some rando you’re reading on the internet, but I believe you are going to make it through this. You are wonderful. Do everything you can to be safe. The world needs you. It’s a better place because you are here. You know things, have ideas to share, and will make things that none of the rest of us do or have.

Hang in there, my friends, and let me know if you have any thoughts, encouragements or tips for staying sane during this time. I’ll try to compile them into another article to share.

Jesse
@dr0id


Jesse Averna  — pictured on his way to donate masks — is a five-time Emmy-winning ACE editor living in LA and currently working in the animation feature world. 

Oscar-winning engineer Jim Houston has passed away

The Hollywood Section of Society of Motion Picture and Television Engineers (SMPTE) announced the passing of its long-time member and SMPTE Fellow Jim Houston. A 34-year veteran of the entertainment industry, Houston held senior engineering positions with several studios and prominent post facilities, including Sony Pictures Entertainment, Pacific Title & Art, Walt Disney Feature Animation and, since February of this year, Samsung Research America. A pioneer in motion imaging standards, computer animation and digital restoration, he won two Academy Awards for Scientific and Engineering Achievement. He died from a heart attack on March 26 in Pasadena. He was 61.

“Jim made a profound impact on SMPTE and the industry in general,” said SMPTE Hollywood Section chair Brian Gaffney. “He was a founding member of the Academy Color Encoding System committee. He wrote influential papers on topics ranging from the color fidelity of high dynamic range images to design considerations for cinemas using laser projection. He attended every industry technical and social event and was a constant presence in the community. He will be missed, and his legacy will last forever in Hollywood.”

Houston was born in Philadelphia and graduated from Cornell University. He began his career with Gould Computer Systems and worked at NASA’s Ames Research Center before getting his start in Hollywood as a technical director with Walt Disney Feature Animation in 1986. In 1992, he won an Academy of Motion Picture Arts and Sciences (AMPAS) Scientific and Engineering Award as part of the team that developed the CAPS production system for film animation. His second such honor came in 2007 for his contributions to the Rosetta process used in digital restoration. In 2014, he was awarded SMPTE’s Technicolor/Herbert T. Kalmus Award for “leadership and contributions in the application of digital technologies to motion picture production processes.” He served as co-chair of AMPAS’ ACES Project Committee and was a member of its Science and Technology Council.

He is survived by his mother, Margaret Houston, and siblings John, Michael, Martin, Kevin and Cathy and their families. Funeral services will be held in Philadelphia. A memorial service will be scheduled for later this year in Los Angeles.

Updated: Product makers offer support to cope with COVID-19 disruption

This is a weird time for our industry and the world. The best we can do is try to keep working and stay safe. For our part, postPerspective will continue to report industry news and tell stories about workflows, artists and tools, in addition to running pieces about how pros are working remotely… and keeping sane.

In fact, if you have a story about how you are working remotely and keeping on keeping on, please share it with us (info@postPerspective.com). Even though we can’t see each other face to face right now, keeping a sense of community has never been more important.

A number of companies are releasing updates, offering discounts, and even making their remote services free for a limited time in order to help everyone keep working through this pandemic. Here is a bit of news from some of those companies, and we will add more companies to this list as the news comes in, so watch this space.

mLogic
mLogic is offering a 15% discount on its mTape Thunderbolt 3 LTO-7 and LTO-8 solutions The discount applies to orders placed on the mTape website through April 20th. Use discount code mLogicpostPerspective15%.

Xytech
Xytech has launched “Xytech After Dark,” a podcast focusing on trends in the media and broadcasting industries. The first two episodes are now available on iTunes, Spotify and all podcasting platforms.

Xytech’s Greg Dolan says the podcast “is not a forum to sell, but instead to talk about why create the functionality in MediaPulse and the types of things happening in our industry.”

Hosted by Xytech’s Gregg Sandheinrich, the podcast will feature Xytech staff, along with special guests. The first two episodes cover topics including the recent HPA Tech Retreat (featuring HPA president Seth Hallen), as well as the cancellation of the NAB Show, the value of trade shows and the effects of COVID-19 on the industry.

Nvidia
Nvidia is expanding its free virtual GPU software evaluation to 500 licenses for 90 days to help companies support their remote workers with their existing GPU infrastructure. Nvidia vGPU software licenses — including Quadro Virtual Workstation — enable GPU-accelerated virtualization so that content creators, designers, engineers and others can continue their work. More details are available here.  Nvidia has also posted a separate blog on virtual GPUs to help admins who are working to support remote employees

Object Matrix 
Object Matrix is offering video tips for surviving working from home. The videos, hosted by co-founder Nicholas Pearce, are here.

Adobe
Adobe shared a guide to best practices for working from home. It’s meant to support creators and filmmakers who might be shifting to remote work and need to stay connected with their teams and continue to complete projects. You can find the guide here.

Adobe’s principal Creative Cloud evangelist, Jason Levine, hosted a live stream — Video Workflows With Team Projects ±that focus on remote workflows.

Additionally, Karl Soule, Senior Technical Business Development Manager, hosed a stream focusing on Remote video workflows and collaboration in the enterprise. If you sign up on this page, you can see his presentation.

Streambox
Streambox has introduced a pay-as-you-go software plan for video professionals who use its Chroma 4K, Chroma UHD, Chroma HD and Chroma X streaming encoder/decoder hardware. Since the software has been “decoupled” from the hardware platform, those who own the hardware can rent the software on a monthly basis, pause the subscription between projects and reinstate it as needed. By renting software for a fixed period, creatives can take on jobs without having to pay outright for technology that might have been impractical.

And last week’s offerings as well

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

DejaSoft
DejaSoft is offering editors 50% off all their DejaEdit licenses through the end of April. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow.

DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.

Main Image: Courtesy of Adobe

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

DejaSoft
DejaSoft is offering editors 50% off all their DejaEdit licenses through the end of April. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow.

DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Adobe
Adobe has shared a guide to best practices for working from home, created in support of creators and filmmakers who may be shifting to remote work and need to stay connected with their teams and continue to complete projects. You can find the guide below and here.

Adobe’s Jason Levine and Karl Soule will also be hosting two livestreams this week that focus on remote workflows, in the hopes of offering helpful tips during this uncertain time – details are below.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.

Main Image: Courtesy of Frame.io

Finishing artist Tim Nagle discuses work on indie film Miss Juneteenth

Lucky Post Flame artist Tim Nagle has a long list of projects under his belt, including collaborations with David Lowery — providing Flame work on the short film Pioneer as well as finishing and VFX work to Lowery’s motion picture A Ghost Story. He is equally at home working on spots, such as campaigns for AT&T, Hershey’s, The Home Depot, Jeep, McDonald’s and Ram..

Nagle began his formal career on the audio side of the business, working as engineer for Solid State Logic, where he collaborated with clients including Fox, Warner Bros., Skywalker, EA Games and ABC.

Tim Nagle

We reached out to Nagle about his and Lucky Post’s work on the feature film Miss Juneteenth, which premiered at Sundance and was recently honored by SXSW 2020 as the winner of the Louis Black Lone Star award.

Miss Juneteenth was directed (and written) by Channing Godfrey Peoples — her first feature-length film. It focuses on a woman from the south — a bona fide beauty queen once crowned Miss Juneteenth, a title commemorating the day slavery was abolished in Texas. The film follows her journey as she tries to hold onto her elegance while striving to survive. She looks for ways to thrive despite her own shortcomings as she marches, step by step, toward self-realization.

How did the film come to you?
We have an ongoing relationship with Sailor Bear, the film’s producing team of David Lowery, Toby Halbrooks and James Johnston. We’ve collaborated with them on multiple projects, including The Old Man & The Gun, directed by Lowery.

What were you tasked to do?
We were asked to provide dailies transcoding, additional editorial, VFX, color and finishing and ultimately delivery to distribution.

How often did you talk to director Channing Godfrey Peoples?
Channing was in the studio, working side by side with our creatives, including colorist Neil Anderson and me, to get the project completed for the Sundance deadline. It was a massive team effort, and we felt privileged to help Channing with her debut feature.

Without spoilers, what most inspires you about the film?
There’s so much to appreciate in the film — it’s a love letter to Texas, for one. It’s directed by a woman, has a single mother at its center and is a celebration of black culture. The LA Times called it one of the best films to come out of Sundance 2020.

Once you knew the film was premiering at Sundance, what was left to complete and in what amount of time?
This was by far the tightest turnaround we have ever experienced. Everything came down to the wire, sound being the last element. It’s one of the advantages of having a variety of talent and services under one roof — the creative collaboration was immediate, intense and really made possible by our shorthand and proximity.

How important do you think it is for post houses to be diversified in terms of the work they do?
I think diversification is important not only for business purposes but also to keep the artists creatively inspired. Lucky Post’s ongoing commitment to support independent film, both financially and creatively, is an integrated part of our business along with brand-supported work and advertising. Increasingly, as you see greater crossover of these worlds, it just seems like a natural evolution for the business to have fewer silos.

What does it mean to you as a company to have work at Sundance? What kinds of impact do you see — business, morale and otherwise?
Having a project that we put our hands on accepted into Sundance was such an honor. It is unclear what the immediate and direct business impacts might be, but for morale, this is often where the immediate value is clear. The excitement and inspiration we all get from projects like this just naturally makes how we do business better.

What software and hardware did you use?
On this project we started with Assimilate Scratch for dailies creation. Editorial was done in Adobe Premiere. Color was Blackmagic DaVinci Resolve, and finishing was done in Autodesk Flame.

What is a piece of advice that you’d give to filmmakers when considering the post phase of their films?
We love being involved as early as possible — certainly not to get in anyone’s way,  but to be in the background supporting the director’s creative vision. I’d say get with a post company that can assist in setting looks and establishing a workflow. With a little bit of foresight, this will create the efficiency you need to deliver in what always ends up being a tight deadline with the utmost quality.

Netflix’s Mindhunter: Skywalker’s audio adds to David Fincher’s vision

By Patrick Birk

Scott Lewis

I was late in discovering David Fincher’s gripping series on serial killers, Mindhunter. But last summer, I noticed the Netflix original lurking in my suggested titles and decided to give it a whirl. I burned through both seasons within a week. The show is both thrilling and chilling, but the majority of these moments are not achieved through blazing guns, jump scares and pyrotechnics. It instead focuses on the inner lives of multiple murderers and the FBI agents whose job it is to understand them through subtle but detail-rich conversation.

Sound plays a crucial role in setting the tone of the series and heightening tension through each narrative arc. I recently spoke to rerecording mixers Scott Lewis and Stephen Urata as well as supervising sound editor Jeremy Molod — all from Skywalker Sound — about their process creating a haunting and detail-laden soundtrack. Let’s start with Lewis and Urata and then work our way to Molod.

How is working with David Fincher? Does he have any directorial preferences when it comes to sound? I know he’s been big on loud backgrounds in crowded spaces since The Social Network.
Scott Lewis: David is extremely detail-oriented and knowledgeable about sound. So he would give us very indepth notes about the mix… down to the decibel.

Stephen Urata: That level of attention to detail is one of the more challenging parts of working on a show like Mindhunter.

Working with a director who is so involved in the audio, does that limit your freedom at all?
Lewis: No. It doesn’t curtail your freedom, because when a director has a really clear vision, it’s more about crafting the track to be what he’s looking for. Ultimately, it’s the director’s show, and he has a way of bringing the best work out of people. I’m sure you heard about how he does hundreds of takes with actors to get many options. He takes a similar approach with sound in that we might give him multiple options for a certain scene or give him many different flavors of something to choose from. And he’ll push us to deliver the goods. For example, you might deliver a technically perfect mix but he’ll dig in until it’s exactly what he wants it to be.

Stephen Urata

Urata: Exactly. It’s not that he’s curtailing or handcuffing us from doing something creative. This project has been one of my favorites because it was just the editorial team and sound design, and then it would come to the mix stage. That’s where it would be just Scott and me in a mix room just the two of us and we’d get a shot at our own aesthetic and our own choice. It was really a lot of fun trying to nail down what our favorite version of the mix would be, and David really gave us that opportunity. If he wanted something else he would have just said, “I want it like this and only do it like this.”

But at the same time, we would do something maybe completely different than he was expecting, and if he liked it, he would say, “I wasn’t thinking that, but if you’re going to go that direction, try this also.” So he wasn’t handcuffing us, he was pushing us.

Do you have an example of something that you guys brought to the table that Fincher wasn’t expecting and asked you to go with it?
Urata: The first thing we did was the train scene. It was the scene in an empty parking garage and there is the sound of an incoming train from two miles away. That was actually the first thing that we did. It was the middle of Episode 2 or something, and that’s where we started.

Where they’re talking to the BTK survivor, Kevin?
Lewis: Exactly.

Urata: He’s fidgeting and really uncomfortable telling his story, and David wanted to see if that scene would work at all, because it really relied heavily on sound. So we got our shot at it. He said, “This is the kind of the direction I want you guys to go in.” Scott and I played off of each other for a good amount of time that first day, trying to figure out what the best version would be and we presented it to him. I don’t remember him having that many notes on that first one, which is rare.

It really paid off. Among the mixes you showed Fincher, did you notice a trend in terms of his preferences?
Lewis: When I say we gave him options it might be down to something like with Son of Sam. Throughout that scene we used a slight pitching to slowly lower his voice over the length of the scene so that by the time he reveals that he actually isn’t crazy and he’s playing everybody, his voice drops a register. So when we present him options, it’s things like how much we’re pitching him down over time or things like that. It’s a constant review process.

The show takes place in the mid ‘70s and early ’80s. Were there any period-specific sounds or mixing tricks you used when it came to diegetic music and things like that?
Lewis: Oh yeah. Ren Klyce is the supervising sound designer on the show, and he’s fantastic. He’s the sound designer on all of David’s films. He is really good about making sure that we stay to the period. So with regard to mixing, panning is something that he’s really focused on because it’s the ‘70s. He’d tell us not to go nuts on the panning, the surrounds, that kind of thing; just keep it kind of down the middle. Also, futzes are a big thing in that show; music futzes, phone futzes … we did a ton of work on making sure that everything was period-specific and sounded right.

Are you using things like impulse responses and Altiverb or worldizing?
Lewis: I used a lot of Speakerphone by Audio Ease as well as EQ and reverb.

What mixing choices did you make to immerse the viewer in Holden’s reality, i.e. the PTSD he experiences?
Lewis: When he’s experiencing anxiety, it’s really important to make sure that we’re telling the story that we’re setting out to tell. Through mixing, you can focus the viewers’ attention on what you want them to track. So that could be dialogue in the background of a scene, like the end of Episode 1, when he’s having a panic attack, and in the distance, his boss and Tench are talking. It was very important that you make out the dialogue there, even though you’re focusing on Holden having a panic attack. So it’s moments like that when it’s making sure that the viewer is feeling that claustrophobia but also picking up on the story point that we want you to follow.

Lewis: Also, Stephen did something really great there — there are sprinklers in the background and you don’t even notice, but the tension is building through them.

There’s a very intense moment when Holden’s trying to figure out who let their boss know about a missing segment of tape in an interview, and he accuses Greg, who leans back in his chair, and there’s a squeal in there that kind of ramps up the tension.
Urata: David’s really, really honed in on Foley in general — chair squeaks, the type of shoes somebody’s wearing, the squeak of the old wooden floor under their feet. All those things have to play with David. Like when Wendy’s creeping over to the stairwell to listen to her girlfriend and her ex-husband talking. David said, “I want to hear the wooden floor squeaking while she’s sneaking over.”

It’s not just the music crescendo-ing and making you feel really nervous or scared. It’s also Foley work that’s happening in the scene, I want to hear more of that or less of that. Or more backgrounds to just add to the sound pressure to build to the climax of the scene. David uses all those tools to accomplish the storytelling in the scene with sound.

How much ambience do you have built into the raw Foley tracks that you get, and how much is reverb added after the fact? Things like car door slams have so much body to them.
Urata: Some of those, like door slams, were recorded by Ren Klyce. Instead of just recording a door slam with a mic right next to the door and then adding reverb later on, he actually goes into a huge mansion and slams a huge door from 40 feet away and records that to make it sound really realistic. Sometimes we add it ourselves. I think the most challenging part about all of that is marrying and making all the sounds work together for the specific aesthetic of the soundtrack.

Do you have a go-to digital solution for that? Is it always something different or do you find yourself going to the same place?
Urata: It definitely varies. There’s a classic reverb, a digital version of it: the Lexicon 480. We use that a good amount. It has a really great natural film sound that people are familiar with and it sounds natural. There are other ones but it’s really just another tool to make it. If it doesn’t work, we just have to use something else.

Were there any super memorable ADR moments?
Lewis: I can just tell you that there’s a lot of ADR. Some whole scenes are ADR. Any Fincher show that I’ve mixed dialogue on, where I also mixed the ADR, I am 10 times better than I was before I started. Because David’s so focused on storytelling, if there’s a subtle inflection that he’s looking for that he didn’t get on set, he will loop the line to make sure that he gets that nuance.

Did you coordinate with the composer? How do you like to mix the score so that it has a really complementary relationship to the rest of the elements?
Lewis: As re-recording mixers, they don’t involve us in the composition part of it; it just comes to us after they’ve spotted the score.

Jason Hill was the composer, and his score is great. It’s so spooky and eerie. It complements the sound design and sound effects layers really well so that a lot of it will kind of will sit in there. The score is great and it’s not traditional. He’s not working with big strings and horns all over the place. He’s got a lot of synth and guitars and stuff. He would use a lot of analog gear as well. So when it comes to mix sometimes you get kind of anomalies that you don’t commonly get, whether it’s hiss or whatever, elements he’s adding to add kind of an analog sound to it.

Lewis: And a lot of times we would keep that in because it’s part of his score.

Now let’s jump in with sound editor Jeremy Molod

As a sound editor, what was it like working with David Fincher?
Jeremy Molod: David and I have done abot seven or eight films together, so by the time we started on Season Two of Mindhunter, we pretty much knew each other’s styles. I’m a huge fan of David’s movies. It’s a privilege to work with him because he’s such a good director, and the stuff he creates is so entertaining and beautifully done. I really admire his organization and how detailed he is. He really gets in there and gives us detail that no other director has ever given us.

Jeremy Molod

You worked with him on The Social Network. In college, my sound professors would always cite the famous bar scene, where Mark Zuckerberg and his girlfriend had to shout at each other over the backgrounds.
Molod: I remember that moment well. When we were mixing that scene, because the music was so loud and so pulsating, David said, “I don’t want this to sound like we’re watching a movie about a club; I want this to be like we’re in the club watching this.” To make it realistic, when you’re in the club, you’re straining to hear sounds and people’s voices. He said that’s what it should be like. Our mixer, David Parker, kept pushing the music up louder and louder, so you can barely make out those words.

I feel like I’m seeing iterations of that in Mindhunter as well.
Molod: Absolutely. That makes it more stressful and like you said, gives it a lot more tension.

Scott said that David’s down to the decibel in terms of how he likes his sound mixed. I’m assuming he’s that specific when it comes to the editorial as well?
Molod: That is correct. It’s actually even more to that quarter decibel. He literally does that all the time. He gets really, really in there.

He does the same thing with editorial, and what I love about his process is he doesn’t just say, “I want this character to sound old and scared,” he gives real detail. He’ll say, “This guy’s very scared and he’s dirty and his shoelaces are untied and he’s got a rag and a piece of snot rag hanging out of his pocket. And you can hear the lint and the Swiss army knife with the toothpick part missing.” He gets into painting a picture, he wants us literally to translate the sound, but he wants us to make it sound like the picture he’s painting.

So he wanted to make Kevin sound really nervous in the truck scene. Kevin’s in the back and you don’t really see him too much. He’s blurred out. David really wanted to sell his fear by using sound, so we had him tapping the leg nervously, scratching the side of the car, kind of slapping his leg and obviously breathing really heavy and sniffing a lot, and it was those bounds that really helped sell that scene.

So while he does have the acumen and vocabulary within sound to talk to you on a technical level, he’ll give you direction in a similar way to how he would an actor.
Molod: Absolutely, and that’s always how I’ve looked at it. When he’s giving us direction, it’s actually the same way as he’s giving an actor direction to be a character. He’s giving the sound team direction to help those characters and help paint those characters and the scenes.

With that in mind, what was the dialogue editing process like? I’ve heard that his attention to detail really comes into play with inflection of lines. Were you organizing and pre-syncing the alternate takes as closely as you could with the picture selection?
Molod: We did that all the time. The inclination and the intonation and the cadence of the voices of the characters is really important to him, and he’s really good about figuring out which words of which takes he can stitch together to do it. So there might be two sentences that one actor says at one time and those sentences are actually made up of five different takes. And he does so many takes that we have a wealth of material to choose from.

We’d probably send about five or six versions to David to listen to and then he would make his notes. That would happen almost every day and we would start honing in on the performances he liked. Eventually he might say, “I don’t like any of them. You’ve got to loop this guy on the ADR stage.” He likes us to stitch up the best little parts and loop together like a puzzle.

What is the ADR stage like at Skywalker?
Molod: We actually did all of our ADR at Disney Studios in LA because David was down there, as were the actors. We did a fair amount of ADR in Mindhunter, there’s lots of it in there.

We usually have three or four microphones running during an ADR session, one of which will be a radio mic. The other three would be booms set in different locations, the same microphones that they use in production. We also throw in an extra [Sennheiser MKH 50] just to have it with the track of sound that we could choose from.

The process went great, we went through it, we’d come back and give him about five or six choices and then he would start making notes and we had to pin it down to the way he liked it. So by the time we got to the mix stage, the decision was done.

There was a scene where people are walking around talking after a murder had been committed, and what David really wanted was to kind of be talking a little softly about this murder. So we had to go in and loop that whole scene again with them performing it at a more quiet, sustained volume. We couldn’t just turn it down. They had to perform it as if they were not quite whispering but trying to speak a little lower so no one could hear.

To what extent did loop groups play a part in the soundtrack? With the prominence of backgrounds in the show it seems like customization would be helpful, to have time-specific little bits of dialogue that might pop out.
Molod: We’ve used a group called the Loop Squad for all the features, House of Cards shows and the Mindhunters. We would send a list of all of our cues, get on the phone and explain what the reasoning was, what the storylines were. All their actors would on their own, go and research everything that was happening at the time, so if they were just standing by a movie theater, they had something to talk about that was relevant at the time.

When it came to production sound on the show, which track did you normally find yourself working from?
Molod: In most scenes, they would have a couple of radio mics attached to the actors and they’d have several booms. Normally, there were maybe eight different microphones set up. You would have one general boom over the whole thing, you’d have the boom that was close to each character.

We almost always went with one of the booms, unless we were having trouble making out what they were saying. And then it depended just on which actor was standing closest to the boom. One of the tricks our editors did in order to make it sound better is they would phase the two. So if the boom wasn’t quite working on its own and the radio either, one of our tricks would be to make those two play together in a way, and accomplish what we wanted where you could hear it but also give the space in the room.

Were there any moments that you remember from the production tracks for effects?
Molod: Whenever we could use production effects, we always tried to get those in, because they always sound the most realistic and most pertinent to that scene and that location. If we can maintain any footsteps in the production, we always do because those always sound great.

Any kind of subtle things like creaks, bed creaks, the floor creaking, we always try to salvage those and those help a lot too. Fincher is very, very, very into Foley. We have Foley covering the whole thing, end to end. He gives us notes on everybody’s footsteps and we do tests of each character with different types of shoes on and different strides of walking, and we send it to him.

So much of the show’s drama plays out in characters’ internal worlds. In a lot of the prison interview scenes, I notice door slams here and there that I think serve to heighten the tension. Did you develop a kind of a logical language when it came to that, or did you find it was more intuitive?
Molod: No, we did have our language to it and that was based on Fincher’s direction, and when it was really crazy he wanted to hear the door slams and buzzers and keys jingling and tons of prisoners yelling offsite. We spent days recording loop-group prisoners and they would be sprinkled throughout the scene. And when something about the conversation had an upsetting subject matter, we might ramp up the voices in the back.


Pat Birk is a musician, sound engineer and post pro at Silver Sound, a boutique sound house based in New York City.

COVID-19: NAB talks plans, more companies offer support, info about remote work

Last Friday, NAB’s president/CEO, Gordon Smith, issued a statement saying that rather than rescheduling the NAB Show for later this year, NAB would be unveiling a new digital offering called NAB Show Express, and enhancing NAB Show New York later this year. Here is part of what he said.

“First, we are exploring a number of ways to bring the industry together online, both in the short and long term. We know from many years of serving the community with face-to-face events, that connectivity is vital to the health and success of the industry. That’s why we are excited to announce NAB Show Express, targeted to launch in April 2020. This digital experience will provide a conduit for our exhibitors to share product information, announcements and demos, as well as deliver educational content from the original selection of programming slated for the live show in Las Vegas, and create opportunities for the community to interact virtually — all of which adds up to something that brings the NAB Show community together in a new way.

“Second, we will be enhancing NAB Show New York with new programs, partners, and experiences. We have already had numerous conversations with show partners about expanding their participation and have heard from numerous exhibitors interested in enhancing their presence at this fall’s show. NAB Show New York represents the best opportunity for companies to announce and showcase their latest innovations and comes at a perfect time for the industry to gather face-to-face to restart, refocus, and reengage as we move forward together.”

A number of companies are releasing updates and offering discounts and tips for working remotely. Here is a bit of news from some of those companies, and we will add more companies to this list as the news comes in, so watch this space.

mLogic
mLogic is offering a 15% discount on its mTape Thunderbolt 3 LTO-7 and LTO-8 solutions The discount applies to orders placed on the mTape website through April 20th. Use discount code mLogicpostPerspective15%.

Xytech
Xytech has launched “Xytech After Dark,” a podcast focusing on trends in the media and broadcasting industries. The first two episodes are now available on iTunes, Spotify and all podcasting platforms.

Xytech’s Greg Dolan says the podcast “is not a forum to sell, but instead to talk about why create the functionality in MediaPulse and the types of things happening in our industry.”

Hosted by Xytech’s Gregg Sandheinrich, the podcast will feature Xytech staff, along with special guests. The first two episodes cover topics including the recent HPA Tech Retreat (featuring HPA president Seth Hallen), as well as the cancellation of the NAB Show, the value of trade shows and the effects of COVID-19 on the industry.

Nvidia
Nvidia is expanding its free virtual GPU software evaluation to 500 licenses for 90 days to help companies support their remote workers with their existing GPU infrastructure. Nvidia vGPU software licenses — including Quadro Virtual Workstation — enable GPU-accelerated virtualization so that content creators, designers, engineers and others can continue their work. More details are available here.  Nvidia has also posted a separate blog on virtual GPUs to help admins who are working to support remote employees

Object Matrix 
Object Matrix is offering video tips for surviving working from home. The videos, hosted by co-founder Nicholas Pearce, are here.

Adobe
Adobe shared a guide to best practices for working from home. It’s meant to support creators and filmmakers who might be shifting to remote work and need to stay connected with their teams and continue to complete projects. You can find the guide here.

Adobe’s principal Creative Cloud evangelist, Jason Levine, hosted a live stream — Video Workflows With Team Projects ±that focus on remote workflows.

Additionally, Karl Soule, Senior Technical Business Development Manager, hosed a stream focusing on Remote video workflows and collaboration in the enterprise. If you sign up on this page, you can see his presentation.

Streambox
Streambox has introduced a pay-as-you-go software plan for video professionals who use its Chroma 4K, Chroma UHD, Chroma HD and Chroma X streaming encoder/decoder hardware. Since the software has been “decoupled” from the hardware platform, those who own the hardware can rent the software on a monthly basis, pause the subscription between projects and reinstate it as needed. By renting software for a fixed period, creatives can take on jobs without having to pay outright for technology that might have been impractical.

And last week’s offerings as well

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

DejaSoft
DejaSoft is offering editors 50% off all their DejaEdit licenses through the end of April. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow.

DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.

Main Image: Courtesy of Frame.io

Main Image: Courtesy of Adobe

Autodesk’s 3ds Max 2021 now available

Autodesk has introduced 3ds Max 2021, a new version of its 3D modeling, animation and rendering software. This latest release offers new tools designed to give 3D artists the ability to work across design visualization and media and entertainment with a fully scriptable baking experience, simpler install process, viewport and rendering improvements, and integrated Python 3 support, among other enhancements.

Highlights include:
• New texture baking experience supports physically based rendering (PBR), overrides and OSL workflows and provides intuitive new tool set.
• Updated installer allows users to get up and running quickly and easily.
• Integrated support for Python 3 and an improved pymxs API that ensure developers and technical artists can better customize pipelines.
• Native integration with the Arnold Renderer V6.0 offers a high-end rendering experience out of the box, while included scripts efficiently convert V-Ray and Corona files to the Physical Material for added flexibility.
• Performance enhancements simplify the use of PBR workflows across renderers, including with realtime game engines; provide direct access to high-fidelity viewports; improve the OSL user experience; significantly accelerate file I/O; and enhance control over modeling with a new weighted normals modifier.
• Tool set advancements to SketchUp import, Substance, ProSound and FBX streamline the creation and movement of high-quality 3D assets.
• New plugin interop and improvements – from support for AMG and OSL shaders to scene converter extensions – allow for a broader range of plugins to easily hook into 3ds Max while also simplifying plugin development and installation.

Early 3ds Max 2021 adopter Eloi Andaluz Fullà, a freelance VFX artist on beta, reported, “The revamped viewport, IBL controls and persistent Ambient Occlusion speed up the client review process because I can easily share high-quality viewport images without having to wait for renders. The new bake to texture update is also a huge time-saver because we can easily modify multiple parameters at once, while other updates simplify day-to-day tasks.”

3ds Max 2021 is now available as a stand-alone subscription or with the Autodesk Media & Entertainment Collection.

Workstations Roundtable

By Randi Altman

In our Workstations Special Edition, we spoke to pros working in offline editing, visual effects and finishing about what they need technically in order to keep creating. Here in our Workstations Roundtable, we reached out to both users and those who make computers and related tools, all of whom talk about what they need from their workstations in order to get the job done.

The Foundation’s Director of Engineering, John Stevens 

John Stevens

Located just across the street from the Warner Bros. lot, The Foundation provides post production picture services and workflows in HD, 2K, 4K, UHD, HDR10 and Dolby Vision HDR. They work on many episodic shows, including Black-ish, Grown-ish, Curb Your Enthusiasm and American Soul.

Do you typically buy off the shelf or custom? Both?
Both. It depends on the primary application the system will be running. Typically, we buy off-the-shelf systems that have the CPU and memory configurations we are looking for.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
There is no defined time frame. We look at every system manufacturer’s offerings, look at specs and request demo systems for test after we have narrowed it to a few systems.

How important is the GPU to your work?
The GPU is extremely important, as almost every application uses the GPU to allow for faster processing. A lot of applications allow for multiple GPUs, so I look for systems that will support them.

Curb Your Enthusiasm

What are the questions you ask yourself before buying a new system? And what do you do with your older systems?
What is the primary application that the system is being purchased for? Does the software vendor have a list of certified configurations? Is the application well-threaded, meaning, can the application make efficient use of multiple cores, or does a higher core clock rate make the application perform faster? How many PCI slots are available? What is the power supply capability? What’s the reputation and experience of the manufacturer?

Do you feel mobile workstations are just as powerful for your work as desktops these days?
No, systems are limited in expandability.

 

Puget Systems’ Solutions Research & Development, Matt Bach

Based in Auburn, Washington, Puget Systems specializes in high-performance, custom-built computers for media and entertainment.

Matt Bach

What is your definition of a workstation? We know there are a few definitions out there in the world.
While many people tend to focus on the hardware to define what a workstation is, to us it is really whether or not the computer is able to effectively allow you to get your work done. In order to do so, it has to be not only fast but reliable. In the past, you had to purchase very expensive “workstation-class” hardware to get the proper balance of performance and stability, but these days it is more about getting the right brands and models of parts to complement your workflow than just throwing money at the problem.

For users looking to buy a computer but are torn between off-the-shelf and building their own, what would you tell them?
The first thing I would clarify is that there are vastly different kinds of “off-the-shelf” computers. There are the systems you get from a big box store, where you have a handful of choices but no real customization options. Then there are systems from companies like us, where each system is tailor-made to match what applications you use and what you do in those applications. The sticker price on these kinds of systems might appear to be a bit higher, but in reality — because it is the right hardware for you — the actual performance you get per dollar tends to be quite a bit better.

Of course, you can build a system yourself, and in fact, many of our customers used to do exactly that. But when you are a professional trying to get your work done, most people don’t want to spend their time keeping up on the latest hardware, figuring out what exact components they should use and troubleshooting any issues that come up. Time spent fiddling with your computer is time that you could spend getting your job done. Working with a company like us that understands what it is you are doing — and how to quickly get you back up and running — can easily offset any cost of building your own system.

What questions would you suggest pros ask before deciding on the right computer for their work?
This could easily be an entire post all its own, and this is the reason why we highly encourage every customer to talk to one of our consultants — if not on the phone, then at least by email. The right configuration depends on a huge number of factors that are never quite the same from one person to the next. It includes what applications you use and what you do in those applications. For example, if you are a video editor, what resolution, fps and codec do you tend to work with? Do you do any multicam work? What about VFX or motion graphics?

Depending on what applications you use, it is often also the case that you will run into times when you have opposing “optimal” hardware. A program like After Effects prefers CPUs with high per-core performance, while Premiere Pro can benefit from a CPU with more cores. That means there is no single “best” option if you use both of those applications, so it comes down to determining which application is more likely to benefit from more performance in your own personal workflow.

This really only scratches the surface, however. There is also the need to make sure the system supports your existing peripherals (Thunderbolt, 10G networking, etc.), the physical size of the system and upgradability. Not to mention the quality of support from the system manufacturer.

How do you decide on what components to include in your systems … GPUs, for example?
We actually have an entire department (Puget Labs) that is dedicated to this exact question. Not only does hardware change very quickly, but software is constantly evolving as well. A few years back, developers were working on making their applications multi-threaded. Now, much of that dev time has switched over to GPU acceleration. And in the very near future, we expect work in AI and machine learning to be a major focus.

Keeping up with these trends — and how each individual application is keeping up with them — takes a lot of work. We do a huge amount of internal testing that we make available to the public to determine exactly how individual applications benefit from things like more CPU cores, more powerful GPUs or faster storage.

Can you talk about warranties and support? What do you offer?
As for support and warranty, our systems come with lifetime tech support and one to three years parts warranty. What makes us the most different from big box stores is that we understand your workflow. We do not want your tech support experience to be finger pointing between Adobe, Microsoft and Puget Systems. Our goal is to get you up and running, regardless of what the root cause is, and often that means we need to be creative and work with you individually on the best solution to the problem.

 

Goldcrest Post’s Technical Director, Barbary Ahmed

Barbary Ahmed

Goldcrest Post New York, located in the heart of the bustling Meatpacking District, is a full-service post facility offering offline and picture and sound finishing.  Recent credits include The Laundromat, Godfather of Harlem, Russian Doll, High Flying Bird, Her Smell; Sorry to Bother You, Billions and Unsane.   

Do you typically buy off the shelf or custom? Both?
We do both. But for most cases, we do custom builds because color grading workstations need more power, more GPUs and a lot of I/O options.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
This is technically a long research process. We depend on our trusty vendors, and it also depends on pricing and availability of items and how quick we need them.

How important is the GPU to your work?
For color grading and visual effects, using applications such as Autodesk’s Maya and Flame, Blackmagic Resolve and Adobe Premiere, a high-end workstation will provide a smoother and faster workflow. 4K/UHD media and above can tax a computer, so having access to a top-of-the-line machine is a key for us.

The importance of GPUs is that the video software mentioned above is now able to dump much of the heavy lifting onto the GPU (or even several GPUs), leaving the CPU free to do its job of delegating tasks, applications, APIs, hardware process, I/O device requests and so on. The CPU just makes sure all the basic tasks run in harmony, while the GPU takes care of crunching the more complex and intensive computation needed by the application. It is important to know that for all but the most basic video — and certainly for any form of 4K.

What are the questions you ask yourself before buying a new system? And what do you do with your older systems?
There are many questions to ask here: Is this system scalable? Can we upgrade it in the future? What real change will it bring to our workflow? What are others in my industry using? Does my team like it? These are the kind of questions we start with for any job.

In terms of what to do with older systems, there are a couple things that we think about: Can we use it as a secondary system? Can we donate it? Can we turn it into an experimental box? Can we recycle it? These are the kind of questions we ask ourselves.

Do you feel mobile workstations are just as powerful for your work as desktops these days? Especially now, with the coronavirus shutdowns?
During these unprecedented times, it seems that mobile workstations are the only way to keep up with our clients’ needs. But we were innovative about it; we established the capability to conduct most picture and sound post production work remotely. Colorists, conform editors and other staff are now able to work from home or a remote site and connect to the facility’s central storage and main desktop workstations via remote collaboration software.

This allows Goldcrest to ensure theatrical and television projects remain on track while allowing clients to oversee work in as normal a manner as possible under current circumstances.

 

Dell’s M&E Strategist, Client Solutions, Matt Allard

Matt Allard

Dell Technologies helps users create, manage and deliver media through a complete and scalable IT infrastructure, including workstations, monitors, servers, shared storage, switches, virtualization solutions and more paired with the support and services.

What is Dell’s definition of a workstation? We know there are a few definitions.
One of the most important definitions is the International Data Corporation’s (IDC) definition that assesses the overall market for workstations. This definition includes several important elements:

1. Workstations should be highly configurable and include workstation-grade components, including:
a. Workstation-grade CPUs (like Intel Xeon processors)
b. Professional and discrete GPUs, like those in the Nvidia Quadro line and AMD Radeon Pro line
c. Support for ECC memory

2. Workstations must be certified with commonly used professional ISV software, like that from Adobe, Autodesk, Avid, Blackmagic and others.

3. IDC requires a brand that is dedicated and known for workstations.

Beyond the IDC’s requirements, we understand that workstation customers are seeking the utmost in performance and reliability to run the software they use every day. We feel that workstation-grade components and Dell Precision’s engineering deliver that environment. Reliability can also include the security and manageability that large enterprises expect, and our designs provide the hooks that allow IT to manage and maintain workstations across a large studio or media enterprise. Consumer PCs rarely include these commercial-grade IT capabilities.

Additionally, software and technology (such as the Dell Precision Optimizer, our Reliable Memory Technology, Dell Client Command Suite) can extend the performance, reliability and manageability on top of the hardware components in the system.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
It’s a common misconception that a computer is just a sum of its parts. It can be better to deal with a vendor that has the supply chain volume and market presence to have advantageous access during times like these, when supply constraints exist on popular CPUs and GPUs. Additionally, most professional ISV software is not qualified or certified on a set of off-the-shelf components, but on specific vendor PC models. If users want absolute confidence that their software will run optimally, using a certified/qualified platform is the best choice. Warranties are also important, but more on that in a bit.

What questions would you suggest pros ask before deciding on the right computer for their work?
The first question is to be clear about the nature of the work you do as a pro, using what software applications in the media and entertainment industry. Your working resolution has a large bearing on the ideal configuration for the workstation. We try to make deciding easier with Dell’s Precision Workstation Advisor, which provides pros an easy way to find configuration choices based on our certification testing and interaction with our ISV partners.

Do you think we are at a time when mobile workstations are as powerful as desktops?
The reality is that it is not challenging to build a desktop configuration that is more powerful than the most powerful mobile workstation. For instance, Dell Precision fixed workstations support configurations with multiple CPUs and GPUs, and those actually require beefier power supplies, more slots and thermal designs that need more physical space than in a reasonably sized mobile.

A more appropriate question might be, can a mobile workstation be an effective tool for M&E professionals who need to be on the road or on shoot? And the answer to that is a resounding yes.

How do you decide on what components to include in your systems … GPUs, for example?
As mentioned above, workstations tend to be highly configurable, often with multiple options for CPUs, GPUs and other components. We work to stay at the forefront of our suppliers’ roadmap offerings and to provide a variety of options so customers can choose the right price/performance configuration that suits their needs. This is where having a clear guidance on certified system for the ISV software a customer is using makes selecting the right configuration easier.

Can you talk about warranties and support?
An advantage of dealing with a Tier 1 workstation vendor like Dell is that pros can pick the right warranty and support level for their business, from basic hardware warranty to our ProSupport with aggressive availability and response times. All Dell Precision fixed workstations come with a three-year Dell Limited Hardware warranty, and users can opt for as many as five years. Precision mobile workstations come with a one-year warranty (except 7000 series mobile, which has three years standard), and users can opt for as many as five years’ warranty with ProSupport.

 

Performance Post’s Owner/President, Fausto Sanchez

Fausto Sanchez

Burbank’s Independently owned Performance Post focuses on broadcast television work. It works with Disney, Warner Bros. and NBCUniversal. Credits include TV versions of the Guardians of the Galaxy franchise and SD to UHD upconversion and framerate conversions for HBO’s At the Heart of Gold: Inside the USA Gymnastics Scandal.

Do you typically buy off the shelf or custom? Both?
We look to the major suppliers like HP, Dell and Apple for off-the-shelf products. We also have
purchased custom workstations, and we build our own.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
If we have done our homework well, our workstations can last for three to five years. This timeline is becoming shorter, though, with new technologies such as higher core count and clock speed.

In evaluating our needs, first we look at the community for best practices. We look to see what has been successful for others. I love that we can get that info and stories here on postPerspective! We look at what the main suppliers are providing. These are great if you have a lot of extra cash. For many of us, the market is always demanding and squeezing everything it can. We are no different. We have bought both preconfigured systems from the primary manufacturers as well as custom systems.

HBO’s At the Heart of Gold: Inside the USA Gymnastics Scandal.

How important is the GPU to your work?
In our editorial workflows — Avid Media Composer, Adobe Premiere, Blackmagic Resolve (for editing) — GPU use is not a big deal because these applications are currently not relying on GPU so much for basic editing. Mostly, you select the one best for your applications. Nvidia has been the mainstay for a long time, but AMD has gotten great support, especially in the new Mac Pro workstation.

For color work or encoding, the GPU selection becomes critical. Currently, we are using the Nvidia Titan series GPUs for some of our heaviest processor-intensive workflows

What are the questions you ask yourself before buying new systems? And what do you do with your older systems?
When buying a new system, obviously the first questions are: What is it for? Can we expand it? How much? What kind of support is there? These questions become key, especially if you decide to build your custom workstation. Our old systems many times are repurposed for other work. Many can function in other duties for years.

Do you feel mobile workstations are just as powerful for your work as desktops these days?
We have had our eye on mobile workstations for some time. Many are extremely powerful and can find a good place for a specific purpose. There can be a few problems in this setup: additional monitor capabilities, external GPU, external mass storage connectivity. For a lot of work, mobile workstations make sense; if I do not have to connect a lot of peripherals and can work mostly self-contained or cloud-based, these can be great. In many cases you quickly learn that the keyboard, screen and battery life are not conducive to a long-term workflow. For the right workflow though, these can be great. They’re just not for us right now.

 

AMD’s Director of VFX/Media & Entertainment, James Knight

James Knight

AMD provides Threadripper and Epyc CPUs that accelerate workflows in M&E.

How does AMD describe a workstation?
Some companies have different definitions of what makes a workstation. 
Essentially AMD thinks of workstations as a combination of powerful CPUs and GPUs that enable professionals to create, produce, analyze, design, visualize, simulate and investigate without having to compromise on power or workload performance to achieve their desired results. In the specific case of media and entertainment, AMD designs and tests products aligned with the workstation ecosystem to enable professionals to do so much more within the same exact deadlines. We are giving them more time to create.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
Ultimately, professionals need to choose the best solution to meet their creative goals. We work closely with major OEMs to provide them with the best we have to offer for the market. For example, 64-core Threadripper has certainly been recognized by workstation manufacturers. System builders can offer these new CPUs to achieve great results.

What questions should pros ask before purchasing a workstation, in order to make sure they are getting the right workstation for their needs?
I typically ask professionals to focus on their pain points and how they want the new workstation to resolve those issues. More often than not, they tell me they want more time to create and time to try various renderings. With an optimized workstation matched with on optimal balance of powerful CPUs and reliable GPUs, pros can achieve the results they demand over and over.

What trends have you seen happening in this space over the last couple of years?
As memory technology improves and larger models of higher resolution are created, I’ve seen user expectations increase dramatically, as has their desire to work easily with these files. The demand for reliable tools for creating, editing and producing content has been constantly growing. For example, in the case of movie mastering and encoding, AMD’s 32-core and 64-core Threadripper CPUs have exceeded expectations when working with these large files.

PFX‘s Partner/VFX Supervisor, Jan Rybar 

Jan Rybar

PFX is a Czech-based company focused on animation, post and visual effects. They work on international projects ranging from short films to commercials, TV series and feature films. The 110-member team works in their studios in Prague

How often do you upgrade your workstations, and what process do you go through in finding the right one?
We upgrade the workstations themselves maybe every two or three years. We try to select good quality vendors and robust specs so we won’t be forced to replace workstations too often.

Do you also build your own workstations and renderfarms?
Not really — we have a vendor we like and buy all the hardware there. A long time ago, we found out that the reliability of HP and their Z line of workstations is what we need. So 99% of our workstations and blade renderfarms are HP.

How do your needs as a VFX house differ from a traditional post house?
It blends together a lot — it’s more about what the traditional post house specializes in. If it’s focused on animation or film, then the needs are quite similar, which means more based on CPU power. Lately, as we have been involved more and more in realtime engine-based workflows, state-of-the-art GPU technology is crucial. The Last Whale Singer teaser we did was created with the help of the latest GeForce RTX 2080ti hardware. This allowed us to work both efficiently and with the desired quality (raytracing).

Can you walk us through your typical workflow and how your workstations and their components play a part?
The workflow is quite similar to any other production: design/concept, sculpting, modeling, rigging, layout, animation, lighting/effects, rendering, compositing, color grading, etc.

The main question these days is whether the project runs in a classic animation pipeline, on a realtime engine pipeline or a hybrid. Based on this, we change our approach and adapt it to the technology. For example, when Telescope Animation works on a scene in Unreal, it requires different technology compared to a team that’s working in Maya/Houdini.

PNY’s Nvidia Quadro Product Marketing Manager, Carl Flygare

Carl Flygare

Nvidia’s Quadro RTX-powered workstations, featuring Nvidia Turing GPU architecture, allow for realtime raytracing, AI and advanced graphics capabilities for visualization pros. PNY is Nvidia’s Quadro channel partner throughout North America, Latin America, Europe and India.

How does PNY describe a workstation? Some folks have different definitions of what makes a workstation.
The traditional definition of the term comes from CAD – a system optimized for computer aided design — with a professional CPU (e.g., Xeon, Ryzen), generous DRAM capacity with ECC (Error Correction Code), a significant amount of mass storage, a graphics board capable of running a range of pro applications required by a given workflow and a power supply and system enclosure sufficient to handle all of the above. Markets and use cases also matter.

Contemporary M&E requires realtime cinematic quality rendering in application viewports, with an AI denoising assist. Retiming video (e.g., from 30 fps to 120 fps) for a slow-motion effect can be done by AI, with results essentially indistinguishable from a slow-motion session on the set. A data scientist would see things differently. GPU Tensor TFLOPS enable rapid model training to achieve inference accuracy requirements, GPU memory capacity to hold extremely large datasets, and a CPU/GPU combination that offers a balanced architectural approach to performance. With so many different markets and needs, practically speaking, a workstation is a system that allows a professional to do their best work in the least amount of time. Have the hardware address that need, and you’ve got a workstation.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
As Henry Ford famously said about the Model T: “Any customer can have a car painted any color that he wants so long as it is black.” That is the off-the-shelf approach to acquiring a workstation. Large Tier 1 OEMs offer extensive product lines and daunting Configure to Order options, but ultimately, all offer similar classes of systems. Off-the-shelf is easy; once you successfully navigate the product line and specifications maze, you order a product, and a box arrives. But building your own system is not for the faint-hearted. Pick up CPU data sheets from Intel or AMD — you can read them for days.

The same applies to GPUs. System memory is easier, but mass storage offers a dizzying array of options. HDD (hard disk drive) or SSD (solid state drive)? RAID (and if so, what kind) or no RAID? How much power supply capacity is required for stable performance? A built-from-scratch workstation can result in a dream system, but with a system of one (or a few), how well will critical applications run on it? What if an essential workflow component doesn’t behave correctly? In many instances this will leave you on your own. Do you want to buy a system to perform the work you went into business to do, or do you want to spend time maintaining a system you need to do your work?

A middle path is available. A vibrant, lithe, agile and market-solutions knowledge-based system builder community exists. Vendors like Boxx Technologies, Exxact, Rave Computer, Silverdraft Supercomputing and @Xi Computer (among others) come to mind. These companies specialize in workstations (as defined by any of the definitions discussed earlier), have deep vertical knowledge, react quickly to technological advances that provide a performance and productivity edge, and vigorously support what they sell

What questions would you suggest pros ask before deciding on the right computer for their work?
Where is their current system lacking? How are these deficits affecting creativity and productivity? What use cases does a new system need to perform well? What other parts of my employment environment do I need to interact with, and what do they expect me to provide? These top-line questions transition to many others. What is the model or scene size I need to be able to fit into GPU memory to benefit from full GPU performance acceleration? Will marketing show up in my office or cubicle and ask for a photorealistic render even though a project is early in the design stage? Will a client want to interact with and request changes by using VR? Is a component of singular significance — the GPU — certified and supported by the ISVs that my workflow is built around? Answer these questions first, and you’ll find the remainder of the process goes much more easily. Use case first, last and always!

You guys have a relationship with Nvidia and your system-builder partners use their Nvidia GPUs in their workstations. Can you talk about that?
PNY is Nvidia’s sole authorized channel partner for Nvidia Quadro products throughout North America and Latin America and Europe, Middle East, Africa and India. Every Quadro board is designed, tested and built by Nvidia, whether it comes from PNY, Dell, HP or Lenovo. The difference is that PNY supports Quadro in any system brand. Tier 1 OEMs only support a Quadro board’s “slot win” in systems they build. This makes PNY a much better choice for GPU upgrades — a great way to extend the life of existing workstations — or when looking for suppliers that can deliver the technical support required for a wonderful out-of-box experience with a new system. It’s true whether the workstation is custom-built or purchased through a PNY Partner that specializes in delivering turnkey systems (workstations) built for professionals.

Can you talk about warranties and support? What do you offer?
PNY offers support for Nvidia in any system brand. We have dedicated Nvidia Quadro technical support reps available by phone or email. PNY never asks for a credit card number before offering product or technical support. We also have full access to Nvidia product and technical specialists should escalation be necessary – and direct access to the same Nvidia bug reporting system used by Nvidia employees around the world.

Finally, what trends do you see in the workstation market currently?
First the good: Nvidia Quadro RTX has enabled a workstation renaissance. It’s driving innovation for design, visualization and data science professionals across all major market segments. An entirely new class of product — the data science workstation — has been developed. Quadro RTX in the data centers and virtual GPU technology can bring the benefits of Quadro RTX to many users while protecting essential intellectual property. This trend toward workstation specialization by use case offers buyers more choices that better fit their specific criteria. Workstations — however defined — have never been more relevant or central to creative pros across the globe. Another good trend is the advent of true mobile workstations and notebooks, including thin and light systems, with up to Quadro RTX 5000 class GPUs.

The bad? With choice comes confusion. So many to choose from. Which best meets my needs? Companies with large IT staff can navigate this maze, but what about small and medium businesses? They can find the expertise necessary to make the right choice with PNY’s extensive portfolio of systems builders. For that matter, enterprises can find solutions built from the chassis up to support a given use case. Workstations are better than ever before and purchasing one can be easier than ever as well.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Maxon to live-stream NAB news and artist presentation

With the Las Vegas NAB Show now cancelled, Maxon will be hosting a virtual NAB presence on C4DLive.com featuring a lineup of working artists. Starting on Monday, April 20, and running through Thursday, April 23, these pros — who were originally slated to appear in Las Vegas — will share production tips, techniques and inspiration and talk about working with Maxon’s Cinema 4D, Red Giant and Redshift product lines

For over a decade, Maxon has supplemented its physical booth presence with live streaming presentations. This has allowed show attendees and those unable to attend events in person, to benefit from demos, technology updates and interaction with the guest artists in real time. First up will be CEO Dave McGavran, who will talk about Maxon’s latest news and recent merger with Red Giant.

In terms of artists, Penelope Nederlander will break down her latest end credit animation for Birds of Prey; filmmaker Seth Worley will walk through some of the visual effects shots from his latest short film, Darker Colors; Doug Appleton will share the creative processes behind creating the technology for Spider-Man: Far From Home; Jonathan Winbush will demo importing C4D scenes into Unreal Engine for rendering or VR/AR output; and Veronica Falconieri Hays will share how she builds cellular landscapes and molecular structures in order to convey complex scientific stories.

The line-up of artists also includes Mike “Beeple” Winkelmann, Stu Maschwitz, EJ Hassenfratz, Chris Schmidt, Angie Feret, Kelcey Steele, Daniel “Hashi” Hashimoto, Dan Pierse, Andy Needham, Saida Saetgareeva and many more.

Additional presenters’ info and a live streaming schedule will be available at C4DLive.com.

Main Image: (L-R) Saida Saetgareeva and Penelope Nederlander

Workstations: Offline Editing Workflows

By Karen Moltenbrey

When selecting a workstation, post facilities differ in their opinions about what’s most important, depending on the function the workstations will serve. It goes without saying that everyone wants value. And for some, power is tantamount. For others, speed is a top priority. And for others still, reliability reigns supreme. Luckily for users, today’s workstations can check all those boxes.

As Eric Mittan, director of technology at New York’s Jigsaw Productions, is quick to point out, it’s hard to fathom the kinds of upgrades in power we’ve seen in workstations just in the time he has been working with them professionally. He recalls that in 2004, it took an overnight encoding session to author a standard-definition DVD with just one hour of video — and that task was performed on one of the first dual-processor desktops available to the regular consumer. “Nowadays, that kind of video transcode can take 15 minutes on a ‘light’ laptop, to say nothing of the fact that physical media like the DVD has gone the way of the dinosaur,” he says.

Eric Mittan

That is just the tip of the iceberg in terms of the revolution that workstations have undergone in a very short period. Here, we examine the types of workstations that a pair of studios are using for their editing tasks. Jigsaw, a production company, does a large portion of its own post through Apple iMacs that run Avid Media Composer; it is also a client of post houses for work such as color and final deliverables. Meanwhile, another company, Final Cut, is also a Mac-based operation, running Avid Media Composer and Adobe Premiere Pro, although the company’s Flames run on HP workstations.

[Editor’s Note: These interviews were done prior to the coronavirus lockdown.]

Jigsaw Productions
Jigsaw Productions is a documentary television and film company that was founded in 1978 by documentary filmmaker Alex Gibney. It has since transitioned from a company that made one movie at a time to one that is simultaneously producing multiple features and series for distribution by a number of networks and distribution partners.

Today, Jigsaw does production and offline editorial for all its own films and series. “Our commitment is to filmmakers bringing real stories to their audience,” Mittan says. Jigsaw’s film and episodic projects include the  political (Client 9: The Rise and Fall of Eliot Spitzer), the musical (History of the Eagles) and the athletic (The Armstrong Lie).

On the technical front, Jigsaw does all the creative editorial in house using Avid’s Media Composer. After Jigsaw’s producers and directors are satisfied with the storytelling, the lion’s share of the more technical work is left to the company’s partners at various post houses, such as Harbor, Technicolor, Light Iron and Final Frame, among others. Those facilities do the color timing and DCP generation in the case of the feature titles. Most of the conform and online work for Jigsaw’s TV series is now done in house and then sent out for color.

“I wouldn’t say for sure that we have mastered the Avid-to-Resolve online workflow, but we have become better at it with each project,” says Mittan. It’s Mittan’s job to support post and offline operations along with the needs of the others in the office. The backbone of the post fleet comprises 26 (2018) 27-inch i7 iMacs with 32GB of RAM. During 2018 and 2019, Jigsaw experienced a period of rapid growth, adding 19 new edit suites. (That was in addition to the original 13 built out before Mittan came aboard in 2017.) There are also some earlier iMac models that are used for lighter tasks, such as screening, occasional transcoding and data transfers, as well as eight Mac mini screening stations and five Mac Pro cylinders for heavy transcoding and conform/online tasks. Approximately 10 or more 2019 models round out the remainder of the hardware, though they were purchased with i5 processors, not i7s.

“Jigsaw’s rapid expansion pushed us to buy new machines in addition to replacing a significant portion of our 2012/2013 model Mac Pro and iMac units that had comprised most of our workstations prior to my arrival,” Mittan notes. Each project group at the company is responsible for its own data management and transcoding its own dailies.

Furthermore, Jigsaw has an Avid Nexis shared storage system. “Our editors need to be able to run the latest version of Avid and must maintain and play back multiple streams of DNxHR SQ via a 1Gb connection to our Nexis shared storage. While documentary work tends to be lower resolution and/or lower bandwidth than narrative scripted work, every one of our editors deserves to be able to craft a story with as few technical hiccups as possible,” says Mittan. “Those same workstations frequently need to handle heavy transcodes from interview shoots and research archive gathered each day by production teams.”

When buying new equipment, Mittan looks to strikes a balance between economy and sustainability. While the work at Jigsaw does not always require the latest and greatest of all possible end-user technology, he says, each purchase needs to be made with an eye toward how useful it will remain three to five years into the future.

Salt, Fat, Acid, Heat

While expansion in the past few years resulted in the need for additional purchases, Mittan is hoping to get Jigsaw on a regular schedule of cycling through each of the units over a period of five to six years. Optimally, the edit suite units are used for between three or more years before being downgraded for lighter tasks and eventually used as screening stations for Jigsaw’s producers. Even beyond that, the post machines could see life in years six to eight as office workstations for some of the non-post staff and interns. Although Mittan has yet to access one of the new Mac Pro towers, he is impressed by what he has read and hopes for an acquisition in 2021 to replace the Mac Pro cylinders for online and conform work.

Post at Jigsaw runs Avid Media Composer on the Apple machines. They also use the Adobe Creative Cloud suite for motion graphics within Adobe After Effects and Photoshop. Mittan has also implemented a number of open-source software tools to supplement Jigsaw’s tool kit for assistant editors. That includes command-line tools (like FFmpeg) for video and audio transcodes and Rsync for managed file transfers and verification.

“I’ve even begun to write a handful of custom software scripts that have made short work of tasks common to documentary filmmaking — mostly the kind of common video transcoding jobs that would usually require a paid title but that can be taken care of just as well with free software,” he says.

Additionally, Jigsaw makes frequent use of servers, either functioning as a device for a specific task or for automation.

Jigsaw has done projects for HBO (Robin Williams Come Into My Mind), Showtime (Enemies: The President, Justice & the FBI), Discovery Channel (Why We Hate), A&E (The Clinton Affair) and more, as well as for Netflix (Salt Fat Acid Heat, The Family) — work Mittan describes as an exercise in managing more and more pixels.

The Family

Indeed, documentaries can present big challenges when it comes to dealing with a plethora of media formats. “Documentary work can frequently deal with subjects that have already had a significant media footprint in legacy resolutions. This means that if you’re trying to build a documentary in 4K, you’re going to be dealing with archival footage that is usually HD or SD. You may shoot a handful of new interviews in your new, so-called ‘native’ footage but be overwhelmed by hours upon hours of footage from a VHS collection, or stories that have been downloaded from the website of a TV station in the Midwest,” he adds.

“Working with mixed resolutions means you have to have the capability of running and gunning with your new 4K footage, but the lower resolutions can’t leave your creative editors feeling as though they’ve been left with remnants from another time in history. Blending all of those elements together in a way that tells a cohesive story requires technology that can bring together all of those pieces (and newly generated elements like graphics and reenactments) into a unified piece of media without letting your viewing audience feel the whiplash of frequent resolution changes.”

Miky Wolf

Final Cut
Celebrating its 25th anniversary this year, Final Cut was founded in London by editor Rick Russell. It expanded to New York 20 years ago and to Los Angeles 15 years ago. Across all three offices and several subsidiaries – Significant Others VFX, Machine Sound and The Lofts — Final Cut has more than 100 staff and artists worldwide, offering offline editing, online editing, VFX, graphics, finishing, sound design, mixing and original composition, as well as “dry-hire” facilities for long-form content such as original Netflix series like Sex Education.

Primarily, Final Cut does offline creative editorial. Through Significant Others, it offers online editing and finishing. Although, as editor Miky Wolf notes, there are smaller jobs — such as music videos and digital work — for which the facility “does it all.”

Ryan Johnson

The same can be said of technical supervisor Ryan Johnson, whose job it is to design, implement and maintain the technical infrastructure for Final Cut’s New York and Los Angeles offices. This includes the workstations, software, data storage, backups, networking and security. “The best workstations should be like the best edited films. Something you don’t notice. If you are aware of the workstation while you’re working, it’s typically not a good thing,” Wolf says.

Johnson agrees. “Really, the workstation is just there to facilitate the work. It should be invisible. In fact, ours are mostly hidden under desks and are rarely seen. Mostly, it’s a purpose-built machine, designed less for aesthetics and portability than for reliability and practicality.”

Final Cut’s edit room runs off a Mac Pro with 32GB of RAM; there are two editing monitors, a preview monitor on the desk and a client monitor. The majority of the company’s edit workstations are six-core 2013 Mac Pro “trash cans” with AMD FirePro D500 GPUs and 32GB of RAM. There are approximately 16 of these workstations spread between the NY and LA offices. Moreover, the workstations use little to no local storage since the work resides on Avid’s Nexis servers. Each workstation is connected to a pair of 24-inch LCD displays, while video and audio from the edit software are delivered via Blackmagic Design hardware to an LCD preview monitor on the editor’s desk and to an OLED TV for clients.

The assistant editors all work on 27-inch iMacs of various vintages, mainly 2017 i7 models with 32GB of RAM.For on-set/off-site work, Final Cut keeps a fleet of MacBook Pros, mostly the 2015 Thunderbolt 2 pre-Touch Bar models. These travel with USB 3 SSDs for media storage. Final Cut’s Flames, however, all run on dual 12-core HP Z8s with 128GB of RAM. These machines use local SSD arrays for media storage.

According to Johnson, the workstations (running macOS 10.14.6) mostly are equipped with Avid Media Composer or Adobe Premiere Pro, and the editors sometimes “dabble” in Blackmagic’s DaVinci Resolve (for transcoding or when someone wants to try their hand at editing on it). “We primarily work with compressed proxy footage — typically DNxHD 115 or ProRes LT — at 1080p, so bandwidth requirements aren’t too high. Even lower-spec machines handle a few streams well,” he says. “Sequences that involve many layers or complicated effects will often require rendering, but the machines are fast enough that wait times aren’t too long.”

The editors also use Soundminer’s products for their sound effects library. The assistants perform basic compositing in Adobe After Effects, which the machines handle well, Johnson adds. “However, occasionally they will need to transcode raw/camera original footage to our preferred codec for editing. This is probably the most computationally intensive task for any of the machines, and we’ll try to use newer, faster models for this purpose.”

Stray Dolls feature film

Wherever possible, Final Cut deploys the same types of workstations across all its locations, as maintenance becomes easier when parts are interchangeable, and software compatibility is easier to manage when dealing with a homogeneous collection of machines. Not to mention the political benefit: Everybody gets the same machine, so there’s no workstation envy, so to speak.

Reliability and expandability are the most important factors Johnson considers in a workstation. He acknowledges that the 2013 Mac Pros were a disappointment on both counts: “They had thermal issues from the start — Apple admitted as much — that resulted in unpredictable behavior, and you were stuck with whichever 2013-era GPU you chose when purchasing the machine,” he says. “We expect to get many trouble-free years out of the workstations we buy. They should be easy to fix, maintain and upgrade.”

When selecting workstations for Final Cut, a Macintosh shop, there is not a great deal of choice. “Our choices are quickly narrowed down to whatever Apple happens to be selling,” explains Johnson. “Given the performance tiers of the models available, it is a matter of analyzing our performance needs versus our budget. In an ideal world, the entire staff would be working on the fastest possible machine with the most RAM and so forth, but alas, that is not always in the budget. Therefore, compromise must be found in selecting machines that can capably handle the typical workload and are fast enough not to keep editors and assistants waiting too long for renders.”

The most recent purchase were the new iMacs for the assistants in LA. “For the money, they are great machines, and I’ve found them to be reliable even when pushing them through all night renders, transcodes, etc. They’re at least as fast as the Mac Pros and, in most applications, even faster,” Johnson points out. He expects to replace the 2013 Mac Pros this year.

Florence and the Machine “Big God” music video

Wolf notes that he must be able to work as efficiently at home as he does at the office, “and that’s one nice thing about the evolution of offline editing. A combination of robust laptops and portable SSDs has allowed us to take the work anywhere.”

Using the above-described setup, Final Cut recently finished a campaign for an advertising client in which the edit started on set in LA, continued in the hotel room and then finished back in NY. “We needed to be able to work remotely, even on the plane home, just to get the first cuts done in time,” Wolf explains. “Agencies expect you to be fast. They schedule presentations assuming we can work around the clock to get stuff together — we need systems that can support us.”

Johnson highlighted another recent project with a tight schedule that involved cutting a multi-camera sequence in UHD from portable SSD storage on a standard iMac. “This would have been impossible just a few years ago,” he adds.

Main Image: Netflix’s Sex Education


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

XenData intros Multi-Site Sync service for cloud object storage

XenData, which provides data storage solutions, has announced its new Multi-Site Sync service for cloud object storage targeting media applications. The service creates a global file system accessible worldwide via XenData Cloud File Gateways. The XenData gateways are optimized for video files, supporting partial file restore and streaming.

Each gateway manages a local disk volume that caches frequently accessed files. The solution scales to 2 billion files, unlimited cloud storage and up to 256TB of local disk cache at each location. It can optimize a company’s productivity by providing global file sharing across multiple facilities combined with reliable local performance through local disk caching.

Each instance of the synchronized gateway runs on a physical or virtual Windows machine and allows access to the global file system on each local network as a standard share using SMB, NFS and FTP network protocols. When a file is written to the cloud object storage via one of the gateways, it immediately appears as a stub file within the global file system on all other gateways.

The Multi-Site Sync solution currently supports the following cloud object storage services: Amazon Web Services S3, Hot and Cool tiers of Azure Blob Storage and Wasabi S3. It also works with multiple cloud storage accounts, allowing simultaneous use of multiple cloud storage providers within the global file system.

According to XenData, each gateway uses multi-part HTTPS with checksum verification for a fast, reliable and secure connection to the cloud storage. The gateways adhere to the Microsoft security model based on Active Directory, allowing easy installation into existing domains. The Cloud File Gateway software can be installed on Windows Server 2016, Windows Server 2019 and Windows 10 machines.

XenData also offers two optimized edge appliances that include the XenData gateway software: the CX-10, a 1RU rack-mount appliance with a 10TB disk cache, and the X1, a compact unit that includes a 1.92TB SSD cache..

The solution uses cost-effective object storage and is priced from $150 per month for a system that manages up to 10TB of cloud storage and has two gateways. The cost of the cloud object storage is in addition. Multi-Site Sync is scheduled to be available in May.

My Top Five Ergonomic Workstation Accessories

By Brady Betzel

Instead of writing up my normal “Top Five Workstation Accessories” column this year, I wanted to take a slightly different route and focus on products that might lessen pain and maybe even improve your creative workflow — whether you are working at a studio or, more likely these days, working from home.

As an editor, I sit in a chair for most of my day, and that is on top of my three- to four-hour round-trip commute to work. As aches and pains build up (I’m 36, and I’m sure it doesn’t just get better), I had to start looking for solutions to alleviate the pain I can see coming in the future. In the past I have mentioned products like the Wacom Intuos Pro Pen tablet, which is great and helped me lessen wrist pain. Or color correction panels such as theLoupedeck, which helps creative workflows but also prevents you from solely using the mouse, also lessening wrist pain.

This year I wanted to look at how the actual setup of a workstation environment that might prevent pain or alleviate it. So get out of your seat and move around a little, take a walk around the block, and when you get back, maybe rethink how your workstation environment could become more conducive to a creativity-inspiring flow.

Autonomous SmartDesk 2 
One of the most useful things in my search for flexibility in the edit bay is the standup desk. Originally, I went to Ikea and found a clearance tabletop in the “dents” section and then found a kitchen island stand that was standing height. It has worked great for over 10 years; the only issue is that it isn’t easily adjustable, and sometimes I need to sit to really get my editing “flow” going.

Many companies offer standing desk solutions, including manual options like the classic VariDesk desk riser. If you have been in the offline editing game over the past five to 10 years, then you have definitely seen these come and go. But at almost $400, you might as well look for a robotic standing desk. This is where the Autonomous SmartDesk 2 comes into play. Depending on whether you want the Home Office version, which stands between 29.5 inches and 48 inches, or the Business Office version, which stands between 26 inches and 52 inches, you are looking to spend $379 or $479, respectively (with free shipping included).

The SmartDesk 2 desktop itself is made of MDF (medium-density fibreboard) material, which helps to lower the overall cost but is still sturdy and will hold up to 300 pounds. From black to white oak, there are multiple color options that not only help alleviate pains but can also be a conversation piece in the edit bay. I have the Business version in black along with a matching black chair, and I love that it looks clean and modern. The SmartDesk 2 is operated using a front-facing switch plate complete with up, down and four height-level presets. It operates smoothly and, to be honest, impressively. It gives a touch of class to any environment. Setup took about half an hour, and it came with easy-to-follow instructions, screws/washers and tools.

Keep an eye out for my full review of the Autonomous SmartDesk 2 and ErgoChair 2, but for now think about how a standup desk will at least alleviate some of the sitting you do all day while adding some class and conversation to the edit bay.

Autonomous ErgoChair 2 
Along with a standup desk — and more important in, my opinion — is a good chair. Most offline editors and assistant editors work at a company that either values their posture and buys Herman Miller Aeron chairs, or cheaps out and buys the $49 special at Office Depot. I never quite understood the benefit of saving a few bucks on a chair, especially if a company pays for health insurance — because in the end, they will be paying for it. Not everyone likes or can afford the $1,395 Aeron chairs, but there are options that don’t involve ruining your posture.

Along with the Autonomous SmartDesk 2, you should consider buying the ErgoChair 2, which costs $349 — a similar price to other chairs, like the Secretlab Omega series gaming chair that retails for $359. But the ErgoChair 2 has the best of both worlds: an Aeron chair-feeling mesh back and neck support plus a super-comfortable seat cushion with all the adjustments you could want. Even though I have only had the Autonomous products for a few weeks now, I can already feel the difference when working at home. It seems like a small issue in the grand scheme of things, but being comfortable allows my creativity to flow. The chair took under 30 minutes to build and came with easy-to-follow instructions and good tools, just like the SmartDesk 2.

A Footrest
When I first started in the industry, as soon as I began a freelance job, I would look for an old Sony IMX tape packing box. (Yes, the green tapes. Yes, I worked with tape. And yes, I can operate an MSW-2000 tape deck.) Typically, the boxes would be full of tapes because companies bought hundreds and never used them, and they made great footrests! I would line up a couple boxes under my feet, and it made a huge difference for me. Having a footrest relieves lower back pressure that I find hard to relieve any other way.

As I continue my career into my senior years, I finally discovered that there are actual footstools! Not just old boxes. One of my favorites is on Amazon. It is technically an adjustable nursing footstool but works great for use under a desk. And if you have a baby on the way, it’s a two-for-one deal. Either way, check out the “My Brest Friend” on Amazon. It goes for about $25 with free one-day Amazon Prime shipping. Or if you are a woodworker, you might be able to make your own.

GoFit Muscle Hook 
After sitting in an edit bay for multiple hours, multiple days in a row, I really like to stretch and use a massager to un-stuff my back. One of the best massagers I have seen in multiple edit bays is called the GoFit Muscle Hook.

Luckily for us it’s available at almost any Target or on the Target website for about $25. It’s an alien-looking device that can dig deep into your shoulder blades, neck and back. You can use it a few different ways — large hook for middle-of-the-back issues, smaller hook that I like to use on the neck and upper back, and the neck massage on the bar (that one feels a little weird to me).

There are other massage devices similar to the Muscle Hook, but in my opinion the GoFit Muscle Hook is the best. The plastic-composite seems indestructible and almost feels like it could double as a self-defense tool. But it can work out almost any knots you have worked up after a long day. If you don’t buy anything else for self-care, buy the Muscle Hook. You will be glad you did. Anyone who gets one has that look of pain and relief when they use it for the first time.

Foam Roller
Another item that I just started using was a foam roller. You can find them anywhere for the most part, but I found one on Amazon for $13.95 plus free Amazon Prime one-day shipping. It’s also available on the manufacturer’s website for about $23. Simply, it’s a high-density foam cylinder that you roll on top of. It sounds a little silly, but once you get one, you will really wonder how you lived without one. I purchased an 18-inch version, but they range from 12 inches to 36 inches. And if you have three young sons at home, they can double as fat lightsabers (but they hurt, so keep an eye out).

Summing Up
In the end, there are so many ways to try keeping a flexible editing lifestyle, from kettlebells to stand-up desks. I’ve found that just getting over the mental hurdle of not wanting to move is the biggest catalyst. There are so many great tech accessories for workstations, but we hardly mention ones that can keep our bodies moving and our creativity flowing. Hopefully, some of these ergonomic accessories for your workstation will spark an idea to move around and get your blood flowing.

For some workout inspiration, Onnit has some great free workouts featuring weird stuff like maces, steel clubs and sandbags, but also kettlebells. The site also has nutritional advice. For foam roller stretches, I would check out the same Onnit Academy site.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Workstations and Visual Effects

By Karen Moltenbrey

A couple of decades or so ago, studios needed the exceptional power of a machine offered by the likes of SGI for complex visual effects. Non-specialized PCs simply were not cut out for this type of work. But then a sea change occurred, and suddenly those big blue and purple boxes were being replaced with options in the form of workstations from companies such as Sun, DEC, HP, IBM and others, which offered users tremendous value for something that could conveniently fit on a desktop.

Those hardware companies began to duke it out, leading to the demise of some and the rise of others. But the big winners in this early war for 3D content creators’ business were the users. With a price point that was affordable, these workstations were embraced by facilities big and small, leading to an explosion of 3D content.

Here, we look at two VFX facilities that have taken different approaches when it comes to selecting workstations for their digital artists. NYC’s Bonfire, a boutique studio, uses a range of Boxx and custom-built machines, along with iMacs. Meanwhile, Digital Domain, an Oscar-winning VFX powerhouse, recently set up a new site in Montreal outfitted with Dell workstations.

Dave Dimeola

Bonfire
Bonfire is a relative newcomer to the industry, founded three years ago by Flame artist Brendan O’Neil, who teamed up with Dave Dimeola to get the venture off the ground. Their goal was to create a boutique-style base for those working in post production. The front end would comprise a beautiful, comfortable space where Autodesk Flame and motion graphics artists could work and interact with clients in comfortable suites within a townhouse setting, while the back end would consist of a cloud-based pipeline.

“We figured that if we combined these two things — the traditional boutique shop with the client experience and the power of a cloud-based pipeline — we’d have something,” says Dimeola, whose prior experience in leveraging the cloud proved invaluable in this regard.

Soon after, Peter Corbett, who had sold his Click 3X creative digital studio in 2018, agreed to be part of Bonfire’s advisory board. Believing Dimeola and O’Neil were on to something, Corbett came aboard as a partner and brought “some essential talent” into the company as well. Currently, Bonfire has 11 people on staff, with great talent across the gamut of production and post — from CG and creative directing to producing and more. One of the first key people who Corbett brought in was managing director Jason Mayo.

And thanks to the company’s unconventional production pipeline, it is able to expand its services with remote teams as needed. (Currently, Bonfire has more than 3,000 vetted artists in its network, with a core group of around 150 who are constantly on rotation for work.)

“It’s a game-changer in the current climate,” Dimeola says of the company’s setup. The group is performing traditional post work for advertising agencies and direct to client. “We’re doing content, commercials, social content and brand films,” says Dimeola, “anything that requires storytelling and visual communication and is design-centric.” One of the studio’s key offerings is now color grading, handled by colorist Dario Bigi. In terms of visual effects, Dimeola notes that Bonfire can indeed make animals talk or blow things up, although the VFX work Bonfire does “is more seamless, artful, abstract and weird. We get into all those facets of creation.”

Bonfire has approximately 10 workstations at its location in New York and is expanding into the second floor. The company just ordered a new set of customized PCs with Nvidia GeForce RTX 2070 Super graphic cards and some new ultra-powerful Apple iMacs, which will be used for motion graphics work and editing. The core software running on the machines includes the major industry packages: Autodesk’s Maya, Maxon’s Cinema 4D, Side Effects’ Houdini, Autodesk’s Flame, Foundry’s Nuke and Adobe’s Creative Suite, in addition to Thinkbox’s Krakatoa for particle rendering and Autodesk’s 3ds Max for architectural work. Cinema 4D motion graphics software and the Adobe software will run on the new iMac, while the more render-intensive projects within Maya, Houdini and Nuke will run on the PC network.

As Dimeola points out, workstations have taken some interesting turns in the past two to three years, and Bonfire has embraced the move from CPU-based systems to ones that are GPU-based. As such, the company’s PC workstations — a mix of Boxx and custom-built machines (with an AMD Threadripper 2950X CPU and a pair of Asus GeForce RTX 2080 Ti video cards, along with significant memory) — contain powerful Nvidia Quadro RTX 1080 cards. He attributes Bonfire’s move in processing to the changing needs of CGI rendering and lighting, which are increasingly relying on GPU power.

“We still use CPU power, however, because we feel some of the best lighting is still being done in the CPU with software like [Autodesk’s] Arnold,” Dimeola contends. “But we’re flexible enough to be using GPU-based lighting, like Otoy’s OctaneRender and Maxon’s Redshift for jobs that need to look great but also move very quickly through the pipeline. Some shops own one way of rendering, but we really keep a flexible pipeline so we can pivot and render just about any way we want based on the creative, the look, the schedule. It has to be very flexible in order for us to be efficient.”

The media will work off the SAN, a communal server that is partitioned into segments: one for the CGI, another for the editing (Flame) and a third for color (Blackmagic DaVinci Resolve). “We partitioned a cloud section for the server, which allows us to have complete control on how we sync media with an external team,” explains Dimeola. “That’s a big part of how we share, collaborate and move assets quickly with a team inside and outside and how we scale for projects. This is unique for Bonfire. It is the innovative part of the post production that I don’t think any other shops are really talking about at this time.”

In addition to the local machines and software, Bonfire also runs applications on virtual machines in the cloud. The key, says Dimeola, is knowing how to create harmony between the internal and external infrastructures. The backbone is built on Amazon Web Services (AWS) and Google Cloud Platform (GCP) and functions much the same way as its internal pipeline does. A proprietary project tracker built by Bonfire enables the teams to manage shots and assets; it also has an array of services and tools that help the staff efficiently manage projects that vary in complexity and scale.

Brooklyn Nets

“It’s no single piece to our pipeline that’s so innovative; rather, it’s the way that we’ve configured it between our talent and infrastructure,” says Dimeola, noting that in addition to being able to take on big projects, the company is able to get its clients what they need in real time and make complete changes literally overnight. Dimeola recalls a recent project for Google requiring intensive CGI fluid simulations. The team sat with the client one day to work out the direction and was able to post fresh edits, which included rendered shots, for the client the very next morning. “[In a traditional setup], that never would have been possible,” he points out.

However, getting the best deal on the cloud services requires additional work. “We play that market like the stock market, where we’re finding the best deals and configurations based on our needs at the time,” Dimeola says, and the result is an exponential increase in workflow. “You can ramp up a team and be rendering and working 24/7 because you’re using people in different time zones, and you’re never down a machine for rendering and working.”

Best of all, the setup goes unnoticed by the customer. “The client doesn’t feel or see anything different,” says Dimeola. That is, with one exception: “a dramatic change in the cost of doing the work, particularly if they are requiring a lot of speed.”

Digital Domain Montreal
A longtime creative and technological force in the visual effects industry, Digital Domain has crafted a range of work spanning feature films, video games, commercials and virtual reality experiences. With global headquarters in LA, plus locations in Vancouver, Beijing, Shanghai and elsewhere around the globe, the studio has been the driving force behind many memorable and cutting-edge projects, including the Oscar-winning The Curious Case of Benjamin Button and more. In fact, Digital Domain is known for its technological prowess within visual effects, particularly in the area of realistic digital humans, recently recreating a photoreal 3D version of Martin Luther King Jr. for a groundbreaking immersive project.

Michael Quan

A year ago, Digital Domain expanded its North American footprint by opening an office in Montreal, which celebrated its grand opening this past February. The office has approximately 100 employees, with plans to expand in the future. Most of the work produced by Digital Domain is shared by its five worldwide studios, and that trend will continue with Digital Domain Montreal, particularly with the LA and Vancouver offices; it also will tackle regional projects, focusing mostly on features and episodic content.

Setting up the Montreal site’s infrastructure fell to Digital Domain’s internal IT department, including senior systems engineer Michael Quan, who helped outfit the facility with the same classes of machines that the Los Angeles and Vancouver locations use: the Dell Precision R7920 and R7910 rack workstation PCs. “All the studios share common configuration specifications,” he notes. “Having a common specification makes it tremendously easy to move resources around when necessary.”

In fact, the majority of the machines were purchased approximately during the third quarter of 2019. Prior to that, the location was started up with resources from the facility’s other sites, and since they are using a common configuration, doing so did not present an issue.

Quan notes that the studio is able to cover all aspects of typical VFX production, such as modeling, rigging, animation, lighting, rotoscoping, texture painting, compositing and so forth, using the machines. And with some additional hardware, the office can also leverage those workstations for dailies review, he adds. As for the software, Digital Domain runs the typical programs: Autodesk’s Maya, Foundry’s Mari and Nuke, Chaos’ V-Ray, Maxon’s Redshift, Adobe’s Photoshop and so on, in addition to proprietary software.

Terminator: Dark Fate

As Quan points out, Digital Domain has specific requirements for its workstations, aside from the general CPU, RAM and hard drive specs. The machines must be able to handle the GPUs required by Digital Domain along with additional support devices. While that might seem obvious, when a requirement comes into play, it reduces the number of systems that are available for evaluation, he points out. Furthermore, the workstations must be rack-mountable and of a “reasonable” size (2U) to fit within the data center as opposed to deskside. Also, since the workstations are deployed in the data center, they must be manageable remotely.

“Preferably, it is commodity hardware, meaning using a vendor that is stable, has a relatively large market share and isn’t using some exotic technology,” Quan says, “so if necessary, we could source from secondary markets.” Unfortunately, the company learned this the hard way in the past by using a vendor that implemented custom power and management hardware; the vendor exited the market, leaving the studio without an option for repair and no secondary market to source defective parts.

Just how long Digital Domain retains its workstations depends on their performance effectiveness: If an artist can no longer work due to a resource inefficiency, Quan says, then a new round of hardware specification is initialized.

“The workstations we use are multi-processor-based, have a relatively high amount of memory and are capable of running the higher-performing professional GPUs that our work requires,” he says. “These days, ‘workstations’ could mean what would normally be called gaming rigs but with more memory, a top-end GPU and a high-clock-speed single processor. It just depends on what software will be used and the hardware configuration that is optimized for that.”

Lost in Space, Season 2

As Quan points out, graphics workstations have evolved to where they have the same capabilities as some low- to mid-class servers. “For example, the Dell R7910/R7920 that we are using definitely could be used as servers, since they share the same hardware capability as their server class,” he says. “It used to be that if you wanted performance, you might have to sacrifice manageability and footprint. Now there are systems deployed with one, eight and 10 GPU configurations in a relatively small footprint, which is a fully remotely manageable system in one of our data centers.” He predicts that workstations are evolving to a point where they will just be a specification. “In the near future, it will just be an abstract for us. Gone will be the days of one workstation equating to one physical system.”

According to Quan, the Montreal studio is still ramping up and has several major projects on the horizon, including feature films from Marvel, Sony, 20th Century Studios and more. Some of Digital Domain’s more recent work includes Avengers: Endgame, Lost in Space (Season 2), Terminator: Dark Fate and several others. Globally, its New Media and Digital Humans groups are doing incredible things, he notes, and the Ads/Games Group is producing some exceptional work as well.

“The workstations at Digital Domain have constantly evolved. We went from generic white boxes to Tier 1 systems, back to white boxes, and now to a more sophisticated Tier 1 data center-targeted ecosystem. With the evolutionary steps we are taking, we are iterating to a more efficient management of these resources,” Quan says. “One of the great advantages of having the workstations placed in a remote location is the security aspects. And on a more human level, the reduction of the fan noises and the beeps all those workstations would have created in the artist locations is notable.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Workstations and Color Grading

By Karen Moltenbrey

A workstation is a major investment for any studio. Today, selecting the right type of machine for the job can be a difficult process. There are many brands and flavors on the market, and some facilities even opt to build their own. Colorists have several tools available to them when it comes to color grading, ranging from software-based systems (which typically use a multiprocessor workstation with a high-end GPU) to those that are hardware-based.

Here, we examine the color workflow of two different facilities: Technicolor Vancouver and NBCUniversal StudioPost in Los Angeles.

[Editor’s note: These interviews were conducted before the coronavirus work limits were put in place.]

Anne Boyle

Technicolor Vancouver
Technicolor is a stalwart in the post industry, with its creative family — including VFX studios MPC, The Mill, Mr. X and Mikros — and wide breadth of post production services offered in many locations around the world. Although Technicolor Vancouver has been established for some time now, it was only within the past two years that the decision was made to offer finishing services again there, with an eye toward becoming more of a boutique operation, albeit one offering top-level effects.

With this in mind, Anne Boyle joined as a senior colorist, and immediately Technicolor Vancouver began a co-production with Technicolor Los Angeles. The plan was for the work to be done in Vancouver, with review and supervision handled in LA. “So we hit the ground running and built out new rooms and bought a lot of new equipment,” says Boyle. “This included investing in FilmLight Baselight, and we quickly built a little boutique post finishing house here.”

This shared-location work setup enabled Technicolor to take advantage of the lucrative tax credits offered in Vancouver. The supervising colorist in LA reviews sessions with the client, after which she and Boyle discuss them, and then Boyle picks up the scene and performs the work based on those conversations or notes in the timeline. A similar process occurs for the Dolby SDR deliverables. “There isn’t much guesswork. It is very seamless,” she says.

“I’ve always used Baselight,” says Boyle, “and was hoping to go that route when I got here, and then this shared project happened, and it was on a Baselight [in LA]. Happily for me, the supervising colorist, Maxine Gervais, insisted that we mirror the exact setup that they had.”

Gervais was using a Baselight X system, so that is what was installed in Vancouver. “It’s multi-GPU (six Nvidia Titan XPs) with a huge amount of storage,” she says. “So we put in the same thing and mimicked the infrastructure in LA. They also put in a Baselight Assist station and plan to upgrade it in the coming months to make it color-capable as well.”

The Baselight X turnkey system ships with bespoke storage and processing hardware, although Technicolor Vancouver loaded it with additional storage. For the grading panels, the company went with the top-of-the-line Blackboard. The Vancouver facility also purchased the same displays as LA — Sony BVM-X300s.

Messiah

The mirrored setup was necessary for the shared work on Netflix’s Messiah, an HDR project that dropped January 1. “We had to deliver 10 episodes all at once in 4K, [along with] both the HDR PQ masters and the Dolby SDR deliverable, which were done here as well,” explains Boyle. “So we needed the capability to store all of that and all of those renders. It was quite a VFX-heavy show, too.”

Using Pulse, Technicolor’s internal cloud-based system, the data set is shared between the LA and Vancouver sites. Technicolor staff can pull down the data, and VFX vendors can pull their own VFX shots too. “We had exact mirrors of the data. We were not sending project files back and forth, but rather, we shared them,” Boyle explains. “So anyone could jump on the project, whether in Vancouver or LA, and immediately open the project, and everything would appear instantaneously.”

When it comes to the hardware itself, speed and power are big factors. As Boyle points out, the group handles large files, and slowdowns, render issues and playback hiccups are unacceptable.

Messiah

The color system proved its mettle on Messiah, which required a lot of skin retouching and other beauty work. “The system is dedicated and designed only for colorists,” says Boyle. “And the tools are color-focused.”

Indeed, Boyle has witnessed drastic changes in color workstations over the past several years. File sizes have increased thanks to Red 8K and raw materials, which have driven the need for more powerful machines and more powerful GPUs, particularly with the increasingly complex HDR workflows, wherein floating points are necessary for good color. “More work nowadays needs to be performed on the GPU,” she adds. “You just can’t have enough power behind you.”

NBCUniversal StudioPost
NBCUniversal StudioPost knows a thing or two about post production. Not only does the facility provide a range of post, sound and finishing services, but it also offers cutting-edge equipment rentals and custom editorial rooms used by internal and third-party clients.

Danny Bernardino

Specifically, NBCUniversal offers end-to-end picture services that include dailies, editorial, VFX, color correction, duplication and encoding/decoding, data management, QC, sound, sound editorial, sound supervision, mixing and streaming.

Each area has a plethora of workstations and systems needed to perform its given tasks. For the colorists, the facility offers two choices, both on Linux OS: a Blackmagic DaVinci Resolve 16.1.2 (fully loaded with a creative suite of plugins and add-ons) running on an HP Z840 machine, and Autodesk Lustre 2019 running on an HP Z820.

“We look for a top-of-the-line color corrector that has a robust creative tool set as well as one that is technically stable, which is why we prefer Linux-based systems,” says Danny Bernardino, digital colorist at NBCUniversal StudioPost. Furthermore, the facility prefers a color corrector that adapts to new file formats and workflows by frequently updating its versions. Another concern is that the system works in concert with all of the ever-changing display demands, such as UHD, 4K, HDR and Dolby Vision.

Color bay

According to Bernardino, the color systems at NBCUniversal are outfitted with the proper CPU/GPU and SAN storage connectivity to ensure efficient image processing, thereby allowing the color talent to work without interruption. The color suites also are outfitted with production-level video monitors that represent true color. Each has high-quality scopes (waveform, vector and audio) that handle all formats.

When it comes time to select machines for the colorists there, it is a collective process, says senior VP Thomas Thurau. First, the company ascertains the delivery requirements, and then the color talent, engineering and operations staff work together to configure the proper tool sets for the customers’ content. How often the equipment is replaced is contingent on whether new image and display technology has been introduced.

Thurau defines a solid colorist workstation as a robust platform that is Linux-based and has enough card slots or expansion chassis capabilities to handle four or more GPU cards, Fibre Channel cards and more. “All of our systems are in constant demand, from compute to storage, thus we look for systems and hardware that are robust through to delivery,” he notes.

Mr. Robot

NBCUniversal StudioPost is always humming with various work. Some of the more recent projects there includes Jaws, which was remastered in UHD/HDR, Casino (UHD/HDR), the How to Train Your Dragon series (UHD/HDR) and an array of Alfred Hitchcock’s more famous films. The company also services broadcast episodic (NBCU and others) and OTT/streaming customers, offering a full suite of services (Avid, picture and sound). This includes Law & Order SVU, Chicago Med, Will & Grace, Four Weddings and a Funeral and Mr. Robot, as well as others.

“We take incredible pride in all aspects of our color services here at NBCUniversal StudioPost, and we are especially pleased with our HDR grades,” says Thurau.

For those who prefer to do their own work, NBCUniversal has over 185 editorial rooms, ranging from small to large suites, set up with Avid Media Composer.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

What Makes a Workstation?

By Mike McCarthy

Computer manufacturers charge a premium for their highest-end “workstation” systems, but many people don’t fully understand what really distinguishes a workstation-class system from any other computer. Admittedly, there is no cut and dry line, but workstations usually have a few characteristics that make them more suitable for professional applications than regular home or office PCs. They are usually faster, have a greater level of expandability and are more reliable than other PCs. This, of course, makes them more expensive, but depending on what you need them for, they can be well worth the additional cost.

Workstation Graphics
Nearly all workstations offer professional-level, OpenGL-optimized graphics cards at a time when having any discrete GPU at all is becoming rare outside of gaming systems. Nvidia’s Quadro cards and AMD’s Radeon Pro line have more RAM than their gaming counterparts, and their drivers are optimized for professional applications. High-bit-depth color processing used to be the other defining characteristic of professional graphics cards, but HDR imaging has pushed 10-bit color support into consumer GPUs, removing that as a differentiating factor.

Scalability
Most workstations have a greater level of expandability in the form of more slots for RAM and PCIe cards and more storage and networking options. This allows more flexibility in configuring a system that accommodates a specific task or application. Editors need lots of storage, animators need lots of RAM, and VFX artists might need more CPU power. They might all use the same model of workstation but in totally different configurations. Most workstations usually also have more card slots available, allowing for hardware upgrades for dedicated tasks. Editors might install a video I/O card for SDI interfaces, a sound mixer might need to install Avid Pro Tools processing cards, and many users will need high-bandwidth network cards — running at 10 gigabits or more — to share data with others they are working with.

Different Classes of Workstation
There are also a variety of classes of workstations available, depending on your budget and needs. Top-end workstations have dual-CPU sockets (and in rare cases, four or more sockets), multiplying the potential processing power and aggregate bandwidth. These systems are adapted from server architectures, with a few changes to improve interactive performance and expansion options. They offer many channels and slots for maximum memory capacity and throughput. Intel has had its Xeon scalable processors in this market for many years, while AMD has recently introduced its EPYC processor line into this segment.

Below that top tier comes high-end desktop systems, which offer a single CPU (possibly with up to 32 cores), four to six channels of memory and many PCIe lanes. Intel has its Core X CPUs and Xeon W CPUs on the high end in this range, while AMD has its Threadripper line in this range.

At a lower performance level, some workstations are based on the same CPUs as gaming systems. These systems have much less powerful chipsets, with fewer PCIe lanes for expansion and only two channels of memory, but they still offer very good performance on smaller projects at lower prices. Intel’s Core CPUs and AMDs Ryzen CPUs fall into this category, with up to eight and 16 cores, respectively. These systems can handle single-threaded workloads as well as higher-end options, but for applications that are well threaded, or when running many tasks at once, the higher-end systems will have a definite benefit.

Mobile Workstations
Separately, there are also mobile workstations, which are top-end laptop units. These are usually defined by having professional GPUs and — more recently — in many cases by having mobile Xeon CPUs. They usually have lots of RAM and very good integrated display options, occasionally with integrated calibration systems. They use NVMe storage, but that is no longer unique to workstations. They usually have more ports available than consumer systems and a wider variety of configuration options. Many models also have Mil-Spec ruggedness to protect them from damage in the field.

There are also a number of other unique workstation offerings, from all-in-one systems similar to the iMac Pro to tablets and VR backpacks. The one thing these all have in common is that they are designed for professional users and applications that have high processing workloads on either the CPU or GPU.

On the upper end, it is easy to see what makes a workstation different, with dual-socket Xeon processors, high core counts, RAM measured in terabytes, ECC for stability and RAID-based storage controllers for increased bandwidth and security. But what about “low-end” workstations? If low-end workstation core counts and RAM capacity are similar to a high-end gaming system, then what other features do the lower-end workstations bring to the table?

Reliability
Most large-scale workstation manufacturers have invested more engineering and testing time into workstation products. This includes better thermal components to allow the systems to run cooler and quieter under larger loads. Companies like Dell, HP and Lenovo all include their own software for optimizing and tweaking the system for maximum performance with various supported applications. They also work with software companies to certify various configurations to guarantee support for specific applications. All this effort should make workstations more reliable and less likely to crash or error out during important tasks.

Thermal Engineering
Most computers aren’t designed to be run at maximum performance for long periods of time, as many applications aren’t that taxing. For ones that are, many users take breaks, allowing the system to cool down, but an editor might kick off a render queue before heading home, and the system is processing at maximum capacity for the rest of the night or weekend. Cheaply built systems will heat up quickly and then throttle back the performance to prevent overheating, slowing down the task at hand. Workstations are engineered to carry on those intense computing tasks for greater periods of time without exceeding their thermal envelope. And even when not operating at peak processing performance, many workstations are designed to run much quieter, allowing their users to think more clearly or better hear the audio associated with the tasks they are working on.

Windows for Workstations
Microsoft recently released a version of Windows 10 targeted at high-end power users. It supports more CPUs and RAM than Windows 10 Pro, broader storage options and faster networking protocols, among other features. This difference in software support might further differentiate workstation-class systems in the future.

Mac Workstations
Apple has offered a workstation to its high-end users in the form of the Mac Pro. The original “cheese grater” silver tower had Xeon CPUs, ECC memory and a limited number of PCIe slots for expansion. This was replaced by the “trash can” black cylinder Mac Pro, which was, arguably, not a proper workstation. It didn’t have PCIe slots for expansion, it didn’t have hard drive slots for storage, and, most importantly, it didn’t have the thermal engineering to sustain high-performance workloads for an extended period of time. But it was the best Apple had to offer for many years, putting the trash can into places it never otherwise would have been and was not designed for.

The new Mac Pro tower (or rack) has returned a true workstation to Apple’s product portfolio. With a single-socket Xeon CPU, it sits at the peak of the mid-level workstation tier. With more slots than any other Mac ever, it is fully expandable and upgradable. (Even the I/O header can be replaced in the future.) While it would be possible for Apple to release a more powerful dual-socket option in the future, I doubt it will do so because the current Mac Pro should meet the needs of 99% of potential users due to how much multi-core CPUs have improved in the last few years.

Workstations in the Future
I expect the trend of users moving from top-end dual-socket systems to maxed out mid-level systems to continue in both the PC and Mac world as increases in maximum processing performance (and price) exceed the increases in workloads in most workstation tasks. This should increase the market for mid-level workstations, eventually increasing the options available and decreasing their price. We also see the lines blurring between mobile workstations and gaming laptops as the GPUs and drivers become more standardized between them. It will also be interesting to see what impact Intel and Micron’s new Optane persistent memory architecture has on workstations and their applications. Someday soon we might see integrated network interfaces that are faster than 1Gb, which has been standard since 2004. Until then, we will still be using cards to upgrade our workstations to the capabilities we need them to have for the tasks we need to accomplish, which is what they were designed for.

Quick Chat: Scholar’s Will Johnson and William Campbell

By Randi Altman

In celebrating its 10th anniversary, animation and design company Gentleman Scholar has relaunched as Scholar and has put a new emphasis on its live-action work. Started by directors/partners William Campbell and Will Johnson in Los Angeles, the company has grown over the years and now boasts a New York City location as well.

Recent Scholar projects include the animated Timberland Legends Club spot, the live-action and animated Porsche Pop Star and the live-action Acura TLX.

Considering their new name change and website rebrand, we decided to reach out to “The Wills” to find out more about Scholar’s work philosophy and what this change means to the company.

Audi Q3

Why did you decide to rename and relaunch as Scholar?
Will Johnson: After 10 years it felt like a good time to redefine how the world views us. Not as only as a one-stop shop that can handle all of your design and animation needs, but also a live-action and storytelling powerhouse.

Will Campbell: The new name evokes cleanliness and sophistication and better represents how we have evolved. Gentleman Scholar was fun, quirky and playful. We’re still all of those things, but we feel like we’ve also become more cinematic, more polished and better collaborators that understand production more clearly… which allows us to navigate the industry better as a whole.

Even when it comes to live action and carrying our film into post, we can assess solutions on-set quicker and more fluidly, understanding the restrictions or additions we can take with us into the software. Scholar has changed immensely over the past 10 years. We have grown up and become smarter, faster and better. The rebrand is a window to who we have already become and who we plan to be.

How is the business different, and what’s stayed the same?
Johnson: It’s more refined. We’ve learned a lot about how to conduct ourselves in a competitive art world — the positive ways that we approach each project and allowing the stress of the job to kick us in the ass but not let it guide the decisions we make. It’s also about being patient with our team as well as our own decision-making.

Creativity is a process, and “turning it on” every day isn’t always easy. Understanding that not every idea you have is a great idea and how to be comfortable with your creative self is important. To trust in the “why” you are making something versus the “what” that you make. And that’s reflected in the new company name and our new website design. It’s the same us. The same wild bunch of creative explorers intent on pushing the boundaries of design and live action. We are just more certain of who we are and the stories we tell, and therefore more inclusive in our path to get there.

Acura

Campbell: We now have a decade’s worth of work to back up our thoughts and collaborations. This is enormous when you need to show how capable you are, not just in the standard we hold ourselves to visually, but in the quality and sophistication of our evolving storytelling. We have fine-tuned our production processes, enabling the pipelines of our edit, animation, CG and composite teams to more easily embrace the techniques and tools we use to craft the stories we want to tell… so we can be more decisive with the concepts we put on the table. From the software to the hardware, we are more refined.

Can you talk about how the industry has changed over the past 10 years?
Johnson: It’s more spread out than it’s ever been. There is more content that reaches more eyes in more places. From social to OOH to broadcast, the need to pull everyone together and create something that speaks to everyone all at once feels like it’s stronger and more apparent than before. And we’ve seen it all at this point, from vertical campaigns to entirely experiential ones. The era of “do more with less” is here.

Campbell: For us, we were very young when we opened Scholar. We were in our 20s, and everything was a fire drill and we thrived off the chaos. We have learned to harness the inspiration that comes with chaos and channel it into focused, productive creation.

Have you embraced working in the cloud — storage, rendering, review and approval, etc. — and if so, in what way?
Johnson: Yes. We know it’s a fast-paced world and in the climate of things, generally the globe is embracing a cloud-based way of thinking. Luckily, we have an amazing team of technologists so we can tap into our home-base server from anywhere at any time. From rendering to storage to reviews and approvals — it keeps us all united, focused and organized when we’re moving a million miles a minute in any different direction.

Campbell: Scholar has been testing the technology as it is getting better and cheaper, but we are always balancing convenience versus security, and those swing on a job-by-job basis. We’ve written tools to take advantage of storage and rendering resources on both coasts and use Aspera to facilitate file syncing between each office.

Can you talk about the tools you use for your work?
Johnson: The tangible ones are the usual suspects. Adobe’s Creative Suite and 3D tools like Autodesk Maya, Maxon Cinema 4D, Foundry Nuke and all of the animation and time-based ones, like Adobe Premiere and Avid Media Composer. But my favorite tools tend to be the brains and skills of our team… the words on paper and the channeling of art and thought into something tactile. As creators, we lust to make things, and seeing that circuit board of craft and making is something amazing to watch.

Campbell: Scholar has always been a mixed-media studio. We love getting our hands dirty with new software or cameras. We fundamentally want to do what’s right for the job and not rest inside our comfort zone. Thinking about what style is right for a client, not “how do I make my style fit,” is just how we are wired. The tool is always a means to an end. My favorite jobs are the ones where the technique is invisible, and it’s all about the experience.

We are operating in an entirely new world these days with the coronavirus and working remotely. How are you guys embracing the change?
Campbell: With an office on each coast, we have already had to learn to work as a team remotely. The years of unifying groups from a distance and finding ways for technology to bring artists closer together has set the stage for us right now. We have transitioned our workforce to 100% remote. It’s early days yet, but everyone is in good spirits, and we feel as connected as ever, although I do miss our lunch table.

Johnson: We’re definitely thankful for the staff and talent that we surround ourselves with and how they’ve handled their work-from-home routines. The check-ins, the mind melds and the daily (hourly) hangouts have helped. We’re using the change in the world as an opportunity to showcase our adaptability — how we can scale up and down even in the remote world — as a way to continue to grow our relationships and push the creative boundaries.

As people who find it hard to simply sit still, we’ve changed how we approach and talk about a project as each script comes in. The conversations about techniques are important — how we look at animation with a live-action lens, how 2D can become 3D, or vice versa. We’re more easily adaptable and change purely out of the need to discover what’s new.

Main Image: (L-R) Will Johnson and Will Campbell

Colorist Chat: Framestore LA senior colorist Beau Leon

Veteran colorist Beau Leon recently worked with director Spike Jonze on a Beastie Boys documentary and a spot for cannabis retailer MedMen.

What’s your title and company?
I’m senior colorist at LA’s Framestore

Spike Jonze’s MedMen

What kind of services does Framestore offer?
Framestore is a multi-Oscar-winning creative studio founded over 30 years ago, and the services offered have evolved considerably over the decades. We work across film, television, advertising, music videos, cinematic data visualization, VR, AR, XR, theme park rides… the list is endless and continues to change as new platforms emerge.

As a colorist, what would surprise people the most about what falls under that title?
Despite creative direction or the equipment used to shoot something, whether it be for film or TV, people might not factor in how much color or tone can dictate the impact a story has on its audience. As a colorist, my role often involves acting as a mediator of sorts between various creative stakeholders to ensure everyone is on the same page about what we’re trying to convey, as it can translate differently through color.

Are you sometimes asked to do more than just color on projects?
Earlier in my career, the process was more collaborative with DPs and directors who would bring color in at the beginning of a project. Now, particularly when it comes to commercials with tighter deadlines and turnarounds, many of those conversations happen during pre-production without grading factored in until later in the pipeline.

Rihanna’s Needed Me

Building strong relationships and working on multiple projects with DPs or directors always allows for more trust and creative control on my end. Some of the best examples I’ve seen of this are on music video projects, like Rihanna’s Needed Me, which I graded here at Framestore for a DP I’d grown up in the industry with. That gave me the opportunity to push the creative boundaries.

What system do you work on?
FilmLight Baselight

You recently worked on the new Beastie Boys documentary, Beastie Boys Story. Can you talk a bit about what you did and any challenges relating to deadlines?
I’ve been privileged to work with Spike Jonze on a number of projects throughout my career, so going into Beastie Boys Story, we already had a strong dialogue. He’s a very collaborative director and respectful of everyone’s craft and expertise, which can be surprisingly rare within our industry.

Spike Jonze’s Beatie Boys Story

The unique thing about this project was that, with so much old footage being used, it needed to be mastered in HDR as well as reworked for IMAX. And with Spike being so open to different ideas, the hardest part was deciding which direction to choose. Whether you’re a hardcore Beastie Boys fan or not, the documentary is well worth watching once it will air on AppleTV+ in April.

Any suggestions for getting the most out of a project from a color perspective?
As an audience, our eyes have evolved a great deal over the last few decades. I would argue that most of what we see on TV and film today is extremely oversaturated compared to what we’d experience in our real environment. I think it speaks to how we treat consumers and anticipate what we think they want — colorful, bright and eye-catching. When it’s appropriate, I try to challenge clients to think outside those new norms.

How do you prefer to work with the DP or director?
Whether it’s working with a DP or director, the more involved I can be early on in the conversation, the more seamless the process becomes during post production and ultimately leads to a better end result. In my experience, this type of access is more common when working on music videos.

Most one-off commercial projects see us dealing with an agency more often than the director, but an exception to the rule that comes to mind is on another occasion when I had the chance to collaborate on a project with Spike Jonze for the first ever brand campaign for cannabis retailer MedMen called The New Normal. He placed an important emphasis on grading and was very open to my recommendations and vision.

How do you like getting feedback in terms of the look?
A conversation is always the best way to receive feedback versus a written interpretation of imagery, which tends to become very personal. An example might be when a client wants to create the feeling of a warm climate in a particular scene. Some might interpret that as adding more warm color tones, when in fact, if you think about some of the hottest places you’ve ever visited, the sun shines so fiercely that it casts a bright white hue.

What’s your favorite part of the job?
That’s an easy answer — to me, it’s all about the amazing people you meet in this industry and the creative collaboration that happens as a result. So many of my colleagues over the years have become great friends.

Any least favorites?
There isn’t much that I don’t love about my job, but I have witnessed a change over the years in the way that our industry has begun to undervalue relationships, which I think is a shame.

If you didn’t have this job, what would you be doing instead?
I would be an art teacher. It combines my passion for color and visual inspiration with a forum for sharing knowledge and fostering creativity.

How early did you know this would be your path?
In my early 20s, I started working on dailies (think The Dukes of Hazzard, The Karate Kid, Fantasy Island) at a place in The Valley that had a telecine machine that transferred at a frame rate faster than anywhere else in LA at the time. It was there that I started coloring (without technically realizing that was the job I was doing, or that it was even a profession).

Soon after, I received a call from a company called 525 asking me to join them. They worked on all of the top music videos during the prime “I Want My MTV” era, and after working on music videos as a side hustle at night, I knew that’s where I wanted to be. When I first walked into the building, I was struck by how much more advanced their technology was and immediately felt out of my depth. Luckily, someone saw something in me before I recognized it within myself. I worked on everything from R.E.M.’s “Losing My Religion” to TLC’s “Waterfalls” and The Smashing Pumpkins’ “Tonight, Tonight.” I found such joy in collaborating with some of the most creative and spirited directors in the business, many of whom were inspiring artists, designers and photographers in their spare time.

Where do you find inspiration?
I’m lucky to live in a city like LA with such a rich artistic scene, so I make a point to attend as many gallery openings and exhibitions as I can. Some of my favorite spaces are the Annenberg Space for Photography, the Hammer Museum and Hauser & Wirth. On the weekends I also stop by Arcana bookstore in Culver City, where they source rare books on art and design.

Name three pieces of technology you can’t live without.
I think I would be completely fine if I had to survive without technology.

This industry comes with tight deadlines. How do you de-stress from it all?
After a long day, cooking helps me decompress and express my creativity through a different outlet. I never miss a trip to my local farmer’s market, which also helps to keep me inspired. And when I’m not looking at other people’s art, I’m painting my own abstract pieces at my home studio.

A Closer Look: Delta Soundworks’ Ana Monte and Danielo Deboy

Germany’s Delta Soundworks  was co-founded by Ana Monte and Danielo Deboy back in 2016 in Heidelberg, Germany. This 3D/immersive audio post studio’s projects span across installations, virtual reality, 360-degree films and gaming, as well as feature films, documentaries, TV shows and commercials. Its staff includes production sound mixers, recording engineers, sound designers, Foley artists, composers and music producers.

Below the partners answer some questions about their company and how they work.

How did Delta come about?
Ana Monte: Delta Soundworks grew from the combination of my creative background in film sound design and Daniel’s high-level understanding of the science of sound. I studied music industry and technology at California State University, Chico and I earned my master’s degree in film sound and sound design at the Film Academy Baden-Württemberg, here in Germany.

Daniel is a graduate of the Graz University of Technology, where he focused his studies on 3D audio and music production. He was honored with a Student Award from the German Acoustical Society (DEGA) for his research in the field of 3D sound reproduction. He has also received gold, silver and bronze awards from the Audio Engineering Society (AES) for his music recordings.

Can you talk about some recent projects?
Deboy: I think our biggest current project is working for The Science Dome at the Experimenta, a massive science center in Heilbronn, Germany. It’s a 360-degree theater with a 360-degree projection system and a 29-channel audio system, which is not standard. We create the entire sound production for all the theater’s in-house shows. For one of the productions, our composer Jasmin Reuter wrote a beautiful score, which we recorded with a chamber orchestra. It included a lot of sound design elements, like rally cars. We put all these pieces together and finally mixed them in a 3D format. It was a great ride for us.

Monte: The Science Dome has a very unique format. It’s not a standard planetarium, where everyone is looking up and to the middle, but rather a mixture of theater plus planetarium, wherein people look in front, above and behind. For example, there’s a children’s show with pirates who travel to the moon. They begin in the ocean with space projected above them, and the whole video rotates 180-degrees around the audience. It’s a very cool format and something that is pretty unique, not only in Europe, but globally. The partnership with the Experimenta is very important for us because they do their own productions and, eventually, they might license it to other planetariums.

With such a wide array of projects and requirements, tell us about your workflow.
Deboy: Delta is able to quickly and easily adjust to different workflows because we know, or at least love to be, at the edge of what’s possible. We are always happy to take on new and interesting projects, try out new workflows and design, and look at up-and-coming techniques. I think that’s kind of a unique selling point for us. We are way more flexible than a typical post production house would be, and that includes our work for cinema sound production.

What are some tools you guys use in your work?
Deboy: Avid Pro Tools Ultimate, Reaper, Exponential Audio, iZotope RX 6 and Metric Halo 2882 3D. We also have had a license for Nugen Halo Upmix for a while, and we’ve been using it quite a bit for 5.1 production. We rely on it significantly for the Experimenta Science Dome projects because we also work with a lot of external source material from composers who deliver it in stereo format. Also, the Dome is not a 5.1/7.1 theater; it’s 29 channels. So, Upmix really helped us go from a stereo format to something that we could distribute in the room. I was able to adjust all my sources through the plugin and, ultimately, create a 3D mix. Using Nugen, you can really have fun with your audio.

Monte: I use Nugen. Halo Upmix for sound design, especially to create atmosphere sounds, like a forest. I plug in my source and Upmix just works. It’s really great; I don’t have to spend hours tweaking the sound just to have it only serve as a bed to add extra elements on top. For example, maybe I want an extra bird chirping over there and then, okay, we’re in the forest now. It works really well for tasks like that.

Behind the Title: Akkurat Studios director Andreas Roth

Originally from Hamburg, Andreas Roth is a graduate of the Filmakademie Baden-Württemberg. He began his directing career at 21 and gained momentum with a film for Dirt Devil that went viral, accumulating over 30 million views on Vimeo and garnering an AICP Show honor that placed the piece in the permanent collection at the Museum of Modern Art in NYC.

NAME: Director Andreas Roth

COMPANY: Akkurat Studios

CAN YOU DESCRIBE YOUR COMPANY?
Akkurat Studios is a creative label sprung out of Berlin and Los Angeles. We act as an artist-driven production and publishing company. We manage talent coming from a variety of acknowledged constellations. We work side by side with brands and agencies to create visionary projects.

Dirt Devil

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE OF DIRECTOR?
Being a psychologist, because you always deal with a bunch of different characters and people. It all comes down to communication; the better that works, the better the results.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I guess it’s the variety the job brings — you always meet new and interesting people. It’s a dream scenario.

WHAT’S YOUR LEAST FAVORITE?
Client or agency politics — when it’s less about the film or final result; instead it’s about marketing tests, numbers and guidelines.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I guess with opening our own shop, Akkurat Studios, I kind of brought all the things I like into one place, meaning: direction, creative direction, publishing (Akkurat Journal), photography, producing and traveling (even though world events at the moment won’t allow that).

L-R: Parterns Rocco Kopecny and Andreas Roth

My business partner, Rocco Kopecny, and I have a lot of plans for the upcoming months, which will all somehow have the Akkurate label on it but are slightly different territories out of the film and advertising jungle.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
In art class in high school, I fell in love with editing and picked up a camera. I guess after writing and shooting my first client commercial at the age of 21, I felt that could be a future job. I created the idea with a good friend of mine — we just pitched it direct to the client and got a cinema release — without any big knowledge upfront.

WHAT WAS IT ABOUT DIRECTING THAT ATTRACTED YOU?
I guess I liked the collaboration aspect of things — you need to bring in the best people to realize great results. In the end, a good director knows who he needs around.

WHAT IS IT ABOUT DIRECTING THAT CONTINUES TO KEEP YOU INTERESTED?
You always learn something on each and every project, especially staging and how to work with actors.

HOW DO YOU PICK THE PEOPLE YOU WORK WITH ON A PARTICULAR PROJECT?
I like to work with people I know; it’s kind of a family vibe and makes life easier. From time to time I like to mix up things — normally I get attracted by their work online, usually via Instagram.

HOW DO YOU WORK WITH YOUR DP?
The visuals are really important to me. That’s why I like to work closely with my DP — talking about lenses, camera movement and the overall look what we aim to achieve. Moods are really helpful and I always create mood boards or even edit short mood films.

DO YOU GET INVOLVED WITH THE POST AT ALL?
I like to be part of the edit if possible because it’s the final stage where you tell the story.

Bucherer

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Lately I’ve been focused on opening our shops in Berlin and Los Angeles, which took some time. Last year I shot a commercial for Bucherer, a Swiss luxury brand. We also produced it with Akkurat.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I have a few: Dirt Devil because we really pushed on this one to make it what it is. Herbaria — shooting at Pinewood’s Underwater Stage was a great experience. Also the O’Neill project with big wave surfer Mark Mathews because it was such an intimate time — just him, the DP and me,

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Sadly, my phone, headphones and laptop. Without those three it’s tricky to survive in the industry. (laughs)

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Tell me! Do you have a secret? For me, it’s normally sports, reading a good book or traveling.

Showrunner Derek Simonds talks USA Network’s The Sinner

By Iain Blair

Three years ago, USA Network’s Golden Globe- and Emmy-nominated series The Sinner snuck up behind viewers, grabbed them by the throat and left them gasping for air while they watched a seemingly innocent man stabbed to death at the beach. The second season pulled no punches either, focusing on the murder of a couple by a young boy.

Derek Simonds (right) on set with Bill Pullman.

Now the anthology is back with a third installment, which once again centers around Detective Harry Ambrose (Bill Pullman) as he begins a routine investigation of a tragic car accident on the outskirts of Dorchester, in upstate New York. Piece by piece, Ambrose gradually uncovers a hidden crime that pulls him into another dangerous and disturbing case focusing on Jamie Burns (Matt Bomer), a Dorchester resident, high school teacher and expectant father. The Season 3 finale airs at the end of the month on USA Network.

Also back is the show’s creator and showrunner Derek Simonds, whose credits include ABC’s limited series When We Rise, and ABC’s 2015 limited series The Astronaut Wives Club. He has developed television pilots, wrote, directed and composed the score for his feature film Seven and a Match, and has developed many independent film projects as a writer/producer, including the Oscar-Nominated Sony Pictures Classics release Call Me by Your Name.

I recently spoke with Simonds about making the show — which is executive-produced by Jessica Biel (who starred in Season 1) and Michelle Purple through their company Iron Ocean — the Emmys, and his love of post.

THE SINNER -- "Part II" Episode 302 -- Pictured: Bill Pullman as Detective Lt. Harry Ambrose -- (Photo by: Peter Kramer/USA Network)

When you created this show, did you always envision it as a tortured human drama, a police procedural, or both?
(Laughs) Both, I guess. My previous writing and developing stuff for film and TV was never procedural-oriented. The opportunity with this show came with the book being developed and Jessica Biel being attached.  I was one of many writers vying for the chance to adapt it, and they chose my pitch. The reason the book and the bones of the show appealed to me was the “whydunnit” aspect at the core of Season 1. That really sold me, as I wasn’t very interested in doing a typical procedural mystery or a serial killer drama that was really plot-oriented.

The focus on motive and what trauma could have led to such a rash act — Cora (in Season 1) stabbing the stranger on the beach — that is essentially the mystery, the psychological mystery. So I’ve always been character-oriented in my writing. I love a good story and watching a plot unfold, but really my main interest as a writer and why I came onto the show is because it delves so deeply into character.

Fair to say that Season 3 marks a bit of a shift in the show?
Yes, and I’d say this season is less of a mystery and more of a psychological thriller. It really concerns Detective Harry Ambrose, who’s been in the two earlier seasons, and he encounters this more mundane event — a fatal, tragic car crash — but a car crash, and something that happens all the time.

As he starts looking into it, he realizes that the survivor is not telling the whole story, and that some of the details just don’t add up. His intuition makes him look deeper, and he ends up getting into this relationship that is part suspect, part detective, part pursuer, part pursuee and part almost-friendship with this character played by Matt Bomer.

It also seems more philosophical in tone than the previous two seasons.
I think you’re right. The idea was born out of thinking about Dostoevsky and questions about “why do we kill?” Could it be for philosophical reasons, not just the result of trauma? Could it be that kind of decision? What is morality? Is it learned or is it invented? So there were all these questions and ideas, and I was also very excited to create a male character — not a helpless child or a woman, not someone so innocent — as the new character and have that character reflect Ambrose’s darker side and impulses back to him. So there was this doppelganger-y twinning going on.

Where do you shoot?
All out of New York City. We have stages in Brooklyn where we have our sets, and then we do a lot of location work all over the city and also just outside in Westchester and Rockland counties. They offer us great areas where we can cheat a more bucolic setting than we’re actually in.

It has more of a cinematic feel and look than most TV shows.
Thank you for noticing! As the creator, I’m biased about that, but I spend a lot of time and energy with my team and DP Radium Cheung and designers to really try and avoid the usual TV tropes and clichés and TV-style lighting and shooting every step of the way.

We try to think of every episode as a little film. In fact, every season is like a long film, as they’re stand-alone stories, and I embark on each season like, “OK, we’re making a 5½-hour movie,” and all the decisions we make are kind of holistic.

Do you like being a showrunner?
It’s a great privilege to be able to tell a story and make the decisions about what that story says, and to be able to make it on the scale that we do. It’s totally thrilling, and I love having a moment at the podium to talk about the culture and what’s on my mind through the characters. But I think it’s also one of the hardest jobs you could ever have. In fact, it’s really like having four or five jobs rolled into one, and it’s really, really exhausting, as you’re running between them at all times. So there’s this feeling that you never have enough time, or enough time to think as deeply in every area as you’d like. It takes its toll physically, but it’s so gratifying to get a new season done and out in the world.

Where do you post?
All in New York at Technicolor Postworks, and we do most of the editing there too, and all of our online work. We do all the sound work at Decibel 11, which is also in Manhattan. As for our VFX, we switch year to year, and this year we’re working with The Molecule.

Do you like the post process?
I love post. There’s so much relief once you finish production and that daily stress of worrying about whether you’ll get what you need is over. You can see what you have.

But you don’t have much time for post in TV as compared with film.
True.  it’s an incredibly fast schedule, and as the EP I only have about four or five full days to sit down with the editor and re-cut an episode and consider what the best possible version of it is.

Let’s talk about editing. You have several editors, I assume because of the time factor. How does that work?
I really love the whole editing process, and I spend a lot of time cutting the episodes — 10 hours a day, or more, for those five days, fine-tuning all the cuts before we have to lock them. I’m not the type of showrunner who gives a few notes and goes off to the next room. I’m very hands-on, and we’ve had the same three editors except for one new guy this season.

Everyone comes back, so there’s a growing understanding of the tone and what the show is. So all three editors rotate on the eight episodes. The big editing challenges are refining performance as things become clearer from previous episodes, and running time. We’re a broadcast show, so we don’t have the leeway of a streaming show, and there’s a lot of hair-pulling over how to cut episodes down. That can be very stressful for me, as I feel we might be losing key content that brings a lot of nuance. There’s also keeping a consistent tone and rhythm, and I’m very specific about that.

You’re also a musician, so I assume you must spend a lot of time on sound and the music?
I do. I work in depth on the score with composer Ronit Kirchman, so that’s an aspect of post I really, really love, and where I spend far more time than a typical showrunner does. I understand it and can talk about it, and I have very specific ideas about what I want. But TV’s so different from movies. We do our final mix review in four hours per episode. With a movie you’d have four, five days. So there’s very little time for experimentation, and you have to have a very clear vision of what works.

Derek Simonds

How important are the Emmys to a show like this?
Very, and Jessica Biel was nominated for her role in Season 1, but we haven’t won yet. We have a lot of fans in the industry, but maybe we’re also a bit under the radar, a bit cultish.

The show could easily run for many more years. Will you do more seasons?
I hope so. The beauty of an anthology is that you can constantly refresh the story and introduce new characters, which is very appealing. If we keep going, I think it’ll pivot in a larger way to keep it really fresh. I just never want it to become predictable, where you sense a pattern.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Review: Digital Anarchy’s Transcriptive 2.0

By Barry Goch

Not long ago, I had the opportunity to go behind the scenes at Warner Bros. to cover the UHD HDR remastering of The Wizard of Oz. I had recorded audio of the entire experience so I could get accurate quotes from all involved — about an hour of audio. I then uploaded the audio file to Rev.com and waited. And waited. And waited. A few days later they came back and said they couldn’t do it. I was perplexed! I checked the audio file, and I could clearly hear the voices of the different speakers, but they couldn’t make it work.

That’s when my editor, Randi Altman, suggested Digital Anarchy’s Transcriptive, and it saved the day. What is Transcriptive? It’s an automated, intelligent transcription plugin for Adobe Premiere editors designed to automatically transcribe video using multiple speech and natural language processing engines with accuracy.

Well, not only did Transcriptive work, it worked super-fast, and it’s affordable and simple to use … once everything is set up. I spent a lot of time watching Transcriptive’s YouTube videos and then had to create two accounts for the two different AI transcription portals that they use. After a couple of hours of figuring and setup, I was finally good to go.

Digital Anarchy has lots of videos on YouTube about setting up the program. Here is a link to the overview video and a link to 2.0 new features. After getting everything set up, it took less than five minutes from start to finish to transcribe a one-minute video. That includes the coolest part: automatically linking the transcript to the video clip with word-for-word accuracy.

Transcriptive extension

Step by Step
Import video clip into Premiere, select the clips, and open the Transcriptive Extension.

Tell Transcriptive if you want to use an existing transcript or create a new transcription.

Then choose the AI that you want to transcribe your clip. You see the cost upfront, so no surprises.

Launch app

I picked the Speechmatics AI:

Choosing AI

Once you press continue, Media Encoder launches.

Media Encoder making FLAC file automatically.

And Media Encoder automatically makes a FLAC file and uploads it to the transcription engine you picked.

One minute later, no joke, I had a finished transcription linked word-accurately to my source video clip.

Final Thoughts
The only downside to this is that the transcription isn’t 100% accurate. For example, it heard Lake Tahoe as “Lake Thomas” and my son’s name, Oliver, as “over.”

Final transcription

This lack of accuracy is not a deal breaker for me, especially since I would have been totally out of luck without it on The Wizard of Oz article, which you can read here. For me, the speed and ease of use more than compensates for the lack of accuracy. And, as AI’s get better, the accuracy will only improve.

And on February 27, Digital Anarchy released Transcriptive V.2.0.3, which is compatible with Adobe Premiere v14.0.2. The update also includes a new prepaid option that can lower the cost of transcription to $2.40 per hour of footage. Transcriptive’s tight integration with Premiere makes it a must-have for working with transcripts when cutting long- and short-form projects.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Seagate’s new IronWolf 510 M.2 NVMe SSD

Seagate Technology has beefed up its high-performance solutions for multi-user NAS environments by adding to its IronWolf SSD product line. IronWolf 510 is an M.2 NVMe SSD with caching speeds of up to 3GB/s for NVMe-compatible systems and is designed for creative pros and businesses that need 24/7 multi-user storage that is cache-enabled.

The IronWolf 510 SSD meets NAS manufacturer requirements of one drive write per day (DWPD), allowing multi-user NAS environments to do more with their data with lasting performance. According to Seagate, IronWolf 510 SSD is reliable with 1.8 million hours mean time between failures (MTBF) in a PCIe form factor, two years of Rescue Data Recovery Services, and a five-year limited warranty. IronWolf Health Management helps analyze drive health and will soon be available on compatible NAS systems.

“We are the first to provide a purpose-built M.2 NVMe for NAS that not only goes beyond SATA performance metrics but also provides three times the endurance when compared to the competition. This meets the required endurance spec of one DWPD which our NAS partners expect for their customers,” says Matt Rutledge, senior VP, devices. “Because of such high endurance, our customers are getting a tough SSD for small business and creative professional NAS environments.”

The IronWolf 510 SSD PCIe Gen3 x4, NVMe 1.3 is available in 240GB ($119.99), 480GB ($169.99), 960GB ($319.99) and 1.92TB ($539.99) capacities and is compatible with leading NAS vendors.

Assimilate intros live grading, video monitoring and dailies tools

Assimilate has launched Live Looks and Live Assist, production tools that give pros speed and specialized features for on-set live grading, look creation, advanced video monitoring and recording.

Live Looks provides an easy-to-set-up environment for video monitoring and live grading that supports any resolution, from standard HD up to 8K workflows. Featuring professional grading and FX/greenscreen tools, it is straightforward to operate and offers a seamless connection into dailies and post workflows. With Live Looks being available on both macOS and Windows, users are, for the first time, free to use the platform and hardware of their choice. You can see their intro video here.

“I interact regularly with DITs to get their direct input about tools that will help them be more efficient and productive on set, and Live Looks and Live Assist are a result of that,” says Mazze Aderhold, product marketing manager at Assimilate. “We’ve bundled unique and essential features with the needed speed to boost their capabilities, and enabling them to contribute to time savings and lower costs in the filmmaking workflow.”

Users can run this on a variety of places — from a  laptop to a full-blown on-set DIT rig. Live Looks provides LUT-box control over Flanders, Teradek and TVLogic devices. It also supports video I/O from AJA, Bluefish444 and Blackmagic for image and full-camera metadata capture. There is also now direct reference recording to Apple ProRes on macOS and Windows.

Live Looks goes beyond LUT-box control. Users can process the live camera feed via video I/O, making it possible to do advanced grading, compare looks, manage all metadata, annotate camera input and generate production reports. Its fully color-managed environment ensures the created looks will come out the same in dailies and post. Live Looks provides a seamless path into dailies and post with look-matching in Scratch and CDL-EDL transfer to DaVinci Resolve.

With Live Looks, Assimilate takes its high-end grading tool set beyond Lift, Gamma, Gain and CDL by adding Powerful Curves and an easy-to-use Color Remapper. On-set previews can encompass not just color but realtime texture effects, like Grain, Highlight Glow, Diffusion and Vignette — all GPU-accelerated.

Advanced chroma keying lets users replace greenscreen backgrounds with two clicks. This allows for proper camera angles, greenscreen tracking/anchor point locations and lighting. As with all Assimilate software, users can load and play back any camera format, including raw formats such as Red raw and Apple ProRes raw.

Live Assist has all of the features of Live Looks but also handles basic video-assist tasks, and like Live Assist, it is available on both macOS and Windows. It provides multicam recording and instant playback of all recorded channels and seamlessly combines live grading with video-assist tasks in an easy-to-use UI. Live Assist automatically records camera inputs to file based on the Rec-flag inside the SDI signal, including all live camera metadata. It also extends the range of supported “edit-ready” capture formats: Apple ProRes (Mov), H264 (MP4) and Avid DNxHD/HR (MXF). Operators can then choose whether they want to record the clean signal or record with the grade baked in.

Both Live Looks and Live Assist are available now. Live Looks starts at $89 per month, and Live Assist starts at $325 per month. Both products and free trials are available on the Assimilate site.

Colorist Chat: Keith Shaw on Showtime’s Homeland and the process

By Randi Altman

The long wait for the final season of Showtime’s Homeland seemed to last an eternity, but thankfully the series is now airing, and we here at postPerspective are pretty jazzed about it. Our favorite spies, Carrie and Saul, are back at it, with this season being set in Afghanistan.

Keith Shaw

Year after year, the writing, production and post values on Homeland have been outstanding. One of those post folks is colorist Keith Shaw from FotoKem’s Keep Me Posted, which focuses on finishing services to television.

Shaw’s credits are impressive. In addition to Homeland, his work can be seen on Ray Donovan, Shameless, Animal Kingdom and many others. We reached out to Shaw to find out more about working on Homeland from the first episode to the last. Shaw shares his workflow and what inspires him.

You’ve been on Homeland since the beginning. Can you describe the look of the show and how you’ve worked with DPs David Klein, ASC, and Giorgio Scali, ASC, as well as producer Katie O’Hara?
Working on Homeland from Episode 1 has been a truly amazing experience. Katie, Dave, Giorgio and I are an extremely collaborative group.

One consistent factor of all eight seasons has been the need for the show to look “real.” We don’t have any drastic or aggressively stylized looks, so the goal is to subtly manipulate the color and mood yet make it distinct enough to help support the storyline.

When you first started on the show, how would you describe the look?
The first two seasons were shot by Nelson Cragg, ASC. For those early episodes, the show was a bit grittier and more desaturated. It had a darker, heavier feel to it. There was not as much detail in the dark areas of the image, and the light fell off more quickly on the edges.

Although the locations and looks have changed over the years, what’s been the common thread?
As I mentioned earlier, the show has a realism to it. It’s not super-stylized and affected.

Do the DPs come to the color suite? What kind of notes do you typically get from them?
They do when they are able (which is not often). They are generally on the other side of the world. As far as notes, it depends on the episode. When I’m lucky, I get none. Generally, there are not a lot of notes. That’s the advantage of collaborating on a show from the beginning. You and the DP can “mold” the look of the show together.

You’ve worked on many episodics at Keep Me Posted. Prior to that you were working on features at Warner Bros. Can you talk about how that process differs for you?
In remastering and restoration of feature films, the production stage is complete. It’s not happening simultaneously, and that means the timeline and deadlines aren’t as stressful.

Digital intermediates on original productions, on the other hand, are similar to television because multiple things are happening all at once. There is an overlap between production and post. During color, the cut can be changing, and new effects could be added or updated, but with much tighter deadlines. DI was a great stepping stone for me to move from feature films to television.

Now let’s talk about some more general aspects of the job…

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
First of all, most people don’t have a clear understanding of what a colorist is or does. Even after 25 years and multiple explanations, my father-in-law still tells everyone I’m an editor.

Being a colorist means you wear many hats — confidante, mediator, therapist, VFX supervisor, scheduler and data manager — in addition to that color thing. For me, it boils down to three main attributes. One, you need to be artistic/creative. Two, you need to be technical. Finally, you need to mediate the decision-making processes. Sometimes that can be the hardest part of all, when there are competing viewpoints and visions between all the parties involved.

WHAT SYSTEM DO YOU WORK ON?
Digital Vision’s Nucoda.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Today’s color correctors are incredibly powerful and versatile. In addition to color, I can do light VFX, beauty work, editing or technical fixes when necessary. The clients appreciate the value of saving time and money by taking care of last-minute issues in the color suite.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Building relationships with clients, earning their trust and helping them bring their vision to the screen. I love that special moment when you and the DP are completely in sync — you’re reaching for the knobs before they even ask for a change, and you are finishing each other’s sentences.

WHAT’S YOUR LEAST FAVORITE?
Deadlines. However, they are actually helpful in my case because otherwise I would tweak and re-tweak the smallest details endlessly.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Ray Donovan, Shameless, Animal Kingdom, Single Parents and Bless This Mess are my current shows.

ANY SUGGESTIONS FOR GETTING THE MOST OUT OF A PROJECT FROM A COLOR PERSPECTIVE?
Become a part of the process as early as possible. Establishing looks, LUTs and good communication with the cinematographer are essential.

HOW DO YOU PREFER THE DP OR DIRECTOR TO DESCRIBE THE LOOK THEY WANT?
Each client has a different source of inspiration and way of conveying their vision. I’ve worked from fabric and paint samples, YouTube videos, photographs, magazine ads, movie or television show references, previous work (theirs and/or mine) and so on.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I can’t pick just one, so I’ll pick two. From my feature mastering work, The Shawshank Redemption. From television, Homeland.

WHERE DO YOU FIND INSPIRATION?
Definitely in photography. My father was a professional photographer and we had our own darkroom. As a kid, I spent countless hours after school and on weekends learning how to plan, take and create great photographs. It is still a favorite hobby of mine to this day.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

VFX studio One Of Us adds CTO Benoit Leveau

Veteran post technologist Benoit Leveau has joined London’s One of Us as CTO. The studio, which is in its 16th year, employs 200 VFX artists.

Leveau, who joins One of Us from Milk VFX, has been in the industry for 18 years, starting out in his native France before moving to MPC in London. He then joined Prime Focus, integrating the company’s Vancouver and Mumbai pipelines with London. In 2013, he joined Milk in its opening year as head of pipeline. He helped to build that department and later led the development of Milk’s cloud rendering system.

The studio, which depends on what it calls “the efficient use of existing technology and the timely adoption of new technology,” says Leveau’s knowledge and experience will ensure that “their artists’ creativity has the technical foundation which allows it to flourish.”

With NAB 2020 canceled, what’s next?

By Randi Altman

After weeks of will they or won’t they, and many companies announcing they won’t be exhibiting, NAB announced Wednesday it has canceled its Las Vegas show.

NAB president and CEO Gordon Smith released a statement yesterday that included this bit about what might be next: “We are still weighing the best potential path forward, and we ask you for your patience as we do so. We are committed to exploring all possible alternatives so that we can provide a productive setting where the industry can engage with the latest technology, hear from industry thought leaders and make the game-changing connections that drive our industry forward.”

Some think NAB will be rescheduled, but even the NAB isn’t sure what’s next. They sent this in response to my question about their plans: “We’re in the process of engaging with exhibitors and attendees to gauge their interest in what will be the best path forward for the show. Options under consideration include an event later this year or expanding NAB Show New York in the fall. All of this is, of course, premised on COVID 19 fears being alleviated. We will be in touch with the NAB Show community as decisions are made.”

What is certain is that product makers were prepared to introduce new tools at NAB 2020, and while some might choose to push back their announcements, others are scrambling to find different ways to get their message out. The easy solution is to take everything online — demos, live streaming, etc.

For our part, postPerspective will be covering news from NAB, without there actually being at NAB. Our NAB video interviews and special Video Newsletters will happen, but instead of being from the show floor, we will be conducting them online. And as news comes in, we’ll be reporting it. So check our site for the latest innovations from what we’re now calling “NAB season.” And we’re trying to think outside the box, so if there’s a way we can help you get your message out, just let us know.

I think everyone will admit that trade shows have been evolving, and traditional trade shows have realized that as well. This year even NAB was set to start on a Sunday for the very first time in an effort to expand access to the show floor.

I, for one, am excited to see what’s next. As Plato said, “Necessity is the mother of invention.” Sometimes something bad has to happen to get us to the next step… sooner than we were planning on it.

Catching up with Jojo Rabbit director Taika Waititi

By Iain Blair

Now available on-demand and on DVD, Jojo Rabbit has had an impressive path to the big screen and beyond. Since it premiered at Toronto last year, Jojo Rabbit went from festival favorite to Oscar darling. Helmed by New Zealander Taika Waititi, and infused with his trademark blend of comedy and pathos, it’s a World War II satire that follows Jojo, a lonely German boy, whose world view is turned upside down when he discovers his single mother is hiding a young Jewish girl in their attic. Aided only by his idiotic imaginary friend, Adolf Hitler (Waititi), Jojo must confront his blind nationalism.

The Oscar-winning adapted screenplay by Waititi, who brought a fresh perspective and impish humor to the usually dead-serious subject matter, is based upon the book “Caging Skies” by Christine Leunens.

The behind-the-scenes team includes director of photography Mihai Malaimare, production designer Ra Vincent, editor Tom Eagles, composer Michael Giacchino and visual effects supervisor Jason Chen.

Here, Waititi, whose diverse credits include the $854 million global blockbuster Thor: Ragnarok, Flight of the Conchords and Hunt for the Wilderpeople, talks about making the film.

Given the current rise of anti-Semitism and as someone who’s Jewish yourself, fair to say this was a very personal endeavor?
It was, but not so much because I’m Jewish. I just think anyone who sees the rise of intolerance and the horrible things people do to each other could tackle this. But I definitely felt a sort of quiet power behind me on this, and it’s also the first time I’ve ever sat down and written a script from page one all the way through. I usually start at the end and then bounce around a lot as I figure out how to cobble it all together. But this time it all flowed so easily, so maybe it was my ancestry coming through and helping me.

What sort of film did you set out to make, as this could so easily have been a very bleak drama?
Right, so I set out to make a film with some hope and humor that was also a very different look at such a dark period and a fairly simple story, one about two kids learning to bridge the gaps between themselves and their cultures and understand each other. I also wanted to tell a story about a kid — who’s been indoctrinated to hate — learning to think for himself.

Tonally, I always wanted it to feel like this. I never wanted to make a straight drama, as I don’t know how to do that, or some ridiculous comedy, as that wouldn’t have any substance to it. So I wanted that flow in and out of drama and comedy, which is more in keeping with human experience anyway to me.

Hitler isn’t even in the novel. Did writing him into the film and playing him yourself help exorcise some ghosts?
I think so. The novel’s very dark, without the humor I wanted in this, and the idea of creating Hitler as the imaginary friend just seemed the perfect way to bring in all the comedy and satire. It’s an old theme — young boy befriends a monster — and there’s something quite poetic about looking at the world through the eyes of children.

Clint Eastwood told me it’s not easy directing yourself. How tough was it?
I don’t find it tough. He’s probably felt that way because they’re dramas and he’s not having any fun. For me, it’s improvising and being ridiculous, and I’m pretty aware of my acting abilities, so I always give myself the easier roles. This was so much fun to do, and there was no pressure, as I had no intention of doing an authentic portrayal.

Maybe this was a bit harder than other roles I’ve played because you just feel a bit more embarrassed when you’re dressed like Hitler, and everyone’s looking at you like you’re the biggest piece of shit in the world… because you’re dressed like the biggest piece of shit in the world. (Laughs) So I’d always take the mustache off if I didn’t have to be on camera to feel normal again.

DP Mihai Malaimare shot it, and visually it couldn’t be more different than the usual somber, black-and-white WWII film with its saturated colors. Talk about the shoot and the look you went for.
I’m glad you noticed that, as usually films about the Nazis make everything look very grim and bleak and gray. But we wanted to capture just how much color and brightness there was in Germany then, and we needed to see it that way through the boy’s eyes — all the excitement and hysteria, like a wonderland of celebration and a giant party.

Mihai shot with the ARRI Alexa SXT, and instead of the standard anamorphic 2X lenses he used the Hawk V-Lite squeeze anamorphic 1.3X lenses that gave us the vivid color saturation we wanted. We shot in these small towns in the Czech Republic that still had all these pre-war buildings perfectly preserved, and the interior sets were built on stages at Prague’s Barrandov Studios, the same place the Nazis used to shoot all their propaganda.

Talk about post and editing with frequent collaborator Tom Eagles, who won the ACE Eddie for cutting this film. How did that work?
While we shot, Tom set up in Prague to deal with dailies and work on scenes and move towards an assembly. When we got back here in LA, we started cutting in offices in Burbank, and then we did all the mixing on the Fox lot.

I imagine finding the right tone and the right balance between comedy and tragedy were the big editing challenges?
They were. Tom and I’ve worked together for a long time, and he’s brilliant and also has a great knack for finding music that fits perfectly. He’s got a good eye for comedy and in the end it took us about nine months to get it all right. We also did a lot of testing – about 14 times – and the reason I do it so much is because I really value the audience’s opinion and feedback, and I want to really understand what they want and don’t want from my films. So then we’d make some changes, maybe test some new jokes, delete a couple of scenes, but the final film was pretty close to the original script. (Read postPerspective’s interview with Eagles.)

Do you like the post process?
I do, but I find editing quite hard, as I don’t like sitting still, and watching the editor pull little pieces together drives me nuts. So I tend to leave and come back, watch what they’ve done and give notes, and then maybe we’ll work together on it for a bit. I also don’t want to see the film too many times, as I get bored and start making changes just to keep interested.

VFX play a role. Talk about working on them with VFX supervisor Jason Chen.
We always knew we’d have to do a lot of clean up, as it’s a period piece —  taking out modern road signs and we had some set extensions and wire removal. Luma did them and there was nothing major… not like Thor, where we had well over 2,000 shots, I think.

I loved the scene with The Beatles singing “I Want to Hold Your Hand” in German over the worshipful Hitler footage. Can you talk about the importance of sound and music?
It’s huge, and a big part of it came from all the research I did of the Hitler rallies and Hitler Youth. It really hit me watching old footage how all these people — men, women, children — would be screaming and fainting and crying, and how Hitler was like this rock star of the ‘30s; their reaction was the same as with The Beatles. It was the same crowd hysteria. I’d also included Bowie’s “Heroes” in the first draft, as I always wanted contemporary music in it, along with contemporary dialogue, because it is a modern story. It can happen again.

Where did you do the DI, and how important is it to you?
At Company 3 with Tim Stipan, and it’s key for the vivid look we wanted, but I really trust Mihai and Tim. I don’t want to micro-manage the whole DI. I go in and out and give notes.

How important was the Oscars and awards for a film like this?
(Laughs) You’re talking to a New Zealander, and we have this humility that we think is really charming but is probably really annoying. It’s so great to get a Best Picture nomination, and it’s been 10 years since I read the book and began working on it, so it’s been a lot of work. The goal was always to make something positive that promotes love and change, so I feel validated.

What’s next?
I like to keep doing very different things, so I’ve shot this sports film about football, Next Goal Wins, which we’ve started post on in LA.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

HPA Tech Retreat: Cloud workflows in the desert

By Tom Coughlin

At the 2020 HPA Retreat, attendees witnessed an active production of the short The Lost Lederhosen. This film used the Unreal gaming engine to provide impressive graphical details, along with several cameras and an ACES workflow, with much production work done in the cloud. Many of the companies and studios participating in the retreat played a role in the film’s production, and the shooting and post were part of the ongoing presentations and panels on the first official day of the conference. Tuesday’s sessions ended with Joachim Zell from Efilm and Josh Pines from Technicolor showing the completed video.

Shooting The Lost Lederhosen – director Steve Shaw is at the far right.

As you can imagine, several digital storage products were needed for The Lost Lederhosen. In checking out the production rig in the back of the conference room, I saw some G-Tech modular storage units and was told that there was an Isilon storage system on the other side of the wall — a giveaway because of the noise from the fans in the system. In one of the sessions on that first day, it was reported that 5TB of total footage was shot with 500GB left after conforming using Avid Media Composer with AAF. Editing was done in the cloud with Avid Nexis 30TB storage online. During dailies AWS CLI was used to push files to S3 for a common storage location. Pixmover from Pixspan was used to move data to and from LA, along with AWS S3 storage in the San Francisco Bay area.

Colorfront supported the cloud-based live production of the HPA video and did a demonstration of its 2020 Express Dailies that was used to do all the dailies and deliverables, as well as Transkoder which was used to do all the VFX pulls. Frame.io, which was used to move content from cameras to the cloud. A Mac Pro was feeding dual Apple 32-inch Retina Pro XDR displays showing 6K HDR content. Colorfront was displaying Transkoder 2020 running on a Supermicro workstation with four Nvidia GeForce RTX 2080 Ti GPUs and an AJA Kona 5 video card outputting to an 85-inch Sony Z9G HDR monitor and an AJA HDR Image Analyzer 12G for video analytic monitoring.

Metadata for video content was an important element in the HPA presentations, which included the ASC MHL (media hash list) that hashes files and folders in a standardized way, with essential file metadata in an XML human-readable format. The ASC MHL is used from data capture and offloading through backup and archiving, and it is an important element in restoring content as shown below. The ASC HML is available on github (https://github.com/ascmitc/mhl) and is still a work in progress.

The following day, Tech Retreat main conference producer Mark Schubin said that film hasn’t died yet and that Kodak had received orders from Disney, NBCUniversal, Paramount, Sony and Warner Bros. for motion picture film stock. He talked about what might be the world’s smallest camera, a small endoscopy image chip with 200×200 resolution. And he mentioned Microsoft’s Project Silica proof of concept — a 7.5cm x 7.5cm glass plate storing the 75.6GB Superman movie — as a possible long-term storage media.

MovieLabs

MovieLabs
The MovieLabs white paper released in August 2019, “The Evolution of Media Creation,” was referenced in several talks during the HPA retreat. The paper, created in cooperation with the major film studios, suggests a path to the future of moviemaking, and that path is in the cloud. You can read it here: https://movielabs.com/production-technology

During the SMPTE 2110 IP update, it was said that most new video trucks for the UK’s NEP are built for 2110 IP compliance. There are a total of 12 IP-enabled trucks, six IP control rooms and multiple IP flypacks (backpack IP video gear). In a panel organized by the Digital Production Partnership, the DPP’s Mark Harrison gave a presentation that included information on on-side and cloud storage for M&E applications. He spoke about the 2020 report from the DPP and 10 case studies from the M&E industry of companies that have all adopted cloud-led production for different reasons. We will look at the digital storage needs for three of these case studies.

It was reported that COPA90 is doing high-volume global content management with a cloud production hub and AI using the Veritone Digital Media Hub and IBM Cloud Storage, as shown below.

France TV is doing fast turnaround of high-end drama using cloud-based metadata enrichment with AWS, Azure, a private cloud and local storage before going into Avid Nexis storage, Avid Interplay and Media Composer.

UK’s Jellyfish Pictures is reportedly doing secure distributed high-volume virtualized production using Azure public cloud and a private cloud with PixStor storage.

There are five key principles in the Eluvio content fabric.

Distributed Content Delivery
Eluvio’s Michelle Munson gave an update on the company’s distributed content delivery service, and during a demo at the company’s booth, she told me that Eluvio’s approach keeps the master copy for distribution in cold storage, with the published serviceable content inherently streamable. By reusing distributed parts of content within the network, there is a considerable shrink in requirements for storage. In effect, the fabric replaces a hot storage tier, reducing higher-performance storage and network bandwidth requirements.

In her presentation, Munson said that Eluvio eliminates the need for cloud microservices for content distribution. The blockchain-based network system provides an inherent security model that makes it possible to serve audiences directly over public internet to enable a content fabric. This is not a cloud or a CDN, but rather a data distribution and storage protocol. Rendering is done at the consumer endpoint, allowing consumers to play content just in time with low latency, and monetization happens through secure transactions. MGM is deploying Eluvio’s technology for worldwide content distribution, and some other major media players are also working with the technology.

Renard Jenkins

There are five key principles in the Eluvio content fabric. First, there is no movement of the master copy; a mezzanine copy is used for all servicing. Second, a file-based interface is used for upload and download with underlying objects. Third, streaming and servicing are accomplished from the source in a JIT manner. Fourth, it uses a trustless encryption model over open networks, and fifth, access control and rights management are built in.

Best Practices for Cloud-Based Workflows
MediAnswers’ Chris Lennon and PBS’ Renard Jenkins (who subsequently started work as VP, content transmission, at WarnerMedia) spoke about the right way to do cloud-based workflows, which included local as well as cloud content copies. They gave three principles for survival. First, IT is not IP, and a network should be designed around media use and minimizing packet loss. Second, build or find cloud-native solutions rather than “lift and shift.” Third, linear workflows lead to nonlinear problems.

Universal and the Cloud
Universal’s Annie Chang spoke about tools for the next generation of production, including the use of cloud-based tools such as temporary production storage and an active archive for production assets. She went on to detail future cloud workflows wherein content goes from the camera directly to the cloud (or, if on film, from a digital intermediate post house to the cloud). Editing, dailies distribution and EDL are all done in the cloud, as is final archiving.

Chang said that the move to a mostly cloud-based workflow is already starting at Universal. She reported that DreamWorks Animation (DWA) has built a cloud-native platform that creates workspaces for its artists. Assets are related to each other, and workflows can be kicked off through microservices. She wondered if Universal could repurpose the DWA platform for live-action, VFX assets and workflows.

Universal

Chang discussed an experiment wherein Universal took one shot from Fast & Furious Presents: Hobbs & Shaw (including reference photos, LIDAR scans, camera raw) and demonstrated a VFX pull on premises at DWA while also testing in a public cloud. When Universal ran the content from the cloud and showed it to Universal VFX execs and the VFX producer from Hobbs & Shaw, Chang was told that this was something they have wanted for a decade. Developing the platform this year, Universal plans to test it on a full production in 2021. The company has 10 concurrent projects and is coordinating with multiple industry efforts with ACES, USC ETC and MovieLabs.

ACES
There was much discussion on the next developments for ACES (Academy Color Encoding System), particularly the implementation of ACES 1.2 and the development of ACES 2.0. A panel at the retreat suggested that practical problems with image matching with the current version of ACES could be solved by using AMF (ACES Metadata File). But there are some image matching problems that are not ACES-related but rather related to the source of the image and what sort of format is used for comparison. ACES 2.0 development is underway that plans to address these and other issues with the current version of ACES.

Storage
The digital storage exhibitors at the HPA Retreat included Cloudian (local object storage), which demonstrated with AWS, Azure, Google and other cloud storage services. Quantum had an exhibit that focused on its media and storage solutions, such as StorNext Workflow Storage Platform, F-Series NVMe storage, Xcellis high-performance workflow storage appliances and the its object storage and tape archive solutions. (Note that Quantum recently acquired Western Digital’s ActiveScale object storage.)

Racktop was advertising its Brickstor all-flash or hybrid HDD/SSD CyberConverged data storage offering, which supports FIPS 140-2 and AES-256 for encryption and compliance. Rohde & Schwarz was demoing IMF-based workflows with its Spycer Node media storage.

Rohde & Schwarz

Scale Logic featured its Atavium data management and orchestration solution. According to the product literature, data entering Atavium is identified, tagged and classified and can be searched via metadata or tags whether the data is on premises or in the cloud. Also, tasks can be automated using a combination of metadata and tags and a set of APIs and scheduler and application integration determine the placement of data to reflect the needs of the workflow. Local storage includes nearline HDDs as well as NVMe flash, and DRAM is used for read-ahead cache. The system will work with Spectra Logic’s Black Pearl and integrates with asset management systems.

Seagate Technology was showing storage products, including its Lyve Drive Shuttle for physical data delivery using e-ink and protective cases for shipping storage devices. The company had flyers out on its Seagate Exos modular storage for capacity and the Seagate Nytro modular storage for performance. Pixit Media was partnering with Seagate on its software-defined storage solution.

StorageDNA was showing its analytics-driven data management platform (DNAfabric) that provides data visibility services, including storage capacity and cost as well as data mobility services. Tiger Technology was showing its Tiger Bridge and shared an exhibit space with Nexsan NAS products. Western Digital was showing various G-Tech products, including its G-Speed Shuttle storage systems as well as desktop and mobile HDD and SSD storage devices.


Tom Coughlin is a digital storage analyst and business and technology consultant. His Coughlin Associates consults, publishes books and market and technology reports (such as the annual Digital Storage in Media and Entertainment Report ). He is currently working on his 2020 Digital Storage in Media and Entertainment Survey, feel free to participate:  https://www.surveymonkey.com/r/MWXL22N

 

 

Review: Litra Pro’s Premium 3 Point Lighting Bundle

By Brady Betzel

With LED lights showing up everywhere these days, it’s not always easy to find the balance between affordability, power output and size. I have previously reviewed itty bitty-LED lights like the Litra Torch, which for its size is amazing. Litra has now expanded its LED offerings, adding the Litra Pro and the Litra Studio.

Litra Studio ($650) is at the top of the Litra mountain with not only varying color temperatures — from 2,000 to 10,000 kelvin with adjustable green/magenta settings — but also RGBWW (RGB + cool white + warm white), CCT (kelvin adjustments), HSL (hue + saturation + lightness), color gel presets, flash effects and more.

But for today’s review, I am wanted to focus on the Litra Pro LED, which comes to the Premium 3 Point Lighting Bundle, complete with light stands, lights, soft boxes, and carrying case. I had reached out to Litra about reviewing this bundle because I am tired of having to lug around big clunky lights for quick interviews or smaller setup product shots. And to be honest, it was right before I was heading to Sundance to shoot some interviews for postPerspective, I and didn’t want to check a bag at the airport. (Check out my interviews with editors at Sundance here.)

For the trip, I wanted to bring lights, a Blackmagic 6K Pocket cinema camera, my Canon L series zoom lens, a small tripod and some hard drives all stuffed into my backpack. I knew I’d be in the snow, so I needed lights that could potentially withstand all types of precipitation. Also, I would be throwing these lights around, so I needed them to be durable. The Litra Pro lights fit the bill. They measure 2.75in x 2in x 1.2in  — smaller than a phone, weigh 6oz and have upwards of a 10-hour battery life if set to 5% power. Each Litra Pro costs just under $220 but can be purchased in different bundle assortments. Individually, each Litra Pro comes with a rubberized diffuser, USB-A to Micro-USB charging cable (very short, maybe 3-4inches in length), DSLR mount (to be mounted in a hot/cold shoe), GoPro mount and a little zipper bag.

I wish Litra would package not only the GoPro mount to ¼”-20 but also the female ¼”-20 to GoPro mount adapter to be mounted to something like a tripod. If you don’t already them them, you’d need to purchase the GoPro mounts. Alternatively, it would be nice to have a mini-ball head mount like they sell on the site separately.

I was sent the Litra Pro Premium 3 Point Lighting Bundle. This essentially gives you everything you need for a standard three-point lighting setup — key light, fill light and back light. In addition, you get three light stands with carrying bags, three soft boxes, a customizable foam-insert carrying case and the standard accessories. This package retails for $779.95, which is a pretty good discount. If bought separately, the package would add up to about $820 not including the light stands, which aren’t available on Litra site and cost around $26 for two on Amazon. That means with the bundle you are essentially getting a free carrying case and light stands. The carrying case fits most of the products, except for the light stands. I had some trouble fitting all of the soft boxes along with the original accessories into the carrying case, but with a little force, I got it zipped up.

Do Specs Live Up to Output?
The Litra Pro lights are amazing lights packed into a small package, but with a kind of expensive price tag — Think of the saying, “Better, faster, cheaper: Pick two because you can’t have all three.” They have a CRI (Color Rendering Index) of greater than 95, which on the surface means they will show accurate colors. They can output up to 1200 lumens (increasing from 0-100% in 5% increments) either by app or on the light itself; have a 70-degree beam angle; can be adjusted from 3000k to 6000k color temperature in 100k increments; and have zero flicker, no matter the shutter speed (a.k.a. shutter angle). The top OLED screen displays battery info, Bluetooth connection info, kelvin temperature and brightness values.

One of my two favorite features of the Litra Pro lights are the rugged exterior and the impact they can withstand, based on MIL-STD-810 testing. The Litra Pros can withstand a lot of punishment, typically more than any filmmaker will dish out. For me, I need lights that can be in a pocket, a backpack, or mounted on a lighting stand in the rain, and these lights will withstand all of the elements.

They stood up to my practical production abuse: dropping, water, snow, rain, general throwing around in my backpack on an airplane, and my three sons — all under 10 — throwing them around. In fact, they are waterproofed up to 30 meters (90 feet).

My second favorite feature is the ability to control color temperature and brightness among a group of lights simultaneously or individually through the Litra app. When purchasing the 3 Point Lighting Bundle, this makes a lot of sense. Controlling all of the lights from one app simultaneously can allow you to watch your output image on the camera without moving around the room adjusting each light.

When I first started writing this review, the Litra app was one of the most important factors. When I was at Sundance, I needed to change lighting temperatures or brightness levels without leaving my interview position. I wasn’t able to bring an external monitor, so I only had the monitor on the back of the BMPCC6K camera to judge my lighting decisions. But with the updated Litra app, I was able to quickly add the three Litra Pro lights into a group and adjust the temperature and brightness easily. I tested the app on both Android and iOS devices, and as of mid-February, they both worked.

There can be a little lag when adjusting the brightness and temperature of the lights in a group, but they quickly catch up. The Litra app also has “CTO” (Color Temperature Orange) common preset temperatures of Daylight 5600, ⅛ CTO 4900K, ¼ CTO 4500K, ½ CTO 3800K and ¾ CTO 3200K to quickly jump to the more common color temperatures. If those don’t work, you can also set your favorites. An interesting function is to flash the lights — you can set a brightness minimum/maximum, color temperature and strobe per second in Hz.

When shooting product and interview photography or videography, I like to use diffusion. As I mentioned earlier, the light comes with a rubberized diffusion cover that sits right on the camera. But if you need a little more distance between the light and your diffusion to draw out the softness of the light, the Litra 3 Point Lighting Bundle includes soft boxes that snap together and snap onto the Litra Pro. At first, I was a little thrown off by the soft boxes because you have to build them and break them down if you want to travel with them — I guess I was hoping for more of a collapsible setup. But they come with a padded, zippered pouch for transport, and they lay very flat when broken down. They actually work pretty well when snapped together and are pretty durable. The soft boxes are indispensable for interviews. Without the soft boxes, it is hard to get an even light; add the rubberized diffusion and you will almost get there, but the soft boxes really spread the light nicely.

Over Christmas, I helped out at an event for a pediatric cancer-based foundation called The Bumblebee Foundation, which supports families with kids going through pediatric cancer treatments. They needed someone to take pictures, so I grabbed my camera and mounted one of the Litra Pro lights with a soft box onto the hot shoe of my Canon 5D Mark II with the included mount. The Litra Pro was easy to use, and it didn’t startle people like a flash might. It was a really key item to have in that environment.

I also do some product photography/videography for my wife, who sews and makes hair bows, tutus and more. I needed to light a few Girl Scout Cookie hair bows she had made, so I mounted two of the lights using the lighting stands and soft boxes and just stood one of the Litra Pros behind the products. You can see the video here.

What was interesting is that I wanted more light vertically, and because the Litra Pros have 2-¼”-20 mounts (one on the bottom and one on the side), I could quickly mount the lights vertically. I never really realized how helpful mounting the Litra Pros vertically would be until I actually needed it. At the same time, I had left the lights on at 60-80% power, and after a few minutes, I felt the heat the Litra Pros can put out. It isn’t quite burning, but the Litra Pros do get hot to the touch if left on for a while… just something to keep in mind.

Summing Up
From the military-grade-feeling exterior aluminum construction to the CRI color accuracy, the Litra Pro lights are truly amazing. Whether you use them to light interviews at the 2020 Sundance Film Festival (like I did), add one to a GoPro shoot to take the load off of the sensor with a high ISO, or use them to light product photography, the Litra Pro 3 Point Lighting Bundle is worth your money. They can fit into your pocket and withstand being dropped on the ground or in water.

All in all, this is a great bundle. The Litra Pros are not cheap, but the peace of mind you get knowing they will still work if you drop them or get them wet is worth every penny. When flying to Sundance, I had no fear throwing them around. I was setting up the lighting for my interviews and noticed a water ring on the table from a glass of water. I didn’t think twice and put the Litra Pro right in the water. In fact, when I was shooting some videos for this review, I put the Litra Pros in a vase of water. At first I was nervous, but then I went for it, and they worked flawlessly.

If you are looking for super-compact lighting that is bright enough to use outdoors, light interviews indoors, film underwater, and even double as photography lighting, the Litra Pros are for you. If you are like me and need to do a lot of product videography and interview lighting quickly, the Litra Pro Premium 3 Point Lighting Bundle is where you should look.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Behind the Title: Loyalkaspar EP Scott Lakso

“People probably don’t expect that sometimes being an EP involves jumping into After Effects to render something, or contributing written ideas to strategic and conceptual projects.”

Name: Scott Lakso

Company: New York City’s Loyalkaspar

What does Loyalkaspar do?
We’re a creative branding agency specializing in brand strategy, identity, marketing and production. In human terms: We like to make good work that people will enjoy, and we try to do it for companies that make the world better!

SyFy rebrand

What’s your job title?
Executive Producer

What does that entail?
It entails a little bit of everything you’d expect, but mostly it involves making sure our clients are happy so that they’ll want to keep working with us on new projects. It also means establishing relationships with potential clients. At the office, it means overseeing the team of producers and making sure that everyone is happy and productive. There are a lot of proposals, budgets and timelines as part of that, but all of the nitty-gritty stuff is in service of fostering healthy relationships inside the company and with outside clients.

What would surprise people the most about what falls under that title?
People probably don’t expect that sometimes it involves jumping into After Effects to render something or contributing written ideas to strategic and conceptual projects. The title makes it sound like a reductive position, as in an “executive” producer doesn’t do any of the tasks they used to do as a coordinator or mid-level producer, but it’s actually more of a cumulative role — all of the skills I’ve developed over the 11 years it took to get to EP are still used anytime it seems appropriate.

What’s your favorite part of the job?
My favorite aspect is having the freedom, capacity and trust of the company leadership to do whatever I feel is best for our people, our clients and Loyalkaspar as a whole. Sometimes that’s helping a client out of a bind on short notice, encouraging a staffer to vent over a pint or organizing a spontaneous karaoke night when the time is right… which is more often than you might think.

What’s your least favorite?
When the circumstances of a project or situation require me to work reactively rather than proactively. I’m not a fan of winging it! It feels like driving at night with the headlights turned off. I’m much happier when I can plan a few steps ahead and help everyone avoid the headaches of hazardous speed bumps.

What is your most productive time of day?
Anytime that I can tune out distractions and focus on the task at hand. That’s more about creating a productive window in which to work rather than waiting for a specific time of day.

If you didn’t have this job, what would you be doing instead?
I’d be doing literally any job that NASA would be willing to hire me for, given my lack of astronautics knowledge and experience. So I’d probably be scrubbing dishes in the Cape Canaveral food court or something equally unglamorous.

How did you choose this profession?
“Chose this profession” is a strong phrase, given that I had no idea this kind of work existed until I moved to New York after college. I think I technically stumbled into it. That being said, at some point while stage-managing high school theater, I probably subconsciously chose to go down the path that would lead me to something like this as an adult.

Super Bowl halftime show graphics 2010

Can you name some recent projects?
For the past few months, I’ve been mostly dedicated to the brand identity development for Peacock, the new streaming platform from NBCUniversal. But other recent standout projects have been an interactive film for a museum in Philadelphia and involvement in pitches to the Sesame Workshop and Full Frontal with Samantha Bee.

Do you have a project you are most proud of?
It’s hard to pick only one, but producing the Super Bowl halftime show graphics in 2010 and overseeing our all-encompassing rebrand of SyFy in 2017 are a couple of personal favorites.

Name three piece of technology you can’t live without.
I’d have a hard time living in a world that didn’t have the technology to enjoy music and movies/television, so let’s say a good screen of some kind, a record player/stereo/iPod and some good headphones.

What social media channels do you follow?
At this point, only Instagram. I don’t think I’m alone in thinking that most social media content makes people feel worse about themselves and the world. At least on Instagram, people seem interested in posting things that others will enjoy rather than just broadcasting whatever will get them the most attention.

Do you listen to music while you work? If so what kind?
I find it impossible to work without music on. In terms of what specifically, almost anything instrumental is good for working to, but I really love old, cheesy music like bossa nova, retro Italian film soundtracks, 1960s/1970s library music, Burt Bacharach, etc. That probably makes me sound pretentious, or maybe like a dork, but I’m not exactly proud of my weird taste in music.

What do you do to de-stress from it all?
There are tons of options! When time permits, traveling and hiking outside of the city (especially outside of the country) are great for stress. I know that exercise is good for stress but that doesn’t make it any more enjoyable, so I have to trick myself into accidentally getting a workout while doing something like being in nature or exploring a foreign country. On a smaller scale, just drinking wine with my wife, going to a movie with my phone turned off or doing anything you can find in a book on “hygge” (like reading in my pajamas or cooking comforting food).

Goldcrest Post’s Jay Tilin has passed away

Jay Tilin, head of production at New York’s Goldcrest Post, passed away last month after a long illness. For 40 years, Tilin worked in the industry as an editor, visual effects artist and executive. His many notable credits include the Netflix series Marco Polo and the HBO series Treme and True Detective.

“Jay was in integral part of New York’s post production community and one of the top conform artists in the world,” said Goldcrest Post managing director Domenic Rom. “He was beloved by our staff and clients as an admired colleague and valued friend. We offer our heartfelt condolences to his family and all who knew him.”

Tilin began his career in 1980 as an editor with Devlin Productions. He also spent many years at The Tape House, Technicolor, Riot and Deluxe, all in New York. He was an early adopter of many now standard post technologies, from the advent of HD video in the 1990s through more recent implementations of 4K and HDR finishing.

His credits also include the HBO series Boardwalk Empire, the Sundance Channel series Hap and Leonard, the PBS documentary The National Parks and the Merchant Ivory feature City of Your Final Destination. He also contributed to numerous commercials and broadcast promos. A native New Yorker, Tilin earned a degree in broadcasting from SUNY Oswego.

Tilin is survived by his wife Betsy, his children Kelsey and Sam, his mother Sonya and his sister Felice (Trudy).

Maxon plugin allows for integration of Cinema 4D assets into Unity


Maxon is now a Unity Technologies Verified Solutions Partner and is distributing a plugin for Unity called Cineware by Maxon. The new plugin provides developers and creatives with seamless integration of Cinema 4D assets into Unity. Artists can easily create models and animations in Cinema 4D for use in realtime 3D (RT3D), interactive 2D, 3D, VR and AR experiences. The Cineware by Maxon plugin is now available free of charge on the Unity Asset Store.

The plugin is compatible with Cinema 4D Release 21, the latest version of the software, and Unity’s latest release, 2019.3. The plugin does not require a license of Cinema 4D as long as Cinema 4D scenes have been “Saved for Cineware.” By default, imported assets will appear relative to the asset folder or imported asset. The plugin also supports user-defined folder hierarchies.

Cineware by Maxon currently supports Geometry:
• Vertex Position, Normals, UV, Skinning Weight, Color
• Skin and Binding Rig
• Pose Morphs as Blend Shapes
• Lightmap UV2 Generation on Import

Materials:
• PBR Reflectance Channel Materials conversion
• Albedo/Metal/Rough
• Normal Map
• Bump Map
• Emission

Animated Materials:
• Color including Transparency
• Metalness
• Roughness
• Emission Intensity, Color
• Alpha Cutout Threshold

Lighting:
• Spot, Directional, Point
• Animated properties supported:
• Cone
• Intensity
• Color

Cameras:
• Animated properties
• Field of Vision (FOV)

Main Image: Courtesy of Cornelius Dämmrich

Editor Anthony Marinelli joins Northern Lights

Editor Anthony Marinelli has joined post studio Northern Lights. Marinelli’s experience spans commercial, brand content, film and social projects. Marinelli comes to Northern Lights from a four-year stint at TwoPointO where he was also a partner. He has previously worked at Kind Editorial, Alkemy X, Red Car, Cut+Run and Crew Cuts.

Marinelli’s work includes projects for Mercedes, FedEx, BMW, Visa, Pepsi, Scotts, Mount Sinai and Verizon. He also edited the Webby Award-winning documentary “Alicia in Africa,” featuring Alicia Keys for Keep a Child Alive.

Marinelli is also an active in independent theater and film. He has written and directed many plays and short films, including Acoustic Space, which won Best Short at the 2018 Ridgewood Guild Film Festival and Best Short Screenplay in the Richmond International Film Festival.

Marinelli’s most recent campaigns are for Mount Sinai and Bernie & Phyl’s for DeVito Verdi.

He works on Avid Media Composer and Adobe Premiere. You can watch his reel here.

Director Vincent Lin discusses colorful Seagram’s Escapes spot

By Randi Altman

Valiant Pictures, a New York-based production house, recently produced a commercial spot featuring The Bachelor/Bachelorette host Chris Harrison promoting Seagram’s Escapes and its line of alcohol-based fruit drinks. A new addition to the product line is Tropical Rosé, which was co-developed by Harrison and contains natural passion fruit, dragon fruit and rosé flavors.

Valiant’s Vincent Lin directed the piece, which features Harrison in a tropical-looking room — brightened with sunny pinks and yellows thanks to NYC’s Nice Shoes — describing the rosé and signing off with the Seagram’s Escapes brand slogan, “Keep it colorful!”

Here, director Lin — and his DP Alexander Chinnici — talks about the project’s conception, shoot and post.

How early did you get involved? Did Valiant act as the creative agency on this spot?
Valiant has a long-standing history with the Seagram’s Escapes brand team, and we were fortunate enough to have the opportunity to brainstorm a few ideas with them early on for their launch of Seagram’s Escapes Tropical Rosé with Chris Harrison. The creative concept was developed by Valiant’s in-house creative agency, headed by creative directors Nicole Zizila and Steven Zizila, and me. Seagram’s was very instrumental in the creative for the project, and we collaborated to make sure it felt fresh and new — like an elevated evolution of their “Keep It Colorful” campaign rather than a replacement.

Clearly, it’s meant to have a tropical vibe. Was it shot greenscreen?
We had considered doing this greenscreen, which would open up some interesting options, but also it would pose some challenges. What was important for this campaign creatively was to seamlessly take Chris Harrison to the magical world of Seagram’s Escapes Tropical Rosé. A practical approach was chosen so it didn’t feel too “out of this world,” and the live action still felt real and relatable. We had considered putting Chris in a tropical location — either in greenscreen or on location — but we really wanted to play to Chris’ personality and strengths and have him lead us to this world, rather than throw him into it. Plus, they didn’t sign off on letting us film in the Maldives. I tried (smiles).

L-R: Vincent Lin and Alex Chinnici

What was the spot shot on?
Working with the very talented DP Alex Chinnici, he recommended shooting on the ARRI Alexa for many reasons. I’ll let Alex answer this one.

Alex Chinnici: Some DPs would likely answer with something sexier  like, “I love the look!” But that is ignoring a lot of the technical realities available to us these days. A lot of these cameras are wonderful. I can manipulate the look, so I choose a camera based on other reasons. Without an on-set live, color-capable DIT, I had to rely on the default LUT seen on set and through post. The Alexa’s default LUT is my preference among the digital cameras. For lighting and everyone on the set, we start in a wonderful place right off the bat. Post houses also know it so well, along with colorists and VFX. Knowing our limitations and expecting not to be entirely involved, I prefer giving these departments the best image/file possible.

Inherently, the color, highlight retention and skin tone are wonderful right off the bat without having to bend over backward for anyone. With the Alexa, you end up being much closer to the end rather than having to jump through hoops to get there like you would with some other cameras. Lastly, the reliability is key. With the little time that we had, and a celebrity talent, I would never put a production through the risk of some new tech. Being in a studio, we had full control but still, I’d rather start in a place of success and only make it better from there.

What about the lenses?
Chinnici: I chose the Zeiss Master Primes for similar reasons. While sharp, they are not overbearing. With some mild filtration and very soft and controlled lighting, I can adjust that in other ways. Plus, I know that post will beautify anything that needs it; giving them a clean, sharp image (especially considering the seltzer can) is key.

I shot at a deeper stop to ensure that the lenses are even cleaner and sharper, although the Master Primes do hold up very well wide open. I also wanted the Seagram’s can to be in focus as much as possible and for us to be able to see the set behind Chris Harrison, as opposed to a very shallow depth of field. I also wanted to ensure little to no flares, solid contrast, sharpness across the field and no surprises.

Thanks Alex. Back to you Vincent. How did you work with Alex to get the right look?
There was a lot of back and forth between Alex and me, and we pulled references to discuss. Ultimately, we knew the two most important things were to highlight Chris Harrison and the product. We also knew we wanted the spot to feel like a progression from the brand’s previous work. We decided the best way to do this was to introduce some dimensionality by giving the set depth with lighting, while keeping a clean, polished and sophisticated aesthetic. We also introduced a bit of camera movement to further pull the audience in and to compose the shots it in a way that all the focus would be on Chris Harrison to bring us into that vibrant CG world.

How did you work with Nice Shoes colorist Chris Ryan to make sure the look stayed on point? 
Nice Shoes is always one of our preferred partners, and Chris Ryan was perfect for the job. Our creatives, Nicole and Steven, had worked with him a number of times. As with all jobs, there are certain challenges and limitations, and we knew we had to work fast. Chris is not only detail oriented, creative and a wizard with color correction, but also able to work efficiently.

He worked on a FilmLight Baselight system off the Alexa raw files. The color grading really brought out the saturation to further reinforce the brand’s slogan, “Keep It Colorful,” but also to manage the highlights and whites so it felt inviting and bright throughout, but not at all sterile.

What about the VFX? Can you talk about how that was accomplished? 
Much like the camera work, we wanted to continue giving dimensionality to the spot by having depth in each of our CG shots. Not only depth in space but also in movement and choreography. We wanted the CG world to feel full of life and vibrant in order to highlight key elements of the beverage — the flavors, dragonfruit and passionfruit — and give it a sense of motion that draws you in while making you believe there’s a world outside of it. We wanted the hero to shine in the center and the animation to play out as if a kaleidoscope or tornado was pulling you in closer and closer.

We sought the help of creative production studio Taylor James tto build the CG elements. We chose to work with a core of 3ds Max artists who could do a range of tasks using Autodesk 3ds Max and Chaos Group’s V-Ray (we also use Maya and Arnold). We used Foundry Nuke to composite all of the shots and integrate the CGI into the footage. The 3D asset creation, animation and lighting were constructed and rendered in Autodesk Maya, with compositing done in Adobe After Effects.

One of the biggest challenges was making sure the live action felt connected to the CG world, but with each still having its own personality. There is a modern and clean feel to these spots that we wanted to uphold while still making it feel fun and playful with colors and movement. There were definitely a few earlier versions that we went a bit crazy with and had to scale down a bit.

Does a lot of your work feature live action and visual effects combined?
I think of VFX like any film technique: It’s simply a tool for directors and creatives to use. The most essential thing is to understand the brand, if it’s a commercial, and to understand the story you are trying to tell. I’ve been fortunate to do a number of spots that involve live-action and VFX now, but truth be told, VFX almost always sneaks its way in these days.

Even if I do a practical effect, there are limitless possibilities in post production and VFX. Anything from simple cleanup to enhancing, compositing, set building and extending — it’s all possible. It’d be foolish not to consider it as a viable tool. Now, that’s not to say you should rely solely on VFX to fix problems, but if there’s a way it can improve your work, definitely use it. For this particular project, obviously, the CG was crucial to let us really be immersed in a magical world at the level of realism and proximity we desired.

Anything challenging about this spot that you’d like to share?
Chris Harrison was terrible to work with and refused to wear a shirt for some reason … I’m just kidding! Chris was one of the most professional, humblest and kindest celebrity talents that I’ve had the pleasure to work with. This wasn’t a simple endorsement for him; he actually did work closely with Seagram’s Escapes over several months to create and flavor-test the Tropical Rosé beverage.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Blackmagic releases Resolve 16.2, beefs up audio post tools

Blackmagic has updated its color, edit, VFX and audio post tool to Resolve 16.2. This new version features major Fairlight updates for audio post as well as many improvements for color correction, editing and more.

This new version has major new updates for editing in the Fairlight audio timeline when using a mouse and keyboard. This is because the new edit selection mode unlocks functionality previously only available via the audio editor on the full Fairlight console, so editing is much faster than before. In addition, the edit selection mode makes adding fades and cuts and even moving clips only a mouse click away. New scalable waveforms let users zoom in without adjusting the volume. Bouncing lets customers render a clip with custom sound effects directly from the Fairlight timeline.

Adding multiple clips is also easier, as users can now add them to the timeline vertically, not just horizontally, making it simpler to add multiple tracks of audio at once. Multichannel tracks can now be converted into linked groups directly in the timeline so users no longer have to change clips manually and reimport. There’s added support for frame boundary editing, which improves file export compatibility for film and broadcast deliveries. Frame boundary editing now adds precision so users can easily trim to frame boundaries without having to zoom all the way in the timeline. The new version supports modifier keys so that clips can be duplicated directly in the timeline using the keyboard and mouse. Users can also copy clips across multiple timelines with ease.

Resolve 16.2 also includes support for the Blackmagic Fairlight Sound Library with new support for metadata based searches, so customers don’t need to know the filename to find a sound effect. Search results also display both the file name and description, so finding the perfect sound effect is faster and easier than before.

MPEG-H 3D immersive surround sound audio bussing and monitoring workflows are now supported. Additionally, improved pan and balance behavior includes the ability to constrain panning.

Fairlight audio editing also has index improvements. The edit index is now available in the Fairlight page and works as it does in the other pages, displaying a list of all media used; users simply click on a clip to navigate directly to its location in the timeline. The track index now supports drag selections for mute, solo, record enable and lock as well as visibility controls so editors can quickly swipe through a stack of tracks without having to click on each one individually. Audio tracks can also be rearranged by click and dragging a single track or a group of tracks in the track index.

This new release also includes improvements in AAF import and export. AAF support has been refined so that AAF sequences can be imported directly to the timeline in use. Additionally, if the project features a different time scale, the AAF data can also be imported with an offset value to match. AAF files that contain multiple channels will also be recognized as linked groups automatically. The AAF export has been updated and now supports industry-standard broadcast wave files. Audio cross-fades and fade handles are now added to the AAF files exported from Fairlight and will be recognized in other applications.

For traditional Fairlight users, this new update makes major improvements in importing old legacy Fairlight projects —including improved speed when opening projects with over 1,000 media files, so projects are imported more quickly.

Audio mixing is also improved. A new EQ curve preset for clip EQ in the inspector allows removal of troublesome frequencies. New FairlightFX filters include a new meter plug-in that adds a floating meter for any track or bus, so users can keep an eye on levels even if the monitoring panel or mixer are closed. There’s also a new LFE filter designed to smoothly roll off the higher frequencies when mixing low-frequency effects in surround.

Working with immersive sound workflows using the Fairlight audio editor has been updated and now includes dedicated controls for panning up and down. Additionally, clip EQ can now be altered in the inspector on the editor panel. Copy and paste functions have been updated, and now all attributes — including EQ, automation and clip gain — are copied. Sound engineers can set up their preferred workflow, including creating and applying their own presets for clip EQ. Plug-in parameters can also be customized or added so that users have fast access to their preferred tool set.

Clip levels can now be changed relatively, allowing users to adjust the overall gain while respecting existing adjustments. Clip levels can also be reset to unity, easily removing any level adjustments that might have previously been made. Fades can also be deleted directly from the Fairlight Editor, making it faster to do than before. Sound engineers can also now save their preferred track view so that they get the view they want without having to create it each time. More functions previously only available via the keyboard are now accessible using the panel, including layered editing. This also means that automation curves can now be selected via the keyboard or audio panel.

Continuing on with the extensive improvements to the Fairlight audio, there has also been major updates to the audio editor transport control. Track navigation is now improved and even works when nothing is selected. Users can navigate directly to the timecode entry window above the timeline from the audio editor panel, and there is added support for high-frame-rate timecodes. Timecode entry now supports values relative to the current CTI location, so the playhead can move along the timeline relative to the position rather than a set timecode.

Support has also been added so the colon key can be used in place of the user typing 00. Master spill on console faders now lets users spill out all the tracks to a bus fader for quick adjustments in the mix. There’s also more precision with rotary controls on the panel and when using a mouse with a modifier key. Users can also change the layout and select either icon or text-only labels on the Fairlight editor. Legacy Fairlight users can now use the traditional — and perhaps more familiar — Fairlight layout. Moving around the timeline is even quicker with added support for “media left” and “media right” selection keys to jump the playhead forward and back.

This update also improves editing in Resolve. Loading and switching timelines on the edit page is now faster, with improved performance when working with a large number of audio tracks. Compound clips can now be made from in and out points so that editors can be more selective about which media they want to see directly in the edit page. There is also support for previewing timeline audio when performing live overwrites of video-only edits. Now when trimming, the duration will reflect the clip duration as users actively trim, so they can set a specific clip length. Support for a change transition duration dialogue.

The media pool now includes metadata support for audio files with up to 24 embedded channels. Users can also duplicate clips and timelines into the same bin using copy and paste commands. Support for running the primary DaVinci Resolve screen as a window when dual-screen mode is enabled. Smart filters now let users sort media based on metadata fields, including keywords and people tags, so users can find the clips they need faster.

The-Artery sees red, creates VFX for Huawei’s AppGallery

The-Artery recently worked on a global campaign for agency LH in Israel, and consumer electronics brand Huawei’s official app distribution platform, AppGallery.

The campaign — set to an original musical track called Explore It by artist Tomer Biran — is meant to show the AppGallery as more than a mobile app store, but rather as a gate to an endless world of digital content that comes with data protection and privacy.

Each scene features the platform’s signature red square logo but shown in a variety of creative ways thanks to The-Artery’s visual effects work. This includes floating Tetris-like cubes that change with the beat of the music, camera focuses, red-seated subway cars with a floating red cube and more.

“Director Eli Sverdlov, editor Noam Weissman and executive producer Kobi Hoffman all have distinct artistic processes that are unforgiving to conventional storytelling,” explains founder/executive creative director Vico Sharabani. “We had ongoing conversations about how to create a deeper connection between the brand and audiences. The agency, LH, gave us the freedom to really explore the fun, convenience and security behind downloading apps on the Huawei AppGallery.”

Filming took place across the globe in Kiev, Ukraine, via production company Jiminy Creative Tel Aviv, while editing, design, animation, visual effects and color grading were all done under one roof in The-Artery’s New York studio. The entire production was completed in only 16 days.

The studio used Autodesk’s Flame and 3DS Max, Side Effects Houdini, Adobe’s After Effects and Photoshop for the visual effects and graphics. Colorist: Steve Picano called on Blackmagic’s DaVinci Resolve. Asaf Bitton provided sound design.

Quick Chat: Editing Leap Day short for Stella Artois

By Randi Altman

To celebrate February 29, otherwise known as Leap Day, beer-maker Stella Artois released a short film featuring real people who discover their time together is valuable in ways they didn’t expect. The short was conceived by VaynerMedia, directed by Division7s Kris Belman and cut by Union partner/editor Sloane Klevin. Union also supplied Flame work on the piece.

The film begins with the words, ”There is a crisis sweeping the nation” set on a black screen. Then we see different women standing on the street talking about how easy it is to cancel plans. “You’re just one text away,” says one. “When it’s really cold outside and I don’t want to go out, I use my dog excuse,” says another. That’s when the viewer is told, through text on the screen, that Stella Artois has set out to right this wrong “by showing them the value of their time together.”

The scene changes from the street to a restaurant where friends are reunited for a meal and a goblet of Stella after not seeing each other for a while. When the check comes the confused diners ask about their checks, as an employee explains, that the menu lists prices in minutes, and that Leap Day is a gift of 24 hours and that people should take advantage of that by “uncancelling plans.”

Prior to February 29, Stella encouraged people to #UnCancel plans and catch up with friends over a beer… paid for by the brand. Using the Stella Leap Day Fund — a $366,000 bank of beer reserved exclusively for those who spend time together (there are 366 days in a Leap Year) — people were able to claim as much as a 24-pack when sharing the film using #UnCancelPromo and tagging someone they would like to catch up with.

Editor Sloane Klevin

For the film short, the diners were captured with hidden cameras. Union editor Klevin, who used an Avid Media Composer 2018.12.03 with EditShare storage, was tasked with finding a story in their candid conversations. We reached out to her to find out more about the project and her process.

How early did you get involved in this project, and what kind of input did you have?
I knew I was probably getting the job about a week before they shot. I had no creative input into the shoot; that really only happens when I’m editing a feature.

What was your process like?
This was an incredibly fast turnaround. They shot on a Wednesday night, and it was finished and online the following Wednesday morning at 12am.

I thought about truncating my usual process in order to make the schedule, but when I saw their shooting breakdown for how they planned to shoot it all in one evening, I knew there wouldn’t be a ton of footage. Knowing this, I could treat the project the way I approach most unscripted longform branded content.

My assistant, Ryan Stacom, transcoded and loaded the footage into the Avid overnight, then grouped the four hidden cameras with the sound from the hidden microphones — and, brilliantly, production had time-of-day timecode on everything. The only thing that was tricky was when two tables were being filmed at once. Those takes had to be separated.

The Simon Says transcription software was used to transcribe the short pre and post interviews we had, and Ryan put markers from the transcripts on those clips so I could jump straight to a keyword or line I was searching for during the edit process. I watched all the verité footage myself and put markers on anything I thought was usable in the spot, typing into the markers what was said.

How did you choose the footage you needed?
Sometimes the people had conversations that were neither here nor there, because they had no idea they were being filmed, so I skipped that stuff. Also, I didn’t know if the transcription software would be accurate with so much background noise from the restaurant on the hidden table microphones, so markering myself seemed the best option. I used yellow markers for lines I really liked, and red for stuff I thought we might want to be able to find and audition, but those wasn’t necessarily my selects. That way I could open the markers tool, and read through my yellow selects at a glance.

Once I’d seen everything, I did a music search of Asche & Spencer’s incredibly intuitive, searchable music library website, downloaded my favorite tracks and started editing.  Because of the fast turnaround, the agency was nice enough to send an outline for how they hoped the material might be edited. I explored their road map, which was super helpful, but went with my gut on how to deviate. They gave me two days to edit, which meant I could post for the director first and get his thoughts.

Then I spent the weekend playing with the agency and trying other options. The client saw the cut and gave notes on both days I was with the agency, then we spent Monday and Tuesday color correcting (thanks to Mike Howell at Color Collective), reworking the music track, mixing (with Chris Afzal at Wave Studios), conforming, subtitling.

That was a crazy fast turnaround.
Considering how fast the turnaround was, it went incredibly smoothly. I attribute that to the manageable amount of footage, fantastic casting that got us really great reactions from all the people they filmed, and the amount of communication my producer at Union and the agency producer had in advance.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

 

Western Digital intros WD Gold NVMe SSDs  

Western Digital has introduced its new enterprise-class WD Gold NVMe SSDs designed to help small- and medium-sized companies transition to NVMe storage. The SSDs offer power loss protection and high performance with low latency.

The WD Gold NVMe SSDs will be available in four capacities — .96TB, 1.92TB, 3.84TB and 7.68TB — in early Q2 of this year. The WD Gold NVMe SSD is designed to be, according to the company, “the primary storage in servers delivering significantly improved application responsiveness, higher throughput and greater scale than existing SATA devices for enterprise applications.”

These new NVMe SSDs complement the recently launched WD Gold HDDs by providing a high-performance storage tier for applications and data sets that require low latency or high throughput.

The WD Gold NVMe SSDs are designed using Western Digital’s silicon-to-system technology, from its 3D TLC NAND SSD media to its purpose-built firmware and own integrated controller. The drives give users peace of mind knowing they’re protected against power loss and that data paths are safe. Secure boot and secure erase provide users with additional data-management protections, and the devices come with an extended five-year limited warranty.

Krista Liney directs Ghost-inspired promo for ABC’s The Bachelor

Remember the Ghost-inspired promo for ABC’s The Bachelor, which first aired during the 92nd Academy Awards telecast? ABC Entertainment Marketing developed the concept and wrote the script, which features current Bachelor lead Peter Weber in a send-up of the iconic pottery scene in Ghost between Demi Moore and Patrick Swayze. It even includes the Righteous Brothers song, Unchained Melody, which played over that scene in the film.

ABC Entertainment Marketing tapped Canyon Road Films to produce and Krista Liney to direct. Liney captured Peter taking off his shirt, sitting down at the pottery wheel and “getting messy” — a metaphor for how messy his journey to love has been. As he starts to mold the clay, he is joined by one set of hands, then another and another. As the clay collapses, Whoopi Goldberg appears to say, “Peter, you in danger, boy” – a take-off of the line she delivers to Moore’s character in the film.

This marks Liney’s first shoot as a newly signed director coming on board at Canyon Road Films, a Los Angeles-based creative production company that specializes in television promos and entertainment content.

Liney has a perspective from the side of the client and the production house, having previously served as a marketing executive on the network side. “With promos, I aim to create pieces that will cut through the clutter and command attention,” she explains. “For me, it’s all about how I can best build the anticipation and excitement within the viewer.”

The piece was shot on an ARRI Alexa Mini with Primes and Optimo lenses. ABC finished the spot in-house.

Other credits include EP Lara Wickes and DP Eric Schmidt.

Sonnet intros USB to 5GbE adapter for Mac, Windows and Linux

Sonnet Technologies has introduced the Solo5G USB 3 to 5Gb Ethernet (5GbE) adapter. Featuring NBASE-T (multigigabit) Ethernet technology, the Sonnet Solo5G adapter adds 5GbE and 2.5GbE network connectivity to an array of computers, allowing for superfast data transfers over the existing Ethernet network cabling infrastructure found in most buildings today.

Measuring 1.5 inches wide by 3.25 inches deep by 0.7 inches tall, the Solo5G is a compact, fanless 5GbE adapter for Mac, Windows and Linux computers. Equipped with an RJ45 port, the adapter supports 5GbE and 2.5GbE (5GBASE-T and 2.5GBASE-T, respectively) connectivity via common Cat 5e (or better) copper cabling at distances of up to 100 meters. The adapter’s USB port connects to a USB-A, USB-C or Thunderbolt 3 port on the computer and is bus-powered for convenient, energy-efficient and portable operation.

Cat 5e and Cat 6 copper cables — representing close to 100% of the installed cable infrastructure in enterprises worldwide — were designed to carry data at only up to 1Gb per second. NBASE-T Ethernet was developed to boost the speed capability well beyond that limit. Sonnet’s Solo5G takes advantage of that technology.

When used with a multigigabit Ethernet switch or a 10Gb Ethernet switch with NBASE-T support — including models from Buffalo, Cisco, Netgear, QNAP, TrendNet and others — the Sonnet adapter delivers performance gains from 250% to 425% the speed of Gigabit Ethernet without a wiring upgrade. When connecting to a multigigabit Ethernet-compatible switch is not possible, the Solo5G also supports 1Gb/s and 100Mb/s link speeds.

Sonnet’s Solo5G includes 0.5-meter USB-C to USB-C and USB-C to USB-A cables for connecting the adapter to the computer, saving users the expense of buying a second cable when needed.

The Solo5G USB-C to 5 Gigabit Ethernet adapter is available now for $79.99.

DP Chat: Carnival Row cinematographer Chris Seager

By Randi Altman

For British DP Chris Seager, BSC, his path to movies and television began at film school. After graduation, he found his way to BBC Television’s film department, working on docs and TV movies, including John Schlesinger’s Cold Comfort Farm, starring Kate Beckinsale, Ian McKellen and Rufus Sewell. Soon after, he left the BBC and started his career as a freelance cinematographer.

Chris Seager

Seager’s CV is long, and includes the films A Kind of Murder, Retreat and New in Town and series such as The Alienist, Watchmen and most recently Amazon Prime’s Carnival Row. In fact, Seager, who has been working on Season 2 of the series, received an ASC Award nomination in the non-commercial TV series category for his work on Episode 5, “Grieve No More.”

The show, which stars Orlando Bloom and Cara Delevingne, follows a variety of mythical creatures who had to flee their homeland for the big city during what looks very much like a fantastical Victorian era. As you can imagine, tensions grow between the local humans and these magical immigrants, who are forced to live in a ghetto known as Carnival Row.

We reached out to DP Seager to find out more about the show’s look, his workflow and what inspires him.

Can you talk about the look of the show?
There had been many discussions — between the producers, writers, Legendary, Amazon, the production designer, costume designer, etc. — before I came on board. The production design team had produced some very fine concept drawings that firmly put the show in a, shall we say, “Victorian period.”

That decision led everyone to research that period, so shape, color and design of buildings, sets, costumes and practical lights. For me, that meant the use of candles, oil and gas lamps and the warmth they generated in terms of color and quality of the emitted light from each of those sources. The variety of locations — from The Row exteriors to government buildings, a brothel, bars, upper class establishments and more — gave me many opportunities to use light sources to full effect.

The nighttime streets in The Row showed dark seedy corners and alleyways intermixed with orange dancing flames from braziers, with the street people warming their hands. The streets were awash with rain and mud, horses and carriages, humans, faes and pucks, all fighting their way through the smoke from the fires. It was backlit with a sultry greenish moonlight that gave the cinematic images that brought the viewer into the period. Daylight was slightly cold and threatening and was mixed with the oil lamp warmth on interiors.

How did the showrunners/producers tell you what they wanted for the look?
It all starts with the scripts. I’m a firm believer that it is the words that conjure up the emotions within the script, and in turn they are echoed in the cinematography. And not only the cinematography but the production design, the costumes the makeup, the visual effects, the editing.

In prep, I really like to spend time with the director and production designer going through the script page by page. It’s those first conversations that begin to bring life to the script, the mood of the actors in a scene, their emotions, their fear or anger, are they happy or sad. Just gaining that information from the director plants a suggestion of the feel of that particular scene, whether it’s hard shafts of light, high sun, moody sunset, soft silky light or dark dingy light.

A mood begins to be set and a discussion will take place about the use of the camera — whether it’s still, fast-moving, reflective or perhaps angry. Then comes the choice of the lens package. There are many choices and collectively — through the collaboration with the director — a mood and style emerges, which the team can take on board.

How early do you get involved in a production?
The cheeky answer to that is, probably never early enough. In truth, it does depend on the complexity of the production. You nearly always think you need more prep time, but invariably you just about get enough. Most departments in the filmmaking process would say the same.

Certainly, prep time is very important. It’s when you formulate the style, look and feeling of the piece. It’s also when you have time to meet, discuss, ask questions and get the answers that start to put shape to the project. Then you begin to plan scenes with the director and all the relevant departments that make up the team. On Carnival Row’s first season, I got six weeks of prep before we started shooting.

Can you talk about working with the show’s other departments?
One of the joys of being a cinematographer on a production is working with the other creative departments. Collectively, we are all responsible for giving the show a look. My first contact with other departments is usually the locations team and the production designer. Typically, these two teams have been busy before I arrive on the show, so some locations and set designs have already been looked at or even chosen.

During the first few days of my prep, I get up to speed quickly with their ideas and plans. This is often done with meetings with the showrunner, director, production designer, locations, 1st AD and visual effects to talk through the show’s concepts and journey. Then we discuss script requirements regarding locations or set builds and set extensions (CGI).

Do you enjoy working with the design team?
I love it. Lots of sets had to be built during the shooting of Carnival Row. Some were the mainstay sets, like the Constabulary, Spurnrose House, Balefire, Parliament and the boarding house. Then, of course, the backlot street set and numerous location sets, as well as real locations. Six stages were used at Prague’s Barrandov Studio. Discussions with the production designer were mostly about the size of a set, number of windows, entrances and ceilings, or whether to have them or not. And if you do have them, what’s their height, the set texture, color and darkness, etc. Not a day would go by without a discussion with the design team.

I would also talk the with costume and makeup departments about the colors of the costumes and hair styles, all important aspects of the show. There would be lots of “show and tell” with costumes and props. The makeup effects department was a fun place to visit. It was here where the wings of the fae were designed and built, along with the pucks’ horns and numerous dead bodies.

Can you talk about your camera team?
My A camera operator, Jakub Dvorsky, was a dream to work with. He somehow seemed to instantly understand what I would require with a shot. My gaffer, Enzo Cermak, was also exceptional, as was his team of talented friendly electricians. Thanks to Enzo’s help, I was able to effectively paint a scene.

Earlier you mentioned using candles, oil and gas lamps for lighting. Can you dive into that a bit further?
I liked the idea that the poorer streets of The Row had a cold daylight look to them, interspersed with firelight used for roasting chestnuts, cooking food or just for warmth. That, along with smoke from the fires, gave it a particular look. This cold light was used for the interiors scenes as well. For the Burgue, or well-off areas, we used warmer light with more contrast, and windows on sets were larger. This allowed more light in.

What about the nights?
Night shoots on The Row backlot were backlit or cross-lit with a blue/green moonlight, as referenced earlier. I used ¼ or ½ or sometimes full Wendy lights (tungsten) depending on distance from the set, with ½ blue and ¼ plus green gels. Invariably, if I used a ½ Wendy, one section would have ¼ or ½ diffusion filter as well as the moonlight gels. This gave me options to have a softer moonlight if needed, and I also had the ability to switch off sections of the Wendy lights to get the exposure levels that I wanted.

I would also use an LED tube balloon from a crane over the mid sections of the street set to allow a soft top moonlight. On the busy Row streets, I made use of brazier fires where street sellers cooked food. This gave me the warm light that contrasted with the moonlight. I would then add smoke and the scene would be set.

For interiors, I used a mixture of candles and oil lamps and, occasionally, gas lamps. Candles were used in the Haruspex set to great effect, and also the brothel set. Oil lamps were used in houses and the Constabulary.

Real oil lamps?
Some were real oil lamps and others were look-alike oil lamps. For these I used 650W lighting bulbs dimmed down to around 18% and then a slight flicker was added; the result was a very convincing warm glowing oil lamp look.

You mentioned that the show was shot in Prague?
Yes, Carnival Row was shot in the Czech Republic. The production base was at the historically built Barrandov Studio in Prague. Locations in and around Prague were also used.

You shot on the ARRI Alexa Mini. How did you go about choosing the right camera and lenses this project?
Our frame size was a wide-screen format 1: 2.40. and the lens package was the ARRI Master Primes. The cameras gave us 3.2K upscaled to 4K picture quality. The wide-screen format gave us the ability to use the wonderful width of the frame to our advantage. The Master Primes give us the solid look that they are known for, with solid frames with little to no distortion and with good contrast.

Why the Alexa Mini?
I’m a fan of the ARRI Alexa range of digital cameras. I was always keen to use ARRI film cameras when film was at its height. When digital cameras started to become the vogue, ARRI brought out the D21 camera. It was a heavy, rather large camera with what would be described today as having limited digital prowess. Strangely, I liked the “look” this D21 gave me and used it on quite a few TV shows up until the ARRI Alexa hit the scene.

The Alexa was a game changer for cinematographers. I believe that the Alexa Mini was designed to be used as its name suggests: as a compact camera to be used in tight corners or on weight-limited camera rigs, like Steadicam and stabilized rigs. However, it was soon being used as the studio camera on many productions, and thanks to upgrades over time, it has become my “must-have” camera. It has a wonderful look, and when used in low light, it seems to have a different life. You can push it and pull it into different exciting looks. It’s my friend.

Any scenes that you are particularly proud?
There are many, but here is an example of one such scene. It’s in Season 1, Episode 5, directed by Andy Goddard. Philo (Orlando Bloom) revisits his childhood orphanage to investigate a murder. Between me and Andy, we set up a series of shots following Philo into his old dormitory as memories of his childhood come flooding back.

With no words spoken, we tracked with him through the numerous beds in this grey stone room, with haze-filled soft light coming through tall soulless windows. This gave the room a monochrome feel — a going-back-in-time look. We then devised a camera dolly shot that tracks and pans with Philo on the dolly and moving with the camera as if he is floating. As we tracked, we panned the camera along with Philo, and that developed into a flashback of him as a child with his friends. We then cut out to a wider shot to reveal just Philo standing alone, and the flashback has disappeared. We used this technique a couple of times within those scenes and they were both telling and subtle.

 

Is this like any other project you’ve worked on?
A short answer to that is no. Working on Game of Thrones (Season 3)is probably the nearest it gets, but Carnival Row is a very different beast. Each episode had a wonderment about it and is very magical in some way. The whole series was written as a fantasy world, set in a supposedly Victorian age, with humans, pucks and fae all thrown together. It was dark, mysterious, dangerous, intriguing and exciting.

How did you become interested in cinematography?
It all started when I was 11 years old. The family TV, one late September night, suddenly went bang, with a cloud of blue smoke and a flash. I was somehow fascinated by this event, and on my 12th birthday, my parents bought me a wonderfully illustrated book on how television worked from the television studio to the home. I was then on a mission to be involved somehow in the TV/film business. Art was one of my favorite subjects at school and my art teacher encouraged me to take up photography alongside my painting.

At 18 years old, off I went to art school to study photography. I enjoyed my first year, but I somehow became more interested in the team-oriented film and TV crowd. I moved from photography to cinematography, and the rest is history.

What inspires you artistically?
The obvious answer to that is art. Paintings inspire me. They always have. The way an artist uses light, shape, form, darkness, color, technique, composition, aspect ratio and sheer size or smallness of a canvas, how depth is created, senses of emotion, fear, happiness. Photography equally inspires me. Black and white versus color and that “decisive moment” when the shot is taken, a magnificent moment.

How do you stay on top of new technology?
Advancing technology comes at you from seemingly every direction today. The speed of that advancement during the 20th and 21st centuries is outstanding. I started shooting film at school with a clockwork wind-up 16mm Bolex camera with just two lenses, and look where we are now. I love the technical revolution. It’s important to embrace it, otherwise it overtakes you.

I seem to always be reading trade magazines to see the new developments. It’s also important to talk with other cinematographers to discuss their views and experiences on new technology.

Looking back, what technology has changed the way you work?
I suppose the biggest game changer, apart from digital cameras, is the advancement in LED light fixtures. For me, to be able to use light fixtures like the ARRI SkyPanel LED range — it offers low power and low-output heat, bi-color capabilities, dimming and effects… plus it has firmware that lets you produce gel colors across the spectrum — is just awesome. Camera and grip technology have also changed. The use of high-quality/small-footprint drones are an example, along with telescopic cranes with stabilized heads and cable camera systems.

Digital cameras have advanced over the last few years too, with 4K and 6K capability along with ISO changes from the base ISO of 800 to 2500 and 5000. There’s also the on-set color grading facility that enables the cinematographer to put his/her “look” on to the dailies.

What are some of your best practices you try to follow on each job?
For me, being well prepped and turning up on set early every day is a must for me. I have that nervous adrenaline hit within me at the start of the shooting day. A mixture of excitement and just nerves, which is the way I am. As soon as the rehearsal starts, I’m calm… well, mostly.

Getting into the director’s head [so to speak] is also important for me. Finding out what they like or dislike. We have to be a team and that includes the 1st AD, production designer, operator and many more. It’s important to remember that filmmaking is a team effort and I,  for one, encourage input from my team.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Foundry Nuke 12.1 offers upgrades across product line

Foundry has released Nuke 12.1, with UI enhancements and tool improvements across the entire Nuke family. The largest update to Blink and BlinkScript in recent years improves Cara VR node performance and introduces new tools for developers, while extended functionality in the timeline-based applications speeds up and enriches artist and team review.

Here are the upgrade highlights:
– New Shuffle node updates the classic checkboxes with an artist-friendly, node-based UI that supports up to eight channels per layer (Nuke’s limit) and consistent channel ordering, offering a more robust tool set at the heart of Nuke’s multi-channel workflow.
– Lens Distortion Workflow improvements: The LensDistortion node in NukeX is updated to have a more intuitive workflow and UI, making it easier and quicker to access the faster and more accurate algorithms and expanded options introduced in Nuke 11.
– Blink and BlinkScript improvements: Nuke’s architecture for GPU-accelerated nodes and the associated API can now store data on the GPU between operations, resulting in what Foundry says are “dramatic performance improvements to chains of nodes with GPU caching enabled.” This new functionality is available to developers using BlinkScript, along with bug fixes and a debug print out on Linux.
– Cara VR GPU performance improvements: The Cara VR nodes in NukeX have been updated to take advantage of the new GPU-caching functionality in Blink, offering performance improvements in viewer processing and rendering when using chains of these nodes together. Foundry’s internal tests on production projects show rendering time that’s up to 2.4 times faster.
– Updated Nuke Spherical Transform and Bilateral: The Cara VR versions of the Spherical Transform and Bilateral nodes have been merged with the Nuke versions of these nodes, adding increased functionality and GPU support in Nuke. Both nodes take advantage of the GPU performance improvements added in Nuke 12.1. They are now available in Nuke and no longer require a NukeX license.
– New ParticleBlinkScript node: NukeX now includes a new ParticleBlinkScript node, allowing developers to write BlinkScripts that operate on particles. Nuke 12.1 ships with more than 15 new gizmos, offering a starting point for artists who work with particle effects and developers looking to use BlinkScript.
– QuickTime audio and surround sound support: Nuke Studio, Hiero and HieroPlayer now support multi-channel audio. Artists can now import Mov containers holding audio on Linux and Windows without needing to extract and import the audio as a separate Wav file.

– Faster HieroPlayer launch and Nuke Flipbook integration: Foundry says new instances of HieroPlayer launch 1.2 times faster on Windows and up to 1.5 times faster on Linux in internal tests, improving the experience for artists using HieroPlayer for review. With Nuke 12.1, artists can also use HieroPlayer as the Flipbook tool for Nuke and NukeX, giving them more control when comparing different versions of their work in progress.
– High DPI Windows and Linux: UI scaling when using high-resolution monitors is now available on Windows and Linux, bringing all platforms in line with high-resolution display support added for macOS in Nuke 12.0 v1.
– Extended ARRI camera support: Nuke 12.1 adds support for ARRI formats, including Codex HDE .arx files, ProRes MXFs and the popular Alexa Mini LF. Foundry also says there are performance gains when debayering footage on CUDA GPUs, and there’s an SDK update.

Review: Loupedeck+ for Adobe’s Creative Cloud — a year later

By Mike McCarthy

It has been a little over a year since Loupedeck first announced support for Adobe’s Premiere Pro and After Effects thanks to its Loupedeck+ hardware interface panel. As you might know, Loupedeck was originally designed for Adobe Lightroom users. When Loupedeck was first introduced, I found myself wishing there was something similar for Premiere, so I was clearly pleased when that became a reality.

I was eager to test it and got one before starting editorial on a large feature film back in January. While I was knee-deep in the film, postPerspective’s Brady Betzel wrote a thorough review of the panel and how to use it in Premiere and Lightroom. My focus has been a bit different, working to find a way to make it a bit more user-friendly and looking for ways to take advantage of the tool’s immense array of possible functions.

Loupedeck+

I was looking forward to using the panel on a daily basis while editing the film (which I can’t name yet, sorry) because I would have three months of consistent time in Premiere to become familiar with it. The assistant editor on the film ordered a Loupedeck+ when he heard I had one. To our surprise, both of the panels sat idle for most of the duration of that project, even though we were using Premiere for 12 to 16 hours a day. There are a few reasons fo that from my own experience and perspective:

1) Using Premiere Pro 12 involved a delay — which made the controls, especially the dials, much less interactive — but that has been solved in Premiere 13. Unfortunately, we were stuck in version 12 on the film for larger reasons.

2) That said, even in Premiere 13, every time you rotate the dials, it sends a series of individual commands to Premiere that fills up your actions history with one or two adjustments. Pressing a dial resets its value to the default, which alleviates the need to undo that adjustment, but what about the other edit I just made before that? Long gone by that point. If you are just color correcting, this limitation isn’t an issue, but if you are alternating between making color adjustments and other fixes as you work through a sequence, this is a potential problem.

Loupedeck+

A solution? Limit each adjustment so that it’s seen as a single action until another value is adjusted or until a second or two go by — similar in principle to linear keyframe thinning, when you use sliders to make audio level adjustments.

3) Lastly, there was the issue of knowing what each button and dial would do, since there are a lot of them (40 buttons and 14 dials), and they are only marked for their functionality in Lightroom. I also couldn’t figure out how to map it to the functions I wanted to use the most. (The intrinsic motion effect values.)

The first issue will solve itself as I phase out Premiere 12 once this project is complete. The second could be resolved by some programming work by Loupedeck or Adobe, depending on where the limitations lie. Also, adding direct access to more functions in the Loupedeck utility would make it more useful to my workflows — specifically, the access to the motion effect values. But all of that hinges on me being able to remember the functions associated with each control, and those functions being more efficient than doing it with my mouse and keyboard.=

What solution works for you?

Dedicated Interface v. Mouse/Keyboard
The Loupedeck has led to a number of interesting debates about the utility of a dedicated interface for editing compared to a mouse and/or keyboard. I think this is a very interesting topic, as the interface between the system and the user is the heart of what we do. The monitor(s) and speakers are the flip side of that interface, completing the feedback loop. While I have little opinion on speakers because most of my work is visual, I have always been very into having the “best” monitorsolutions and figuring out exactly what “best” means.

It used to mean two 24-inch WUXGA panels, and then it meant a 30-inch LCD. Then I discovered that two 30-inch LCDs were too much for me to use effectively. Similarly, 4K had too many pixels for a 27-inch screen in Windows 7. An ultrawide 34-inch 3440×1440 is my current favorite, although my 32-inch 8K display is starting to grow on me now that Windows 10 can usually scale content on it smoothly.

Our monitor is how our computer communicates with us, and the mouse and keyboard are how we communicate with it. The QWERTY keyboard is a relic from the typewriter era, designed to be inefficient, to prevent jamming the keys. Other arrangements have been introduced but have not gained widespread popularity. The mouse is a much more flexible analog input device for giving more nuanced feedback. (Keys are only on or off, no in-between.) But it is not as efficient at discrete tasks as a keyboard shortcut, provided that you can remember it.

Keyboard shortcuts

This conundrum has led to debates about the best or most efficient way of controlling applications on the system, and editors have some pretty strong opinions on the matter. I am not going to settle it once and for all, but I am going to attempt to step back and look at the bigger picture. Many full-time operators who have become accustomed to their applications are very fast using their keyboard shortcuts, and Avid didn’t even support mouse editing on the timeline until a few years ago. This leads many of those operators to think that keyboard shortcuts are the most efficient possible method of operating, dismissing the possibility that there might be better solutions. But I am confident that for people starting from scratch, they could be at least as efficient, if not more so, using an interface that was actually designed for what they are doing.

Loupedeck is by no means the first or only option in that regard. I have had a Contour Shuttle Pro 2 for many years and have used it on rare occasions for certain highly repetitive tasks. Blackmagic sells a number of physical interface options for Resolve, including its new editing keyboard, and there have been many others for color correction, which is the focus of the Loupedeck’s design as well.

Shuttle Pro 2

Many people also use tablets or trackballs as a replacement for the mouse, but that usually is more about ergonomics and doesn’t compete with keyboard functionality. These other dedicated interfaces are designed to replace some of the keyboard and mouse functionality, but none of them totally replace the QWERTY keyboard, as we will still have to be able to type, to name files, insert titles, etc. But that is what a keyboard is designed to do, compared to pressing spacebar for playback or CTRL+K to add a layer slice. These functions are tasks that have been assigned to the keyboard for convenience, but they are not intrinsically connected with them.

There is no denying the keyboard is a fairly flexible digital input tool, consistently available on nearly all systems and designed to give your fingers lots of easily accessible options. Editors are hardly the only people repurposing it or attempting to use it to maximize efficiency in ways it wasn’t originally designed for. Gamers wear out their WASD keys because their functionality has nothing to do with their letter values and is entirely based on their position on the board. And while other interfaces have been marketed, most gamers are still using a QWERTY keyboard and mouse as their primary physical interface. People are taught the QWERTY keyboard from an early age to develop unconscious muscle memory and, ideally, to allow them to type as they think.

QWERTY keyboard

Once those unconscious links are developed, it is relatively easy to repurpose them for other uses. You think “T” and you press it without thinking about where it is. This is why the keyboard is so efficient as an input device, even outside of the tasks it was originally designed for. But what is preventing people from becoming as efficient with their other physical interfaces? Time with the device and good design are required. Controls have to be able to be identified by touch, without looking, to make that unconscious link possible, which is the reason for the bumps on your F and J keys. But those mental finger mappings may compete with your QWERTY muscle memory, which you are still going to need to be an effective operator, so certain people might be better off sticking with that.

If you are super-efficient with your keyboard shortcuts, and they do practically everything you need, then you are probably not in the target market for the Loupedeck or other dedicated interfaces. If you aren’t that efficient on your keyboard, or you do more analog tasks (color correction) that don’t take place with the discrete steps provided by a keyboard, then a dedicated interface might be more attractive to you. Ironically, my primary temp color tool on my recent film was Lumetri curves, which aren’t necessarily controlled by the Loupedeck.

 

Mike’s solution

That was more about contrast because “color” isn’t really my thing, but for someone who uses those tools that the Loupedeck is mapped to, I have no doubt the Loupedeck would be much faster than using mouse and keyboard for those functions. Mapping the dials to the position, scale and opacity values would improve my workflow, and that currently works great in After Effects, especially in 3D, but not in Premiere Pro (yet). Other functions like slipping and sliding clips are mapped to the Loupedeck dials, but they are not marked, making them very hard to learn. My solution to that is to label them.

Labeling the Loupedeck
I like the Loupedeck, but I have trouble keeping track of the huge variety of functions available, with four possible tasks assigned to each dial per application. Obviously, it would help if the functions were fairly consistent across applications, but currently, by default, they are not. There are some simple improvements that can be made, but not all of the same functions are available, even between Premiere and After Effects. Labeling the controls would be helpful, even just in the process of learning them, but they change between apps, so I don’t want to take a sharpie to the console itself.

Loupedeck CT

The solution I devised was to make cutouts, which can be dropped over the controls, with the various functions labeled with color-coded text. There are 14 dials, 40 buttons and four lights that I had to account for in the cutout. I did separate label patterns for Premiere, After Effects and Photoshop. They were initially based on the Loupedeck’s default settings for those applications, but I have created custom cutouts that have more consistent functionality when switching between the various apps.

Loupedeck recently introduced the new Loupedeck CT (Creative Tool), which is selling for $550. At more than twice the price, it is half the size and labels the buttons and dials with LCD screens that change to reflect the functions available for whatever application and workspace you might be in. This offers a similar but static capability to the much larger set of controls available on the cheaper Loupedeck+.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

The Call of the Wild director Chris Sanders on combining live-action, VFX

By Iain Blair

The Fox family film The Call of the Wild, based on the Jack London tale, tells the story of  a big-hearted dog named Buck whose is stolen from his California home and transported to the Canadian Yukon during the Gold Rush. Director Chris Sanders called on the latest visual effects and animation technology to bring the animals in the film to life. The film stars Harrison Ford and is based on a screenplay by Michael Green.

Sanders’ crew included two-time Oscar–winning cinematographer Janusz Kaminski; production designer Stefan Dechant; editors William Hoy, ACE, and David Heinz; composer John Powell; and visual effects supervisor Erik Nash.

I spoke with Sanders — who has helmed the animated films Lilo & Stitch, The Croods and How to Train Your Dragon — about making the film, which features a ton of visual effects.

You’ve had a very successful career in animation but wasn’t this a very ambitious project to take on for your live-action debut?
It was. It’s a big story, but I felt comfortable because it has such a huge animated element, and I felt I could bring a lot to the party. I also felt up to the task of learning — and having such an amazing crew made all of that as easy as it could possibly be.

Chris Sanders on set.

What sort of film did you set out to make?
As true a version as we could tell in a family-friendly way. No one’s ever tried to do the whole story. This is the first time. Before, people just focused on the last 30 pages of the novel and focused on the relationship between Buck and John Thornton, played by Harrison. And that makes perfect sense, but what you miss is the whole origin story of how they end up together — how Buck has to learn to become a sled dog, how he meets the wolves and joins their world. I loved all that, and also all the animation needed to bring it all alive.

How early on did you start integrating post and all the visual effects?
Right away, and we began with previs.

Your animation background must have helped with all the previs needed on this. Did you do a lot of previs, and what was the most demanding sequence?
We did a ton. In animation it’s called layout, a rough version, and on this we didn’t arrive on set without having explored the sequence many times in previs. It helped us place the cameras and block it all, and we also improvised and invented on set. But previs was a huge help with any heavy VFX element, like when Thornton’s going down river. We had real canoes in a river in Canada with inertial measurement devices and inertial recorders, and that was the most extensive recording we had to do. Later in post, we had to replace the stuntman in the canoe with Thornton and Buck in an identical canoe with identical movements. That was so intensive.

 

How was it working with Harrison Ford?
The devotion to his craft and professionalism… he really made me understand what “preparing for a role” really means, and he really focused on Thornton’s back story. The scene where he writes the letter to his wife? Harrison dictated all of that to me and I just wrote it down on top of the script. He invented all that. He did that quite a few times. He made the whole experience exciting and easy.

The film has a sort of retro look. Talk about working with DP Janusz Kaminski.
We talked about the look a lot, and we both wanted to evoke those old Disney films we saw as kids —something very rich with a magical storybook feel to it. We storyboarded a lot of the film, and I used all the skills I’d learned in animation. I’d see sequences a certain way, draw it out, and sometimes we’d keep them and cut them into editorial, which is exactly what you do in animation.

How tough was the shoot? It must have been quite a change of pace for you.
You’re right. It was about 50 days, and it was extremely arduous. It’s the hardest thing I’ve ever done physically, and I was not fully prepared for how exhausted you get — and there’s no time to rest. I’d be driving to set by 4:30am every day, and we’d be shooting by 6am. And we weren’t even in the Yukon — we shot here in California, a mixture of locations doubling for the Yukon and stage work.

 

Where did you post?
All on the Fox lot, and MPC Montreal did all the VFX. We cut it in relatively small offices. I’m so used to post, as all animation is basically post. I wish it was faster, but you can’t rush it.

You had two editors — William Hoy and David Heinz. How did that work?
We sent them dailies and they divided up the work since we had so much material. Having two great voices is great, as long as everyone’s making the same movie.

What were the big editing challenges?
The creative process in editorial is very different from animation, and I was floored by how malleable this thing was. I wasn’t prepared for that. You could change a scene completely in editorial, and I was blown away at what they could accomplish. It took a long time because we came back with over three hours of material in the first assembly, and we had to crush that down to 90 minutes. So we had to lose a huge amount, and what we kept had to be really condensed, and the narrative would shift a lot. We’d take comedic bits and make them more serious and vice versa.

Visual effects play a key role. Can you talk about working on them with VFX supervisor Erik Nash.
I love working with VFX, and they were huge in this. I believe there are less than 30 shots in the whole film that don’t have some VFX. And apart from creating Buck and most of the other dogs and animals, we had some very complex visual effects scenes, like the avalanche and the sledding sequence.

L-R: Director Chris Sanders and writer Iain Blair

We had VFX people on set at all times. Erik was always there supervising the reference. He’d also advise us on camera angles now and then, and we’d work very closely with him all the time. The cameras were hooked up to send data to our recording units so that we always knew what lens was on what camera at what focal length and aperture, so later the VFX team knew exactly how to lens the scenes with all the set extensions and how to light them.

The music and sound also play a key role, especially for Buck, right?
Yes, because music becomes Buck’s voice. The dogs don’t talk like they do in Lion King, so it was critical. John Powell wrote a beautiful score that we recorded on the Newman Stage at Fox, and then we mixed at 5 Cat Studios.

Where did you do the DI, and how important is it to you?
We did it at Technicolor with colorist Mike Hatzer, and I’m pretty involved. Janusz did the first pass and set the table, and then we fine-tuned it, and I’m very happy with the rich look we got.

Do you want to direct another live-action film?
Yes. I’m much more comfortable with the idea now that I know what goes into it. It’s a challenge, but a welcome one.

What’s next?
I’m looking at all sorts of projects, and I love the idea of doing another hybrid like this.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Blue Bolt VFX supervisor Richard Frazer

“If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.”

Name: Richard Frazer

Company: London’s BlueBolt

Can you describe your company?
For the last four years, I’ve worked at BlueBolt, a Soho-based visual effects company in London. We work on high-end TV and feature films, and our main area of specialty is creating CG environments and populating them. BlueBolt is a privately owned company run by two women, which is pretty rare. They believe in nurturing good talent and training them up to help break through the glass ceiling, if an artist is up for it.

What’s your job title?
I joined as a lead compositor with a view to becoming a 2D supervisor, and now I am one of the studio’s main core VFX supervisors.

What does that entail?
It means I oversee all of the visual effects work for a specific TV show or movie — from script stage to final delivery. That includes working with the director and DP in preproduction to determine what they would like to depict on the screen. We then work out what is possible to shoot practically, or if we need to use visual effects to help out.

I’ll then often be on the set during the shoot to make sure we correctly capture everything we need for post work. I’ll work with the VFX producer to calculate the costs and time scales of the VFX work. Finally, I will creatively lead our team of talented artists to create those rendered images and make sure it all fits in with the show in a visually seamless way.

What would surprise people the most about what falls under that title?
The staggering amount of time and effort involved by many talented people to create something that an audience should be totally unaware exists. If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.

How long have you been working in VFX?
For around a decade. I started out as a rotoscope artist in 2008 and then became a compositor. I did my first supervisor job back in 2012.

How has the VFX industry changed in the time you’ve been working?
A big shift has been just how much more visual effects work there is on TV shows and how much the standard of that work has improved. It used to be that TV work was looked down on as the poor cousin of feature film work. But shows like Game of Thrones have set audience expectations so much higher now. I worked on nothing but movies for the first part of my career, but the majority of my work now is on TV shows.

Did a particular film inspire you along this path in entertainment?
I grew up on ‘80s sci-fi and horror, so movies like Aliens and The Thing were definitely inspirations. This was back when effects were almost all done practically, so I wanted to get into model-making or prosthetics. The first time I remember being blown away by digital VFX work was seeing Terminator 2 at the cinema. I’ve ended up doing the type of work I dreamed of as a kid, just in a digital form.

Did you go to film school?
No, I actually studied graphic design. I worked for some time doing animation, video editing and motion graphics. I taught myself compositing for commercials using After Effects. But I always had a love of cinema and decided to try and specializing in this area. Almost all of what I’ve learned has been on the job. I think there’s no better training than just throwing yourself at the work, absorbing everything you can from the people around you and just being passionate about what you do.

What’s your favorite part of the job?
Each project has its own unique set of challenges, and every day involves creative problem-solving. I love the process of translating what only exists in someone’s imagination and the journey of creating those images in a way that looks entirely real.

I also love the mix of being at the offices one day creating things that only exist in a virtual world, while the next day I might be on a film set shooting things in the real world. I get to travel to all kinds of random places and get paid to do so!

What’s your least favorite?
There are so many moving parts involved in creating a TV show or movie — so many departments all working together trying to complete the task at hand, as well as factors that are utterly out of your control. You have to have a perfectly clear idea of what needs to be done, but also be able to completely scrap that and come up with another idea at a moment’s notice.

If you didn’t have this job, what would you be doing instead?
Something where I can be creative and make things that physically exist. I’m always in awe of people who build and craft things with their hands.

Can you name some recent projects you have worked on?
Recent work has included Peaky Blinders, The Last Kingdom and Jamestown, as well as a movie called The Rhythm Section.

What is the project that you are most proud of?
I worked on a movie called Under the Skin a few years ago, which was a very technically and creatively challenging project. It was a very interesting piece of sci-fi that people seem to either love or hate, and everyone I ask seems to have a slightly different interpretation of what it was actually about.

What tools so you use day to day?
Almost exclusively Foundry Nuke. I use it for everything from drawing up concepts to reviewing artists’ work. If there’s functionality that I need from it that doesn’t exist, I’ll just write Python code to add features.

Where do you find inspiration now?
In the real world, if you just spend the time observing it in the right way. I often find myself distracted by how things look in certain light. And Instagram — it’s the perfect social media for me, as it’s just beautiful images, artwork and photography.

What do you do to de-stress from it all?
The job can be quite mentally and creatively draining and you spend a lot of time in dark rooms staring at screens, so I try to do the opposite of that. Anything that involves being outdoors or doing something physical — I find cycling or boxing are good ways to unwind.

I recently went on a paragliding trip in the French Alps, which was great, but I found myself looking at all these beautiful views of sunsets over mountains and just analyzing how the sunlight was interacting with the fog and the atmospheric hazing. Apparently, I can never entirely turn off that part of my brain.

Kent Zambrana joins design house ATK PLN as senior producer

Dallas-based design studio ATK PLN has added Kent Zambrana as senior producer. Zambrana has over a decade of experience in production, overseeing teams of artists working on live action, animation and design projects. Over the years, he has worked at a number of creative shops across the agency and production sides of the business where he developed media campaigns for omni-channel video ecosystems, interactive projects and future tech.

Adds Zambrana, “ATK PLN’s offerings across design, animation and live action fit nicely within my skill set. I’m looking forward to leveraging my production expertise and direct-to-brand and agency perspectives to better serve their clients and continue to grow their offerings.”

Zambrana, who studied radio, television and film at the University of Texas in Austin, started his pro career in Los Angeles, where he spent four years producing work across The Simpsons properties, including the television series, theme park ride, digital platforms and promotional campaigns. He brought that experience with him and returned to Texas where the became a supervising producer at Invodo, building out the video content library of the startup’s digital video platform. He then landed as head of production, producing animation, live action, 2D and 3D work. When the company was acquired by Industrial Color in 2018, he led both the Dallas and Austin offices as senior director of production before joining The Marketing Arm to lead their in-house production shop.

How does Zambrana relax? He can be found rehearsing, recording and performing with his indie pop band, Letting Up Despite Great Faults.

Video Coverage: HPA Tech Retreat’s making of The Lost Lederhosen

By Randi Altman

At the HPA Tech Retreat in Rancho Mirage, California, the Supersession was a little different this year. Under the leadership of Joachim (JZ) Zell — who you might know from his day job as VP of technology at EFILM — the Supersession focused on the making of the short film, The Lost Lederhosen, in “near realtime,” in the desert. And postPerspective was there, camera in hand, to interview a few of the folks involved.

Watch our video coverage here.

While production for the film began a month before the Retreat — with Steve Shaw, ASC, directing and DP Roy H. Wagner Jr., ASC, lending his cinematography talents — some scenes were shot the morning of the session with data transfer taking place during lunch and post production in the afternoon. Peter Moss, ASC, and Sam Nicholson, ASC, also provided their time and expertise. After an active day of production, cloud-based post and extreme collaboration, the Supersession ended with the first-ever screening of The Lost Lederhosen, the story of Helga and her friend Hans making their way to Los Angeles, Zell and the HBA (Hollywood Beer Alliance). Check out HPA’s trailer here.

From acquisition to post (and with the use of multiple camera formats, framefrates and lenses), the film’s crew were volunteers and includes creatives and technologists from companies such as AWS, Colorfront, Frame.io, Avid, Blackmagic, Red, Panavision, Zeiss, FilmLight, SGO, Stargate, Unreal Engine, Sohonet and many more. One of the film’s goals was to use the cloud as much as possible in order to test out that particular workflow. While there were some minor hiccups along the way, the film got made — at the HPA Tech Retreat — and these industry pros got smarter about working in the cloud, something that will be increasingly employed going forward.

While we were were only able to chat with a handful of those pros involved, like any movie, the list of credits and thank you’s are too extensive to mention here — there were dozens of individuals and companies who donated their services and time to make this possible.

Watch our video coverage here.

(A big thank you and shout out to Twain Richardson for editing our videos.)

Main Image Caption: AWS’ Jack Wenzinger and EFILM’s Joachim Zell

Matt Shaw on cutting Conan Without Borders: Ghana and Greenland

By Randi Altman

While Conan O’Brien was airing his traditional one-hour late night talk show on TBS, he and his crew would often go on the road to places like Cuba, South Korea and Armenia for Conan Without Borders — a series of one-hour specials. He would focus on regular folks, not celebrities, and would embed himself into the local culture… and there was often some very mediocre dancing, courtesy of Conan. The shows were funny, entertaining and educational, and he enjoyed doing them.

Conan and Matt on the road.

In 2019, Conan and his crew, Team Coco, switched the nightly show from one hour to a new 30-minute format. The format change allowed them to produce three to four hour-long Conan Without Borders specials per year. Two of the places the show visited last year were Ghana and Greenland. As you might imagine, they shoot a lot of footage, which all must be logged and edited, often while on the road.

Matt Shaw is one of the editors on Conan, and he went on the road with the show when it traveled to Greenland. Shaw’s past credits include Deon Cole’s Black Box and The Pete Holmes Show (both from Conan O’Brien’s Conaco production company) and The Late Late Show with James Corden (including Carpool Karaoke). One of his first gigs for Team Coco was editing Conan Without Borders: Made in Mexico. That led to a full-time editing gig on Conan on TBS and many fun adventures.

We reached out to Shaw to find out more about editing these specials and what challenges he faced along the way.

You recently edited Conan Without Borders — the Greenland and Ghana specials. Can you talk about preparing for a job like that? What kind of turnaround did you have?
Our Ghana special was shot back in June 2019, with the original plan to air in August, but it was pushed back to November 7 because of how fast the Greenland show came up.

In terms of prep for a show like Ghana, we mainly just know the shooting specs and will handle the rest once the crew actually returns. For the most part, that’s the norm. Ideally, we’ll have a working dark week (no nightly Conan show), and the three editors — me, Rob Ashe and Chris Heller — will take the time to offload, sync and begin our first cuts of everything. We’ll have been in contact with the writers on the shoot to get an idea of what pieces were shot and their general notes from the day.

With Greenland, we had to mobilize and adjust everything to accommodate a drastically different shoot/delivery schedule. The Friday before leaving, while we were prepping the Ghana show to screen for an audience, we heard there might be something coming up that would push Ghana back. On Monday, we heard the plan was to go to Greenland on Wednesday evening, after the nightly show, and turn around Greenland in place of Ghana’s audience screening. We had to adjust the nightly show schedule to still have a new episode ready for Thursday while we were in Greenland.

How did you end up on the Greenland trip?
Knowing we’d only have six days from returning from Greenland to having to finish the show broadcast, our lead editor, Rob Ashe, suggested we send an editor to work on location. We were originally looking into sending footage via Aspera from a local TV studio in Nuuk, Greenland, but we just wouldn’t have been able to turn it around fast enough. We decided about two days before the trip began that I’d go and do what I could to offload, backup, sync and do first cuts on everything.

How much footage did you have per episode, and what did they shoot on?
Ghana had close to 17 hours of material shot over five days on Sony Z450s at 4K XAVC, 29.97. Greenland was closer to 12 hours shot over three days on Panasonic HPX 250s, P2 media recording at 1080 60i.

We also used iPhone/iPad/GoPro footage picked up by the rest of the crew as needed for both shows. I also had a DJI Osmo pocket camera to play with when I had a chance, and we used some of that footage during the montage of icebergs.

So you were editing segments while they were still shooting?
In Greenland, I was cutting daily in the hotel. Midday, I’d get a drop of cards, offload, sync/group and the first cuts on everything. We had a simple offline edit workflow set up where I’d upload my cuts to Frame.io and email my project files to the team — Rob and Chris — in Burbank. They would then download and sync the Frame.io file to a top video layer in the timeline and continue cutting down, with any additional notes from the writers.

Generally, I’d have everything from Day One uploaded by the start of Day Two, etc. It seemed to work out pretty well to set us up for success when we returned. I was also getting notes on requests to help cut a few highlights from our remotes and to put on Team Coco’s Instagram account.

On our return day, we flew to Ilulissat for an iceberg expedition. We had about two hours on the ground before having to return to the airport and fly to Kangerlussuaq, where our chartered plane was waiting to take us back to California. On the flight back, I worked for another four hours or so to sort through the remaining segments and prep everything so we could hit the ground running the following morning. During the flight home, we screened some drone footage from the iceberg trip for Conan, and it really got everyone excited.

What are the challenges of working on the road and with such tight turnarounds?
The night we left for Greenland was preceded by a nightly show in Burbank. After the show ended, we hopped on a plane to fly eight hours to Kangerlussuaq for customs, then another to Nuuk. The minute we landed, we were filming for about three hours before checking into the hotel. I grabbed the morning’s camera cards, went to my room and began cutting. By the time I went to bed, I had cuts done of almost everything from the first day. I’m a terrible sleeper on planes, so the marathon start was pretty insane.

Outside of the little sleep, our offload speeds were slower because we were using different cameras than usual — for the sake of traveling lighter — because the plane we flew in had specific weight restrictions. We actually had to hire local crew for audio and B and C camera because there wasn’t enough room for everyone in the plane to start.

In general, I think the overall trip went as smooth as it could have. It would be interesting to see how it would play out for a longer shoot schedule.

What editing system did you use? What was your setup like? What kind of storage were you using?
On the road I had my MacBook Pro (2018 model), and we rented an identical backup machine in case mine died. For storage, we had four 1TB G-Tech USB-C drives and a 4TB G-RAID to back everything up. I had a USB-3.0 P2 card reader as well and multiple backup readers. A Bluetooth mouse and keyboard rounded out the kit, so I could travel with everything in a backpack.

We had to charter a plane in order to fly directly to Greenland. With such a tight turnaround between filming and delivering the actual show, this was the only way to actually make the special happen. Commercial flights fly only a few days per week out of neighboring countries, and once you’re in Greenland, you either have to fly or take a boat from city to city.

Matt Shaw editing on plane.

On the plane, there was a conference table in the back, so I set up there with one laptop and the G-RAID to continue working. The biggest trouble on the plane was making sure everything stayed secure on the table while taking off and making turns. There were a few close calls when everything started to slide away, and I had to reach to make sure nothing was disconnected.

How involved in the editing is Conan? What kind of feedback did you get?
In general, if Conan has specific notes, the writers will hear them during or right after a shoot is finished. Or we’ll test-screen something after a nightly show taping and indirectly get notes from the writers then.

There will be special circumstances, like our cold opens for Comic-Con, when Conan will come to edit and screen a close-to-final cut. And there just might be a run of jokes that isn’t as strong, but he lets us work with the writers to make what we all think is the best version by committee.

Can you point to some of the more challenging segments from Greenland or Ghana?
The entire show is difficult with the delivery-time constraints while handling the nightly show. We’ll be editing the versions for screening sometimes up to 10 minutes before they have to screen for an audience as well as doing all the finishing (audio mix, color as needed, subtitling and deliverables).

For any given special, we’re each cutting our respective remotes during the day while working on any new comedy pieces for that day’s show, then one or two of us will split the work on the nightly show, while the other keeps working with the travel show writers. In the middle of it all, we’ll cut together a mini tease or an unfinished piece to play into that night’s show to promote the specials, so the main challenge is juggling 30 things at a time.

For me, I got to edit this 1980s-style action movie trailer based on an awesome poster Conan had painted by a Ghanaian artist. We had puppets built, a lot of greenscreen and a body double to composite Conan’s head onto for fight scenes. Story-wise, we didn’t have much of a structure to start, but we had to piece something together in the edit and hope it did the ridiculous poster justice.

The Thursday before our show screened for an audience was the first time Mike Sweeney (head writer for the travel shows) had a chance to look at any greenscreen footage and knew we were test-screening it the following Monday or Tuesday. It started to take shape when one of our graphics/VFX artists, Angus Lyne, sent back some composites. In the end, it came together great and killed with the audience and our staff, who had already seen anything and everything.

Our other pieces seem to have a linear story, and we try to build the best highlights from any given remote. With something like this trailer, we have to switch our thought process to really build something from scratch. In the case of Greenland and Ghana, I think we put together two really great shows.

How challenging is editing comedy versus drama? Or editing these segments versus other parts of Conan’s world?
In a lot of the comedy we cut, the joke is king. There are always instances when we have blatant continuity errors, jump cuts, etc., but we don’t have to kill ourselves trying to make it work in the moment if it means hurting the joke. Our “man on the street” segments are great examples of this. Obviously, we want something to be as polished and coherent as possible, but there are cases when it just isn’t best, in our opinion, and that’s okay.

That being said, when we do our spoofs of whatever ad or try to recreate a specific style, we’re going to do everything to make that happen. We recently shot a bit with Nicholas Braun from Succession where he’s trying to get a job from Conan during his hiatus from Succession. This was a mix of improv and scripted, and we had to match the look of that show. It turned out well and funny and is in the vein of Succession.

What about for the Ghana show?
For Ghana, we had a few segments that were extremely serious and emotional. For example, Conan and Sam Richardson visited Osu Castle, a major slave trade port. This segment demands care and needs to breathe so the weight of it can really be expressed, versus earlier in the show, when Conan was buying a Ghana shirt from a street vendor, and we hard-cut to him wearing a shirt 10 sizes too small.

And Greenland?
Greenland is a place really affected by climate change. My personal favorite segment I’ve cut on these travel specials is the impact the melting icecaps could have on the world. Then there is a montage of the icebergs we saw, followed by Conan attempting to stake a “Sold” sign on an iceberg, signifying he had bought property in Greenland for the US. Originally, the montage had a few jokes within the segment, but we quickly realized it’s so beautiful we shouldn’t cheapen it. We just let it be beautiful.

Comedy or drama, it’s really about being aware of what you have in front of you and what the end goal is.

What haven’t I asked that’s important?
For me, it’s important to acknowledge how talented our post team is to be able to work simultaneously on a giant special while delivering four shows a week. Being on location for Greenland also gave me a taste of the chaos the whole production team and Team Coco goes through, and I think everyone should be proud of what we’re capable of producing.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The Den editorial boutique launches in Los Angeles

Christjan Jordan, editor of award-winning work for clients including Amazon, GEICO and Hulu, has partnered with industry veteran Mary Ellen Duggan to launch The Den, an independent boutique editorial house in Los Angeles.

Over the course of his career, Jordan has worked with Arcade Edit, Cosmo Street and Rock Paper Scissors, among others. He has edited such spots as Alexa Loses Her Voice for Amazon, Longest Goal Celebration Ever for GEICO, #notspecialneeds for World Down Syndrome Day out of Publicis NY and Super Bowl 2020 ads Tom Brady’s Big Announcement for Hulu and Famous Visitors for Walmart. Jordan’s work has been recognized by the Cannes Lions, AICE, AICP, Clio, D&AD, One Show and Sports Emmy awards.

Yes, with Mary Ellen, agency producers are guided by an industry veteran that knows exactly what agencies and clients are looking for,” says Jordan. “And for me, I love fostering young editors. It’s an interesting time in our industry and there is a lot of fresh creative talent.”

In her career, Duggan has headed production departments at both KPB and Cliff Freeman on the East Coast and, most recently, Big Family Table in Los Angeles. In addition, she has freelanced all over the country.

“The stars aligned for Christjan and I to work together,” says Duggan. “We had known each other for years and had recently worked on a Hulu campaign together. We had a similar vision for what we thought the editorial experience should be. A high end boutique editorial that is nimble, has a roster of diverse talent, and a real family vibe.”

Veteran producer Rachel Seitel has joined as partner and head of business development. The Den will be represented by Diane Patrone at The Family on the East Coast and by Ezra Burke and Shane Harris on the West Coast.

The Den’s founding roster also features editor Andrew Ratzlaff and junior editor Hannelore Gomes. The staff works on Avid Media Composer and Adobe Premiere.

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.