AMD 2.1

Category Archives: post production

Killing Eve EP Sally Woodward Gentle talks Season 3

By Iain Blair

Killing Eve is more than just one of the most addictive spy thrillers on TV. It’s also a dark comedy, a workplace drama and a globe-trotting actioner that tells the story of two women engaged in an epic game of cat-and-mouse — Eve (Sandra Oh), head of a secret MI6 unit, and Villanelle (Jodie Comer), a beautiful, (and strangely likeable) psychopathic assassin Eve’s been tasked to track down.

Sally Woodward Gentle

The award-winning show continues the story of the two women when it returns for its third season on April 26 on both BBC America and AMC. Season 3 sees Eve back in action after having survived being shot by Villanelle in the Season 2 finale. Eve is now in Rome and her current MI6 status is in flux after being manipulated by Carolyn (Fiona Shaw).

Continuing the show’s tradition of passing the baton to a new female writing voice, for Season 3 Suzanne Heathcote serves as lead writer and executive producer, joining executive producers Sally Woodward Gentle, Phoebe Waller-Bridge (who was brought on by Gentle as head writer for Season One), Lee Morris, Gina Mingacci, Damon Thomas, Jeff Melvoin and Sandra Oh. Killing Eve is produced by Sid Gentle Films and is distributed by Endeavor Content.

I recently spoke with EP Sally Woodward Gentle — the BAFTA-winning and Emmy-nominated EP of the dramas Any Human Heart and The Durrells in Corfu — about making the show (based on the books of Luke Jennings) and her workflow.

What can you tell us about Season 3 without giving too much away?
It’s a much more emotional season. We move Eve and Villanelle’s relationship on, and we get to see more of what Villanelle is really about. At the same time, Eve is really tested. And we bring in lots of new characters, which is very exciting, and Carolyn and Konstantin (Villanelle’s handler, played by Kim Bodnia) have got huge roles to play.

The appeal of two women leads seems obvious now, but were there doubts at first having them play traditional male roles when you first optioned the Luke Jennings novellas?
Not really. In fact, it didn’t even cross my mind. I just felt that people really enjoy having a female assassin, and that it would be great to have another woman chase her. And I didn’t feel that the idea was wildly original. It just felt right, but I knew there were other female assassin shows out there, and I didn’t want people to go, “Oh, there’s La Femme Nikita.” I did feel it was time to do something bolder with it.

Is that how you decided to involve Phoebe Waller-Bridge?
Exactly. I’d read Fleabag and we’d had a meeting, and I just loved her attitude. Back then, she’d only done Fleabag and written some very clever comedy. I loved the idea of putting Luke’s novellas together with her attitude, joie de vivre and love of TV and what it could do. It didn’t feel like, “Wow, this will be earth-shatteringly different!” It just felt like something really interesting to do. Just do it and see what comes out.

You’ve executive produced all three seasons. What are the main challenges of this show?
To keep it feeling really fresh each year, and to not repeat stuff you’ve done. To examine new, different areas of emotional relationships, and to put people under different types of stress. The other big challenge is that we have to turn it around from start to finish in just one year. We have to write all the scripts, shoot them, post them and get them out there in that time. It’s really hard work, both physically and mentally, but a lot of our team’s been here since the start. They love it, and that really helps, and everyone wants to push it a bit harder every season, so we embrace all new ideas.

Is it true that when Sandra Oh was first approached, she didn’t quite believe she was being cast as Eve?
Yeah, she hadn’t pictured herself in the role, but she’s brilliant.

What do Sandra and Jodie bring to the mix?
They bring so much. We were still finishing scripts for Season 1 as we shot, so you can’t help but feed their input into the scripts, and the characters really have so much of the actors’ DNA. They just know them so well and how they’d respond.

You always use great locations. Where did you shoot Season 3?
In Spain, Romania, the UK. We get around!

Where do you post?
All at Molinare in London. We do everything from the edit to the grade, and we do all the sound at Hackenbacker, which is part of Molinare.

Do you like the post process?
I love it and really enjoy it. We have a great post supervisor, Kate Stannard, who’s been on the show since the start. The great thing about post is that you get to rewrite all the raw material and be really creative with it.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
We have a great team of editors, including Dan Crinnion, who’s been on the show since the start, and an Italian editor Simone Nesti, who does the assembly and who’s also been with us since the start. As soon as a director has finished shooting, we get right in there with the editor who does that block. We shoot in blocks of two episodes,and each block has its own editor and assistants. It’s not a huge team considering the amount of work.

The show is a real genre mash-up – thriller, comedy, action, emotional drama. How do you handle all the shifting tones?
That’s the big editing challenge and the thing we were most concerned about in Season 1  — that Eve’s and Villanelle’s two stories were too disparate to be knitted together properly. But once you start to get a feel for what the show is and what works and what doesn’t, it flows more easily. For me, if it gets too broad and it doesn’t feel truthful, that’s not good. But then it’s also a big piece of entertainment, so you can be really wild with it. We’re not saving lives, and there’s no massive message. We just want to be truthful about human behavior and be very entertaining.

There’s obviously a lot of attention also paid to the sound and the music.
Thanks for noticing. We have a great production sound team. Nigel Heath is our rerecording mixer, and our aim is not to have the dialogue too clean and out front. We like to keep a lot of texture in the background and make it feel quite immersive. Then in terms of the music, composer David Holmes is quite bold in his choices. That’s very tricky, as we try hard not to be too genre and obvious with the cues, so they don’t just reinforce the visuals and what you should feel. So at a very dark moment the score might be quite celebratory and glorious. We’re constantly trying to flip it.

What about the DI?
It’s incredibly important, and our colorist Gareth Spensley has worked on it since Season 1 so he knows the show really well, and he works very closely with our DP Julian Court, who’s done most of the episodes since the start. Sometimes we have to shoot out of sequence, at different times of year, so you have to match all that. We try to find locations that feel fresh and exciting, and then we try hard not to overstylize the look and keep all the skin tones as natural and real as possible, and then enhance the beauty of the rest of it. At the very start of the show, we thought of pushing the look to get a more “noir” look, but it just didn’t feel right, so we just leant more into the pleasure of the visuals.

How important are the Emmys to a show like this?
Hugely important, for both the show and the actors. I think it’s given us far more visibility.

I heard you already got picked up for Season 4. How far along are you with it?
We’re already writing and nearly have the whole season arc worked out, and we’ll start shooting it at the end of September. Of course, it all depends on what happens with the COVID-19 crisis, but that’s the plan.

Will you do more seasons?
I can see us going on as long as we keep refreshing it and move their relationship along. Then we have all these new characters we’ve created who’ll be there in Season 4 and beyond, so there’s plenty to explore.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Facilis updates storage, MAM offerings for remote workflows

Targeting those who are working remotely, Facilis has released of v.8 of its Hub shared storage system, v.3.5 of its FastTracker media asset management software as well as Object Cloud updates. Facilis has made these available immediately and free of charge for any eligible existing user with a current software support contract.

Hub v.8 includes:
• Bandwidth Priority delivers full throughput to all workstations during normal operation but prioritizes workstations to maintain greater throughput when the server enters a high-load condition. This priority setting is dynamic and can affect client performance within seconds of applying.
• SSD and HDD tiering offers dedicated speed for projects needing SSD-level performance, while maintaining a perpetual HDD-based mirror.
• Software-defined Multi-disk Parity can be enabled for up to four drive failures per drive group, on a virtual volume-basis. This allows owners of aging systems to better protect their assets from data loss due to drive failure.

Facilis FastTracker MAM software features file movement profiles, duplicate file reporting and a secure directory browse interface. FastTracker can now flush and pre-fetch files and folders from cloud and LTO locations through the Object Cloud feature, while reporting the status of archived media. With proxy encoding of indexed assets to an Object Cloud location, FastTracker offers compressed versions of facility media to editors working in the field.

According to the company’s COO, Shane Rodbourn, this release represents over two years of development .

AMD 2.1

COVID-19: How our industry is stepping up

We’ve been using this space to talk about how companies are discounting products, raising money, introducing technology to help with remote workflows and talking about how pros are personally pitching in.

Here are this week’s updates followed by what we’ve gathered to date:

FXhome 
To support those who are putting their lives on the line to provide care and healing to those impacted by the global pandemic, FXhome is adding Partners In Health, Doctors Without Borders and the Center for Disaster Philanthropy as new beneficiaries of the FXhome “Pay What You Want” initiative.

Pay What You Want is a goodwill program inspired by the HitFilm Express community’s desire to contribute to the future development of HitFilm Express, the company’s free video editing and VFX software. Through the initiative, users can contribute financially, and those funds will be allocated for future development and improvements to HitFilm. Additionally, FXhome is contributing a percentage of the proceeds to organizations dedicated to global causes important to the company and its community. The larger the contribution from customers, the more FXhome will donate.

Besides adding the three new health-related beneficiaries, FXhome has extended its campaign to support each new cause from one month to three months, beginning in April and running through the end of June. A percentage of all proceeds of revenues generated during this time period will be donated to each cause.

Covid-19 Film and TV Emergency Relief Fund
Created by The Film and TV Charity in close partnership with the BFI, the new COVID-19 Film and TV Emergency Relief Fund provides support to the many thousands of active workers and freelancers who have been hit hardest by the closure of productions across the UK. The fund has received initial donations totaling £2.5 million from Netflix, the BFI, BBC Studios, BBC Content, WarnerMedia and several generous individuals.

It is being administered by The Film and TV Charity, with support from BFI staff. The Film and TV Charity and the BFI is covering all overheads, enabling donations to go directly to eligible workers and freelancers across film, TV and cinema. One-off grants of between £500 and £2,500 will be awarded based on need. Applications for the one-off grants can be made via The Film and TV Charity’s website. The application process will remain open for two weeks.

The Film and TV Charity also has a new COVID-19 Film and TV Repayable Grants Scheme offering support for industry freelancers waiting for payments under the Government’s Self-employment Income Support Scheme. Interest-free grants of up to £2,000 will be offered to those eligible for Self-employment Income Support but who are struggling with the wait for payments in June. The Covid-19 Film and TV Repayable Grants Scheme opens April 15. Applicants will have one week to make a claim via The Film and TV Charity’s website.

Lenovo
Lenovo is offering a free 120-day license of Mechdyne’s TGX Remote Desktop software, which uses Nvidia Quadro GPUs and a built-in video encoder to compress and send information from the host workstation to the end-point device to decode. This eliminates lag on complex and detailed application files.

Teams can share powerful, high-end workstation resources across the business, easily dialing up performance and powerful GPUs from their standard workstation to collaborate remotely with coworkers around the world.

Users keep data and company IP secure on-site while reducing the risk of data breaches and remotely administering computer hardware assets from anywhere, anytime.
Users install the trial on their host workstations and install the receiver software on their local devices to access their applications and projects as if they were in the office.

Ambidio 
To help sound editors, mixers and other post pro who suddenly find themselves working from home, Ambidio is making its immersive sound technology, Ambidio Looking Glass, available for free. Sound professionals can apply for a free license through Ambidio’s website. Ambidio is also waiving its per-title releasing fee for home entertainment titles during the current cinema shutdown. It applies to new titles that haven’t previously been released through Blu-ray, DVD, digital download or streaming. The free offer is available through May 31.

Ambidio Looking Glass can be used as a monitoring tool for theatrical and television projects requiring immersive sound. Ambidio Looking Glass produces immersive sound that approximates what can be achieved on a studio mix stage, except it is playable through standard stereo speaker systems. Editors and mixers working from home studios can use it to check their work and share it with clients, who can also hear the results without immersive sound playback systems.

“The COVID-19 pandemic is forcing sound editors and mixers to work remotely,” says Ambidio founder Iris Wu. “Many need to finish projects that require immersive sound from home studios that lack complex speaker arrays. Ambidio Looking Glass provides a way for them to continue working with dimensional sound and meet deadlines, even if they can’t get to a mix stage.”

Qumulo
Through July 2020, Qumulo is offering its cloud-native file software for free to public and private-sector medical and health care research organizations that are working to minimize the spread and impact of the COVID-19 virus.

“Research and health care organizations across the world are working tirelessly to find answers and collaborate faster in their COVID-19 vaccine mission,” said Matt McIlwain, chairman of the board of trustees of the Fred Hutchinson Cancer Research Center and managing partner at Madrona Venture Group. “It will be through the work of these professionals, globally sharing and analyzing all available data in the cloud, that a cure for COVID-19 will be discovered.”

Qumulo’s cloud-native file and data services allows organizations to use the cloud to capture, process, analyze and share data with researchers distributed across geographies. Qumulo’s software works seamlessly with the applications medical and health care researchers have been using for decades, as well as with artificial intelligence and analytics services more recently developed in the cloud.

Medical organizations can register to use Qumulo’s file software in the cloud, which will be deployable through the Amazon Web Services and Google Cloud marketplaces.

Goldcrest Post
Goldcrest Post has established the capability to conduct most picture and sound post production work remotely. Colorists, conform editors and other staff are now able to work from home or a remote site and connect to the facility’s central storage and technical resources via remote collaboration software. Clients can monitor work through similar secure, fast and reliable desktop connections.

The service allows Goldcrest to ensure theatrical and television projects remain on track while allowing clients to oversee work in as normal a manner as possible under current circumstances.

Goldcrest has set up a temporary color grading facility at a remote site convenient for its staff colorists. The site includes a color grading control panel, two color-calibrated monitors and a high-speed connection to the main Goldcrest facility. The company has also installed desktop workstations and monitors in the homes of editors and other staff involved in picture conforming and deliverables. Sound mixing is still being conducted on-site, but sound editorial and ancillary sound work is being done from home.In taking these measures, the facility has reduced its on-site staff to a bare minimum while keeping workflow disruption to a minimum.

Ziva Dynamics
Ziva Dynamics is making Ziva VFX character simulation software free for students and educators. The same tools used on Game of Thrones, Hellboy and John Wick: Chapter 3 are now available for noncommercial projects, offering students the chance to learn physics-based character creation before they graduate. Ziva VFX Academic licenses are fully featured and receive the same access and support as other Ziva products.

In addition to the software, Ziva Academic users will now receive free access to Ziva Dynamics’ simulation-ready assets Zeke the Lion (previously $10,000) and Lila the Cheetah. Thanks to Ziva VFX’s Anatomy Transfer feature, the Zeke rig has helped make squirrels, cougars, dogs and more for films like John Wick 3, A Dog’s Way Home and Primal.

Ziva Dynamics will also be providing a free Ziva Academic floating lab license to universities so students can access the software in labs across campuses whenever they want. Ziva VFX Academic licenses are free and open to any fully accredited institution, student, professor or researcher (an $1,800 value). New licenses can be found in the Ziva store and are provided following a few eligibility questions. Academic users on the original paid plan can now increase their license count for free.

OpenDrives 
OpenDrives’ OpenDrives Anywhere is an in-place private cloud model that enables customers with OpenDrives to work on the same project from multiple locations without compromising performance. With existing office infrastructure, teams already have an in-place private cloud and can extend its power to each of their remote professionals. No reinvestment in storage is needed.

Nothing changes from a workflow perspective except physical proximity. With simple adjustments, remote control of existing enterprise workstations can be extended via a secure connection. HP’s ZCentral Remote Boost (formerly RGS) software will facilitate remote access over secure connection to your workstations, or Teradici can provide both dedicated external hardware and software solutions for this purpose, giving teams the ability to support collaborative workflows at low cost. OpenDrives can also get teams quickly set up in under two hours on a corporate VPN and in under 24 hours without.

Prime Focus Technologies 
Prime Focus Technologies (PFT), the technology arm of Prime Focus, has added new features and advanced security enhancements to Clear to help customers embrace the virtual work environment. In terms of security, Clear now has a new-generation HTML 5 player enabled with Hollywood-grade DRM encryption. There’s also support for just-in-time visual watermarking embedded within the stream for streaming through Clear as a secure alternative to generating watermarking on the client side.

Clear also has new features that make it easier to use, including direct and faster download from S3 and Azure storage, easier partner onboarding and an admin module enhancement with condensed permissions to easily handle custom user roles. Content acquisition is made easier with a host of new functionalities to simplify content acquisition processes and reduce dependencies as much as possible. Likewise, for easier content servicing, there is now automation in content localization, to make it easier to perform and review tasks on Clear. For content distribution, PFT has enabled on-demand cloud distribution on Clear through the most commonly used cloud technologies.

Our Previous Update News

Brady and Stephenie Betzel
Many of you know postPerspective contributor and online video editor Brady Betzel from his great reviews and tips pieces. During this crisis, he is helping his wife, Stephenie, make masks for her sister (a nurse) and colleagues working at St. John’s Regional Medical Center in Oxnard, California, in addition to anyone else who works on the “front lines.” She’s sewn over 100 masks so far and is not stopping. Creativity and sewing is not new to her. Her day job is also creating. You can check out her work on Facebook and Instagram.

Object Matrix 
Object Matrix co-founder Nick Pearce has another LinkedIn dispatch, this time launching Good News Friday, where folks from around the globe check in with good news!  You can also watch it on YouTube. Pearce and crew are also offering video tips for surviving working from home. The videos, hosted by Pearce, are here.

Conductor
Conductor is waiving charges for orchestrating renders in the cloud. Updated pricing is reflected in the cost calculator on Conductor’s Pricing page. These changes will last at least through May 2020. To help expedite any transition needs, the Conductor team will be on call for virtual render wrangling of cloud submissions, from debugging scenes and scripts to optimizing settings for cost, turnaround time, etc. If you need this option, then email support@conductortech.com.

Conductor is working with partners to set up online training sessions to help studios quickly adopt cloud strategies and workflows. The company will send out further notifications as the sessions are formalized. Conductor staff is also available for one-on-one studio sessions as needed for those with specific pipeline considerations.

Conductor’s president and CEO Mac Moore said this: “The sudden onset of this pandemic has put a tremendous strain on our industry, completely changing the way studios need to operate virtually overnight. Given Conductor was built on the ‘work from anywhere’ premise, I felt it our responsibility to help studios to the greatest extent possible during this critical time.”

Symply
Symply is providing as many remote workers in the industry as possible with a free 90-day license to SymplyConveyor, its secure, high-speed transfer and sync software. Symply techs will be available to install SymplyConveyor remotely on any PC, Mac or Linux workstation pair or server and workstation.

The no-obligation offer is available at gosymply.com. Users sign up, and as long as they are in the industry and have a need, Symply techs will install the software. The number of free 90-day licenses is limited only by Symply’s ability to install them given its limited resources.

Foundry
Foundry has reset its trial database so that users can access a new 30-day trial for all products regardless of the date of their last trial. The company continues to offer unlimited non-commercial use of Nuke and Mari. On the educational side, students who are unable to access school facilities can get a year of free access to Nuke, Modo, Mari and Katana.

They have also announced virtual events, including:

• Foundry LiveStream – a series of talks around projects, pipelines and tools.
• Foundry Webinars – A 30 to 40-minute technical deep dive into Foundry products, workflows and third-party tools.
• Foundry Skill-Ups – A 30-minute guide to improving your skills as a compositor/lighter/texture artist to get to that next level in your career.
• Foundry Sessions – Special conversations with our customers sharing insights, tips and tricks.
• Foundry Workflow Wednesdays –10-minute weekly videos posted on social media showing tips and tricks with Nuke from our experts.

Alibi Music Library
Alibi Music Library is offering free whitelisted licensing of its Alibi Music and Sound FX catalogs to freelancers, agencies and production companies needing to create or update their demo reels during this challenging time.

Those who would like to take advantage of this opportunity can choose Demo Reel 2020 Gratis from the shopping cart feature on Alibi’s website next to any desired track(s). For more info, click here.

2C Creative
Caleb & Calder Sloan’s Awesome Foundation, the charity of 2C Creative founders Chris Sloan and Carla Kaufman Sloan, is running a campaign that will match individual donations (up to $250 each) to charities supporting first responders, organizations and those affected by COVID-19. 2C is a creative agency & production company serving the TV/streaming business with promos, brand integrations, trailers, upfront presentations and other campaigns. So far, the organization’s “COVID-19 Has Met Its Match” campaign has raised more than $50,000. While the initial deadline date for people to participate was April 6, this has now been extended to April 13. To participate, please visit ccawesomefoundation.org for a list of charities already vetted by the foundation or choose your own. Then, simply email a copy of your donation receipt to: cncawesomefoundation@gmail.com and they will match it!

Red Giant 
For the filmmaking education community, Red Giant is offering Red Giant Complete — the full set of tools including Trapcode Suite, Magic Bullet Suite, Universe, VFX Suite and Shooter Suite — free for students or faculty members of a university, college or high school. Instead of buying separate suites or choosing which tools best suits one’s educational needs or budget, students and teachers can get every tool Red Giant makes completely free of charge. All that’s required is a simple verification.

How to get a free Red Giant Complete license if you are a student, teacher or faculty member:
1. Use school or organization ID or any proof of current employment or enrollment for verification. More information on academic verification is available here.
2. Send your academic verification to academic@redgiant.com.
3. Wait for approval via email before purchasing.
4. Once you get approval, go to the Red Giant Complete Product Page and “buy” your free version. You will only be able to buy the free version if you have been pre-approved.

The free education subscription will last 180 days. When that time period ends, users will need to reverify their academic status to renew their free subscription.

Flanders Scientific
Remote collaboration and review benefits greatly from having the same type of display calibrated the same way in both locations. To help facilitate such workflow consistency, FSI is launching a limited time buy one, get one for $1,000 off special on its most popular monitor, the DM240.

Nvidia
For those pros needing to power graphics workloads without local hardware, cloud providers, such as Amazon Web Services and Google Cloud, offer Nvidia Quadro Virtual Workstation instances to support remote, graphics-intensive work quickly without the need for any on-prem infrastructure. End-users only need a connected laptop or thin client, as the virtual workstations support the same Nvidia Quadro drivers and features as the physical Quadro GPUs used by pro artists and designers in local workstations.

Additionally, last week, Nvidia has expanded its free virtual GPU software evaluation to 500 licenses for 90 days to help companies support their remote workers with their existing GPU infrastructure. Nvidia vGPU software licenses — including Quadro Virtual Workstation — enable GPU-accelerated virtualization so that content creators, designers, engineers and others can continue their work. More details are available here.  Nvidia has also posted a separate blog on virtual GPUs to help admins who are working to support remote employees

Harman
Harman is offering a free e-learning program called Learning Sessions in conjunction with Harman Pro University.

The Learning Sessions and the Live Workshop Series provide a range of free on-demand and instructor-led webinars hosted by experts from around the world. The Industry Expert workshops feature tips and tricks from front of house engineers, lighting designers, technicians and other industry experts, while the Harman Expert workshops feature in-depth product and solution webinars by Harman product specialists.

• April 7—Lighting for Churches: Live and Video with Lucas Jameson and Chris Pyron
• April 9—Audio Challenges in Esports with Cameron O’Neill
• April 15—Special Martin Lighting Product Launch with Markus Klüesener
• April 16—Lighting Programming Workshop with Susan Rose
• April 23—Performance Manager: Beginner to Expert with Nowell Helms

Apple
Apple is offering free 90-day trials of Final Cut Pro X and Logic Pro X apps for all in order to help those working from home and looking for something new to master, as well as for students who are already using the tools in school but don’t have the apps on their home computers.

Avid
For its part, Avid is offering free temp licenses for remote users of the company’s creative tools. Commercial customers can get a free 90-day license for each registered user of Media Composer | Ultimate, Pro Tools, Pro Tools | Ultimate and Sibelius | Ultimate. For students whose school campuses are closed, any student of an Avid-based learning institution that uses Media Composer, Pro Tools or Sibelius can receive a free 90-day license for the same products.

Aris
Aris, a full-service production and post house based in Los Angeles, is partnering with ThinkLA to offer free online editing classes for those who want to sharpen their skills while staying close to home during this worldwide crisis. The series will be taught by Aris EP/founder Greg Bassenian, who is also an award-winning writer and director. He has also edited numerous projects for clients including Coca-Cola, Chevy and Zappos.

Logic
mLogic is offering a 15% discount on its mTape Thunderbolt 3 LTO-7 and LTO-8 solutions The discount applies to orders placed on the mTape website through April 20th. Use discount code mLogicpostPerspective15%.

Xytech
Xytech has launched “Xytech After Dark,” a podcast focusing on trends in the media and broadcasting industries. The first two episodes are now available on iTunes, Spotify and all podcasting platforms.

Xytech’s Greg Dolan says the podcast “is not a forum to sell, but instead to talk about why create the functionality in MediaPulse and the types of things happening in our industry.”

Hosted by Xytech’s Gregg Sandheinrich, the podcast will feature Xytech staff, along with special guests. The first two episodes cover topics including the recent HPA Tech Retreat (featuring HPA president Seth Hallen), as well as the cancellation of the NAB Show, the value of trade shows and the effects of COVID-19 on the industry.

Adobe
Adobe shared a guide to best practices for working from home. It’s meant to support creators and filmmakers who might be shifting to remote work and need to stay connected with their teams and continue to complete projects. You can find the guide here.

Adobe’s principal Creative Cloud evangelist, Jason Levine, hosted a live stream — Video Workflows With Team Projects that focus on remote workflows.

Additionally, Karl Soule, Senior Technical Business Development Manager, hosed a stream focusing on Remote video workflows and collaboration in the enterprise. If you sign up on this page, you can see his presentation.

Streambox
Streambox has introduced a pay-as-you-go software plan for video professionals who use its Chroma 4K, Chroma UHD, Chroma HD and Chroma X streaming encoder/decoder hardware. Since the software has been “decoupled” from the hardware platform, those who own the hardware can rent the software on a monthly basis, pause the subscription between projects and reinstate it as needed. By renting software for a fixed period, creatives can take on jobs without having to pay outright for technology that might have been impractical

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage .capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

DejaSoft
DejaSoft is offering editors 50% off all their DejaEdit licenses through the end of April. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow.

DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

DejaSoft
DejaSoft is offering editors 50% off all their DejaEdit licenses through the end of April. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow.

DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.


Words of wisdom from editor Jesse Averna, ACE

We are all living in a world we’ve never had to navigate before. People’s jobs are in flux, others are working from home, and anxiety is a regular part of our lives. Through all the chaos, Jesse Averna has been a calming voice on social media, so postPerspective reached out to ask him to address our readership directly.

Jesse, who was co-founder of the popular Twitter chat and Facebook group @PostChat, works at Disney Animation Studio and is a member of the American Cinema Editors.


Hey,

How are you doing? This isn’t an ad. I’m not going to sell you anything or try to convince you of anything. I just want to take the opportunity to check in. Like many of you, I’m a post professional (an editor) currently working from home. If we don’t look out for each other, who will? Please know that it’s okay not to be okay right now. I have to be honest, I’m exhausted. I’m just endlessly reading news and searching for new news and reading posts about news I’ve already read and searching again for news I might have missed …

I want to remind you of a couple things that I think might bring some peace, if you let me. I fear it’s about to get much darker and much scarier, so we need to anchor ourselves to some hope.

You are valuable. The world is literally different because you are here. You have intrinsic value, and that will never change. No matter what. You are thought about and loved, despite whatever the voice in your head says. I’m sure your first reaction to reading that is to blow it off, but try to own it. Even for just a moment. It’s true.

You don’t deserve what’s going on, but let it bring some peace that the whole world is going through it together. You might be isolated, but you’re not alone. We are forced to look out for one another by looking out for ourselves. It’s interesting; I feel so separate and vulnerable, but the truth is that the whole planet is feeling and reacting to this as one. We are in sync, whether we know it or not — and that’s encouraging to me. We ALL want to be well and be safe, and we want our neighbors to be well also. We have a rare moment of feeling like a people, like a planet.

If you are feeling anxious, do me a favor tonight. Go outside and look at the stars. Set a timer for five minutes. No entertainment or phone or anything else. Just five minutes. Reset. Feel yourself on a cosmic scale. Small. A blink of an eye. But so, so valuable.

And please give yourself a break. A sanity check. If you need help, please reach out. If you need to nest, do it. You need to tune out, do it. Take care of yourself. This is an unprecedented moment. It’s okay not to be okay. Once you can, though, see who you can help. This complete shift of reality has made me think about legacy. This is a unique legacy-building moment. That student who reached out to you on LinkedIn asking for advice? You now have time to reply. That nonprofit you thought about volunteering your talents to? Now’s your chance. Even just to make the connection. Who can you help? Check in on? You don’t need any excuse in our current state to reach out.

I know I’m just some rando you’re reading on the internet, but I believe you are going to make it through this. You are wonderful. Do everything you can to be safe. The world needs you. It’s a better place because you are here. You know things, have ideas to share and will make things that none of the rest of us do or have.

Hang in there, my friends, and let me know if you have any thoughts, encouragements or tips for staying sane during this time. I’ll try to compile them into another article to share.

Jesse
@dr0id


Jesse Averna  — pictured on his way to donate masks — is a five-time Emmy-winning ACE editor living in LA and working in the animation feature world. 


Finishing artist Tim Nagle discuses work on indie film Miss Juneteenth

Lucky Post Flame artist Tim Nagle has a long list of projects under his belt, including collaborations with David Lowery — providing Flame work on the short film Pioneer as well as finishing and VFX work to Lowery’s motion picture A Ghost Story. He is equally at home working on spots, such as campaigns for AT&T, Hershey’s, The Home Depot, Jeep, McDonald’s and Ram..

Nagle began his formal career on the audio side of the business, working as engineer for Solid State Logic, where he collaborated with clients including Fox, Warner Bros., Skywalker, EA Games and ABC.

Tim Nagle

We reached out to Nagle about his and Lucky Post’s work on the feature film Miss Juneteenth, which premiered at Sundance and was recently honored by SXSW 2020 as the winner of the Louis Black Lone Star award.

Miss Juneteenth was directed (and written) by Channing Godfrey Peoples — her first feature-length film. It focuses on a woman from the south — a bona fide beauty queen once crowned Miss Juneteenth, a title commemorating the day slavery was abolished in Texas. The film follows her journey as she tries to hold onto her elegance while striving to survive. She looks for ways to thrive despite her own shortcomings as she marches, step by step, toward self-realization.

How did the film come to you?
We have an ongoing relationship with Sailor Bear, the film’s producing team of David Lowery, Toby Halbrooks and James Johnston. We’ve collaborated with them on multiple projects, including The Old Man & The Gun, directed by Lowery.

What were you tasked to do?
We were asked to provide dailies transcoding, additional editorial, VFX, color and finishing and ultimately delivery to distribution.

How often did you talk to director Channing Godfrey Peoples?
Channing was in the studio, working side by side with our creatives, including colorist Neil Anderson and me, to get the project completed for the Sundance deadline. It was a massive team effort, and we felt privileged to help Channing with her debut feature.

Without spoilers, what most inspires you about the film?
There’s so much to appreciate in the film — it’s a love letter to Texas, for one. It’s directed by a woman, has a single mother at its center and is a celebration of black culture. The LA Times called it one of the best films to come out of Sundance 2020.

Once you knew the film was premiering at Sundance, what was left to complete and in what amount of time?
This was by far the tightest turnaround we have ever experienced. Everything came down to the wire, sound being the last element. It’s one of the advantages of having a variety of talent and services under one roof — the creative collaboration was immediate, intense and really made possible by our shorthand and proximity.

How important do you think it is for post houses to be diversified in terms of the work they do?
I think diversification is important not only for business purposes but also to keep the artists creatively inspired. Lucky Post’s ongoing commitment to support independent film, both financially and creatively, is an integrated part of our business along with brand-supported work and advertising. Increasingly, as you see greater crossover of these worlds, it just seems like a natural evolution for the business to have fewer silos.

What does it mean to you as a company to have work at Sundance? What kinds of impact do you see — business, morale and otherwise?
Having a project that we put our hands on accepted into Sundance was such an honor. It is unclear what the immediate and direct business impacts might be, but for morale, this is often where the immediate value is clear. The excitement and inspiration we all get from projects like this just naturally makes how we do business better.

What software and hardware did you use?
On this project we started with Assimilate Scratch for dailies creation. Editorial was done in Adobe Premiere. Color was Blackmagic DaVinci Resolve, and finishing was done in Autodesk Flame.

What is a piece of advice that you’d give to filmmakers when considering the post phase of their films?
We love being involved as early as possible — certainly not to get in anyone’s way,  but to be in the background supporting the director’s creative vision. I’d say get with a post company that can assist in setting looks and establishing a workflow. With a little bit of foresight, this will create the efficiency you need to deliver in what always ends up being a tight deadline with the utmost quality.


Netflix’s Mindhunter: Skywalker’s audio adds to David Fincher’s vision

By Patrick Birk

Scott Lewis

I was late in discovering David Fincher’s gripping series on serial killers, Mindhunter. But last summer, I noticed the Netflix original lurking in my suggested titles and decided to give it a whirl. I burned through both seasons within a week. The show is both thrilling and chilling, but the majority of these moments are not achieved through blazing guns, jump scares and pyrotechnics. It instead focuses on the inner lives of multiple murderers and the FBI agents whose job it is to understand them through subtle but detail-rich conversation.

Sound plays a crucial role in setting the tone of the series and heightening tension through each narrative arc. I recently spoke to rerecording mixers Scott Lewis and Stephen Urata as well as supervising sound editor Jeremy Molod — all from Skywalker Sound — about their process creating a haunting and detail-laden soundtrack. Let’s start with Lewis and Urata and then work our way to Molod.

How is working with David Fincher? Does he have any directorial preferences when it comes to sound? I know he’s been big on loud backgrounds in crowded spaces since The Social Network.
Scott Lewis: David is extremely detail-oriented and knowledgeable about sound. So he would give us very indepth notes about the mix… down to the decibel.

Stephen Urata: That level of attention to detail is one of the more challenging parts of working on a show like Mindhunter.

Working with a director who is so involved in the audio, does that limit your freedom at all?
Lewis: No. It doesn’t curtail your freedom, because when a director has a really clear vision, it’s more about crafting the track to be what he’s looking for. Ultimately, it’s the director’s show, and he has a way of bringing the best work out of people. I’m sure you heard about how he does hundreds of takes with actors to get many options. He takes a similar approach with sound in that we might give him multiple options for a certain scene or give him many different flavors of something to choose from. And he’ll push us to deliver the goods. For example, you might deliver a technically perfect mix but he’ll dig in until it’s exactly what he wants it to be.

Stephen Urata

Urata: Exactly. It’s not that he’s curtailing or handcuffing us from doing something creative. This project has been one of my favorites because it was just the editorial team and sound design, and then it would come to the mix stage. That’s where it would be just Scott and me in a mix room just the two of us and we’d get a shot at our own aesthetic and our own choice. It was really a lot of fun trying to nail down what our favorite version of the mix would be, and David really gave us that opportunity. If he wanted something else he would have just said, “I want it like this and only do it like this.”

But at the same time, we would do something maybe completely different than he was expecting, and if he liked it, he would say, “I wasn’t thinking that, but if you’re going to go that direction, try this also.” So he wasn’t handcuffing us, he was pushing us.

Do you have an example of something that you guys brought to the table that Fincher wasn’t expecting and asked you to go with it?
Urata: The first thing we did was the train scene. It was the scene in an empty parking garage and there is the sound of an incoming train from two miles away. That was actually the first thing that we did. It was the middle of Episode 2 or something, and that’s where we started.

Where they’re talking to the BTK survivor, Kevin?
Lewis: Exactly.

Urata: He’s fidgeting and really uncomfortable telling his story, and David wanted to see if that scene would work at all, because it really relied heavily on sound. So we got our shot at it. He said, “This is the kind of the direction I want you guys to go in.” Scott and I played off of each other for a good amount of time that first day, trying to figure out what the best version would be and we presented it to him. I don’t remember him having that many notes on that first one, which is rare.

It really paid off. Among the mixes you showed Fincher, did you notice a trend in terms of his preferences?
Lewis: When I say we gave him options it might be down to something like with Son of Sam. Throughout that scene we used a slight pitching to slowly lower his voice over the length of the scene so that by the time he reveals that he actually isn’t crazy and he’s playing everybody, his voice drops a register. So when we present him options, it’s things like how much we’re pitching him down over time or things like that. It’s a constant review process.

The show takes place in the mid ‘70s and early ’80s. Were there any period-specific sounds or mixing tricks you used when it came to diegetic music and things like that?
Lewis: Oh yeah. Ren Klyce is the supervising sound designer on the show, and he’s fantastic. He’s the sound designer on all of David’s films. He is really good about making sure that we stay to the period. So with regard to mixing, panning is something that he’s really focused on because it’s the ‘70s. He’d tell us not to go nuts on the panning, the surrounds, that kind of thing; just keep it kind of down the middle. Also, futzes are a big thing in that show; music futzes, phone futzes … we did a ton of work on making sure that everything was period-specific and sounded right.

Are you using things like impulse responses and Altiverb or worldizing?
Lewis: I used a lot of Speakerphone by Audio Ease as well as EQ and reverb.

What mixing choices did you make to immerse the viewer in Holden’s reality, i.e. the PTSD he experiences?
Lewis: When he’s experiencing anxiety, it’s really important to make sure that we’re telling the story that we’re setting out to tell. Through mixing, you can focus the viewers’ attention on what you want them to track. So that could be dialogue in the background of a scene, like the end of Episode 1, when he’s having a panic attack, and in the distance, his boss and Tench are talking. It was very important that you make out the dialogue there, even though you’re focusing on Holden having a panic attack. So it’s moments like that when it’s making sure that the viewer is feeling that claustrophobia but also picking up on the story point that we want you to follow.

Lewis: Also, Stephen did something really great there — there are sprinklers in the background and you don’t even notice, but the tension is building through them.

There’s a very intense moment when Holden’s trying to figure out who let their boss know about a missing segment of tape in an interview, and he accuses Greg, who leans back in his chair, and there’s a squeal in there that kind of ramps up the tension.
Urata: David’s really, really honed in on Foley in general — chair squeaks, the type of shoes somebody’s wearing, the squeak of the old wooden floor under their feet. All those things have to play with David. Like when Wendy’s creeping over to the stairwell to listen to her girlfriend and her ex-husband talking. David said, “I want to hear the wooden floor squeaking while she’s sneaking over.”

It’s not just the music crescendo-ing and making you feel really nervous or scared. It’s also Foley work that’s happening in the scene, I want to hear more of that or less of that. Or more backgrounds to just add to the sound pressure to build to the climax of the scene. David uses all those tools to accomplish the storytelling in the scene with sound.

How much ambience do you have built into the raw Foley tracks that you get, and how much is reverb added after the fact? Things like car door slams have so much body to them.
Urata: Some of those, like door slams, were recorded by Ren Klyce. Instead of just recording a door slam with a mic right next to the door and then adding reverb later on, he actually goes into a huge mansion and slams a huge door from 40 feet away and records that to make it sound really realistic. Sometimes we add it ourselves. I think the most challenging part about all of that is marrying and making all the sounds work together for the specific aesthetic of the soundtrack.

Do you have a go-to digital solution for that? Is it always something different or do you find yourself going to the same place?
Urata: It definitely varies. There’s a classic reverb, a digital version of it: the Lexicon 480. We use that a good amount. It has a really great natural film sound that people are familiar with and it sounds natural. There are other ones but it’s really just another tool to make it. If it doesn’t work, we just have to use something else.

Were there any super memorable ADR moments?
Lewis: I can just tell you that there’s a lot of ADR. Some whole scenes are ADR. Any Fincher show that I’ve mixed dialogue on, where I also mixed the ADR, I am 10 times better than I was before I started. Because David’s so focused on storytelling, if there’s a subtle inflection that he’s looking for that he didn’t get on set, he will loop the line to make sure that he gets that nuance.

Did you coordinate with the composer? How do you like to mix the score so that it has a really complementary relationship to the rest of the elements?
Lewis: As re-recording mixers, they don’t involve us in the composition part of it; it just comes to us after they’ve spotted the score.

Jason Hill was the composer, and his score is great. It’s so spooky and eerie. It complements the sound design and sound effects layers really well so that a lot of it will kind of will sit in there. The score is great and it’s not traditional. He’s not working with big strings and horns all over the place. He’s got a lot of synth and guitars and stuff. He would use a lot of analog gear as well. So when it comes to mix sometimes you get kind of anomalies that you don’t commonly get, whether it’s hiss or whatever, elements he’s adding to add kind of an analog sound to it.

Lewis: And a lot of times we would keep that in because it’s part of his score.

Now let’s jump in with sound editor Jeremy Molod

As a sound editor, what was it like working with David Fincher?
Jeremy Molod: David and I have done abot seven or eight films together, so by the time we started on Season Two of Mindhunter, we pretty much knew each other’s styles. I’m a huge fan of David’s movies. It’s a privilege to work with him because he’s such a good director, and the stuff he creates is so entertaining and beautifully done. I really admire his organization and how detailed he is. He really gets in there and gives us detail that no other director has ever given us.

Jeremy Molod

You worked with him on The Social Network. In college, my sound professors would always cite the famous bar scene, where Mark Zuckerberg and his girlfriend had to shout at each other over the backgrounds.
Molod: I remember that moment well. When we were mixing that scene, because the music was so loud and so pulsating, David said, “I don’t want this to sound like we’re watching a movie about a club; I want this to be like we’re in the club watching this.” To make it realistic, when you’re in the club, you’re straining to hear sounds and people’s voices. He said that’s what it should be like. Our mixer, David Parker, kept pushing the music up louder and louder, so you can barely make out those words.

I feel like I’m seeing iterations of that in Mindhunter as well.
Molod: Absolutely. That makes it more stressful and like you said, gives it a lot more tension.

Scott said that David’s down to the decibel in terms of how he likes his sound mixed. I’m assuming he’s that specific when it comes to the editorial as well?
Molod: That is correct. It’s actually even more to that quarter decibel. He literally does that all the time. He gets really, really in there.

He does the same thing with editorial, and what I love about his process is he doesn’t just say, “I want this character to sound old and scared,” he gives real detail. He’ll say, “This guy’s very scared and he’s dirty and his shoelaces are untied and he’s got a rag and a piece of snot rag hanging out of his pocket. And you can hear the lint and the Swiss army knife with the toothpick part missing.” He gets into painting a picture, he wants us literally to translate the sound, but he wants us to make it sound like the picture he’s painting.

So he wanted to make Kevin sound really nervous in the truck scene. Kevin’s in the back and you don’t really see him too much. He’s blurred out. David really wanted to sell his fear by using sound, so we had him tapping the leg nervously, scratching the side of the car, kind of slapping his leg and obviously breathing really heavy and sniffing a lot, and it was those bounds that really helped sell that scene.

So while he does have the acumen and vocabulary within sound to talk to you on a technical level, he’ll give you direction in a similar way to how he would an actor.
Molod: Absolutely, and that’s always how I’ve looked at it. When he’s giving us direction, it’s actually the same way as he’s giving an actor direction to be a character. He’s giving the sound team direction to help those characters and help paint those characters and the scenes.

With that in mind, what was the dialogue editing process like? I’ve heard that his attention to detail really comes into play with inflection of lines. Were you organizing and pre-syncing the alternate takes as closely as you could with the picture selection?
Molod: We did that all the time. The inclination and the intonation and the cadence of the voices of the characters is really important to him, and he’s really good about figuring out which words of which takes he can stitch together to do it. So there might be two sentences that one actor says at one time and those sentences are actually made up of five different takes. And he does so many takes that we have a wealth of material to choose from.

We’d probably send about five or six versions to David to listen to and then he would make his notes. That would happen almost every day and we would start honing in on the performances he liked. Eventually he might say, “I don’t like any of them. You’ve got to loop this guy on the ADR stage.” He likes us to stitch up the best little parts and loop together like a puzzle.

What is the ADR stage like at Skywalker?
Molod: We actually did all of our ADR at Disney Studios in LA because David was down there, as were the actors. We did a fair amount of ADR in Mindhunter, there’s lots of it in there.

We usually have three or four microphones running during an ADR session, one of which will be a radio mic. The other three would be booms set in different locations, the same microphones that they use in production. We also throw in an extra [Sennheiser MKH 50] just to have it with the track of sound that we could choose from.

The process went great, we went through it, we’d come back and give him about five or six choices and then he would start making notes and we had to pin it down to the way he liked it. So by the time we got to the mix stage, the decision was done.

There was a scene where people are walking around talking after a murder had been committed, and what David really wanted was to kind of be talking a little softly about this murder. So we had to go in and loop that whole scene again with them performing it at a more quiet, sustained volume. We couldn’t just turn it down. They had to perform it as if they were not quite whispering but trying to speak a little lower so no one could hear.

To what extent did loop groups play a part in the soundtrack? With the prominence of backgrounds in the show it seems like customization would be helpful, to have time-specific little bits of dialogue that might pop out.
Molod: We’ve used a group called the Loop Squad for all the features, House of Cards shows and the Mindhunters. We would send a list of all of our cues, get on the phone and explain what the reasoning was, what the storylines were. All their actors would on their own, go and research everything that was happening at the time, so if they were just standing by a movie theater, they had something to talk about that was relevant at the time.

When it came to production sound on the show, which track did you normally find yourself working from?
Molod: In most scenes, they would have a couple of radio mics attached to the actors and they’d have several booms. Normally, there were maybe eight different microphones set up. You would have one general boom over the whole thing, you’d have the boom that was close to each character.

We almost always went with one of the booms, unless we were having trouble making out what they were saying. And then it depended just on which actor was standing closest to the boom. One of the tricks our editors did in order to make it sound better is they would phase the two. So if the boom wasn’t quite working on its own and the radio either, one of our tricks would be to make those two play together in a way, and accomplish what we wanted where you could hear it but also give the space in the room.

Were there any moments that you remember from the production tracks for effects?
Molod: Whenever we could use production effects, we always tried to get those in, because they always sound the most realistic and most pertinent to that scene and that location. If we can maintain any footsteps in the production, we always do because those always sound great.

Any kind of subtle things like creaks, bed creaks, the floor creaking, we always try to salvage those and those help a lot too. Fincher is very, very, very into Foley. We have Foley covering the whole thing, end to end. He gives us notes on everybody’s footsteps and we do tests of each character with different types of shoes on and different strides of walking, and we send it to him.

So much of the show’s drama plays out in characters’ internal worlds. In a lot of the prison interview scenes, I notice door slams here and there that I think serve to heighten the tension. Did you develop a kind of a logical language when it came to that, or did you find it was more intuitive?
Molod: No, we did have our language to it and that was based on Fincher’s direction, and when it was really crazy he wanted to hear the door slams and buzzers and keys jingling and tons of prisoners yelling offsite. We spent days recording loop-group prisoners and they would be sprinkled throughout the scene. And when something about the conversation had an upsetting subject matter, we might ramp up the voices in the back.


Pat Birk is a musician, sound engineer and post pro at Silver Sound, a boutique sound house based in New York City.


COVID-19: NAB talks plans, more companies offer support, info about remote work

Last Friday, NAB’s president/CEO, Gordon Smith, issued a statement saying that rather than rescheduling the NAB Show for later this year, NAB would be unveiling a new digital offering called NAB Show Express, and enhancing NAB Show New York later this year. Here is part of what he said.

“First, we are exploring a number of ways to bring the industry together online, both in the short and long term. We know from many years of serving the community with face-to-face events, that connectivity is vital to the health and success of the industry. That’s why we are excited to announce NAB Show Express, targeted to launch in April 2020. This digital experience will provide a conduit for our exhibitors to share product information, announcements and demos, as well as deliver educational content from the original selection of programming slated for the live show in Las Vegas, and create opportunities for the community to interact virtually — all of which adds up to something that brings the NAB Show community together in a new way.

“Second, we will be enhancing NAB Show New York with new programs, partners, and experiences. We have already had numerous conversations with show partners about expanding their participation and have heard from numerous exhibitors interested in enhancing their presence at this fall’s show. NAB Show New York represents the best opportunity for companies to announce and showcase their latest innovations and comes at a perfect time for the industry to gather face-to-face to restart, refocus, and reengage as we move forward together.”

A number of companies are releasing updates and offering discounts and tips for working remotely. Here is a bit of news from some of those companies, and we will add more companies to this list as the news comes in, so watch this space.

mLogic
mLogic is offering a 15% discount on its mTape Thunderbolt 3 LTO-7 and LTO-8 solutions The discount applies to orders placed on the mTape website through April 20th. Use discount code mLogicpostPerspective15%.

Xytech
Xytech has launched “Xytech After Dark,” a podcast focusing on trends in the media and broadcasting industries. The first two episodes are now available on iTunes, Spotify and all podcasting platforms.

Xytech’s Greg Dolan says the podcast “is not a forum to sell, but instead to talk about why create the functionality in MediaPulse and the types of things happening in our industry.”

Hosted by Xytech’s Gregg Sandheinrich, the podcast will feature Xytech staff, along with special guests. The first two episodes cover topics including the recent HPA Tech Retreat (featuring HPA president Seth Hallen), as well as the cancellation of the NAB Show, the value of trade shows and the effects of COVID-19 on the industry.

Nvidia
Nvidia is expanding its free virtual GPU software evaluation to 500 licenses for 90 days to help companies support their remote workers with their existing GPU infrastructure. Nvidia vGPU software licenses — including Quadro Virtual Workstation — enable GPU-accelerated virtualization so that content creators, designers, engineers and others can continue their work. More details are available here.  Nvidia has also posted a separate blog on virtual GPUs to help admins who are working to support remote employees

Object Matrix 
Object Matrix is offering video tips for surviving working from home. The videos, hosted by co-founder Nicholas Pearce, are here.

Adobe
Adobe shared a guide to best practices for working from home. It’s meant to support creators and filmmakers who might be shifting to remote work and need to stay connected with their teams and continue to complete projects. You can find the guide here.

Adobe’s principal Creative Cloud evangelist, Jason Levine, hosted a live stream — Video Workflows With Team Projects ±that focus on remote workflows.

Additionally, Karl Soule, Senior Technical Business Development Manager, hosed a stream focusing on Remote video workflows and collaboration in the enterprise. If you sign up on this page, you can see his presentation.

Streambox
Streambox has introduced a pay-as-you-go software plan for video professionals who use its Chroma 4K, Chroma UHD, Chroma HD and Chroma X streaming encoder/decoder hardware. Since the software has been “decoupled” from the hardware platform, those who own the hardware can rent the software on a monthly basis, pause the subscription between projects and reinstate it as needed. By renting software for a fixed period, creatives can take on jobs without having to pay outright for technology that might have been impractical.

And last week’s offerings as well

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

DejaSoft
DejaSoft is offering editors 50% off all their DejaEdit licenses through the end of April. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow.

DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.

Main Image: Courtesy of Frame.io

Main Image: Courtesy of Adobe


Workstations Roundtable

By Randi Altman

In our Workstations Special Edition, we spoke to pros working in offline editing, visual effects and finishing about what they need technically in order to keep creating. Here in our Workstations Roundtable, we reached out to both users and those who make computers and related tools, all of whom talk about what they need from their workstations in order to get the job done.

The Foundation’s Director of Engineering, John Stevens 

John Stevens

Located just across the street from the Warner Bros. lot, The Foundation provides post production picture services and workflows in HD, 2K, 4K, UHD, HDR10 and Dolby Vision HDR. They work on many episodic shows, including Black-ish, Grown-ish, Curb Your Enthusiasm and American Soul.

Do you typically buy off the shelf or custom? Both?
Both. It depends on the primary application the system will be running. Typically, we buy off-the-shelf systems that have the CPU and memory configurations we are looking for.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
There is no defined time frame. We look at every system manufacturer’s offerings, look at specs and request demo systems for test after we have narrowed it to a few systems.

How important is the GPU to your work?
The GPU is extremely important, as almost every application uses the GPU to allow for faster processing. A lot of applications allow for multiple GPUs, so I look for systems that will support them.

Curb Your Enthusiasm

What are the questions you ask yourself before buying a new system? And what do you do with your older systems?
What is the primary application that the system is being purchased for? Does the software vendor have a list of certified configurations? Is the application well-threaded, meaning, can the application make efficient use of multiple cores, or does a higher core clock rate make the application perform faster? How many PCI slots are available? What is the power supply capability? What’s the reputation and experience of the manufacturer?

Do you feel mobile workstations are just as powerful for your work as desktops these days?
No, systems are limited in expandability.

 

Puget Systems’ Solutions Research & Development, Matt Bach

Based in Auburn, Washington, Puget Systems specializes in high-performance, custom-built computers for media and entertainment.

Matt Bach

What is your definition of a workstation? We know there are a few definitions out there in the world.
While many people tend to focus on the hardware to define what a workstation is, to us it is really whether or not the computer is able to effectively allow you to get your work done. In order to do so, it has to be not only fast but reliable. In the past, you had to purchase very expensive “workstation-class” hardware to get the proper balance of performance and stability, but these days it is more about getting the right brands and models of parts to complement your workflow than just throwing money at the problem.

For users looking to buy a computer but are torn between off-the-shelf and building their own, what would you tell them?
The first thing I would clarify is that there are vastly different kinds of “off-the-shelf” computers. There are the systems you get from a big box store, where you have a handful of choices but no real customization options. Then there are systems from companies like us, where each system is tailor-made to match what applications you use and what you do in those applications. The sticker price on these kinds of systems might appear to be a bit higher, but in reality — because it is the right hardware for you — the actual performance you get per dollar tends to be quite a bit better.

Of course, you can build a system yourself, and in fact, many of our customers used to do exactly that. But when you are a professional trying to get your work done, most people don’t want to spend their time keeping up on the latest hardware, figuring out what exact components they should use and troubleshooting any issues that come up. Time spent fiddling with your computer is time that you could spend getting your job done. Working with a company like us that understands what it is you are doing — and how to quickly get you back up and running — can easily offset any cost of building your own system.

What questions would you suggest pros ask before deciding on the right computer for their work?
This could easily be an entire post all its own, and this is the reason why we highly encourage every customer to talk to one of our consultants — if not on the phone, then at least by email. The right configuration depends on a huge number of factors that are never quite the same from one person to the next. It includes what applications you use and what you do in those applications. For example, if you are a video editor, what resolution, fps and codec do you tend to work with? Do you do any multicam work? What about VFX or motion graphics?

Depending on what applications you use, it is often also the case that you will run into times when you have opposing “optimal” hardware. A program like After Effects prefers CPUs with high per-core performance, while Premiere Pro can benefit from a CPU with more cores. That means there is no single “best” option if you use both of those applications, so it comes down to determining which application is more likely to benefit from more performance in your own personal workflow.

This really only scratches the surface, however. There is also the need to make sure the system supports your existing peripherals (Thunderbolt, 10G networking, etc.), the physical size of the system and upgradability. Not to mention the quality of support from the system manufacturer.

How do you decide on what components to include in your systems … GPUs, for example?
We actually have an entire department (Puget Labs) that is dedicated to this exact question. Not only does hardware change very quickly, but software is constantly evolving as well. A few years back, developers were working on making their applications multi-threaded. Now, much of that dev time has switched over to GPU acceleration. And in the very near future, we expect work in AI and machine learning to be a major focus.

Keeping up with these trends — and how each individual application is keeping up with them — takes a lot of work. We do a huge amount of internal testing that we make available to the public to determine exactly how individual applications benefit from things like more CPU cores, more powerful GPUs or faster storage.

Can you talk about warranties and support? What do you offer?
As for support and warranty, our systems come with lifetime tech support and one to three years parts warranty. What makes us the most different from big box stores is that we understand your workflow. We do not want your tech support experience to be finger pointing between Adobe, Microsoft and Puget Systems. Our goal is to get you up and running, regardless of what the root cause is, and often that means we need to be creative and work with you individually on the best solution to the problem.

 

Goldcrest Post’s Technical Director, Barbary Ahmed

Barbary Ahmed

Goldcrest Post New York, located in the heart of the bustling Meatpacking District, is a full-service post facility offering offline and picture and sound finishing.  Recent credits include The Laundromat, Godfather of Harlem, Russian Doll, High Flying Bird, Her Smell; Sorry to Bother You, Billions and Unsane.   

Do you typically buy off the shelf or custom? Both?
We do both. But for most cases, we do custom builds because color grading workstations need more power, more GPUs and a lot of I/O options.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
This is technically a long research process. We depend on our trusty vendors, and it also depends on pricing and availability of items and how quick we need them.

How important is the GPU to your work?
For color grading and visual effects, using applications such as Autodesk’s Maya and Flame, Blackmagic Resolve and Adobe Premiere, a high-end workstation will provide a smoother and faster workflow. 4K/UHD media and above can tax a computer, so having access to a top-of-the-line machine is a key for us.

The importance of GPUs is that the video software mentioned above is now able to dump much of the heavy lifting onto the GPU (or even several GPUs), leaving the CPU free to do its job of delegating tasks, applications, APIs, hardware process, I/O device requests and so on. The CPU just makes sure all the basic tasks run in harmony, while the GPU takes care of crunching the more complex and intensive computation needed by the application. It is important to know that for all but the most basic video — and certainly for any form of 4K.

What are the questions you ask yourself before buying a new system? And what do you do with your older systems?
There are many questions to ask here: Is this system scalable? Can we upgrade it in the future? What real change will it bring to our workflow? What are others in my industry using? Does my team like it? These are the kind of questions we start with for any job.

In terms of what to do with older systems, there are a couple things that we think about: Can we use it as a secondary system? Can we donate it? Can we turn it into an experimental box? Can we recycle it? These are the kind of questions we ask ourselves.

Do you feel mobile workstations are just as powerful for your work as desktops these days? Especially now, with the coronavirus shutdowns?
During these unprecedented times, it seems that mobile workstations are the only way to keep up with our clients’ needs. But we were innovative about it; we established the capability to conduct most picture and sound post production work remotely. Colorists, conform editors and other staff are now able to work from home or a remote site and connect to the facility’s central storage and main desktop workstations via remote collaboration software.

This allows Goldcrest to ensure theatrical and television projects remain on track while allowing clients to oversee work in as normal a manner as possible under current circumstances.

 

Dell’s M&E Strategist, Client Solutions, Matt Allard

Matt Allard

Dell Technologies helps users create, manage and deliver media through a complete and scalable IT infrastructure, including workstations, monitors, servers, shared storage, switches, virtualization solutions and more paired with the support and services.

What is Dell’s definition of a workstation? We know there are a few definitions.
One of the most important definitions is the International Data Corporation’s (IDC) definition that assesses the overall market for workstations. This definition includes several important elements:

1. Workstations should be highly configurable and include workstation-grade components, including:
a. Workstation-grade CPUs (like Intel Xeon processors)
b. Professional and discrete GPUs, like those in the Nvidia Quadro line and AMD Radeon Pro line
c. Support for ECC memory

2. Workstations must be certified with commonly used professional ISV software, like that from Adobe, Autodesk, Avid, Blackmagic and others.

3. IDC requires a brand that is dedicated and known for workstations.

Beyond the IDC’s requirements, we understand that workstation customers are seeking the utmost in performance and reliability to run the software they use every day. We feel that workstation-grade components and Dell Precision’s engineering deliver that environment. Reliability can also include the security and manageability that large enterprises expect, and our designs provide the hooks that allow IT to manage and maintain workstations across a large studio or media enterprise. Consumer PCs rarely include these commercial-grade IT capabilities.

Additionally, software and technology (such as the Dell Precision Optimizer, our Reliable Memory Technology, Dell Client Command Suite) can extend the performance, reliability and manageability on top of the hardware components in the system.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
It’s a common misconception that a computer is just a sum of its parts. It can be better to deal with a vendor that has the supply chain volume and market presence to have advantageous access during times like these, when supply constraints exist on popular CPUs and GPUs. Additionally, most professional ISV software is not qualified or certified on a set of off-the-shelf components, but on specific vendor PC models. If users want absolute confidence that their software will run optimally, using a certified/qualified platform is the best choice. Warranties are also important, but more on that in a bit.

What questions would you suggest pros ask before deciding on the right computer for their work?
The first question is to be clear about the nature of the work you do as a pro, using what software applications in the media and entertainment industry. Your working resolution has a large bearing on the ideal configuration for the workstation. We try to make deciding easier with Dell’s Precision Workstation Advisor, which provides pros an easy way to find configuration choices based on our certification testing and interaction with our ISV partners.

Do you think we are at a time when mobile workstations are as powerful as desktops?
The reality is that it is not challenging to build a desktop configuration that is more powerful than the most powerful mobile workstation. For instance, Dell Precision fixed workstations support configurations with multiple CPUs and GPUs, and those actually require beefier power supplies, more slots and thermal designs that need more physical space than in a reasonably sized mobile.

A more appropriate question might be, can a mobile workstation be an effective tool for M&E professionals who need to be on the road or on shoot? And the answer to that is a resounding yes.

How do you decide on what components to include in your systems … GPUs, for example?
As mentioned above, workstations tend to be highly configurable, often with multiple options for CPUs, GPUs and other components. We work to stay at the forefront of our suppliers’ roadmap offerings and to provide a variety of options so customers can choose the right price/performance configuration that suits their needs. This is where having a clear guidance on certified system for the ISV software a customer is using makes selecting the right configuration easier.

Can you talk about warranties and support?
An advantage of dealing with a Tier 1 workstation vendor like Dell is that pros can pick the right warranty and support level for their business, from basic hardware warranty to our ProSupport with aggressive availability and response times. All Dell Precision fixed workstations come with a three-year Dell Limited Hardware warranty, and users can opt for as many as five years. Precision mobile workstations come with a one-year warranty (except 7000 series mobile, which has three years standard), and users can opt for as many as five years’ warranty with ProSupport.

 

Performance Post’s Owner/President, Fausto Sanchez

Fausto Sanchez

Burbank’s Independently owned Performance Post focuses on broadcast television work. It works with Disney, Warner Bros. and NBCUniversal. Credits include TV versions of the Guardians of the Galaxy franchise and SD to UHD upconversion and framerate conversions for HBO’s At the Heart of Gold: Inside the USA Gymnastics Scandal.

Do you typically buy off the shelf or custom? Both?
We look to the major suppliers like HP, Dell and Apple for off-the-shelf products. We also have
purchased custom workstations, and we build our own.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
If we have done our homework well, our workstations can last for three to five years. This timeline is becoming shorter, though, with new technologies such as higher core count and clock speed.

In evaluating our needs, first we look at the community for best practices. We look to see what has been successful for others. I love that we can get that info and stories here on postPerspective! We look at what the main suppliers are providing. These are great if you have a lot of extra cash. For many of us, the market is always demanding and squeezing everything it can. We are no different. We have bought both preconfigured systems from the primary manufacturers as well as custom systems.

HBO’s At the Heart of Gold: Inside the USA Gymnastics Scandal.

How important is the GPU to your work?
In our editorial workflows — Avid Media Composer, Adobe Premiere, Blackmagic Resolve (for editing) — GPU use is not a big deal because these applications are currently not relying on GPU so much for basic editing. Mostly, you select the one best for your applications. Nvidia has been the mainstay for a long time, but AMD has gotten great support, especially in the new Mac Pro workstation.

For color work or encoding, the GPU selection becomes critical. Currently, we are using the Nvidia Titan series GPUs for some of our heaviest processor-intensive workflows

What are the questions you ask yourself before buying new systems? And what do you do with your older systems?
When buying a new system, obviously the first questions are: What is it for? Can we expand it? How much? What kind of support is there? These questions become key, especially if you decide to build your custom workstation. Our old systems many times are repurposed for other work. Many can function in other duties for years.

Do you feel mobile workstations are just as powerful for your work as desktops these days?
We have had our eye on mobile workstations for some time. Many are extremely powerful and can find a good place for a specific purpose. There can be a few problems in this setup: additional monitor capabilities, external GPU, external mass storage connectivity. For a lot of work, mobile workstations make sense; if I do not have to connect a lot of peripherals and can work mostly self-contained or cloud-based, these can be great. In many cases you quickly learn that the keyboard, screen and battery life are not conducive to a long-term workflow. For the right workflow though, these can be great. They’re just not for us right now.

 

AMD’s Director of VFX/Media & Entertainment, James Knight

James Knight

AMD provides Threadripper and Epyc CPUs that accelerate workflows in M&E.

How does AMD describe a workstation?
Some companies have different definitions of what makes a workstation. 
Essentially AMD thinks of workstations as a combination of powerful CPUs and GPUs that enable professionals to create, produce, analyze, design, visualize, simulate and investigate without having to compromise on power or workload performance to achieve their desired results. In the specific case of media and entertainment, AMD designs and tests products aligned with the workstation ecosystem to enable professionals to do so much more within the same exact deadlines. We are giving them more time to create.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
Ultimately, professionals need to choose the best solution to meet their creative goals. We work closely with major OEMs to provide them with the best we have to offer for the market. For example, 64-core Threadripper has certainly been recognized by workstation manufacturers. System builders can offer these new CPUs to achieve great results.

What questions should pros ask before purchasing a workstation, in order to make sure they are getting the right workstation for their needs?
I typically ask professionals to focus on their pain points and how they want the new workstation to resolve those issues. More often than not, they tell me they want more time to create and time to try various renderings. With an optimized workstation matched with on optimal balance of powerful CPUs and reliable GPUs, pros can achieve the results they demand over and over.

What trends have you seen happening in this space over the last couple of years?
As memory technology improves and larger models of higher resolution are created, I’ve seen user expectations increase dramatically, as has their desire to work easily with these files. The demand for reliable tools for creating, editing and producing content has been constantly growing. For example, in the case of movie mastering and encoding, AMD’s 32-core and 64-core Threadripper CPUs have exceeded expectations when working with these large files.

PFX‘s Partner/VFX Supervisor, Jan Rybar 

Jan Rybar

PFX is a Czech-based company focused on animation, post and visual effects. They work on international projects ranging from short films to commercials, TV series and feature films. The 110-member team works in their studios in Prague

How often do you upgrade your workstations, and what process do you go through in finding the right one?
We upgrade the workstations themselves maybe every two or three years. We try to select good quality vendors and robust specs so we won’t be forced to replace workstations too often.

Do you also build your own workstations and renderfarms?
Not really — we have a vendor we like and buy all the hardware there. A long time ago, we found out that the reliability of HP and their Z line of workstations is what we need. So 99% of our workstations and blade renderfarms are HP.

How do your needs as a VFX house differ from a traditional post house?
It blends together a lot — it’s more about what the traditional post house specializes in. If it’s focused on animation or film, then the needs are quite similar, which means more based on CPU power. Lately, as we have been involved more and more in realtime engine-based workflows, state-of-the-art GPU technology is crucial. The Last Whale Singer teaser we did was created with the help of the latest GeForce RTX 2080ti hardware. This allowed us to work both efficiently and with the desired quality (raytracing).

Can you walk us through your typical workflow and how your workstations and their components play a part?
The workflow is quite similar to any other production: design/concept, sculpting, modeling, rigging, layout, animation, lighting/effects, rendering, compositing, color grading, etc.

The main question these days is whether the project runs in a classic animation pipeline, on a realtime engine pipeline or a hybrid. Based on this, we change our approach and adapt it to the technology. For example, when Telescope Animation works on a scene in Unreal, it requires different technology compared to a team that’s working in Maya/Houdini.

PNY’s Nvidia Quadro Product Marketing Manager, Carl Flygare

Carl Flygare

Nvidia’s Quadro RTX-powered workstations, featuring Nvidia Turing GPU architecture, allow for realtime raytracing, AI and advanced graphics capabilities for visualization pros. PNY is Nvidia’s Quadro channel partner throughout North America, Latin America, Europe and India.

How does PNY describe a workstation? Some folks have different definitions of what makes a workstation.
The traditional definition of the term comes from CAD – a system optimized for computer aided design — with a professional CPU (e.g., Xeon, Ryzen), generous DRAM capacity with ECC (Error Correction Code), a significant amount of mass storage, a graphics board capable of running a range of pro applications required by a given workflow and a power supply and system enclosure sufficient to handle all of the above. Markets and use cases also matter.

Contemporary M&E requires realtime cinematic quality rendering in application viewports, with an AI denoising assist. Retiming video (e.g., from 30 fps to 120 fps) for a slow-motion effect can be done by AI, with results essentially indistinguishable from a slow-motion session on the set. A data scientist would see things differently. GPU Tensor TFLOPS enable rapid model training to achieve inference accuracy requirements, GPU memory capacity to hold extremely large datasets, and a CPU/GPU combination that offers a balanced architectural approach to performance. With so many different markets and needs, practically speaking, a workstation is a system that allows a professional to do their best work in the least amount of time. Have the hardware address that need, and you’ve got a workstation.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
As Henry Ford famously said about the Model T: “Any customer can have a car painted any color that he wants so long as it is black.” That is the off-the-shelf approach to acquiring a workstation. Large Tier 1 OEMs offer extensive product lines and daunting Configure to Order options, but ultimately, all offer similar classes of systems. Off-the-shelf is easy; once you successfully navigate the product line and specifications maze, you order a product, and a box arrives. But building your own system is not for the faint-hearted. Pick up CPU data sheets from Intel or AMD — you can read them for days.

The same applies to GPUs. System memory is easier, but mass storage offers a dizzying array of options. HDD (hard disk drive) or SSD (solid state drive)? RAID (and if so, what kind) or no RAID? How much power supply capacity is required for stable performance? A built-from-scratch workstation can result in a dream system, but with a system of one (or a few), how well will critical applications run on it? What if an essential workflow component doesn’t behave correctly? In many instances this will leave you on your own. Do you want to buy a system to perform the work you went into business to do, or do you want to spend time maintaining a system you need to do your work?

A middle path is available. A vibrant, lithe, agile and market-solutions knowledge-based system builder community exists. Vendors like Boxx Technologies, Exxact, Rave Computer, Silverdraft Supercomputing and @Xi Computer (among others) come to mind. These companies specialize in workstations (as defined by any of the definitions discussed earlier), have deep vertical knowledge, react quickly to technological advances that provide a performance and productivity edge, and vigorously support what they sell

What questions would you suggest pros ask before deciding on the right computer for their work?
Where is their current system lacking? How are these deficits affecting creativity and productivity? What use cases does a new system need to perform well? What other parts of my employment environment do I need to interact with, and what do they expect me to provide? These top-line questions transition to many others. What is the model or scene size I need to be able to fit into GPU memory to benefit from full GPU performance acceleration? Will marketing show up in my office or cubicle and ask for a photorealistic render even though a project is early in the design stage? Will a client want to interact with and request changes by using VR? Is a component of singular significance — the GPU — certified and supported by the ISVs that my workflow is built around? Answer these questions first, and you’ll find the remainder of the process goes much more easily. Use case first, last and always!

You guys have a relationship with Nvidia and your system-builder partners use their Nvidia GPUs in their workstations. Can you talk about that?
PNY is Nvidia’s sole authorized channel partner for Nvidia Quadro products throughout North America and Latin America and Europe, Middle East, Africa and India. Every Quadro board is designed, tested and built by Nvidia, whether it comes from PNY, Dell, HP or Lenovo. The difference is that PNY supports Quadro in any system brand. Tier 1 OEMs only support a Quadro board’s “slot win” in systems they build. This makes PNY a much better choice for GPU upgrades — a great way to extend the life of existing workstations — or when looking for suppliers that can deliver the technical support required for a wonderful out-of-box experience with a new system. It’s true whether the workstation is custom-built or purchased through a PNY Partner that specializes in delivering turnkey systems (workstations) built for professionals.

Can you talk about warranties and support? What do you offer?
PNY offers support for Nvidia in any system brand. We have dedicated Nvidia Quadro technical support reps available by phone or email. PNY never asks for a credit card number before offering product or technical support. We also have full access to Nvidia product and technical specialists should escalation be necessary – and direct access to the same Nvidia bug reporting system used by Nvidia employees around the world.

Finally, what trends do you see in the workstation market currently?
First the good: Nvidia Quadro RTX has enabled a workstation renaissance. It’s driving innovation for design, visualization and data science professionals across all major market segments. An entirely new class of product — the data science workstation — has been developed. Quadro RTX in the data centers and virtual GPU technology can bring the benefits of Quadro RTX to many users while protecting essential intellectual property. This trend toward workstation specialization by use case offers buyers more choices that better fit their specific criteria. Workstations — however defined — have never been more relevant or central to creative pros across the globe. Another good trend is the advent of true mobile workstations and notebooks, including thin and light systems, with up to Quadro RTX 5000 class GPUs.

The bad? With choice comes confusion. So many to choose from. Which best meets my needs? Companies with large IT staff can navigate this maze, but what about small and medium businesses? They can find the expertise necessary to make the right choice with PNY’s extensive portfolio of systems builders. For that matter, enterprises can find solutions built from the chassis up to support a given use case. Workstations are better than ever before and purchasing one can be easier than ever as well.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Workstations and Color Grading

By Karen Moltenbrey

A workstation is a major investment for any studio. Today, selecting the right type of machine for the job can be a difficult process. There are many brands and flavors on the market, and some facilities even opt to build their own. Colorists have several tools available to them when it comes to color grading, ranging from software-based systems (which typically use a multiprocessor workstation with a high-end GPU) to those that are hardware-based.

Here, we examine the color workflow of two different facilities: Technicolor Vancouver and NBCUniversal StudioPost in Los Angeles.

[Editor’s note: These interviews were conducted before the coronavirus work limits were put in place.]

Anne Boyle

Technicolor Vancouver
Technicolor is a stalwart in the post industry, with its creative family — including VFX studios MPC, The Mill, Mr. X and Mikros — and wide breadth of post production services offered in many locations around the world. Although Technicolor Vancouver has been established for some time now, it was only within the past two years that the decision was made to offer finishing services again there, with an eye toward becoming more of a boutique operation, albeit one offering top-level effects.

With this in mind, Anne Boyle joined as a senior colorist, and immediately Technicolor Vancouver began a co-production with Technicolor Los Angeles. The plan was for the work to be done in Vancouver, with review and supervision handled in LA. “So we hit the ground running and built out new rooms and bought a lot of new equipment,” says Boyle. “This included investing in FilmLight Baselight, and we quickly built a little boutique post finishing house here.”

This shared-location work setup enabled Technicolor to take advantage of the lucrative tax credits offered in Vancouver. The supervising colorist in LA reviews sessions with the client, after which she and Boyle discuss them, and then Boyle picks up the scene and performs the work based on those conversations or notes in the timeline. A similar process occurs for the Dolby SDR deliverables. “There isn’t much guesswork. It is very seamless,” she says.

“I’ve always used Baselight,” says Boyle, “and was hoping to go that route when I got here, and then this shared project happened, and it was on a Baselight [in LA]. Happily for me, the supervising colorist, Maxine Gervais, insisted that we mirror the exact setup that they had.”

Gervais was using a Baselight X system, so that is what was installed in Vancouver. “It’s multi-GPU (six Nvidia Titan XPs) with a huge amount of storage,” she says. “So we put in the same thing and mimicked the infrastructure in LA. They also put in a Baselight Assist station and plan to upgrade it in the coming months to make it color-capable as well.”

The Baselight X turnkey system ships with bespoke storage and processing hardware, although Technicolor Vancouver loaded it with additional storage. For the grading panels, the company went with the top-of-the-line Blackboard. The Vancouver facility also purchased the same displays as LA — Sony BVM-X300s.

Messiah

The mirrored setup was necessary for the shared work on Netflix’s Messiah, an HDR project that dropped January 1. “We had to deliver 10 episodes all at once in 4K, [along with] both the HDR PQ masters and the Dolby SDR deliverable, which were done here as well,” explains Boyle. “So we needed the capability to store all of that and all of those renders. It was quite a VFX-heavy show, too.”

Using Pulse, Technicolor’s internal cloud-based system, the data set is shared between the LA and Vancouver sites. Technicolor staff can pull down the data, and VFX vendors can pull their own VFX shots too. “We had exact mirrors of the data. We were not sending project files back and forth, but rather, we shared them,” Boyle explains. “So anyone could jump on the project, whether in Vancouver or LA, and immediately open the project, and everything would appear instantaneously.”

When it comes to the hardware itself, speed and power are big factors. As Boyle points out, the group handles large files, and slowdowns, render issues and playback hiccups are unacceptable.

Messiah

The color system proved its mettle on Messiah, which required a lot of skin retouching and other beauty work. “The system is dedicated and designed only for colorists,” says Boyle. “And the tools are color-focused.”

Indeed, Boyle has witnessed drastic changes in color workstations over the past several years. File sizes have increased thanks to Red 8K and raw materials, which have driven the need for more powerful machines and more powerful GPUs, particularly with the increasingly complex HDR workflows, wherein floating points are necessary for good color. “More work nowadays needs to be performed on the GPU,” she adds. “You just can’t have enough power behind you.”

NBCUniversal StudioPost
NBCUniversal StudioPost knows a thing or two about post production. Not only does the facility provide a range of post, sound and finishing services, but it also offers cutting-edge equipment rentals and custom editorial rooms used by internal and third-party clients.

Danny Bernardino

Specifically, NBCUniversal offers end-to-end picture services that include dailies, editorial, VFX, color correction, duplication and encoding/decoding, data management, QC, sound, sound editorial, sound supervision, mixing and streaming.

Each area has a plethora of workstations and systems needed to perform its given tasks. For the colorists, the facility offers two choices, both on Linux OS: a Blackmagic DaVinci Resolve 16.1.2 (fully loaded with a creative suite of plugins and add-ons) running on an HP Z840 machine, and Autodesk Lustre 2019 running on an HP Z820.

“We look for a top-of-the-line color corrector that has a robust creative tool set as well as one that is technically stable, which is why we prefer Linux-based systems,” says Danny Bernardino, digital colorist at NBCUniversal StudioPost. Furthermore, the facility prefers a color corrector that adapts to new file formats and workflows by frequently updating its versions. Another concern is that the system works in concert with all of the ever-changing display demands, such as UHD, 4K, HDR and Dolby Vision.

Color bay

According to Bernardino, the color systems at NBCUniversal are outfitted with the proper CPU/GPU and SAN storage connectivity to ensure efficient image processing, thereby allowing the color talent to work without interruption. The color suites also are outfitted with production-level video monitors that represent true color. Each has high-quality scopes (waveform, vector and audio) that handle all formats.

When it comes time to select machines for the colorists there, it is a collective process, says senior VP Thomas Thurau. First, the company ascertains the delivery requirements, and then the color talent, engineering and operations staff work together to configure the proper tool sets for the customers’ content. How often the equipment is replaced is contingent on whether new image and display technology has been introduced.

Thurau defines a solid colorist workstation as a robust platform that is Linux-based and has enough card slots or expansion chassis capabilities to handle four or more GPU cards, Fibre Channel cards and more. “All of our systems are in constant demand, from compute to storage, thus we look for systems and hardware that are robust through to delivery,” he notes.

Mr. Robot

NBCUniversal StudioPost is always humming with various work. Some of the more recent projects there includes Jaws, which was remastered in UHD/HDR, Casino (UHD/HDR), the How to Train Your Dragon series (UHD/HDR) and an array of Alfred Hitchcock’s more famous films. The company also services broadcast episodic (NBCU and others) and OTT/streaming customers, offering a full suite of services (Avid, picture and sound). This includes Law & Order SVU, Chicago Med, Will & Grace, Four Weddings and a Funeral and Mr. Robot, as well as others.

“We take incredible pride in all aspects of our color services here at NBCUniversal StudioPost, and we are especially pleased with our HDR grades,” says Thurau.

For those who prefer to do their own work, NBCUniversal has over 185 editorial rooms, ranging from small to large suites, set up with Avid Media Composer.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

What Makes a Workstation?

By Mike McCarthy

Computer manufacturers charge a premium for their highest-end “workstation” systems, but many people don’t fully understand what really distinguishes a workstation-class system from any other computer. Admittedly, there is no cut and dry line, but workstations usually have a few characteristics that make them more suitable for professional applications than regular home or office PCs. They are usually faster, have a greater level of expandability and are more reliable than other PCs. This, of course, makes them more expensive, but depending on what you need them for, they can be well worth the additional cost.

Workstation Graphics
Nearly all workstations offer professional-level, OpenGL-optimized graphics cards at a time when having any discrete GPU at all is becoming rare outside of gaming systems. Nvidia’s Quadro cards and AMD’s Radeon Pro line have more RAM than their gaming counterparts, and their drivers are optimized for professional applications. High-bit-depth color processing used to be the other defining characteristic of professional graphics cards, but HDR imaging has pushed 10-bit color support into consumer GPUs, removing that as a differentiating factor.

Scalability
Most workstations have a greater level of expandability in the form of more slots for RAM and PCIe cards and more storage and networking options. This allows more flexibility in configuring a system that accommodates a specific task or application. Editors need lots of storage, animators need lots of RAM, and VFX artists might need more CPU power. They might all use the same model of workstation but in totally different configurations. Most workstations usually also have more card slots available, allowing for hardware upgrades for dedicated tasks. Editors might install a video I/O card for SDI interfaces, a sound mixer might need to install Avid Pro Tools processing cards, and many users will need high-bandwidth network cards — running at 10 gigabits or more — to share data with others they are working with.

Different Classes of Workstation
There are also a variety of classes of workstations available, depending on your budget and needs. Top-end workstations have dual-CPU sockets (and in rare cases, four or more sockets), multiplying the potential processing power and aggregate bandwidth. These systems are adapted from server architectures, with a few changes to improve interactive performance and expansion options. They offer many channels and slots for maximum memory capacity and throughput. Intel has had its Xeon scalable processors in this market for many years, while AMD has recently introduced its EPYC processor line into this segment.

Below that top tier comes high-end desktop systems, which offer a single CPU (possibly with up to 32 cores), four to six channels of memory and many PCIe lanes. Intel has its Core X CPUs and Xeon W CPUs on the high end in this range, while AMD has its Threadripper line in this range.

At a lower performance level, some workstations are based on the same CPUs as gaming systems. These systems have much less powerful chipsets, with fewer PCIe lanes for expansion and only two channels of memory, but they still offer very good performance on smaller projects at lower prices. Intel’s Core CPUs and AMDs Ryzen CPUs fall into this category, with up to eight and 16 cores, respectively. These systems can handle single-threaded workloads as well as higher-end options, but for applications that are well threaded, or when running many tasks at once, the higher-end systems will have a definite benefit.

Mobile Workstations
Separately, there are also mobile workstations, which are top-end laptop units. These are usually defined by having professional GPUs and — more recently — in many cases by having mobile Xeon CPUs. They usually have lots of RAM and very good integrated display options, occasionally with integrated calibration systems. They use NVMe storage, but that is no longer unique to workstations. They usually have more ports available than consumer systems and a wider variety of configuration options. Many models also have Mil-Spec ruggedness to protect them from damage in the field.

There are also a number of other unique workstation offerings, from all-in-one systems similar to the iMac Pro to tablets and VR backpacks. The one thing these all have in common is that they are designed for professional users and applications that have high processing workloads on either the CPU or GPU.

On the upper end, it is easy to see what makes a workstation different, with dual-socket Xeon processors, high core counts, RAM measured in terabytes, ECC for stability and RAID-based storage controllers for increased bandwidth and security. But what about “low-end” workstations? If low-end workstation core counts and RAM capacity are similar to a high-end gaming system, then what other features do the lower-end workstations bring to the table?

Reliability
Most large-scale workstation manufacturers have invested more engineering and testing time into workstation products. This includes better thermal components to allow the systems to run cooler and quieter under larger loads. Companies like Dell, HP and Lenovo all include their own software for optimizing and tweaking the system for maximum performance with various supported applications. They also work with software companies to certify various configurations to guarantee support for specific applications. All this effort should make workstations more reliable and less likely to crash or error out during important tasks.

Thermal Engineering
Most computers aren’t designed to be run at maximum performance for long periods of time, as many applications aren’t that taxing. For ones that are, many users take breaks, allowing the system to cool down, but an editor might kick off a render queue before heading home, and the system is processing at maximum capacity for the rest of the night or weekend. Cheaply built systems will heat up quickly and then throttle back the performance to prevent overheating, slowing down the task at hand. Workstations are engineered to carry on those intense computing tasks for greater periods of time without exceeding their thermal envelope. And even when not operating at peak processing performance, many workstations are designed to run much quieter, allowing their users to think more clearly or better hear the audio associated with the tasks they are working on.

Windows for Workstations
Microsoft recently released a version of Windows 10 targeted at high-end power users. It supports more CPUs and RAM than Windows 10 Pro, broader storage options and faster networking protocols, among other features. This difference in software support might further differentiate workstation-class systems in the future.

Mac Workstations
Apple has offered a workstation to its high-end users in the form of the Mac Pro. The original “cheese grater” silver tower had Xeon CPUs, ECC memory and a limited number of PCIe slots for expansion. This was replaced by the “trash can” black cylinder Mac Pro, which was, arguably, not a proper workstation. It didn’t have PCIe slots for expansion, it didn’t have hard drive slots for storage, and, most importantly, it didn’t have the thermal engineering to sustain high-performance workloads for an extended period of time. But it was the best Apple had to offer for many years, putting the trash can into places it never otherwise would have been and was not designed for.

The new Mac Pro tower (or rack) has returned a true workstation to Apple’s product portfolio. With a single-socket Xeon CPU, it sits at the peak of the mid-level workstation tier. With more slots than any other Mac ever, it is fully expandable and upgradable. (Even the I/O header can be replaced in the future.) While it would be possible for Apple to release a more powerful dual-socket option in the future, I doubt it will do so because the current Mac Pro should meet the needs of 99% of potential users due to how much multi-core CPUs have improved in the last few years.

Workstations in the Future
I expect the trend of users moving from top-end dual-socket systems to maxed out mid-level systems to continue in both the PC and Mac world as increases in maximum processing performance (and price) exceed the increases in workloads in most workstation tasks. This should increase the market for mid-level workstations, eventually increasing the options available and decreasing their price. We also see the lines blurring between mobile workstations and gaming laptops as the GPUs and drivers become more standardized between them. It will also be interesting to see what impact Intel and Micron’s new Optane persistent memory architecture has on workstations and their applications. Someday soon we might see integrated network interfaces that are faster than 1Gb, which has been standard since 2004. Until then, we will still be using cards to upgrade our workstations to the capabilities we need them to have for the tasks we need to accomplish, which is what they were designed for.

Quick Chat: Scholar’s Will Johnson and William Campbell

By Randi Altman

In celebrating its 10th anniversary, animation and design company Gentleman Scholar has relaunched as Scholar and has put a new emphasis on its live-action work. Started by directors/partners William Campbell and Will Johnson in Los Angeles, the company has grown over the years and now boasts a New York City location as well.

Recent Scholar projects include the animated Timberland Legends Club spot, the live-action and animated Porsche Pop Star and the live-action Acura TLX.

Considering their new name change and website rebrand, we decided to reach out to “The Wills” to find out more about Scholar’s work philosophy and what this change means to the company.

Audi Q3

Why did you decide to rename and relaunch as Scholar?
Will Johnson: After 10 years it felt like a good time to redefine how the world views us. Not as only as a one-stop shop that can handle all of your design and animation needs, but also a live-action and storytelling powerhouse.

Will Campbell: The new name evokes cleanliness and sophistication and better represents how we have evolved. Gentleman Scholar was fun, quirky and playful. We’re still all of those things, but we feel like we’ve also become more cinematic, more polished and better collaborators that understand production more clearly… which allows us to navigate the industry better as a whole.

Even when it comes to live action and carrying our film into post, we can assess solutions on-set quicker and more fluidly, understanding the restrictions or additions we can take with us into the software. Scholar has changed immensely over the past 10 years. We have grown up and become smarter, faster and better. The rebrand is a window to who we have already become and who we plan to be.

How is the business different, and what’s stayed the same?
Johnson: It’s more refined. We’ve learned a lot about how to conduct ourselves in a competitive art world — the positive ways that we approach each project and allowing the stress of the job to kick us in the ass but not let it guide the decisions we make. It’s also about being patient with our team as well as our own decision-making.

Creativity is a process, and “turning it on” every day isn’t always easy. Understanding that not every idea you have is a great idea and how to be comfortable with your creative self is important. To trust in the “why” you are making something versus the “what” that you make. And that’s reflected in the new company name and our new website design. It’s the same us. The same wild bunch of creative explorers intent on pushing the boundaries of design and live action. We are just more certain of who we are and the stories we tell, and therefore more inclusive in our path to get there.

Acura

Campbell: We now have a decade’s worth of work to back up our thoughts and collaborations. This is enormous when you need to show how capable you are, not just in the standard we hold ourselves to visually, but in the quality and sophistication of our evolving storytelling. We have fine-tuned our production processes, enabling the pipelines of our edit, animation, CG and composite teams to more easily embrace the techniques and tools we use to craft the stories we want to tell… so we can be more decisive with the concepts we put on the table. From the software to the hardware, we are more refined.

Can you talk about how the industry has changed over the past 10 years?
Johnson: It’s more spread out than it’s ever been. There is more content that reaches more eyes in more places. From social to OOH to broadcast, the need to pull everyone together and create something that speaks to everyone all at once feels like it’s stronger and more apparent than before. And we’ve seen it all at this point, from vertical campaigns to entirely experiential ones. The era of “do more with less” is here.

Campbell: For us, we were very young when we opened Scholar. We were in our 20s, and everything was a fire drill and we thrived off the chaos. We have learned to harness the inspiration that comes with chaos and channel it into focused, productive creation.

Have you embraced working in the cloud — storage, rendering, review and approval, etc. — and if so, in what way?
Johnson: Yes. We know it’s a fast-paced world and in the climate of things, generally the globe is embracing a cloud-based way of thinking. Luckily, we have an amazing team of technologists so we can tap into our home-base server from anywhere at any time. From rendering to storage to reviews and approvals — it keeps us all united, focused and organized when we’re moving a million miles a minute in any different direction.

Campbell: Scholar has been testing the technology as it is getting better and cheaper, but we are always balancing convenience versus security, and those swing on a job-by-job basis. We’ve written tools to take advantage of storage and rendering resources on both coasts and use Aspera to facilitate file syncing between each office.

Can you talk about the tools you use for your work?
Johnson: The tangible ones are the usual suspects. Adobe’s Creative Suite and 3D tools like Autodesk Maya, Maxon Cinema 4D, Foundry Nuke and all of the animation and time-based ones, like Adobe Premiere and Avid Media Composer. But my favorite tools tend to be the brains and skills of our team… the words on paper and the channeling of art and thought into something tactile. As creators, we lust to make things, and seeing that circuit board of craft and making is something amazing to watch.

Campbell: Scholar has always been a mixed-media studio. We love getting our hands dirty with new software or cameras. We fundamentally want to do what’s right for the job and not rest inside our comfort zone. Thinking about what style is right for a client, not “how do I make my style fit,” is just how we are wired. The tool is always a means to an end. My favorite jobs are the ones where the technique is invisible, and it’s all about the experience.

We are operating in an entirely new world these days with the coronavirus and working remotely. How are you guys embracing the change?
Campbell: With an office on each coast, we have already had to learn to work as a team remotely. The years of unifying groups from a distance and finding ways for technology to bring artists closer together has set the stage for us right now. We have transitioned our workforce to 100% remote. It’s early days yet, but everyone is in good spirits, and we feel as connected as ever, although I do miss our lunch table.

Johnson: We’re definitely thankful for the staff and talent that we surround ourselves with and how they’ve handled their work-from-home routines. The check-ins, the mind melds and the daily (hourly) hangouts have helped. We’re using the change in the world as an opportunity to showcase our adaptability — how we can scale up and down even in the remote world — as a way to continue to grow our relationships and push the creative boundaries.

As people who find it hard to simply sit still, we’ve changed how we approach and talk about a project as each script comes in. The conversations about techniques are important — how we look at animation with a live-action lens, how 2D can become 3D, or vice versa. We’re more easily adaptable and change purely out of the need to discover what’s new.

Main Image: (L-R) Will Johnson and Will Campbell

Colorist Chat: Framestore LA senior colorist Beau Leon

Veteran colorist Beau Leon recently worked with director Spike Jonze on a Beastie Boys documentary and a spot for cannabis retailer MedMen.

What’s your title and company?
I’m senior colorist at LA’s Framestore

Spike Jonze’s MedMen

What kind of services does Framestore offer?
Framestore is a multi-Oscar-winning creative studio founded over 30 years ago, and the services offered have evolved considerably over the decades. We work across film, television, advertising, music videos, cinematic data visualization, VR, AR, XR, theme park rides… the list is endless and continues to change as new platforms emerge.

As a colorist, what would surprise people the most about what falls under that title?
Despite creative direction or the equipment used to shoot something, whether it be for film or TV, people might not factor in how much color or tone can dictate the impact a story has on its audience. As a colorist, my role often involves acting as a mediator of sorts between various creative stakeholders to ensure everyone is on the same page about what we’re trying to convey, as it can translate differently through color.

Are you sometimes asked to do more than just color on projects?
Earlier in my career, the process was more collaborative with DPs and directors who would bring color in at the beginning of a project. Now, particularly when it comes to commercials with tighter deadlines and turnarounds, many of those conversations happen during pre-production without grading factored in until later in the pipeline.

Rihanna’s Needed Me

Building strong relationships and working on multiple projects with DPs or directors always allows for more trust and creative control on my end. Some of the best examples I’ve seen of this are on music video projects, like Rihanna’s Needed Me, which I graded here at Framestore for a DP I’d grown up in the industry with. That gave me the opportunity to push the creative boundaries.

What system do you work on?
FilmLight Baselight

You recently worked on the new Beastie Boys documentary, Beastie Boys Story. Can you talk a bit about what you did and any challenges relating to deadlines?
I’ve been privileged to work with Spike Jonze on a number of projects throughout my career, so going into Beastie Boys Story, we already had a strong dialogue. He’s a very collaborative director and respectful of everyone’s craft and expertise, which can be surprisingly rare within our industry.

Spike Jonze’s Beatie Boys Story

The unique thing about this project was that, with so much old footage being used, it needed to be mastered in HDR as well as reworked for IMAX. And with Spike being so open to different ideas, the hardest part was deciding which direction to choose. Whether you’re a hardcore Beastie Boys fan or not, the documentary is well worth watching once it will air on AppleTV+ in April.

Any suggestions for getting the most out of a project from a color perspective?
As an audience, our eyes have evolved a great deal over the last few decades. I would argue that most of what we see on TV and film today is extremely oversaturated compared to what we’d experience in our real environment. I think it speaks to how we treat consumers and anticipate what we think they want — colorful, bright and eye-catching. When it’s appropriate, I try to challenge clients to think outside those new norms.

How do you prefer to work with the DP or director?
Whether it’s working with a DP or director, the more involved I can be early on in the conversation, the more seamless the process becomes during post production and ultimately leads to a better end result. In my experience, this type of access is more common when working on music videos.

Most one-off commercial projects see us dealing with an agency more often than the director, but an exception to the rule that comes to mind is on another occasion when I had the chance to collaborate on a project with Spike Jonze for the first ever brand campaign for cannabis retailer MedMen called The New Normal. He placed an important emphasis on grading and was very open to my recommendations and vision.

How do you like getting feedback in terms of the look?
A conversation is always the best way to receive feedback versus a written interpretation of imagery, which tends to become very personal. An example might be when a client wants to create the feeling of a warm climate in a particular scene. Some might interpret that as adding more warm color tones, when in fact, if you think about some of the hottest places you’ve ever visited, the sun shines so fiercely that it casts a bright white hue.

What’s your favorite part of the job?
That’s an easy answer — to me, it’s all about the amazing people you meet in this industry and the creative collaboration that happens as a result. So many of my colleagues over the years have become great friends.

Any least favorites?
There isn’t much that I don’t love about my job, but I have witnessed a change over the years in the way that our industry has begun to undervalue relationships, which I think is a shame.

If you didn’t have this job, what would you be doing instead?
I would be an art teacher. It combines my passion for color and visual inspiration with a forum for sharing knowledge and fostering creativity.

How early did you know this would be your path?
In my early 20s, I started working on dailies (think The Dukes of Hazzard, The Karate Kid, Fantasy Island) at a place in The Valley that had a telecine machine that transferred at a frame rate faster than anywhere else in LA at the time. It was there that I started coloring (without technically realizing that was the job I was doing, or that it was even a profession).

Soon after, I received a call from a company called 525 asking me to join them. They worked on all of the top music videos during the prime “I Want My MTV” era, and after working on music videos as a side hustle at night, I knew that’s where I wanted to be. When I first walked into the building, I was struck by how much more advanced their technology was and immediately felt out of my depth. Luckily, someone saw something in me before I recognized it within myself. I worked on everything from R.E.M.’s “Losing My Religion” to TLC’s “Waterfalls” and The Smashing Pumpkins’ “Tonight, Tonight.” I found such joy in collaborating with some of the most creative and spirited directors in the business, many of whom were inspiring artists, designers and photographers in their spare time.

Where do you find inspiration?
I’m lucky to live in a city like LA with such a rich artistic scene, so I make a point to attend as many gallery openings and exhibitions as I can. Some of my favorite spaces are the Annenberg Space for Photography, the Hammer Museum and Hauser & Wirth. On the weekends I also stop by Arcana bookstore in Culver City, where they source rare books on art and design.

Name three pieces of technology you can’t live without.
I think I would be completely fine if I had to survive without technology.

This industry comes with tight deadlines. How do you de-stress from it all?
After a long day, cooking helps me decompress and express my creativity through a different outlet. I never miss a trip to my local farmer’s market, which also helps to keep me inspired. And when I’m not looking at other people’s art, I’m painting my own abstract pieces at my home studio.

Review: Digital Anarchy’s Transcriptive 2.0

By Barry Goch

Not long ago, I had the opportunity to go behind the scenes at Warner Bros. to cover the UHD HDR remastering of The Wizard of Oz. I had recorded audio of the entire experience so I could get accurate quotes from all involved — about an hour of audio. I then uploaded the audio file to Rev.com and waited. And waited. And waited. A few days later they came back and said they couldn’t do it. I was perplexed! I checked the audio file, and I could clearly hear the voices of the different speakers, but they couldn’t make it work.

That’s when my editor, Randi Altman, suggested Digital Anarchy’s Transcriptive, and it saved the day. What is Transcriptive? It’s an automated, intelligent transcription plugin for Adobe Premiere editors designed to automatically transcribe video using multiple speech and natural language processing engines with accuracy.

Well, not only did Transcriptive work, it worked super-fast, and it’s affordable and simple to use … once everything is set up. I spent a lot of time watching Transcriptive’s YouTube videos and then had to create two accounts for the two different AI transcription portals that they use. After a couple of hours of figuring and setup, I was finally good to go.

Digital Anarchy has lots of videos on YouTube about setting up the program. Here is a link to the overview video and a link to 2.0 new features. After getting everything set up, it took less than five minutes from start to finish to transcribe a one-minute video. That includes the coolest part: automatically linking the transcript to the video clip with word-for-word accuracy.

Transcriptive extension

Step by Step
Import video clip into Premiere, select the clips, and open the Transcriptive Extension.

Tell Transcriptive if you want to use an existing transcript or create a new transcription.

Then choose the AI that you want to transcribe your clip. You see the cost upfront, so no surprises.

Launch app

I picked the Speechmatics AI:

Choosing AI

Once you press continue, Media Encoder launches.

Media Encoder making FLAC file automatically.

And Media Encoder automatically makes a FLAC file and uploads it to the transcription engine you picked.

One minute later, no joke, I had a finished transcription linked word-accurately to my source video clip.

Final Thoughts
The only downside to this is that the transcription isn’t 100% accurate. For example, it heard Lake Tahoe as “Lake Thomas” and my son’s name, Oliver, as “over.”

Final transcription

This lack of accuracy is not a deal breaker for me, especially since I would have been totally out of luck without it on The Wizard of Oz article, which you can read here. For me, the speed and ease of use more than compensates for the lack of accuracy. And, as AI’s get better, the accuracy will only improve.

And on February 27, Digital Anarchy released Transcriptive V.2.0.3, which is compatible with Adobe Premiere v14.0.2. The update also includes a new prepaid option that can lower the cost of transcription to $2.40 per hour of footage. Transcriptive’s tight integration with Premiere makes it a must-have for working with transcripts when cutting long- and short-form projects.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Seagate’s new IronWolf 510 M.2 NVMe SSD

Seagate Technology has beefed up its high-performance solutions for multi-user NAS environments by adding to its IronWolf SSD product line. IronWolf 510 is an M.2 NVMe SSD with caching speeds of up to 3GB/s for NVMe-compatible systems and is designed for creative pros and businesses that need 24/7 multi-user storage that is cache-enabled.

The IronWolf 510 SSD meets NAS manufacturer requirements of one drive write per day (DWPD), allowing multi-user NAS environments to do more with their data with lasting performance. According to Seagate, IronWolf 510 SSD is reliable with 1.8 million hours mean time between failures (MTBF) in a PCIe form factor, two years of Rescue Data Recovery Services, and a five-year limited warranty. IronWolf Health Management helps analyze drive health and will soon be available on compatible NAS systems.

“We are the first to provide a purpose-built M.2 NVMe for NAS that not only goes beyond SATA performance metrics but also provides three times the endurance when compared to the competition. This meets the required endurance spec of one DWPD which our NAS partners expect for their customers,” says Matt Rutledge, senior VP, devices. “Because of such high endurance, our customers are getting a tough SSD for small business and creative professional NAS environments.”

The IronWolf 510 SSD PCIe Gen3 x4, NVMe 1.3 is available in 240GB ($119.99), 480GB ($169.99), 960GB ($319.99) and 1.92TB ($539.99) capacities and is compatible with leading NAS vendors.

Colorist Chat: Keith Shaw on Showtime’s Homeland and the process

By Randi Altman

The long wait for the final season of Showtime’s Homeland seemed to last an eternity, but thankfully the series is now airing, and we here at postPerspective are pretty jazzed about it. Our favorite spies, Carrie and Saul, are back at it, with this season being set in Afghanistan.

Keith Shaw

Year after year, the writing, production and post values on Homeland have been outstanding. One of those post folks is colorist Keith Shaw from FotoKem’s Keep Me Posted, which focuses on finishing services to television.

Shaw’s credits are impressive. In addition to Homeland, his work can be seen on Ray Donovan, Shameless, Animal Kingdom and many others. We reached out to Shaw to find out more about working on Homeland from the first episode to the last. Shaw shares his workflow and what inspires him.

You’ve been on Homeland since the beginning. Can you describe the look of the show and how you’ve worked with DPs David Klein, ASC, and Giorgio Scali, ASC, as well as producer Katie O’Hara?
Working on Homeland from Episode 1 has been a truly amazing experience. Katie, Dave, Giorgio and I are an extremely collaborative group.

One consistent factor of all eight seasons has been the need for the show to look “real.” We don’t have any drastic or aggressively stylized looks, so the goal is to subtly manipulate the color and mood yet make it distinct enough to help support the storyline.

When you first started on the show, how would you describe the look?
The first two seasons were shot by Nelson Cragg, ASC. For those early episodes, the show was a bit grittier and more desaturated. It had a darker, heavier feel to it. There was not as much detail in the dark areas of the image, and the light fell off more quickly on the edges.

Although the locations and looks have changed over the years, what’s been the common thread?
As I mentioned earlier, the show has a realism to it. It’s not super-stylized and affected.

Do the DPs come to the color suite? What kind of notes do you typically get from them?
They do when they are able (which is not often). They are generally on the other side of the world. As far as notes, it depends on the episode. When I’m lucky, I get none. Generally, there are not a lot of notes. That’s the advantage of collaborating on a show from the beginning. You and the DP can “mold” the look of the show together.

You’ve worked on many episodics at Keep Me Posted. Prior to that you were working on features at Warner Bros. Can you talk about how that process differs for you?
In remastering and restoration of feature films, the production stage is complete. It’s not happening simultaneously, and that means the timeline and deadlines aren’t as stressful.

Digital intermediates on original productions, on the other hand, are similar to television because multiple things are happening all at once. There is an overlap between production and post. During color, the cut can be changing, and new effects could be added or updated, but with much tighter deadlines. DI was a great stepping stone for me to move from feature films to television.

Now let’s talk about some more general aspects of the job…

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
First of all, most people don’t have a clear understanding of what a colorist is or does. Even after 25 years and multiple explanations, my father-in-law still tells everyone I’m an editor.

Being a colorist means you wear many hats — confidante, mediator, therapist, VFX supervisor, scheduler and data manager — in addition to that color thing. For me, it boils down to three main attributes. One, you need to be artistic/creative. Two, you need to be technical. Finally, you need to mediate the decision-making processes. Sometimes that can be the hardest part of all, when there are competing viewpoints and visions between all the parties involved.

WHAT SYSTEM DO YOU WORK ON?
Digital Vision’s Nucoda.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Today’s color correctors are incredibly powerful and versatile. In addition to color, I can do light VFX, beauty work, editing or technical fixes when necessary. The clients appreciate the value of saving time and money by taking care of last-minute issues in the color suite.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Building relationships with clients, earning their trust and helping them bring their vision to the screen. I love that special moment when you and the DP are completely in sync — you’re reaching for the knobs before they even ask for a change, and you are finishing each other’s sentences.

WHAT’S YOUR LEAST FAVORITE?
Deadlines. However, they are actually helpful in my case because otherwise I would tweak and re-tweak the smallest details endlessly.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Ray Donovan, Shameless, Animal Kingdom, Single Parents and Bless This Mess are my current shows.

ANY SUGGESTIONS FOR GETTING THE MOST OUT OF A PROJECT FROM A COLOR PERSPECTIVE?
Become a part of the process as early as possible. Establishing looks, LUTs and good communication with the cinematographer are essential.

HOW DO YOU PREFER THE DP OR DIRECTOR TO DESCRIBE THE LOOK THEY WANT?
Each client has a different source of inspiration and way of conveying their vision. I’ve worked from fabric and paint samples, YouTube videos, photographs, magazine ads, movie or television show references, previous work (theirs and/or mine) and so on.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I can’t pick just one, so I’ll pick two. From my feature mastering work, The Shawshank Redemption. From television, Homeland.

WHERE DO YOU FIND INSPIRATION?
Definitely in photography. My father was a professional photographer and we had our own darkroom. As a kid, I spent countless hours after school and on weekends learning how to plan, take and create great photographs. It is still a favorite hobby of mine to this day.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

VFX studio One Of Us adds CTO Benoit Leveau

Veteran post technologist Benoit Leveau has joined London’s One of Us as CTO. The studio, which is in its 16th year, employs 200 VFX artists.

Leveau, who joins One of Us from Milk VFX, has been in the industry for 18 years, starting out in his native France before moving to MPC in London. He then joined Prime Focus, integrating the company’s Vancouver and Mumbai pipelines with London. In 2013, he joined Milk in its opening year as head of pipeline. He helped to build that department and later led the development of Milk’s cloud rendering system.

The studio, which depends on what it calls “the efficient use of existing technology and the timely adoption of new technology,” says Leveau’s knowledge and experience will ensure that “their artists’ creativity has the technical foundation which allows it to flourish.”

Goldcrest Post’s Jay Tilin has passed away

Jay Tilin, head of production at New York’s Goldcrest Post, passed away last month after a long illness. For 40 years, Tilin worked in the industry as an editor, visual effects artist and executive. His many notable credits include the Netflix series Marco Polo and the HBO series Treme and True Detective.

“Jay was in integral part of New York’s post production community and one of the top conform artists in the world,” said Goldcrest Post managing director Domenic Rom. “He was beloved by our staff and clients as an admired colleague and valued friend. We offer our heartfelt condolences to his family and all who knew him.”

Tilin began his career in 1980 as an editor with Devlin Productions. He also spent many years at The Tape House, Technicolor, Riot and Deluxe, all in New York. He was an early adopter of many now standard post technologies, from the advent of HD video in the 1990s through more recent implementations of 4K and HDR finishing.

His credits also include the HBO series Boardwalk Empire, the Sundance Channel series Hap and Leonard, the PBS documentary The National Parks and the Merchant Ivory feature City of Your Final Destination. He also contributed to numerous commercials and broadcast promos. A native New Yorker, Tilin earned a degree in broadcasting from SUNY Oswego.

Tilin is survived by his wife Betsy, his children Kelsey and Sam, his mother Sonya and his sister Felice (Trudy).

Editor Anthony Marinelli joins Northern Lights

Editor Anthony Marinelli has joined post studio Northern Lights. Marinelli’s experience spans commercial, brand content, film and social projects. Marinelli comes to Northern Lights from a four-year stint at TwoPointO where he was also a partner. He has previously worked at Kind Editorial, Alkemy X, Red Car, Cut+Run and Crew Cuts.

Marinelli’s work includes projects for Mercedes, FedEx, BMW, Visa, Pepsi, Scotts, Mount Sinai and Verizon. He also edited the Webby Award-winning documentary “Alicia in Africa,” featuring Alicia Keys for Keep a Child Alive.

Marinelli is also an active in independent theater and film. He has written and directed many plays and short films, including Acoustic Space, which won Best Short at the 2018 Ridgewood Guild Film Festival and Best Short Screenplay in the Richmond International Film Festival.

Marinelli’s most recent campaigns are for Mount Sinai and Bernie & Phyl’s for DeVito Verdi.

He works on Avid Media Composer and Adobe Premiere. You can watch his reel here.

Blackmagic releases Resolve 16.2, beefs up audio post tools

Blackmagic has updated its color, edit, VFX and audio post tool to Resolve 16.2. This new version features major Fairlight updates for audio post as well as many improvements for color correction, editing and more.

This new version has major new updates for editing in the Fairlight audio timeline when using a mouse and keyboard. This is because the new edit selection mode unlocks functionality previously only available via the audio editor on the full Fairlight console, so editing is much faster than before. In addition, the edit selection mode makes adding fades and cuts and even moving clips only a mouse click away. New scalable waveforms let users zoom in without adjusting the volume. Bouncing lets customers render a clip with custom sound effects directly from the Fairlight timeline.

Adding multiple clips is also easier, as users can now add them to the timeline vertically, not just horizontally, making it simpler to add multiple tracks of audio at once. Multichannel tracks can now be converted into linked groups directly in the timeline so users no longer have to change clips manually and reimport. There’s added support for frame boundary editing, which improves file export compatibility for film and broadcast deliveries. Frame boundary editing now adds precision so users can easily trim to frame boundaries without having to zoom all the way in the timeline. The new version supports modifier keys so that clips can be duplicated directly in the timeline using the keyboard and mouse. Users can also copy clips across multiple timelines with ease.

Resolve 16.2 also includes support for the Blackmagic Fairlight Sound Library with new support for metadata based searches, so customers don’t need to know the filename to find a sound effect. Search results also display both the file name and description, so finding the perfect sound effect is faster and easier than before.

MPEG-H 3D immersive surround sound audio bussing and monitoring workflows are now supported. Additionally, improved pan and balance behavior includes the ability to constrain panning.

Fairlight audio editing also has index improvements. The edit index is now available in the Fairlight page and works as it does in the other pages, displaying a list of all media used; users simply click on a clip to navigate directly to its location in the timeline. The track index now supports drag selections for mute, solo, record enable and lock as well as visibility controls so editors can quickly swipe through a stack of tracks without having to click on each one individually. Audio tracks can also be rearranged by click and dragging a single track or a group of tracks in the track index.

This new release also includes improvements in AAF import and export. AAF support has been refined so that AAF sequences can be imported directly to the timeline in use. Additionally, if the project features a different time scale, the AAF data can also be imported with an offset value to match. AAF files that contain multiple channels will also be recognized as linked groups automatically. The AAF export has been updated and now supports industry-standard broadcast wave files. Audio cross-fades and fade handles are now added to the AAF files exported from Fairlight and will be recognized in other applications.

For traditional Fairlight users, this new update makes major improvements in importing old legacy Fairlight projects —including improved speed when opening projects with over 1,000 media files, so projects are imported more quickly.

Audio mixing is also improved. A new EQ curve preset for clip EQ in the inspector allows removal of troublesome frequencies. New FairlightFX filters include a new meter plug-in that adds a floating meter for any track or bus, so users can keep an eye on levels even if the monitoring panel or mixer are closed. There’s also a new LFE filter designed to smoothly roll off the higher frequencies when mixing low-frequency effects in surround.

Working with immersive sound workflows using the Fairlight audio editor has been updated and now includes dedicated controls for panning up and down. Additionally, clip EQ can now be altered in the inspector on the editor panel. Copy and paste functions have been updated, and now all attributes — including EQ, automation and clip gain — are copied. Sound engineers can set up their preferred workflow, including creating and applying their own presets for clip EQ. Plug-in parameters can also be customized or added so that users have fast access to their preferred tool set.

Clip levels can now be changed relatively, allowing users to adjust the overall gain while respecting existing adjustments. Clip levels can also be reset to unity, easily removing any level adjustments that might have previously been made. Fades can also be deleted directly from the Fairlight Editor, making it faster to do than before. Sound engineers can also now save their preferred track view so that they get the view they want without having to create it each time. More functions previously only available via the keyboard are now accessible using the panel, including layered editing. This also means that automation curves can now be selected via the keyboard or audio panel.

Continuing on with the extensive improvements to the Fairlight audio, there has also been major updates to the audio editor transport control. Track navigation is now improved and even works when nothing is selected. Users can navigate directly to the timecode entry window above the timeline from the audio editor panel, and there is added support for high-frame-rate timecodes. Timecode entry now supports values relative to the current CTI location, so the playhead can move along the timeline relative to the position rather than a set timecode.

Support has also been added so the colon key can be used in place of the user typing 00. Master spill on console faders now lets users spill out all the tracks to a bus fader for quick adjustments in the mix. There’s also more precision with rotary controls on the panel and when using a mouse with a modifier key. Users can also change the layout and select either icon or text-only labels on the Fairlight editor. Legacy Fairlight users can now use the traditional — and perhaps more familiar — Fairlight layout. Moving around the timeline is even quicker with added support for “media left” and “media right” selection keys to jump the playhead forward and back.

This update also improves editing in Resolve. Loading and switching timelines on the edit page is now faster, with improved performance when working with a large number of audio tracks. Compound clips can now be made from in and out points so that editors can be more selective about which media they want to see directly in the edit page. There is also support for previewing timeline audio when performing live overwrites of video-only edits. Now when trimming, the duration will reflect the clip duration as users actively trim, so they can set a specific clip length. Support for a change transition duration dialogue.

The media pool now includes metadata support for audio files with up to 24 embedded channels. Users can also duplicate clips and timelines into the same bin using copy and paste commands. Support for running the primary DaVinci Resolve screen as a window when dual-screen mode is enabled. Smart filters now let users sort media based on metadata fields, including keywords and people tags, so users can find the clips they need faster.

Quick Chat: Editing Leap Day short for Stella Artois

By Randi Altman

To celebrate February 29, otherwise known as Leap Day, beer-maker Stella Artois released a short film featuring real people who discover their time together is valuable in ways they didn’t expect. The short was conceived by VaynerMedia, directed by Division7s Kris Belman and cut by Union partner/editor Sloane Klevin. Union also supplied Flame work on the piece.

The film begins with the words, ”There is a crisis sweeping the nation” set on a black screen. Then we see different women standing on the street talking about how easy it is to cancel plans. “You’re just one text away,” says one. “When it’s really cold outside and I don’t want to go out, I use my dog excuse,” says another. That’s when the viewer is told, through text on the screen, that Stella Artois has set out to right this wrong “by showing them the value of their time together.”

The scene changes from the street to a restaurant where friends are reunited for a meal and a goblet of Stella after not seeing each other for a while. When the check comes the confused diners ask about their checks, as an employee explains, that the menu lists prices in minutes, and that Leap Day is a gift of 24 hours and that people should take advantage of that by “uncancelling plans.”

Prior to February 29, Stella encouraged people to #UnCancel plans and catch up with friends over a beer… paid for by the brand. Using the Stella Leap Day Fund — a $366,000 bank of beer reserved exclusively for those who spend time together (there are 366 days in a Leap Year) — people were able to claim as much as a 24-pack when sharing the film using #UnCancelPromo and tagging someone they would like to catch up with.

Editor Sloane Klevin

For the film short, the diners were captured with hidden cameras. Union editor Klevin, who used an Avid Media Composer 2018.12.03 with EditShare storage, was tasked with finding a story in their candid conversations. We reached out to her to find out more about the project and her process.

How early did you get involved in this project, and what kind of input did you have?
I knew I was probably getting the job about a week before they shot. I had no creative input into the shoot; that really only happens when I’m editing a feature.

What was your process like?
This was an incredibly fast turnaround. They shot on a Wednesday night, and it was finished and online the following Wednesday morning at 12am.

I thought about truncating my usual process in order to make the schedule, but when I saw their shooting breakdown for how they planned to shoot it all in one evening, I knew there wouldn’t be a ton of footage. Knowing this, I could treat the project the way I approach most unscripted longform branded content.

My assistant, Ryan Stacom, transcoded and loaded the footage into the Avid overnight, then grouped the four hidden cameras with the sound from the hidden microphones — and, brilliantly, production had time-of-day timecode on everything. The only thing that was tricky was when two tables were being filmed at once. Those takes had to be separated.

The Simon Says transcription software was used to transcribe the short pre and post interviews we had, and Ryan put markers from the transcripts on those clips so I could jump straight to a keyword or line I was searching for during the edit process. I watched all the verité footage myself and put markers on anything I thought was usable in the spot, typing into the markers what was said.

How did you choose the footage you needed?
Sometimes the people had conversations that were neither here nor there, because they had no idea they were being filmed, so I skipped that stuff. Also, I didn’t know if the transcription software would be accurate with so much background noise from the restaurant on the hidden table microphones, so markering myself seemed the best option. I used yellow markers for lines I really liked, and red for stuff I thought we might want to be able to find and audition, but those wasn’t necessarily my selects. That way I could open the markers tool, and read through my yellow selects at a glance.

Once I’d seen everything, I did a music search of Asche & Spencer’s incredibly intuitive, searchable music library website, downloaded my favorite tracks and started editing.  Because of the fast turnaround, the agency was nice enough to send an outline for how they hoped the material might be edited. I explored their road map, which was super helpful, but went with my gut on how to deviate. They gave me two days to edit, which meant I could post for the director first and get his thoughts.

Then I spent the weekend playing with the agency and trying other options. The client saw the cut and gave notes on both days I was with the agency, then we spent Monday and Tuesday color correcting (thanks to Mike Howell at Color Collective), reworking the music track, mixing (with Chris Afzal at Wave Studios), conforming, subtitling.

That was a crazy fast turnaround.
Considering how fast the turnaround was, it went incredibly smoothly. I attribute that to the manageable amount of footage, fantastic casting that got us really great reactions from all the people they filmed, and the amount of communication my producer at Union and the agency producer had in advance.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

 

Western Digital intros WD Gold NVMe SSDs  

Western Digital has introduced its new enterprise-class WD Gold NVMe SSDs designed to help small- and medium-sized companies transition to NVMe storage. The SSDs offer power loss protection and high performance with low latency.

The WD Gold NVMe SSDs will be available in four capacities — .96TB, 1.92TB, 3.84TB and 7.68TB — in early Q2 of this year. The WD Gold NVMe SSD is designed to be, according to the company, “the primary storage in servers delivering significantly improved application responsiveness, higher throughput and greater scale than existing SATA devices for enterprise applications.”

These new NVMe SSDs complement the recently launched WD Gold HDDs by providing a high-performance storage tier for applications and data sets that require low latency or high throughput.

The WD Gold NVMe SSDs are designed using Western Digital’s silicon-to-system technology, from its 3D TLC NAND SSD media to its purpose-built firmware and own integrated controller. The drives give users peace of mind knowing they’re protected against power loss and that data paths are safe. Secure boot and secure erase provide users with additional data-management protections, and the devices come with an extended five-year limited warranty.

Krista Liney directs Ghost-inspired promo for ABC’s The Bachelor

Remember the Ghost-inspired promo for ABC’s The Bachelor, which first aired during the 92nd Academy Awards telecast? ABC Entertainment Marketing developed the concept and wrote the script, which features current Bachelor lead Peter Weber in a send-up of the iconic pottery scene in Ghost between Demi Moore and Patrick Swayze. It even includes the Righteous Brothers song, Unchained Melody, which played over that scene in the film.

ABC Entertainment Marketing tapped Canyon Road Films to produce and Krista Liney to direct. Liney captured Peter taking off his shirt, sitting down at the pottery wheel and “getting messy” — a metaphor for how messy his journey to love has been. As he starts to mold the clay, he is joined by one set of hands, then another and another. As the clay collapses, Whoopi Goldberg appears to say, “Peter, you in danger, boy” – a take-off of the line she delivers to Moore’s character in the film.

This marks Liney’s first shoot as a newly signed director coming on board at Canyon Road Films, a Los Angeles-based creative production company that specializes in television promos and entertainment content.

Liney has a perspective from the side of the client and the production house, having previously served as a marketing executive on the network side. “With promos, I aim to create pieces that will cut through the clutter and command attention,” she explains. “For me, it’s all about how I can best build the anticipation and excitement within the viewer.”

The piece was shot on an ARRI Alexa Mini with Primes and Optimo lenses. ABC finished the spot in-house.

Other credits include EP Lara Wickes and DP Eric Schmidt.

Sonnet intros USB to 5GbE adapter for Mac, Windows and Linux

Sonnet Technologies has introduced the Solo5G USB 3 to 5Gb Ethernet (5GbE) adapter. Featuring NBASE-T (multigigabit) Ethernet technology, the Sonnet Solo5G adapter adds 5GbE and 2.5GbE network connectivity to an array of computers, allowing for superfast data transfers over the existing Ethernet network cabling infrastructure found in most buildings today.

Measuring 1.5 inches wide by 3.25 inches deep by 0.7 inches tall, the Solo5G is a compact, fanless 5GbE adapter for Mac, Windows and Linux computers. Equipped with an RJ45 port, the adapter supports 5GbE and 2.5GbE (5GBASE-T and 2.5GBASE-T, respectively) connectivity via common Cat 5e (or better) copper cabling at distances of up to 100 meters. The adapter’s USB port connects to a USB-A, USB-C or Thunderbolt 3 port on the computer and is bus-powered for convenient, energy-efficient and portable operation.

Cat 5e and Cat 6 copper cables — representing close to 100% of the installed cable infrastructure in enterprises worldwide — were designed to carry data at only up to 1Gb per second. NBASE-T Ethernet was developed to boost the speed capability well beyond that limit. Sonnet’s Solo5G takes advantage of that technology.

When used with a multigigabit Ethernet switch or a 10Gb Ethernet switch with NBASE-T support — including models from Buffalo, Cisco, Netgear, QNAP, TrendNet and others — the Sonnet adapter delivers performance gains from 250% to 425% the speed of Gigabit Ethernet without a wiring upgrade. When connecting to a multigigabit Ethernet-compatible switch is not possible, the Solo5G also supports 1Gb/s and 100Mb/s link speeds.

Sonnet’s Solo5G includes 0.5-meter USB-C to USB-C and USB-C to USB-A cables for connecting the adapter to the computer, saving users the expense of buying a second cable when needed.

The Solo5G USB-C to 5 Gigabit Ethernet adapter is available now for $79.99.

The Den editorial boutique launches in Los Angeles

Christjan Jordan, editor of award-winning work for clients including Amazon, GEICO and Hulu, has partnered with industry veteran Mary Ellen Duggan to launch The Den, an independent boutique editorial house in Los Angeles.

Over the course of his career, Jordan has worked with Arcade Edit, Cosmo Street and Rock Paper Scissors, among others. He has edited such spots as Alexa Loses Her Voice for Amazon, Longest Goal Celebration Ever for GEICO, #notspecialneeds for World Down Syndrome Day out of Publicis NY and Super Bowl 2020 ads Tom Brady’s Big Announcement for Hulu and Famous Visitors for Walmart. Jordan’s work has been recognized by the Cannes Lions, AICE, AICP, Clio, D&AD, One Show and Sports Emmy awards.

Yes, with Mary Ellen, agency producers are guided by an industry veteran that knows exactly what agencies and clients are looking for,” says Jordan. “And for me, I love fostering young editors. It’s an interesting time in our industry and there is a lot of fresh creative talent.”

In her career, Duggan has headed production departments at both KPB and Cliff Freeman on the East Coast and, most recently, Big Family Table in Los Angeles. In addition, she has freelanced all over the country.

“The stars aligned for Christjan and I to work together,” says Duggan. “We had known each other for years and had recently worked on a Hulu campaign together. We had a similar vision for what we thought the editorial experience should be. A high end boutique editorial that is nimble, has a roster of diverse talent, and a real family vibe.”

Veteran producer Rachel Seitel has joined as partner and head of business development. The Den will be represented by Diane Patrone at The Family on the East Coast and by Ezra Burke and Shane Harris on the West Coast.

The Den’s founding roster also features editor Andrew Ratzlaff and junior editor Hannelore Gomes. The staff works on Avid Media Composer and Adobe Premiere.

LVLY adds veteran editor Bill Cramer

Bill Cramer, an editor known for his comedy and dialogue work, among other genres, has joined the editorial roster at LVLY, a content creation and creative studio based in New York City.

Cramer joins from Northern Lights. Prior to that he had spent many years at Crew Cuts, where he launched his career and built a strong reputation for his work on many ads and campaigns. Clients included ESPN, GMC, LG, Nickelodeon, Hasbro, MLB, Wendy’s and American Express. Check out his reel.

Cramer reports that he wasn’t looking to make a move but that LVLY’s VP/MD, Wendy Brovetto, inspired him. “Wendy and I knew of each other for years, and I’d been following LVLY since they did their top-to-bottom rebranding. I knew that they’re doing everything from live -action production to podcasting, VFX, design, VR and experiential, and I recognized that joining them would give me more opportunities to flex as an editor. Being at LVLY gives me the chance to take on any project, whether that’s a 30-second commercial, music video or long-form branded content piece; they’re set up to tackle any post production needs, no matter the scale.”

“Bill’s a great comedy/dialogue editor, and that’s something our clients have been looking for,” says Brovetto. “Once I saw the range of his work, it was an easy decision to invite him to join the LVLY team. In addition to being a great editor, he’s a funny guy, and who doesn’t need more humor in their day?”

Cramer, who works on both Avid Media Composer and Adobe Premiere, joins an editorial roster that includes Olivier Wicki, J.P. Damboragian, Geordie Anderson, Noelle Webb, Joe Siegel and Aaron & Bryan.

Senior colorist Tony D’Amore joins Picture Shop

Burbank’s Picture Shop has beefed up its staff with senior colorist Tony D’Amore, who will also serve as a director of creative workflow. In that role, he will oversee a team focusing on color prep and workflow efficiency.

Originally from rural Illinois, D’Amore made the leap to the West Coast to pursue an education, studying film and television at UCLA. He started his career in color in the early ‘90s, gaining valuable experience in the world of post. He has been working closely with color and post workflow since.

While D’Amore has experience working on Autodesk Lustre and FilmLight Baselight, he primarily grades in Blackmagic DaVinci Resolve. D’Amore has contributed color to several Emmy Award-winning shows nominated in the category of “Outstanding Cinematography.”

D’Amore has developed new and efficient workflows for Dolby Vision HDR and HDR10, coloring hundreds of hours of episodic programming for networks including CBS, ABC and Fox, as well as cable and streaming platforms such as HBO, Starz, Netflix, Hulu and Amazon.

D’Amore’s most notable project to date is having colored a Marvel series simultaneously for IMAX and ABC delivery. His list of color credits include, Barry (HBO), Looking for Alaska (Hulu), Legion (FX), Carnival Row (Amazon), Power (Starz), Fargo (FX), Elementary (CBS), Hanna (Amazon), and a variety of Marvel series, including Jessica Jones, Daredevil, The Defenders, Luke Cage and Iron Fist. All of these are available on streaming platforms

Behind the Title: Dell Blue lead editor Jason Uson

This veteran editor started his career at LA’s Rock Paper Scissors, where he spent four years learning the craft from editors such as Bee Ottinger and Angus Wall. After freelancing at Lost Planet, Spot Welders and Nomad, he held staff positions at Cosmo Street, Harpo Films and Beast Editorial before opening Foundation Editorial his own post boutique in Austin.

NAME: Jason Uson

COMPANY: Austin, Texas-based Dell Blue

Can you describe what Dell Blue does?
Dell Blue is the in-house agency for Dell Technologies.

What’s your job Title?
Senior Lead Creative Editor

What does that entail?
Outside of the projects that I am editing personally, there are multiple campaigns happening simultaneously at all times. I oversee all of them and have my eyes on every edit, fostering and mentoring our junior editors and producers to help them grow in their careers.

I’ve helped establish and maintain the process regarding our workflow and post pipeline. I also work closely with our entire team of creatives, producers, project managers and vendors from the beginning of each project and follow it through from production to post. This enables us to execute the best possible workflow and outcome for every project.

To add another layer to my role, I am also directing spots for Dell when the project is right.

Alienware

That’s a lot! What else would surprise people about what falls under that title?
The number of hours that go into making sure the job gets done and is the best it can be. Editing is a process that takes time. Creating something of value that means something is an art no matter how big or small the job might be. You have to have pride in every aspect of the process. It shows when you don’t.

What’s your favorite part of the job?
I have two favorites. The first is the people. I know that sounds cliché, but it’s true. The team here at Dell is truly something special. We are family. We work together. Play together. Happy Hour together. Respect, support and genuinely care for one another. But, ultimately, we care about the work. We are all aligned to create the best work possible. I am grateful to be surrounded by such a talented and amazing group of humans.

The second, which is equally important to me, is the process of organizing my project, watching all the footage and pulling selects. I make sure I have what I need and check it off my list. Music, sound effects, VO track, graphics and anything else I need to get started. Then I create my first timeline. A blank, empty timeline. Then I take a deep breath and say to myself, “Here we go.” That’s my favorite.

Do you have a least favorite?
My least favorite part is wrapping a project. I spend so much time with my clients and creatives and we really bond while working on a project together. We end on such a high note of excitement and pride in what we’ve done and then, just like that, it’s over. I realize that sounds a bit dramatic. Not to worry, though, because lucky for me, we all come back together in a few months to work on something new and the excitement starts all over again.

What is your most productive time of day?
This also requires a two-part answer. The first is early morning. This is my time to get things done, uninterrupted. I go upstairs and make a fresh cup of coffee. I open my deck doors. I check and send emails, and get my personal stuff done. This clears out all of my distractions for the day before I jump into my edit bay.

The second part is late at night. I get to replay all of the creative decisions from the day and explore other options. Sometimes, I get lucky and find something I didn’t see before.

If you didn’t have this job, what would you be doing instead?
That’s easy. I’d be a chef. I love to cook and experiment with ingredients. And I love to explore and create an amazing dining experience.

I see similarities between editors and chefs. Both aim to create something impactful that elicits an emotional response from the “elements” they are given. For chefs, the ingredients, spices and techniques are creatively brought together to bring a dish to life.

For editors, the “elements” that I am given, in combination with the use of my style, techniques, sound design, graphics and music etc. all give life to a spot.

How early did you know this would be your path?
I had originally moved to Los Angeles with dreams of becoming an actor. Yes, it’s groundbreaking, I know. During that time, I met editor Dana Glauberman (The Mandalorian, Juno, Up in the Air, Thank You for Smoking, Creed II, Ghostbusters: Afterlife). I had lunch with her at the studios one day in Burbank and went on a tour of the backlot. I got to see all the edit bays, film stages, soundstages and machine rooms. To me, this was magic. A total game-changer in an instant.

While I was waiting on that one big role, I got my foot in the door as a PA at editing house Rock Paper Scissors. One night after work, we all went for drinks at a local bar and every commercial on TV were the ones (editors) Angus Wall and Adam Pertofsky had worked on within the last month, and I was blown away. Something clicked.

This entire creative world behind the scenes was captivating to me. I made the decision at that moment to lean in and go for it. I asked the assistant editor the following morning if he would teach me — and I haven’t looked back. So, Dana, Angus and Adam… thank you!

Can you name some of your recent projects?
I edited the latest global campaign for Alienware called Everything Counts, which was directed by Tony Kaye. More recently, I worked on the campaign for Dell’s latest and greatest business PC laptop that launches in March 2020, which was directed by Mac Premo.

Dell business PC

Side note: I highly recommend Googling Mac Premo. His work is amazing.

What project are you most proud of?
There are two projects that stand out for me. The first one is the very first spot I ever cut — a Budweiser ad for director Sam Ketay and the Art Institute of Pasadena. During the edit, I thought, “Wow, I think I can do this.” It went on to win a Clio.

The second is the latest global campaign for Alienware, which I mentioned above. Director Tony Kaye is a genius. Tony and I sat in my edit bay for a week exploring and experimenting. His process is unlike any other director I have worked with. This project was extremely challenging on many levels. I honestly started looking at footage in a very different way. I evolved. I learned. And I strive to continue to grow every day.

Name three pieces of technology you can’t live without.
Wow, good question. I guess I’ll be that guy and say my phone. It really is a necessity.

Spotify, for sure. I am always listening to music in my car and trying to match artists with projects that are not even in existence yet.

My Bose noise cancelling headphones.

What social media channels do you follow?
I use Facebook and LinkedIn — mainly to stay up to date on what others are doing and to post my own updates every now and then.

I’m on Instagram quite a bit. Outside of the obvious industry-related accounts I follow, here are a few of my random favorites:

@nuts_about_birds
If you love birds as much as I do, this is a good one to follow.

@sergiosanchezart
This guy is incredible. I have been following his work for a long time. If you are looking for a new tattoo, look no further.

@andrewhagarofficial
I was lucky enough to meet Andrew through my friend @chrisprofera and immediately dove into his music. Amazing. Not to mention his dad is Sammy Hagar. Enough said.

@kaleynelson
She’s a talented photographer based in LA. Her concert stills are impressive.

@zuzubee
I love graffiti art and Zuzu is one of the best. Based is Austin, she has created several murals for me. You can see her work all over the city, as well as installations during SXSW and Austin City Limits, on Bud Light cans, and across the US.

Do you listen to music at work? What types?
I do listen to music when I work but only when I’m going through footage and pulling selects. Classical piano is my go-to. It opens my mind and helps me focus and dive into my footage.

Don’t get me wrong, I love music. But if I am jamming to my favorite, Sammy Hagar, I can’t drive…I mean dive… into my footage. So classical piano for me.

How do you de-stress from it all?
This is an understatement, but there are a few things that help me out. Sometimes during the day, I will take a walk around the block. Get a little vitamin D and fresh air. I look around at things other than my screen. This is something (editors) Tom Muldoon and John Murray at Nomad used to do every day. I always wondered why. Now I know. I come back refreshed and with my mind clear and ready for the next challenge.

I also “like” to hit the gym immediately after I leave my edit bay. Headphones on (Sammy Hagar, obviously), stretch it out and jump on the treadmill for 30 minutes.

All that is good and necessary for obvious reasons, but getting back to cooking… I love being in the kitchen. It’s therapy for me. Whether I am chopping and creating in the kitchen or out on the grill, I love it. And my wife appreciates my cooking. Well, I think she does at least.

Photo Credits: Dell PC and Jason Uson images – Chris Profera

Amazon’s The Expanse Season 4 gets HDR finish

The fourth season of the sci-fi series The Expanse was finished in HDR for the first time streaming via Amazon Prime Video. Deluxe Toronto handled end-to-end post services, including online editorial, sound remixing and color grading. The series was shot on ARRI Alexa Minis.

In preparation for production, cinematographer Jeremy Benning, CSC, shot anamorphic test footage at a quarry that would serve as the filming stand-in for the season’s new alien planet, Ilus. Deluxe Toronto senior colorist Joanne Rourke then worked with Benning, VFX supervisor Bret Culp, showrunner Naren Shankar and series regular Breck Eisner to develop looks that would convey the location’s uninviting and forlorn nature, keeping the overall look desaturated and removing color from the vegetation. Further distinguishing Ilus from other environments, production chose to display scenes on or above Ilus in a 2.39 aspect ratio, while those featuring Earth and Mars remained in a 16:9 format.

“Moving into HDR for Season 4 of our show was something Naren and I have wanted to do for a couple of years,” says Benning. “We did test HDR grading a couple seasons ago with Joanne at Deluxe, but it was not mandated by the broadcaster at the time, so we didn’t move forward. But Naren and I were very excited by those tests and hoped that one day we would go HDR. With Amazon as our new home [after airing on Syfy], HDR was part of their delivery spec, so those tests we had done previously had prepared us for how to think in HDR.

“Watching Season 4 come to life with such new depth, range and the dimension that HDR provides was like seeing our world with new eyes,” continues Benning. “It became even more immersive. I am very much looking forward to doing Season 5, which we are shooting now, in HDR with Joanne.”

Rourke, who has worked on every season of The Expanse, explains, “Jeremy likes to set scene looks on set so everyone becomes married to the look throughout editorial. He is fastidious about sending stills each week, and the intended directive of each scene is clear long before it reaches my suite. This was our first foray into HDR with this show, which was exciting, as it is well suited for the format. Getting that extra bit of detail in the highlights made such a huge visual impact overall. It allowed us to see the comm units, monitors, and plumes on spaceships as intended by the VFX department and accentuate the hologram games.”

After making adjustments and ensuring initial footage was even, Rourke then refined the image by lifting faces and story points and incorporating VFX. This was done with input provided by producer Lewin Webb; Benning; cinematographer Ray Dumas, CSC; Culp or VFX supervisor Robert Crowther.

To manage the show’s high volume of VFX shots, Rourke relied on Deluxe Toronto senior online editor Motassem Younes and assistant editor James Yazbeck to keep everything in meticulous order. (For that they used the Grass Valley Rio online editing and finishing system.) The pair’s work was also essential to Deluxe Toronto re-recording mixers Steve Foster and Kirk Lynds, who have both worked on The Expanse since Season 2. Once ready, scenes were sent in HDR via Streambox to Shankar for review at Alcon Entertainment in Los Angeles.

“Much of the science behind The Expanse is quite accurate thanks to Naren, and that attention to detail makes the show a lot of fun to work on and more engaging for fans,” notes Foster. “Ilus is a bit like the wild west, so the technology of its settlers is partially reflected in communication transmissions. Their comms have a dirty quality, whereas the ship comms are cleaner-sounding and more closely emulate NASA transmissions.”

Adds Lynds, “One of my big challenges for this season was figuring out how to make Ilus seem habitable and sonically interesting without familiar sounds like rustling trees or bird and insect noises. There are also a lot of amazing VFX moments, and we wanted to make sure the sound, visuals and score always came together in a way that was balanced and hit the right emotions story-wise.”

Foster and Lynds worked side by side on the season’s 5.1 surround mix, with Foster focusing on dialogue and music and Lynds on sound effects and design elements. When each had completed his respective passes using Avid ProTools workstations, they came together for the final mix, spending time on fine strokes, ensuring the dialogue was clear, and making adjustments as VFX shots were dropped in. Final mix playbacks were streamed to Deluxe’s Hollywood facility, where Naren could hear adjustments completed in real time.

In addition to color finishing Season 4 in HDR, Rourke also remastered the three previous seasons of The Expanse in HDR, using her work on Season 4 as a guide and finishing with Blackmagic DaVinci Resolve 15. Throughout the process, she was mindful to pull out additional detail in highlights without altering the original grade.

“I felt a great responsibility to be faithful to the show for the creators and its fans,” concludes Rourke. “I was excited to revisit the episodes and could appreciate the wonderful performances and visuals all over again.”

DP Chat: Watchmen cinematographer Greg Middleton

By Randi Altman

HBO’s Watchmen takes us to new dimensions in this recent interpretation of the popular graphic novel. In this iteration, we spend a lot of our time in Tulsa, Oklahoma, getting to know Regina King’s policewoman Angela Abar, her unconventional family and a shadowy organization steeped in racism called the Seventh Kavalry. We also get a look back — beautiful in black and white — at Abar’s tragic family back story. It was created and written for TV by Lost veteran Damon Lindelof.

Greg Middleton

Greg Middleton, ASC, CSC, who also worked on Game of Thrones and The Killing, was the series cinematographer. We reached out to him to find out about his process, workflow and where he gets inspiration.

When were you brought on to Watchmen, and what type of looks did the showrunner want from the show?
I joined Watchmen after the pilot for Episode 2. A lot of my early prep was devoted to discussions with the showrunner and producing directors on how to develop the look from the pilot going forward. This included some pilot reshoots due to changes in casting and the designing and building of new sets, like the police precinct.

Nicole Kassell (director of Episodes 1, 2 and 8) and series production designer Kristian Milstead and I spent a lot of time breaking down the possibilities of how we could define the various worlds through color and style.

How was the look described to you? What references were you given?
We based the evolution of the look of the show on the scripts, the needs of the structure within the various worlds and on the graphic novel, which we commonly referred to as “the Old Testament.”

As you mention, it’s based on a graphic novel. Did the look give a nod to that? If so, how? Was that part of the discussion?
We attempted to break down the elements of the graphic novel that might translate well and those that would not. It’s an interesting bit of detective work because a lot of the visual cues in the comic are actually a commentary on the style of comics at the time it was published in 1985.

Those cues, if taken literally, would not necessarily work for us, as their context would not be clear. Things like color were very referential to other comics of the time. For example, they used only secondary color instead of primaries as was the norm. The graphic novel is also a film noir in many ways, so we got some of our ideas based on that.

What did translate well were compositional elements — tricks of transition like match cuts and the details of story in props, costumes and sets within each frame. We used some split diopters and swing shift lenses to give us some deep focus effects for large foreground objects. In the graphic novel, of course, everything is in focus, so those type of compositions are common!

This must have been fun because of the variety of looks the series has — the black-and-white flashbacks, the stylized version of Tulsa, the look of the mansion in Wales (Europa), Vietnam in modern day. Can you talk about each of the different looks?
Yes, there were so many looks! When we began prep on the series with the second episode, we were also simultaneously beginning to film the scenes in Wales for the “blond man” scenes. We knew that that storyline would have its own particular feel because of the location and its very separateness from the rest of the world.

A more classic traditional proscenium-like framing and style seemed very appropriate. Part of that intent was designed to both confuse and to make very apparent to the audience that we were definitely in another world. Cinematographer Chris Seager, BSC, was filming those scenes as I was still doing tests for the other looks and the rest of our show in Atlanta.

We discussed lenses, camera format, etc. The three major looks we had to design that we knew would go together initially were our “Watchmen” world, the “American hero story” show within the show, and the various flashbacks to 1921 Tulsa and World War I. I was determined to make sure that the main world of the show did not feel overly processed and colorized photographically. We shot many tests and developed a LUT that was mostly film-like. The other important aspects to creating a look are, of course, art direction and production design, and I had a great partner in Kristian Milstead, the production designer who joined the show after the pilot.

This was a new series. Do you enjoy developing the look of a show versus coming on board after the look was established?
I enjoy helping to figure out how to tell the story. For series, helping develop the look photographically in the visual strategy is a big part of that. Even if some of those are established, you still do similar decision-making for shooting individual scenes. However, I much prefer being engaged from the beginning.

So even when you weren’t in Wales, you were providing direction?
As I mentioned earlier, Chris Seager and I spoke and emailed regarding lenses and those choices. It was still early for us in Atlanta, but there were preliminary decisions to be made on how the “blond man” (our code name for Jeremy Irons) world would be compared to our Watchmen world. What I did was consult with my director, Nicole Kassell, on her storyboards for her sequences in Wales.

Were there any scenes or looks that stood out as more challenging than others? Can you describe?
Episode 106 was a huge challenge. We have a lot of long takes involving complex camera moves and dimmer cues as a camera would circle or travel between rooms. Also, we developed the black-and-white look to feel like older black-and-white film.
One scene in June’s apartment involved using the camera on a small Scorpio 10-foot crane and a mini libre head to accomplish a slow move around the room. Then we had to push between her two actors toward the wall as an in-camera queue of a projected image of the black-and-white movie Trust in the Law reveals itself with a manual iris.

This kind of shot ends up being a dance with at least six people, not including the cast. The entire “nostalgia” part of the episode was done this way. And none of this would’ve been possible without incredible cast being able to hit these incredibly long takes and choreograph themselves with the camera. Jovan Adepo and Danielle Deadwyler were incredible throughout the episode.

I assume you did camera tests. Why did you choose the ARRI Alexa? Why was it right for this? What about lenses, etc.?
I have been working with the Alexa for many years now, so I was aware of what I could do with the camera. I tested a couple of others, but in the end the Alexa Mini was the right choice for us. I also needed a camera that was small so I could go on and off of a gimbal or fit into small places.

How did you work with the colorist? Who was that on this show? Were you in the suite with them?
Todd Bochner was our final colorist at Sim in LA. I shot several camera tests and worked with him in the suite to help develop viewing LUTs for the various worlds of the show. We did the same procedure for the black and white. In the end, we mimicked some techniques similar to black-and-white film (like red filters), except for us, it was adjusting the channels accordingly.

Do you know what they used on the color?
Yes, it was Blackmagic DaVinci Resolve 16.

How did you get interested in cinematography?
I was always making films as a kid, and then in school and then in university. In film school, at some point splitting apart the various jobs, I seemed to have some aptitude for the cinematography, so after school I decided to try making my focus. I came to it more out of a love of storytelling and filmmaking and less about photography.

Greg Middleton

What inspires you? Other films?
Films that move me emotionally.

What’s next for you?
A short break! I’ve been very fortunate to have been working a lot lately. A film I shot just before Watchmen called American Woman, directed by Semi Chellas, should be coming out this year.

And what haven’t I asked that’s important?
I think the question all filmmakers should ask themselves is, “Why am I telling this story, and what is unique about the way in which I’m telling it?”


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Sohonet intros ClearView Pivot for 4K remote post

Sohonet is now offering ClearView Pivot, a solution for realtime remote editing, color grading, live screening and finishing reviews at full cinema quality. The new solution will provide connectivity and collaboration services for productions around the world.

ClearView Pivot offers 4K HDR with 12-bit color depth and 4:4:4 chroma sampling for full-color-quality video streaming with ultra-low latency over the Sohonet’s private media network, which avoids the extreme compression required due to contention and latency of public internet connections.

“Studios around the world need a realtime 4K collaboration tool that can process video at lossless color fidelity using the industry-standard JPEG 2000 codec between two locations across a network like ours. Avoiding the headache of the current ‘equipment only’ approach is the only scalable solution,” explains Sohonet CEO Chuck Parker.

Sohonet says its integrated solution is approved by ISE (Independent Security Evaluators) — the industry’s gold standard for security. Sohonet’s solution provides an encrypted stream between each endpoint and provides an auditable usage trail for every solution. The Soho Media Network ( SMN) connection offers ultra-low latency (measured in milliseconds), and the company says that unlike equipment-only solutions that require the user to navigate firewall and security issues and perform a “solution check” before each session, ClearView Pivot works immediately. As a point-to-multipoint solution, the user can also pivot easily from one endpoint to the next to collaborate with multiple people at the click of a button or even to stream to multiple destinations at the same time.

Sohonet has been working closely with productions on lots and on locations over the past few years in the ongoing development of ClearView Pivot. In those real-world settings, ClearView Pivot has been put through its paces with trials across multiple departments, and the color technologies have been fully inspected and approved by experts across the industry.

MPI restores The Wizard of Oz in 4K HDR

By Barry Goch

The classic Victor Fleming-directed film The Wizard of Oz, which was released by MGM in 1939 and won two of its six Academy Award nominations, has been beautifully restored by Burbank’s Warner Bros. Motion Picture Imaging (MPI).

Bob Bailey

To share its workflow on the film, MPI invited a group of journalists to learn about the 4K UHD HDR restoration of this classic film. The tour guide for our high-tech restoration journey was MPI’s VP of operations and sales Bob Bailey, who walked us through the entire restoration process — from the original camera negative to final color.

The Wizard of Oz, which starred Judy Garland, was shot on a Technicolor three-strip camera system. According to Bailey, it ran three black and white negatives simultaneously. “That is why it is known as three-strip Technicolor. The magazine on top of the camera was triple the width of a normal black and white camera because it contained each roll of negative to capture your red, green and blue records,” explained Bailey.

“When shooting in Technicolor, you weren’t just getting the camera. You would rent a package that included the camera, a camera crew with three assistants, the film, the processing and a Technicolor color consultant.”

George Feltenstein, SVP of theatrical catalog marketing for Warner Bros. Home Entertainment, spoke about why the film was chosen for restoration. “The Wizard of Oz is among the crown jewels that we hold,” he said. “We wanted to embrace the new 4K HDR technology, but nobody’s ever released a film that old using this technology. HDR, or high dynamic range, has a color range that is wider than anything that’s come before it. There are colors [in The Wizard of Oz] that were never reproducible before, so what better a film to represent that color?”

Feltenstein went on to explain that this is the oldest film to get released in the 4K format. He hopes that this is just the beginning and that many of the films in Warner Bros.’ classic library will also be released on 4K HDR and worked on at MPI under Bailey’s direction.

The Process
MPI scanned each of the three-strip Technicolor nitrate film negatives at 8K 16-bit, composited them together and then applied a new color grain. The film was rescanned with the Lasergraphics Director 10K scanner. “We have just under 15 petabytes of storage here,” said Bailey. “That’s working storage, because we’re working on 8K movies since [some places in the world] are now broadcasting 8K.”

Steven Anastasi

Our first stop was to look at the Lasergraphics Director. We then moved on to MPI’s climate-controlled vault, where we were introduced to Steven Anastasi, VP of technical operations at Warner Bros. Anastasi explained that the original negative vault has climate-controlled conditions with 25% humidity at 35 degrees Fahrenheit, which is the combination required for keeping these precious assets safe for future generations. He said there are 2 million assets in the building, including picture and sound.

It was amazing to see film reels for 2001: A Space Odyssey sitting on a shelf right in front of me. In addition to the feature reels, MPI also stores millions of negatives captured throughout the years by Warner productions. “We also have a very large library,” reported Anastasi. “So the original negatives from the set, a lot of unit photography, head shots in some cases and so forth. There are 10 million of these.”

Finally, we were led into the color bay to view the film. Janet Wilson, senior digital colorist at MPI, has overseen every remaster of The Wizard of Oz for the past 20 years. Wilson used a FilmLight Baselight X system for the color grade. The grading suite housed multiple screens: a Dolby Pulsar for the Dolby Vision pass, a Sony X300 and a Panasonic EZ1000 OLED 4K HDR.

“We have every 4K monitor manufactured, and we run the film through all of them,” said Bailey. “We painstakingly go through the process from a post perspective to make sure that our consumers get the best quality product that’s available out in the marketplace.”

“We want the consumer experience on all monitors to be something that’s taken into account,” added Feltenstein. “So we’ve changed our workflow by having a consumer or prosumer monitor in these color correction suites so the colorist has an idea of what people are going to see at home, and that’s helped us make a better product.”

Our first view of the feature was a side-by-side comparison of the black and white scanned negative and the sepia color corrected footage. The first part of the film, which takes place in Kansas, was shot in black and white, and then a sepia look was applied to it. The reveal scene, when Dorothy passes through the door going into Oz, was originally shot in color. For this new release, the team generated a matte so Wilson could add this sepia area to the inside of the house as Dorothy transitioned into Oz.

“So this is an example of some of the stuff that we could do in this version of the restoration,” explained Wilson. “With this version, you can see that the part of the image where she’s supposed to be in the monochrome house is not actually black and white. It was really a color image. So the trick was always to get the interior of the house to look sepia and the exterior to look like all of the colors that it’s supposed to. Our visual effects team here at MPI — Mike Moser and Richie Hiltzik — was able to draw a matte for me so that I could color inside of the house independently of the exterior and make them look right, which was always a really tricky thing to do.”

Wilson referred back to the Technicolor three-strip, explaining that because you’ve got three different pieces of film — the different records — they’re receiving the light in different ways. “So sometimes one will be a little brighter than the other. One will be a little darker than the other, which means that the Technicolor is not a consistent color. It goes a little red, and then it goes a little green, and then it goes a little blue, and then it goes a little red again. So if you stop on any given frame, it’s going to look a little different than the frames around it, which is one of the tricky parts of color correcting technical art. When that’s being projected by a film projector, it’s less noticeable than when you’re looking at it on a video monitor, so it takes a lot of little individual corrections to smooth those kinds of things out.”

Wilson reported seeing new things with the 8K scan and 4K display. “The amount of detail that went into this film really shows up.” She said that one of the most remarkable things about the restoration was the amazing detail visible on the characters. For the first time in many generations, maybe ever, you can actually see the detail of the freckles on Dorothy’s face.

In terms of leveraging the expanded dynamic range of HDR, I asked Wilson if she tried to map the HDR, like in kind of a sweet spot, so that it’s both spectacular yet not overpowering at the same time.

“I ended up isolating the very brightest parts of the picture,” she replied. “In this case, it’s mostly the sparkles on their shoes and curving those off so I could run those in, because this movie is not supposed to have modern-day animation levels of brightness. It’s supposed to be much more contained. I wanted to take advantage of brightness and the ability to show the contrast we get from this format, because you can really see the darker parts of the picture. You can really see detail within the Wicked Witch’s dress. I don’t want it to look like it’s not the same film. I want it to replicate that experience of the way this film should look if it was projected on a good print on a good projector.”

Dorothy’s ruby slippers also presented a challenge to Wilson. “They are so red and so bright. They’re so light-reflective, but there were times when they were just a little too distracting. So I had to isolate this level at the same track with slippers and bring them down a little bit so that it wasn’t the first and only thing you saw in the image.”

If you are wondering if audio was part of this most recent restoration, the answer is no, but it had been remastered for a previous version. “As early at 1929, MGM began recording its film music using multiple microphones. Those microphonic angles allowed the mixer to get the most balanced monophonic mix, and they were preserved,” explained Feltenstein. “Twenty years ago, we created a 5.1 surround mix that was organically made from the original elements that were created in 1939. It is full-frequency, lossless audio, and a beautiful restoration job was made to create that track so you can improve upon what I consider to be close to perfection without anything that would be disingenuous to the production.”

In all, it was an amazing experience to go behind the scenes and see how the wizards of MPI created a new version of this masterpiece for today and preserved it for future generations.

This restored version of The Wizard of Oz is a must-see visual extravaganza, and there is no better way to see it than in UHD, HDR, Dolby Vision or HDR10+. What I saw in person took my breath away, and I hope every movie fan out there can have the opportunity to see this classic film in its never-before-seen glory.

The 4K version of The Wizard of Oz is currently available via an Ultra HD Blu-ray Combo Pack and digital.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Behind the Title: Harbor sound editor/mixer Tony Volante

“As re-recording mixer, I take all the final edited elements and blend them together to create the final soundscape.”

Name: Tony Volante

Company: Harbor

Can you describe what Harbor does?
Harbor was founded in 2012 to serve the feature film, episodic and advertising industries. Harbor brings together production and post production under one roof — what we like to call “a unified process allowing for total creative control.”

Since then, Harbor has grown into a global company with locations in New York, Los Angeles and London. Harbor hones every detail throughout the moving-image-making process: live-action, dailies, creative and offline editorial, design, animation, visual effects, CG, sound and picture finishing.

What’s your job title?
Supervising Sound Editor/Re-Recording Mixer

What does that entail?
I supervise the sound editorial crew for motion pictures and TV series along with being the re-recording mixer on many of my projects. I put together the appropriate crew and schedule along with helping to finalize a budget through the bidding process. As re-recording mixer, I take all the final edited elements and blend them together to create the final soundscape.

What would surprise people the most about what falls under that title?
How almost all the sound that someone hears in a movie has been replaced by a sound editor.

What’s your favorite part of the job?
Creatively collaborating with co-workers and hearing it all come together in the final mix.

What is your most productive time of day?
Whenever I can turn off my emails and can concentrate on mixing.

If you didn’t have this job, what would you be doing instead?
Fishing!

When did you know this would be your path?
I played drums in a rock band and got interested in sound at around 18 years old. I was always interested in the “sound” of an album along with the musicality. I found myself buying records based on who had produced and engineered them.

Can you name some recent projects?
Fosse/Verdo (FX) and Boys State, which just one Grand Jury Prize at Sundance.

How has the industry changed since you began working?
Technology has improved workflows immensely and has helped us with the creative process. It has also opened up the door to accelerating schedules to the point of sacrificing artistic expression and detail.

Name three pieces of technology you can’t live without
Avid Pro Tools, my iPhone and my car’s navigation system.

How do you de-stress from it all?
I stand in the middle of a flowing stream fishing with my fly rod. If I catch something that’s a bonus!

Alkemy X adds all-female design collective Mighty Oak

Alkemy X has added animation and design collective Mighty Oak to its roster for US commercial representation. Mighty Oak has used its expertise in handmade animation techniques and design combined with live action for brands and networks, including General Electric, Netflix, Luna Bar, HBO, Samsung NBC, Airbnb, Conde Nast, Adult Swim and The New York Times.

Led by CEO/EP Jess Peterson, head of creative talent Emily Collins and CD Michaela Olsen, the collective has garnered over 3 billion online views. Mighty Oak’s first original short film, Under Covers, premiered at the 2019 Sundance Film Festival. Helmed by Olsen, the quirky stop-motion short features handmade puppets and forced-perspective sets to glimpse into the unsuspecting lives and secrets that rest below the surface of a small town.

“I was immediately struck by the extreme care that Mighty Oak takes on each and every frame of their work,” notes Alkemy X EP Eve Ehrich. “Their handmade style and fresh approach really make for dynamic, memorable animation, regardless of the concept.”

Mighty Oak’s Peterson adds, “We are passionate about collaborating with our clients from the earliest stages, working together to craft original character designs and creating work that is memorable and fun.”

Post house DigitalFilm Tree names Nancy Jundi COO

DigitalFilm Tree (DFT) has named Nancy Jundi as chief operating officer. She brings a wealth of experience to her new role, after more than 20 years working with entertainment and technology companies.

Jundi has been an outside consultant to DFT since 2014 and will now be based in DFT’s Los Angeles headquarters, where she joins founder and CEO Ramy Katrib in pioneering new offerings for DFT.

Jundi began her career in investment banking and asset protection before segueing into the entertainment industry. Her experience includes leading sales and marketing at Runway during its acquisition by The Post Group (TPG). She then joined that team as director of marketing and communications to unify the end-to-end post facilities into TPG’s singular brand narrative. She later co-founded Mode HQ (acquired by Pacific Post) before transitioning into technology and SaaS companies.

Since 2012, Jundi has served as a consultant to companies in industries as varied as financial technology, healthcare, eCommerce, and entertainment, with brands such as LAbite, Traffic Zoom and GoBoon. Most recently, she served as interim senior management for CareerArc.

“Nancy is simply one of the smartest and most creative thinkers I know,” says Katrib. “She is the rare interdisciplinary – the creative and technological thinker that can exist in both arenas with clarity and tenacity.”

Marriage Story director Noah Baumbach

By Iain Blair

Writer/director Noah Baumbach first made a name for himself with The Squid and the Whale, his 2005 semi-autobiographical, bittersweet story about his childhood and his parents’ divorce. It launched his career, scoring him an Oscar nomination for Best Original Screenplay.

Noah Baumbach

His latest film, Marriage Story, is also about the disintegration of a marriage — and the ugly mechanics of divorce. Detailed and emotionally complex, the film stars Scarlett Johansson and Adam Driver as the doomed couple.

In all, Marriage Story scooped up six Oscar nominations — Best Picture, Best Actress, Best Actor, Best Supporting Actress, Best Original Screenplay and Best Original Score. Laura Dern walked away with a statue for her supporting role.

The film co-stars Dern, Alan Alda and Ray Liotta. The behind-the-scenes team includes director of photography Robbie Ryan, editor Jennifer Lame and composer Randy Newman.

Just a few days before the Oscars, Baumbach — whose credits also include The Meyerwitz Stories, Frances Ha and Margot at the Wedding — talked to me about making the film and his workflow.

What sort of film did you set out to make?
It’s obviously about a marriage and divorce, but I never really think about a project in specific terms, like a genre or a tone. In the past, I may have started a project thinking it was a comedy but then it morphs into something else. With this, I just tried to tell the story as I initially conceived it, and then as I discovered it along the way. While I didn’t think about tone in any general sense, I became aware as I worked on it that it had all these different tones and genre elements. It had this flexibility, and I just stayed open to all those and followed them.

I heard that you were discussing this with Adam Driver and Scarlett Johansson as you wrote the script. Is that true?
Yes, but it wasn’t daily. I’d reached out to both of them before I began writing it, and luckily they were both enthusiastic and wanted to do it, so I had them as an inspiration and guide as I wrote. Periodically, we’d get together and discuss it and I’d show them some pages to keep them in the loop. They were very generous with conversations about their own lives, their characters. My hope was that when I gave them the finished script it would feel both new and familiar.

What did they bring to the roles?
They were so prepared and helped push for the truth in every scene. Their involvement from the very start did influence how I wrote their roles. Nicole has that long monologue and I don’t know if I’d have written it without Scarlett’s input and knowing it was her. Adam singing “Being Alive” came out of some conversations with him. They’re very specific elements that come from knowing them as people.

You reunited with Irish DP Robbie Ryan, who shot The Meyerowitz Stories. Talk about how you collaborated on the look and why you shot on film?
I grew up with film and feel it’s just the right medium for me. We shot The Meyerowitz Stories on Super 16, and we shot this on 35mm, and we had to deal with all these office spaces and white rooms, so we knew there’d be all these variations on white. So there was a lot of discussion about shades and the palette, along with the production and costume designers, and also how we were going to shoot these confined spaces, because it was what the story required.

You shot on location in New York and LA. How tough was the shoot?
It was challenging, but mainly because of the sheer length of many of the scenes. There’s a lot of choreography in them, and some are quite emotional, so everyone had to really be up for the day, every day. There was no taking it easy one day. Every day felt important for the movie.

Where did you do the post?
All in New York. I have an office in the Village where I cut my last two films, and we edited there again. We mixed on the Warner stage, where I’ve mixed most of my movies. We recorded the music and orchestra in LA.

Do you like the post process?
I really love it. It’s the most fun and the most civilized part of the whole process. You go to work and work on the film all day, have dinner and go home. Writing is always a big challenge, as you’re making it up as you go along, and it can be quite agonizing. Shooting can be fun, but it’s also very stressful trying to get everything you need. I love working with the actors and crew, but you need a high level of energy and endurance to get through it. So then post is where you can finally relax, and while problems and challenges always arise, you can take time to solve them. I love editing, the whole rhythm of it, the logic of it.

_DSC4795.arw

Talk about editing with Jennifer Lame. How did that work?
We work so well together, and our process really starts in the script stage. I’ll give her an early draft to get her feedback and, basically, we start editing the script. We’ll go through it and take out anything we know we’re not going to use. Then during the shoot she’ll sometimes come to the set, and we’ll also talk twice a day. We’ll discuss the day’s work before I start, and then at lunch we’ll go over the previous day’s dailies. So by the time we sit down to edit, we’re really in sync about the whole movie. I don’t work off an assembly, so she’ll put together stuff for herself to let me know a scene is working the way we designed it. If there’s a problem, she’ll let me know what we need.

What were the big editing challenges?
Besides the general challenges of getting a scene right, I think for some of the longer ones it was all about finding the right rhythm and pacing. And it was particularly true of this film that the pace of something early on could really affect something later. Then you have to fix the earlier bit first, and sometimes it’s the scene right before. For instance, the scene where Charlie and Nicole have a big argument that turns into a very emotional fight is really informed by the courtroom scene right before it. So we couldn’t get it right until we’d got the courtroom scene right.

A lot of directors do test screenings. Do you?
No, I have people I show it to and get feedback, but I’ve never felt the need for testing.

VFX play a role. What was involved?
The Artery did them. For instance, when Adam cuts his arm we used VFX in addition to the practical effects, and then there’s always cleanup.

Talk about the importance of sound to you as a filmmaker, as it often gets overlooked in this kind of film.
I’m glad you said that because that’s so true, and this doesn’t have obvious sound effects. But the sound design is quite intricate, and Chris Scarabosio (working out of Skywalker Sound), who did Star Wars, did the sound design and mix; he was terrific.

A lot of it was taking the real-world environments in New York and LA and building on that, and maybe taking some sounds out and playing around with all the elements. We spent a lot of time on it, as both the sound and image should be unnoticed in this. If you start thinking, “That’s a cool shot or sound effect,” it takes you out of the movie. Both have to be emotionally correct at all times.

Where did you do the DI and how important is it to you?
We did it at New York’s Harbor Post with colorist Marcy Robinson, who’s done several of my films. It’s very important, but we didn’t do anything too extreme, as there’s not a lot of leeway for changing the look that much. I’m very happy with the look and the way it all turned out.

Congratulations on all the Oscar noms. How important is that for a film like this?
It’s a great honor. We’re all still the kids who grew up watching movies and the Oscars, so it’s a very cool thing. I’m thrilled.

What’s next?
I don’t know. I just started writing, but nothing specific yet.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Bill Baggelaar promoted at Sony Pictures, Sony Innovation Studios

Post industry veteran Bill Baggelaar has been promoted to executive VP and CTO, technology development at Sony Pictures and executive VP and general manager of Sony Innovation Studios. Prior to joining Sony Pictures almost nine years ago, he spent 13 years at Warner Bros. as VP of technology/motion picture imaging and head of technology/feature animation. His new role will start in earnest on April 1.

“I am excited for this new challenge that combines roles as both CTO of Sony Pictures and GM of Sony Innovation Studios,” says Baggelaar. “The CTO’s office works both inside the studio and with the industry to develop key standards and technologies that can be adopted across the various lines of business. Sony Innovation Studios is developing groundbreaking tools, methods and techniques for realtime volumetric virtual production — or as we like to say, “the future of movie magic” — with a level of fidelity and quality that is best in class. With the technicians, engineers and artisans at Sony Innovation Studios combined with our studio technology team, we will be able to bring new experiences and technologies to all areas of production and delivery.”

Baggelaar’s promotion is part of a larger announcement by Sony, which involves a new team established within Sony Pictures — the Entertainment Innovation & Technology Group, Sony Pictures Entertainment, which encompasses the following departments: Sony Innovation Studios (SIS), Technology Development, IP Acceleration and Branded Integration.

The group is headed by Yasuhiro Ito, executive VP, Entertainment Innovation & Technology Group. Don Eklund will be leaving his post as EVP /CTO of technology development at the end of March. Eklund has had a long history with SPE and has been in his current role since 2017, establishing the foundation of the studio’s technology development activities.

“This new role combines my years of experience in production, post and VFX; my work with the broader industry and organizations; and my work with Sony companies around the world over the past eight and a half years — along with my more recent endeavors into virtual production — to create a truly unique opportunity for technical innovation that only Sony can provide,” concludes Baggelaar, who will report directly to Ito.

Kevin Lau heads up advertising, immersive at Digital Domain

Visual effects studio Digital Domain has brought on Kevin Lau as executive creative director of advertising, games and new media. In this newly created position, Lau will oversee all short-form projects and act as a creative partner for agencies and brands.

Lau brings over 18 years of ad-based visual effects and commercial production experience, working on campaigns for brands such as Target, Visa and Sprint.

Most recently, he was the executive creative director and founding partner at Timber, an LA-based studio focused on ads (GMC, Winter Olympics) and music videos (Kendrick Lamar’s Humble). Prior to that, he held creative director positions at Mirada, Brand New School and Superfad. Throughout his career, his work has been honored with multiple awards including Clios, AICP Awards, MTV VMAs and a Cannes Gold Lion for Sprint’s “Now Network” campaign via Goodby.

Lau, who joins Digital Domain EPs Nicole Fina and John Canning as they continue to build the studio’s short-form business, will help unify the vision for the advertising, games and new media/experiential groups, promoting a consistent voice across campaigns.

Lau joins the team as the new media group prepares to unveil its biggest project to date: Time’s The March, a virtual reality recreation of the 1963 March on Washington for Jobs and Freedom. Digital Domain’s experience with digital humans will play a major role in the future of both groups as they continue to build on the photoreal cinematics and in-game characters previously created for Activision, Electronic Arts and Ubisoft.

Quantum to acquire Western Digital’s ActiveScale business  

Quantum has entered into an agreement with Western Digital Technologies, a subsidiary of Western Digital Corp., to acquire its ActiveScale object storage business. The addition of the ActiveScale product line and engineers brings object storage software and erasure coding technology to Quantum’s portfolio and helps the company to expand in the object storage market.

The acquisition will extend the company’s role in storing and managing video and unstructured data using a software-defined approach. The transaction is expected to close by March 31, 2020. Financial terms of the transaction were not disclosed.

What are the benefits of object storage software?
• Scalability: Allows users to store, manage and analyze billions of objects and exabytes of capacity.
• Durable: ActiveScale object storage offers up to 19 nines of data durability using patented erasure coding protection technologies.
• Easy to Manage at Scale: Because object storage has a flat namespace (compared to a hierarchical file system structure), managing billions of objects and hundreds of petabytes of capacity is easier than using traditional network attached storage. This, according to Quantum, reduces operational expenses.

Quantum has been offering object storage and selling and supporting the ActiveScale product line for over five years. Object storage can be used as an active-archive tier of storage — where StorNext file storage is used for high-performance ingest and processing of data, object storage acts as lower cost online content repository, and tape acts as the lowest cost cold storage tier.

For M&E, object storage is used as a long-term content repository for video content, in movie and TV production, in sports video, and even for large corporate video departments. Those working in movie and TV production require very high performance ingest, edit, processing, rendering of their video files, which typically is done with a file system like StorNext. Once content is finished, it is preserved in an object store, with StorNext data management handling the data movement between file and object tiers.

“Object storage software is an obvious fit with our strategy, our go-to-market focus and within our technology portfolio,” says Jamie Lerner, president/CEO of Quantum. “We are committed to the product, and to making ActiveScale customers successful, and we look forward to engaging with them to solve their most pressing business challenges around storing and managing unstructured data. With the addition of the engineers and scientists that developed the erasure-coded object store software, we can deliver on a robust technical roadmap, including new solutions like an object store built on a combination of disk and tape.”

Colorist Chat: Light Iron supervising colorist Ian Vertovec

“As colorists, we are not just responsible for enhancing each individual shot based on the vision of the filmmakers, but also for helping to visually construct an emotional arc over time.”

NAME: Ian Vertovec

TITLE: Supervising Colorist

COMPANY: Light Iron

CAN YOU DESCRIBE YOUR ROLE IN THE COMPANY?
A Hollywood-based collaborator for motion picture finishing, with a studio in New York City as well.

GLOW

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
As colorists, we are not just responsible for enhancing each individual shot based on the vision of the filmmakers, but also for helping to visually construct an emotional arc over time. For example, a warm scene feels warmer coming out of a cool scene as opposed to another warm scene. We have the ability and responsibility to nudge the audience emotionally over the course of the film. Using color in this way makes color grading a bit like a cross between photography and editing.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Once in a while, I’ll be asked to change the color of an object, like change a red dress to blue or a white car to black. While we do have remarkable tools at our disposal, this isn’t quite the correct way to think about what we can do. Instead of being able to change the color of objects, it’s more like we can change the color of the light shining on objects. So instead of being able to turn a red dress to blue, I can change the light on the dress (and only the dress) to be blue. So while the dress will appear blue, it will not look exactly how a naturally blue dress would look under white light.

WHAT’S YOUR FAVORITE PART OF THE JOB?
There is a moment with new directors, after watching the first finished scene, when they realize they have made a gorgeous-looking movie. It’s their first real movie, which they never fully saw until that moment — on the big screen, crystal clear and polished — and it finally looks how they envisioned it. They are genuinely proud of what they’ve done, as well as appreciative of what you brought out in their work. It’s an authentic filmmaking moment.

WHAT’S YOUR LEAST FAVORITE?
Working on multiple jobs at a time and long days can be very, very draining. It’s important to take regular breaks to rest your eyes.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Something with photography, VFX or design, maybe.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I was doing image manipulation in high school and college before I even knew what color grading was.

Just Mercy

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Just Mercy, Murder Mystery, GLOW, What We Do in the Shadows and Too Old to Die Young.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Sometimes your perspective and a filmmaker’s perspective for a color grade can be quite divergent. There can be a temptation to take the easy way and either defer or overrule. I find tremendous value in actually working out those differences and seeing where and why you are having a difference of opinion.

It can be a little scary, as nobody wants to be perceived as confrontational, but if you can civilly explain where and why you see a different approach, the result will almost always be better than what either of you thought possible in the first place. It also allows you to work more closely and understand each other’s creative instincts more accurately. Those are the moments I am most proud of — when we worked through an awkward discord and built something better.

WHERE DO YOU FIND INSPIRATION?
I have a fairly extensive library of Pinterest boards — mostly paintings — but it’s real life and being in the moment that I find more interesting. The color of a green leaf at night under a sodium vapor light, or how sunlight gets twisted by a plastic water bottle — that is what I find so cool. Why ruin that with an Insta post?

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
FilmLight Baselight’s Base Grade, FilmLight Baselight’s Texture Equalizer and my Red Hydrogen.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram mostly.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
After working all day on a film, I often don’t feel like watching another movie when I get home because I’ll just be thinking about the color.  I usually unwind with a video game, book or podcast. The great thing about a book or video games is that they demand your 100% attention. You can’t be simultaneously browsing social media or the news  or be thinking about work. You have to be 100% in the moment, and it really resets your brain.

Nomad Editorial hires eclectic editor Dan Maloney

Nomad Editing Company has added editor Dan Maloney to its team. Maloney is best known for his work cutting wry, eclectic comedy spots in addition to more emotional content. While his main tool is Avid Media Composer, he is also well-versed in Adobe Premiere.

“I love that I get to work in so many different styles and genres. It keeps it all interesting,” he says.

Prior to joining Nomad, Maloney cut at studios such as Whitehouse Post, Cut+Run, Spot Welders and Deluxe’s Beast. Throughout his career, Maloney has uses his eye for composition on a wide range of films, documentaries, branded content and commercials, including the Tide Interview spot that debuted at Super Bowl XLII.
“My editing style revolves mostly around performance and capturing that key moment,” he says. “Whether I’m doing a comedic or dramatic piece, I try to find that instance where an actor feels ‘locked in’ and expand the narrative out from there.”

According to Nomad editor/partner Jim Ulbrich, “Editing is all about timing and pace. It’s a craft and you can see Dan’s craftsmanship in every frame of his work. Each beat is carefully constructed to perfection across multiple mediums and genres. He’s not simply a comedy editor, visual storyteller, or doc specialist. He’s a skilled craftsman.”

Director James Mangold on Oscar-nominated Ford v Ferrari

By Iain Blair

Filmmaker James Mangold has been screenwriting, producing and directing for years. He has made films about country legends (Walk the Line), cowboys (3:10 to Yuma), superheroes (Logan) and cops (Cop Land), and has tackled mental illness (Girl Interrupted) as well.

Now he’s turned his attention to race car drivers and Formula 1 with his movie Ford v Ferrari, which has earned Mangold an Oscar nomination for Best Picture. The film also received nods for its editing, sound editing and sound mixing.

James Mangold (beard) on set.

The high-octane drama was inspired by a true-life friendship that forever changed racing history. In 1959, Carroll Shelby (Matt Damon) is on top of the world after winning the most difficult race in all of motorsports, the 24 Hours of Le Mans. But his greatest triumph is followed quickly by a crushing blow — the fearless Texan is told by doctors that a grave heart condition will prevent him from ever racing again.

Endlessly resourceful, Shelby reinvents himself as a car designer and salesman working out of a warehouse space in Venice Beach with a team of engineers and mechanics that includes hot-tempered test driver Ken Miles (Christian Bale). A champion British race car driver and a devoted family man, Miles is brilliant behind the wheel, but he’s also blunt, arrogant and unwilling to compromise.

After Shelby’s vehicles make a strong showing at Le Mans against Italy’s venerable Enzo Ferrari, Ford Motor Company recruits the firebrand visionary to design the ultimate race car, a machine that can beat even Ferrari on the unforgiving French track. Determined to succeed against overwhelming odds, Shelby, Miles and their ragtag crew battle corporate interference, the laws of physics and their own personal demons to develop a revolutionary vehicle that will outshine every competitor. The film culminates in the historic showdown between the US and Italy at the grueling 1966 24 hour Le Mans race.

Mangold’s below-the-line talent, many of whom have collaborated with the director before, includes Academy Award-nominated director of photography Phedon Papamichael; film editors Michael McCusker, ACE, and Andrew Buckland; visual effects supervisor Olivier Dumont; and composers Marco Beltrami and Buck Sanders.

L-R: Writer Iain Blair and Director James Mangold

I spoke with Mangold — whose other films include Logan, The Wolverine and Knight and Day — about making the film and his workflow.

You obviously love exploring very different subject matter in every film you make.
Yes, and I do every movie like a sci-fi film — meaning inventing a new world that has its own rules, customs, language, laws of physics and so on, and you need to set it up so the audience understands and they get it all. It’s like being a world-builder, and I feel every film should have that, as you’re entering this new world, whether it’s Walk the Line or The French Connection. And the rules and behavior are different from our own universe, and that’s what makes the story and characters interesting to me.

What sort of film did you set out to make?
Well, given all that, I wanted to make an exciting racing movie about that whole world, but it’s also that it was a moment when racing was free of all things that now turn me off about it. The cars were more beautiful then, and free of all the branding. Today, the cars are littered with all the advertising and trademarks — and it’s all nauseating to me. I don’t even feel like I’m watching a sport anymore.

When this story took place, it was also a time when all the new technology was just exploding. Racing hasn’t changed that much over the past 20 years. It’s just refining and tweaking to get that tiny edge, but back in the ‘60s they were still inventing the modern race car, and discovering aerodynamics and alternate building materials and methods. It was a brand-new world, so there was this great sense of discovery and charm along with all that.

What were the main technical challenges in pulling it all together?
Trying to do what I felt all the other racing movies hadn’t really done — taking the driving out of the CG world and putting it back in the real world, so you could feel the raw power and the romanticism of racing. A lot of that’s down to the particulates in the air, the vibrations of the camera, the way light moves around the drivers — and the reality of behavior when you’re dealing with incredibly powerful machines. So right from the start, I decided we had to build all the race cars; that was a huge challenge right there.

How early on did you start integrating post and all the VFX?
Day one. I wanted to use real cars and shoot the Le Mans and other races in camera rather than using CGI. But this is a period piece, so we did use a lot of CGI for set extensions and all the crowds. We couldn’t afford 50,000 extras, so just the first six rows or so were people in the stands; the rest were digital.

Did you do a lot of previz?

A lot, especially for Le Mans, as it was such a big, three-act sequence with so many moving parts. We used far less for Daytona. We did a few storyboards and then me and my second unit director, Darrin Prescott — who has choreographed car chases and races in such movies as Drive, Deadpool 2, Baby Driver and The Bourne Ultimatum — planned it out using matchbox cars.

I didn’t want that “previzy” feeling. Even when I do a lot of previz, whether it’s a Marvel movie or like this, I always tell my previz team “Don’t put the camera anywhere it can’t go.” One of the things that often happens when you have the ability to make your movie like a cartoon in a laboratory — which is what previz is — is that you start doing a lot of gimmicky shots and flying the camera through keyholes and floating like a drone, because it invites you to do all that crazy shit. It’s all very show-offy as a director — “Look at me!” — and a turnoff to me. It takes me out of the story, and it’s also not built off the subjective experience of your characters.

This marks your fifth collaboration with DP Phedon Papamichael, and I noticed there’s no big swooping camera moves or the beauty shot approach you see in all the car commercials.
Yes, we wanted it to look beautiful, but in a real way. There’s so much technology available now, like gyroscopic setups and arms that let you chase the cars in high-speed vehicles down tracks. You can do so much, so why do you need to do more? I’m conservative that way. My goal isn’t to brand myself through my storytelling tricks.

How tough was the shoot?
It was one of the most fun shoots I’ve ever had, with my regular crew and a great cast. But it was also very grueling, as we were outside a lot, often in 115-degree heat in the desert on blacktop. And locations were big challenges. The original Le Mans course doesn’t exist anymore like it used to be, so we used several locations in Georgia to double for it. We shot the races wide-angle anamorphic with a team of a dozen professional drivers, and with anamorphic you can shoot the cars right up into the lens — just inches away from camera, while they’d be doing 150 mph or 160 mph.

Where did you post?
All on the Fox lot at my offices. We scored at Capitol Records and mixed the score in Malibu at my composer’s home studio. I really love the post, and for me it’s all part of the same process — the same cutting and pasting I do when I’m writing, and even when I’m directing. You’re manipulating all these elements and watching it take form — and particularly in this film, where all the sound design and music and dialogue are all playing off one another and are so key. Take the races. By themselves, they look like nothing. It’s just a car whipping by. The power of it all only happens with the editing.

You had two editors — Michael McCusker and Andrew Buckland. How did that work?
Mike’s been with me for 20 years, so he’s kind of the lead. Mike and Drew take and trade scenes, and they’re good friends so they work closely together. I move back and forth between them, which also gives them each some space. It’s very collaborative. We all want it to look beautiful and elegant and well-designed, but no one’s a slave to any pre-existing ideas about structure or pace. (Check out postPerspective‘s interview with the editing duo here.)

What were the big editing challenges?
It’s a car racing movie with drama, so we had to hit you with adrenalin and then hold you with what’s a fairly procedural and process-oriented film about these guys scaling the corporate wall to get this car built and on the track. Most of that’s dramatic scenes. The flashiest editing is the races, which was a huge, year-long effort. Mike was cutting the previz before we shot a foot, and initially we just had car footage, without the actors, so that was a challenge. It all transformed once we added the actors.

Can you talk about working on the visual effects with Method’s VFX supervisor Olivier Dumont?
He did an incredible job, as no one thinks there are so many. They’re really invisible, and that’s what I love — the film feels 100% analog, but of course it isn’t. It’s impossible to build giant race tracks as they were in the ‘60s. But having real foregrounds really helped. We had very few scenes where actors were wandering around in a green void like on so many movies now. So you’re always anchored in the real world, and then all the set extensions were in softer focus or backlit.

This film really lends itself to sound.
Absolutely, as every car has its own signature sound, and as we cut rapidly from interiors to exteriors, from cars to pits and so on. The perspective aural shifts are exciting, but we also tried to keep it simple and not lose the dramatic identity of the story. We even removed sounds in the mix if they weren’t important, so we could focus on what was important.

Where did you do the DI, and how important is it to you?
At Efilm with Skip Kimball (working on Blackmagic DaVinci Resolve), and it was huge on this, especially dealing with the 24-hour race, the changing light, rain and night scenes, and having to match five different locations was a nightmare. So we worked on all that and the overall look from early on in the edit.

What’s next?
Don’t know. I’ve got two projects I’m working on. We’ll see.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Talking with Franki Ashiruka of Nairobi’s Africa Post Office

By Randi Altman

After two decades of editing award-winning film and television projects for media companies throughout Kenya and around the world, Franki Ashiruka opened Africa Post Office, a standalone, post house in Nairobi, Kenya. The studio provides color grading, animation, visual effects, motion graphics, compositing and more. In addition, they maintain a database of the Kenyan post production community that allows them to ramp up with the right artists when the need arises.

Here she talks about the company, its workflow and being a pioneer in Nairobi’s production industry.

When did you open Africa Post Office, and what was your background prior to starting this studio?
Africa Post Office (APO) opened its doors in February 2017. Prior to starting APO, I was a freelance editor with plenty of experience working with well-established media houses such as Channel 4 (UK), Fox International Channels (UK), 3D Global Leadership (Nigeria), PBS (USA), Touchdown (New Zealand), Greenstone Pictures (New Zealand) and Shadow Films (South Africa).

In terms of Kenya-based projects, I’ve worked with a number of production houses including Quite Bright Films, Fat Rain Films, Film Crew in Africa, Mojo Productions, Multichoice, Zuku, Content House and Ginger Ink Films.

I imagine female-run, independent studios in Africa are rare?
On the contrary, Kenya has reached a point where more and more women are emerging as leaders of their own companies. I actually think there are more women-led film production companies than male-led. The real challenge was that before APO, there was nothing quite like it in Nairobi. Historically, video production here was very vertical — if you shot something, you’d need to also manage post within whatever production house you were working in. There were no standalone post houses until us. That said, with my experience, even though hugely daunting, I never thought twice about starting APO. It is what I have always wanted to do, and if being the first company of our kind didn’t intimidate me, being female was never going to be a hindrance.

L-R: Franki Ashiruka, Kevin Kyalo, Carole Kinyua and Evans Wenani

What is the production and post industry like in Nairobi? 
When APO first opened, the workload was commercial-heavy, but in the last two years that has steadily declined. We’re seeing this gap filled by documentary films, corporate work and television series. Feature films are also slowly gaining traction and becoming the focus of many up-and-coming filmmakers.

What services do you provide, and what types of projects do you work on?
APO has a proven track record of successful delivery on hundreds of film and video projects for a diverse range of clients and collaborators, including major corporate entities, NGOs, advertising and PR agencies, and television stations. We also have plenty of experience mastering according to international delivery standards. We’re proud to house a complete end-to-end post ecosystem of offline and online editing suites.

Most importantly, we maintain a very thorough database of the post production community in Kenya.
This is of great benefit to our clients who come to us for a range of services including color grading, animation, visual effects, motion graphics and compositing. We are always excited to collaborate with the right people and get additional perspectives on the job at hand. One of our most notable collaborators is Ikweta Arts (Avatar, Black Panther, Game of Thrones, Hacksaw Ridge), owned and run by Yvonne Muinde. They specialize in providing VFX services with a focus in quality matte painting/digital environments, art direction, concept and post visual development art. We also collaborate with Keyframe (L’Oréal, BMW and Mitsubishi Malaysia) for motion graphics and animations.

Can you name some recent projects and the work you provided?
We are incredibly fortunate to be able to select projects that align with our beliefs and passions.

Our work on the short film Poacher (directed by Tom Whitworth) won us three global Best Editing Awards from the Short to the Point Online Film Festival (Romania, 2018), Feel the Reel International Film Festival (Glasgow, 2018) and Five Continents International Film Festival (Venezuela, 2019).

Other notable work includes three feature documentaries for the Big Story segment on China Global Television Network, directed by Juan Reina (director of the Netflix Original film Diving Into the Unknown), Lion’s Den (Quite Bright Films) an adaptation of ABC’s Shark Tank and The Great Kenyan Bake Off (Showstopper Media) adopted from the BBC series The Great British Bake Off. We also worked on Disconnect, a feature film produced by Kenya’s Tosh Gitonga (Nairobi Half Life), a director who is passionate about taking Africa’s budding film industry to the next level. We have also worked on a host of television commercials for clients extending across East Africa, including Kenya, Rwanda, South Sudan and Uganda.

What APO is most proud of though, is our clients’ ambitions and determination to contribute toward the growth of the African film industry. This truly resonates with APO’s mantra.

You recently added a MAM and some other gear. Can you talk about the need to upgrade?
Bringing on the EditShare EFS 200 nodes has significantly improved the collaborative possibilities of APO. We reached a point where we were quickly growing, and the old approach just wasn’t going to cut it.

Prior to centralizing our content, projects lived on individual hard disks. This meant that if I was editing and needed my assistant to find me a scene or a clip, or I needed VFX on something, I would have to export individual clips to different workstations. This created workflow redundancies and increased potential for versioning issues, which is something we couldn’t afford to be weighed down with.

The remote capabilities of the EditShare system were very appealing as well. Our color grading collaborator, Nic Apostoli of Comfort and Fame, is based in Cape Town, South Africa. From there, he can access the footage on the server and grade it while the client reviews with us in Nairobi. Flow media asset management also helps in this regard. We’re able to effectively organize and index clips, graphics, versions, etc. into clearly marked folders so there is no confusion about what media should be used. Collaboration among the team members is now seamless regardless of their physical location or tools used, which include the Adobe Creative Suite, Foundry Nuke, Autodesk Maya and Maxon Cinema 4D.

Any advice for others looking to break out on their own and start a post house?
Know what you want to do, and just do it! Thanks Nike …


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The 71st NATAS Technology & Engineering Emmy Award winners

The National Academy of Television Arts & Sciences (NATAS) has announced the recipients of the 71st Annual Technology & Engineering Emmy Awards. The event will take place in partnership with the National Association of Broadcasters, during the NAB Show on Sunday, April 19 in Las Vegas.

The Technology & Engineering Emmy Awards are awarded to a living individual, a company or a scientific or technical organization for developments and/or standardization involved in engineering technologies that either represent so extensive an improvement on existing methods or are so innovative in nature that they materially have affected television.

A committee of engineers working in television considers technical developments in the industry and determines which, if any, merit an award.

“The Technology & Engineering Emmy Award was the first Emmy Award issued in 1949, and it laid the groundwork for all the other Emmys to come,” says Adam Sharp, CEO/president of NATAS. “We are especially excited to be honoring Yvette Kanouff with our Lifetime Achievement Award in Technology & Engineering.”

Kanouff has held CTO and president roles at various companies in the cable and media industry. Over the years, she has spearheaded transformational technologies, such as video on demand, cloud DVR, digital and on-demand advertising, streaming security and privacy.

And now the Awards recipients:

Pioneering System for Live Performance-Based Animation Using Facial Recognition
– Adobe

HTML5 Development and Deployment of a Full TV Experience on Any Device
– Apple
– Google
– LG
– Microsoft
– Mozilla
– Opera
– Samsung

Pioneering Public Cloud-Based Linear Media Supply Chains
– AWS
– Discovery
– Evertz
– Fox Neo (Walt Disney Television)
– SDVI

Pioneering Development of Large Scale, Cloud Served, Broadcast Quality,
Linear Channel Transmission to Consumers
– Sling TV
– Sony PlayStation Vue
– Zattoo

Early Development of HSM Systems That Created a Pivotal Improvement in Broadcast Workflows
– Dell (Isilon)
– IBM
– Masstech
– Quantum

Pioneering Development and Deployment of Hybrid Fiber Coax Network Architecture
– Cable Labs

Pioneering Development of the CCD Image Sensor
– Bell Labs
– Michael Tompsett

VoCIP (Video over Bonded Cellular Internet)
– Aviwest
– Dejero
– LiveU
– TVU Networks

Ultra-High Sensitivity HDTV Camera
– Canon
– Flovel

Development of Synchronized Multi-Channel Uncompressed Audio Transport Over IP Networks
– ALC NetworX
– Audinate
– Audio Engineering Society
– Kevin Gross
– QSC
– Telos Alliance
– Wheatstone

Emmy statue image courtesy of ATAS/NATAS

Review: HP’s ZBook G6 mobile workstation

By Brady Betzel

In a year that’s seen AMD reveal an affordable 64-core processor with its Threadripper 3, it appears as though we are picking up steam toward next-level computing.

Apple finally released its much-anticipated Mac Pro (which comes with a hefty price tag for the 1.5TB upgrade), and custom-build workstation companies — like Boxx and Puget Systems — can customize good-looking systems to fit any need you can imagine. Additionally, over the past few months, I have seen mobile workstations leveling the playing field with their desktop counterparts.

HP is well-known in the M&E community for its powerhouse workstations. Since I started my career, I have either worked on a MacPro or an HP. Both have their strong points. However, workstation users who must be able to travel with their systems, there have always been some technical abilities you had to give up in exchange for a smaller footprint. That is, until now.

The newly released HP ZBook 15 G6 has become the rising the rising tide that will float all the boats in the mobile workstation market. I know I’ve said it before, but the classification of “workstation” is technically much more than just a term companies just throw around. The systems with workstation-level classification (at least from HP) are meant to be powered on and run at high levels 24 hours a day, seven days a week, 365 days a year.

They are built with high-quality, enterprise-level components, such as ECC (error correcting code) memory. ECC memory will self-correct errors that it sees, preventing things like blue screens of death and other screen freezes. ECC memory comes at a cost, and that is why these workstations are priced a little higher than a standard computer system. In addition, the warranties are a little more inclusive — the HP ZBook 15 G6 comes with a standard three-year/on-site service warranty.

Beyond the “workstation” classification, the ZBook 15 G6 is amazingly powerful, brutally strong and incredibly colorful and bright. But what really matters is under the hood. I was sent the HP ZBook 15 G6 that retails for $4,096 and contains the following specs:
– Intel Xeon E-2286M (eight cores/16 threads — 2.4GHz base/5GHz Turbo)
– Nvidia Quadro RTX 3000 (6GB VRAM)
15.6-inch UHD HP Dream Color display, anti-glare, WLED backlit 600 nits, 100% DCI-P3
– 64GB DDR4 2667MHz
– 1TB PCIe Gen 3 x4 NVMe SSD TLC
– FHD webcam 1080p plus IR camera
– HP collaboration keyboard with dual point stick
– Fingerprint sensor
– Smart Card reader
– Intel Wi-Fi 6 AX 200, 802.11ac 2×2 +BT 4.2 combo adapter (vPro)
– HP long-life battery four-cell 90 Wh
– Three-year limited warranty

The ZBook 15 G6 is a high-end mobile workstation with a price that reflects it. However, as I said earlier, true workstations are built to withstand constant use and, in this case, abuse. The ZBook 15 G6 has been designed to pass up to 21 extensive MIL-STD 810G tests, which is essentially worst-case scenario testing. For instance, drop testing of around four feet, sand and dust testing, radiation testing (the sun beating down on the laptop for an extended period) and much more.

The exterior of the G6 is made of aluminum and built to withstand abuse. The latest G6 is a little bulky/boxy, in my opinion, but I can see why it would hold up to some bumps and bruises, all while working at blazingly fast speeds, so bulk isn’t a huge issue for me. Because of that bulk, you can imagine that this isn’t the lightest laptop either. It weighs in at 5.79 pounds for the lowest end and measures 1 inch by 14.8 inches by 10.4 inches.

On the bottom of the workstation is an easy-to-access panel for performing repairs and upgrades yourself. I really like the bottom compartment. I opened it and noticed I could throw in an additional NVMe drive and an SSD if needed. You can also access memory here. I love this because not only can you perform easy repairs yourself, but you can perform upgrades or part replacements without voiding your warranty on the original equipment. I’m glad to see that HP kept this in mind.

The keyboard is smaller than a full-size version but has a number keypad, which I love using when typing in timecodes. It is such a time-saver for me. (I credit entering in repair order numbers when I fixed computers at Best Buy as a teenager.) On the top of the keyboard are some handy shortcuts if you do web conferences or calls on your computer, including answering and ending calls. The Bang & Olufsen speakers are some of the best laptop speakers I’ve heard. While they aren’t quite monitor-quality, they do have some nice sound on the low end that I was able to fine-tune in the Bang & Olufsen audio control app.

Software Tests
All right, enough of the technical specs. Let’s get on to what people really want to know — how the HP ZBook 15 G6 performs while using apps like Blackmagic’s DaVinci Resolve and Adobe Premiere Pro. I used sample Red and Blackmagic Raw footage that I use a lot in testing. You can grab the Red footage here and the BRaw footage here. Keep in mind you will need to download the BRaw software to edit with BRaw inside of Adobe products, which you can find here).

Performance monitor while exporting in Resolve with VFX.

For testing in Resolve and Premiere, I strung out one-minute of 4K, 6K and 8K Red media in one sequence and the 4608×2592 4K and 6K BRaw media in another. During the middle of my testing Resolve had a giant Red API upgrade to allow for better realtime playback of Red Raw files if you have an Nvidia CUDA-based GPU.

First up is Resolve 16.1.1 and then Resolve 16.1.2. Both sequences are set to UHD (3840×2160) resolution. One sequence of each codec contains just color correction, while another of each codec contains effects and color correction. The Premiere sequence with color and effects contains basic Lumetri color correction, noise reduction (50) and a Gaussian blur with settings of 0.4. In Resolve, the only difference in the color and effects sequence is that the noise reduction is spatial and set to Enhanced, Medium and 25/25.

In Resolve, the 4K Red media would play in realtime while the 6K (RedCode 3:1) would jump down to about 14fps to 15fps, and the 8K (RedCode 7:1) would play at 10fps at full resolution with just color correction. With effects, the 4K media would play at 20fps, 6K at 3fps and 8K at 10fps. The Blackmagic Raw video would play at real time with just color correction and around 3fps to 4fps with effects.

This is where I talk about just how loud the fans in the ZBook 15 G6 can get. When running exports and benchmarks, the fans are noticeable and a little distracting. Obviously, we are running some high-end testing with processor- and GPU-intensive tests but still, the fans were noticeable. However, the bottom of the mobile workstation was not terribly hot, unlike the MacBook Pros I’ve tested before. So my lap was not on fire.

In my export testing, I used those same sequences as before and from Adobe Premiere Pro 2020. I exported UHD files using Adobe Media Encoder in different containers and codecs: H.264 (Mov), H.265 (Mov), ProResHQ, DPX, DCP and MXF OP1a (XDCAM). The MXF OP1a was at 1920x1080p export.
Here are my results:

Red (4K,6K,8K)
– Color Only: H.264 – 5:27; H.265 – 4:45; ProResHQ – 4:29; DPX – 3:37; DCP – 10:38; MXF OP1a – 2:31

Red Color, Noise Reduction (50), Gaussian Blur .4: H.264 – 4:56; H.265 – 4:56; ProResHQ – 4:36; DPX – 4:02; DCP – 8:20; MXF OP1a – 2:41

Blackmagic Raw
Color Only: H.264 – 2:05; H.265 – 2:19; ProResHQ – 2:04; DPX – 3:33; DCP – 4:05; MXF OP1a – 1:38

Color, Noise Reduction (50), Gaussian Blur 0.4: H.264 – 1:59; H.265 – 2:22; ProResHQ – 2:07; DPX – 3:49; DCP – 3:45; MXF OP1a – 1:51

What is surprising is that when adding effects like noise reduction and a Gaussian blur in Premiere, the export times stayed similar. While using the ZBook 15 G6, I noticed my export times improved when I upgraded driver versions, so I re-did my tests with the latest Nvidia drivers to make sure I was consistent. The drivers also solved an issue in which Resolve wasn’t reading BRaw properly, so remember to always research drivers.

The Nvidia Quadro RTX 3000 really pulled its weight when editing and exporting in both Premiere and Resolve. In fact, in previous versions of Premiere, I noticed that the GPU was not really being used as well as it should have been. With the Premiere Pro 2020 upgrade it seems like Adobe really upped its GPU usage game — at some points I saw 100% GPU usage.

In Resolve, I performed similar tests, but instead of ProResHQ I exported a DNxHR QuickTime file/package instead of a DCP and IMF package. For the most part, they are stock exports in the Deliver page of Resolve, except I forced Video Levels, Forced Debayer and Resizing to Highest Quality. Here are my results from Resolve version 16.1.1 and 16.1.2. (16.1.2 will be in parenthesis.)

– Red (4K, 6K, 8K) Color Only: H.264 – 2:17 (2:31); H.265 – 2:23 (2:37); DNxHR – 2:59 (3:06); IMF – 6:37 (6:40); DPX – 2:48 (2:45); MXF OP1A – 2:45 (2:33)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 5:00 (5:15); H.265 – 5:18 (5:21); DNxHR – 5:25 (5:02); IMF – 5:28 (5:11); DPX – 5:23 (5:02); MXF OP1a – 5:20 (4:54)

-Blackmagic Raw Color Only: H.264 – 0:26 (0:25); H.265 – 0:31 (0:30); DNxHR – 0:50 (0:50); IMF – 3:51 (3:36); DPX – 0:46 (0:46); MXF OP1a – 0:23 (0:22)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 7:51 (7:53); H.265 – 7:45 (8:01); DNxHR – 7:53 (8:00); IMF – 8:13 (7:56); DPX – 7:54 (8:18); MXF OP1a – 7:58 (7:57)

Interesting to note: Exporting Red footage with color correction only was significantly faster from Resolve, but for Red footage with effects applied, export times were similar between Resolve and Premiere. With the CUDA Red SDK update to Resolve in 16.1.2, I thought I would see a large improvement, but I didn’t. I saw an approximate 10% increase in playback but no improvement in export times.

Puget

Puget Systems has some great benchmarking tools, so I reached out to Matt Bach, Puget Systems’ senior labs technician, about my findings. He suggested that the mobile Xeon could possibly still be the bottleneck for Resolve. In his testing he saw a larger increase in speed with AMD Threadripper 3 and Intel i9-based systems. Regardless, I am kind of going deep on realtime playback of 8K Red Raw media on a mobile workstation — what a time we are in. Nonetheless, Blackmagic Raw footage was insanely fast when exporting out of Resolve, while export time for the Blackmagic Raw footage with effects was higher than I expected. There was a consistent use of the GPU and CPU in Resolve much like in the new version of Premiere 2020, which is a trend that’s nice to see.

In addition to Premiere and Resolve testing, I ran some common benchmarks that provide a good 30,000-foot view of the HP ZBook 15 G6 when comparing it to other systems. I decided to use the Puget Systems benchmarking tools. Unfortunately, at the time of this review, the tools were only working properly with Premiere and After Effects 2019, so I ran the After Effects benchmark using the 2019 version. The ZBook 15 G6 received an overall score of 802, render score of 79, preview score of 75.2 and tracking score of 86.4. These are solid numbers that beat out some desktop systems I have tested.

Corona

To test some 3D applications, I ran the Cinebench R20, which gave a CPU score of 3243, CPU (single core) score of 470 and an M/P ratio of 6.90x. I recently began running the Gooseberry benchmark scene in Blender to get a better sense of 3D rendering performance, and it took 29:56 to export. Using the Corona benchmark, it took 2:33 to render 16 passes, 3,216,368 rays/s. Using Octane Bench the ZBook 15 G6 received a score of 139.79. In the Vray benchmark for CPU, it received 9833 Ksamples, and in the Vray GPU testing, 228 mpaths. I’m not going to lie; I really don’t know a lot about what these benchmarks are trying to tell me, but they might help you decide whether this is the mobile workstation for your work.

Cinebench

One benchmark I thought was interesting between driver updates for the Nvidia Quadro RTX 3000 was the Neat Bench from Neat Video — the noise reduction plugin for video. It measures whether your system should use the CPU, GPU or a combination thereof to run Neat Video. Initially, the best combination result was to use the CPU only (seven cores) at 11.5fps.

After updating to the latest Nvidia drivers, the best combination result was to use the CPU (seven cores) and GPU (Quadro RTX 3000) at 24.2fps. A pretty incredible jump just from a driver update. Moral of the story: Make sure you have the correct drivers always!

Summing Up
Overall, the HP ZBook 15 G6 is a powerful mobile workstation that will work well across the board. From 3D to color correction apps, the Xeon processor in combination with the Quadro RTX 3000 will get you running 4K video without a problem. With the HP DreamColor anti-glare display using up to 600 nits of brightness and covering 100% of the DCI-P3 color space, coupled with the HDR option, you can rely on the attached display for color accuracy if you don’t have your output monitor attached. And with features like two USB Type-C ports (Thunderbolt 3 plus DP 1.4 plus USB 3.1 Gen 2), you can connect external monitors for a larger view of your work

The HP Fast Charge will get you out of a dead battery fiasco with the ability to go from 0% to 50% charge in 45 minutes. All of this for around $4,000 seems to be a pretty low price to pay, especially because it includes a three-year on-site warranty and because the device is certified to work seamlessly with many apps that pros use with HP’s independent software vendor verifications.

If you are looking for a mobile workstation upgrade, are moving from desktop to mobile or want an alternative to a MacBook Pro, you should price a system out online.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

The Mill opens boutique studio in Berlin

Technicolor’s The Mill has officially launched in Berlin. This new boutique studio is located in the heart of Berlin, situated in the creative hub of Mitte, near many of Germany’s agencies, production companies and brands.

The Mill has been working with German clients for years. Recent projects include the Mercedes’ Bertha Benz spot with director Sebastian Strasser; Netto’s The Easter Surprise, directed in-house by The Mill; and BMW The 8 with director Daniel Wolfe. The new studio will bring The Mill’s full range of creative services from color to experiential and interactive, as well as visual effects and design.

The Mill Berlin crew

Creative director Greg Spencer will lead the creative team. He is a multi-award winning creative, having won several VES, Cannes Lions and British Arrow awards. His recent projects include Carlsberg’s The Lake, PlayStation’s This Could Be You and Eve Cuddly Toy. Spencer also played a role in some of Mill Film’s major titles. He was the 2D supervisor for Les Misérables and also worked on the Lord of the Rings trilogy. His resume also includes campaigns for brands such as Nike and Samsung.

Executive producer Justin Stiebel moves from The Mill London, where he has been since early 2014, to manage client relationships and new business. Since joining the company, Stiebel has produced spots such as Audi’s Next Level and the Mini’s “The Faith of a Few” campaign. He has also collaborated with directors such as Sebastian Strasser, Markus Walter and Daniel Wolfe while working on brands like Mercedes, Audi and BMW.

Sean Costelloe is managing director of The Mill London and The Mill Berlin.

Main Image Caption: (L-R) Justin Stiebel and Greg Spencer

Quantum F1000: a lower-cost NVMe storage option

Quantum is now offering the F1000, a lower-priced addition to the Quantum F-Series family of NVMe storage appliances. Using the software-defined architecture introduced with the F2000, the F1000 offers “ultra-fast streaming” performance and response times at a lower entry price. The F-Series can be used to accelerate the capture, edit and finishing of high-definition content and to accelerate VFX and CGI render speeds up to 100 times for developing augmented and virtual reality.

The Quantum F-Series was designed to handle content such as HD video used for movie, TV and sports production, advertising content or image-based workloads that require high-speed processing. Pros are using F-Series NVMe systems as part of Quantum’s StorNext scale-out file storage cluster and leveraging the StorNext data management capabilities to move data between NVMe storage pools and other storage pools. Users can take advantage of the performance boost NVMe provides for workloads that require it, while continuing to use lower-cost storage for data where performance is less critical.

Quantum F-Series NVMe appliances accelerate pro workloads and also help customers move from Fibre Channel networks to less expensive IP-based networks. User feedback has shown that pros need a lower cost of entry into NVMe technology, which is what led Quantum to develop the F1000. According to Quantum, the F1000 offers performance that is five to 10 times faster than an equivalent SAS SSD storage array at a similar price.

The F1000 is available in two capacity points: 39TB and 77TB. It offers the same connectivity options as the F2000 — 32Gb Fibre Channel or iSER/RDMA using 100Gb Ethernet — and is designed to be deployed as part of a StorNext scale out file storage cluster.

DP Chat: The Grudge’s Zachary Galler

By Randi Altman

Being on set is like coming home for New York-based cinematographer Zachary Galler, who as a child would tag along with his father while he directed television and film projects. The younger Galler started in the industry as a lighting technician and quickly worked his way up to shooting various features and series.

His first feature as a cinematographer, The Sleepwalker, premiered at the in 2014 and was later distributed by IFC. His second feature, She’s Lost Control, was awarded the C.I.C.A.E. Award at the Berlin International Film Festival later that year. Other television credits include all eight episodes of Discovery’s scripted series Manhunt: Unabomber, Hulu’s The Act and USA’s Briarpatch (coming in February). He recently completed the feature Nicolas Pesce-directed thriller The Grudge, which stars John Cho and Betty Gilpin and is in theaters now.

Tell us about The Grudge. How early did you get involved in planning, and what direction were you given by the director about the look he wanted?
Nick and I worked together on a movie he directed called Piercing. That was our first collaboration, but we discovered that we had very similar ideas and working styles and we formed a special relationship. Shortly after that project, we started talking about The Grudge, and about a year later we were shooting. We talked a lot about how this movie should feel, and how we could achieve something new and different from something neither of us had done before. We used a lot of look-books and movie references to communicate, so when it came time to shoot we had the visual language down fluently and that allowed us keep each other consistent in execution.

How would you describe the look?
Nick really liked the bleach-bypass look from David Fincher’s Se7en, and I thought about a mix of that and (photographer) Bill Henson. We also knew that we had to differentiate between the different storyline threads in the movie, so we had lots to figure out. One of the threads is darker and looks very yellow, while another is warmer and more classic. Another is slightly more desaturated and darker. We did keep the same bleach-bypass look throughout, but adjusted our color temperature, contrast and saturation accordingly. For a horror movie like this, I really wanted to be able to control where the shadow detail turned into black, because some of our scare scenes relied on that so we made sure to light accordingly, and were able to fine-tune most of that in-camera.

How did you work with the director and colorist to achieve that look?
We worked with FotoKem colorist Kostas Theodosiou (who used Blackmagic Resolve). I was shooting a TV show during the main color pass, so I only got to check in to set looks and approve final color, but Nick and Kostas did a beautiful job. Kostas is a master of contrast control and very tastefully helped us ride that line of where there should be detail and where it should not be detail. He was definitely an important part of the collaboration and helped make the movie better.

Where was it shot and how long was the shoot?
We shot the movie in 35 days in Winnipeg, Canada.

How did you go about choosing the right camera and lenses for this project and why these tools?
Nick decided early on that he wanted to shoot this film anamorphic. Panavision has been an important partner for me on most of my projects, and I knew that I loved their glass. We got a range of different lenses from Panavision Toronto to help us differentiate our storylines — we shot one on T Series, one on Primo anamorphics and one on G Series anamorphics. The Alexa Mini was the camera of choice because of its low light sensitivity and more natural feel.

Now more general questions…

How did you become interested in cinematography?
My father was a director, so I would visit him on set a lot when I was growing up. I didn’t know quite what I wanted to do when I was young but I knew that it was being on set. After dropping out of film school, I got a job working in a lighting rental warehouse and started driving trucks and delivering lights to sets in New York. I had always loved taking pictures as a kid and as I worked more and learned more, I realized that what I wanted to do was be a DP. I was very lucky in that I found some great collaborators early on in my career that both pushed me and allowed me to fail. This is the greatest job in the world.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
Artistically, I am inspired by painters, photographers and other DPs. There are so many people doing such amazing work right now. As far as technology is concerned, I’m a bit slow with adopting, as I need to hold something in my hands or see what it does before I adopt it. I have been very lucky to get to work with some great crews, and often a camera assistant, gaffer or key grip will bring something new to the table. I love that type of collaboration.

 

DP Zachary Galler (right) and director Nicolas Pesce on the set of Screen Gems’ The Grudge.

What new technology has changed the way you works?
For some reason, I was resistant to using LUTs for a long time. The Grudge was actually the first time I relied on something that wasn’t close to just plain Rec 709. I always figured that if I could get the 709 feeling good when I got into color I’d be in great shape. Now, I realize how helpful they can be, and that you can push much further. I also think that the Astera LED tubes are amazing. They allow you to do so much so fast and put light in places that would be very hard to do with other traditional lighting units.

What are some of your best practices or rules you try to follow on each job?
I try to be pretty laid back on set, and I can only do that because I’m very picky about who I hire in prep. I try and let people run their departments as much as possible and give them as much information as possible — it’s like cooking, where you try and get the best ingredients and don’t do much to them. I’ve been very lucky to have worked with some great crews over the years.

What’s your go-to gear — things you can’t live without?
I really try and keep an open mind about gear. I don’t feel romantically attached to anything, so that I can make the right choices for each project.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Directing Olly’s ‘Happy Inside Out’ campaign

How do you express how vitamins make you feel? Well, production company 1stAveMachine partnered with independent creative agency Yard NYC to develop the stylized “Happy Inside Out” campaign for Olly multivitamin gummies to show just that.

Beauty

The directing duo of Erika Zorzi and Matteo Sangalli, known as Mathery, highlighted the brand’s products and benefits by using rich textures, colors and lighting. They shot on an ARRI Alexa Mini. “Our vision was to tell a cohesive narrative, where each story of the supplements spoke the same visual language,” Mathery explains. “We created worlds where everything is possible and sometimes took each product’s concept to the extreme and other times added some romance to it.”

Each spot imagines various benefits of taking Olly products. The side-scrolling Energy, which features a green palette, shows a woman jumping and doing flips through life’s everyday challenges, including through her home to work, doing laundry and going to the movies. Beauty, with its pink color pallete, features another woman “feeling beautiful” while turning the heads of a parliament of owls. Meanwhile, Stress, with its purple/blue palette, features a women tied up in a giant ball of yarn, and as she unspools herself, the things that were tying her up spin away. In the purple-shaded Sleep, a lady lies in bed pulling off layer after layer of sleep masks until she just happily sleeps.

Sleep

The spots were shot with minimal VFX, other than a few greenscreen moments, and the team found itself making decisions on the fly, constantly managing logistics for stunt choreography, animal performances and wardrobe. Jogger Studios provided the VFX using Autodesk Flame for conform, cleanup and composite work. Adobe After Effects was used for all of the end tag animation. Cut+Run edited the campaign.

According to Mathery, “The acrobatic moves and obstacle pieces in the Energy spot were rehearsed on the same day of the shoot. We had to be mindful because the action was physically demanding on the talent. With the Beauty spot, we didn’t have time to prepare with the owls. We had no idea if they would move their heads on command or try to escape and fly around the whole time. For the Stress spot, we experimented with various costume designs and materials until we reached a look that humorously captured the concept.”

The campaign marks Mathery’s second collaboration with Yard NYC and Olly, who brought the directing team into the fold very early on, during the initial stages of the project. This familiarity gave everyone plenty of time to let the ideas breath.

Recreating the Vatican and Sistine Chapel for Netflix’s The Two Popes

The Two Popes, directed by Fernando Meirelles, stars Anthony Hopkins as Pope Benedict XVI and Jonathan Pryce as current pontiff Pope Francis in a story about one of the most dramatic transitions of power in the Catholic Church’s history. The film follows a frustrated Cardinal Bergoglio (the future Pope Francis) who in 2012 requests permission from Pope Benedict to retire because of his issues with the direction of the church. Instead, facing scandal and self-doubt, the introspective Benedict summons his harshest critic and future successor to Rome to reveal a secret that would shake the foundations of the Catholic Church.

London’s Union was approached in May 2017 and supervised visual effects on location in Argentina and Italy over several months. A large proportion of the film takes place within the walls of Vatican City. The Vatican was not involved in the production and the team had very limited or no access to some of the key locations.

Under the direction of production designer Mark Tildesley, the production replicated parts of the Vatican at Rome’s Cinecitta Studios, including a life-size, open ceiling, Sistine Chapel, which took two months to build.

The team LIDAR-scanned everything available and set about amassing as much reference material as possible — photographing from a permitted distance, scanning the set builds and buying every photographic book they could lay their hands on.

From this material, the team set about building 3D models — created in Autodesk Maya — of St. Peter’s Square, the Basilica and the Sistine Chapel. The environments team was tasked with texturing all of these well-known locations using digital matte painting techniques, including recreating Michelangelo’s masterpiece on the ceiling of the Sistine Chapel.

The story centers on two key changes of pope in 2005 and 2013. Those events attracted huge attention, filling St. Peter’s Square with people eager to discover the identity of the new pope and celebrate his ascension. News crews from around the world also camp out to provide coverage for the billions of Catholics all over the world.

To recreate these scenes, the crew shot at a school in Rome (Ponte Mammolo) that has the same pattern on its floor. A cast of 300 extras was shot in blocks in different positions at different times of day, with costume tweaks including the addition of umbrellas to build a library that would provide enough flexibility during post to recreate these moments at different times of day and in different weather conditions.

Union also called on Clear Angle Studios to individually scan 50 extras to provide additional options for the VFX team. This was an ambitious crowd project, so the team couldn’t shoot in the location, and the end result had to stand up at 4K in very close proximity to the camera. Union designed a Houdini-based system to deal with the number of assets and clothing in such a way that the studio could easily art-direct them as individuals, allow the director to choreograph them and deliver a believable result.

Union conducted several motion capture shoots inhouse at Union to provide some specific animation cycles that married with the occasions they were recreating. This provided even more authentic-looking crowds for the post team.

Union worked on a total of 288 VFX shots, including greenscreens, set extensions, window reflections, muzzle flashes, fog and rain and a storm that included a lightning strike on the Basilica.

In addition, the team did a significant amount of de-aging work to accommodate the film’s eight-year main narrative timeline as well as a long period in Pope Francis’ younger years.

A Beautiful Day in the Neighborhood director Marielle Heller

By Iain Blair

If you are of a certain age, the red cardigan, the cozy living room and the comfy sneakers can only mean one thing — Mister Rogers! Sony Pictures’ new film, A Beautiful Day in the Neighborhood, is a story of kindness triumphing over cynicism. It stars Tom Hanks and is based on the real-life friendship between Fred Rogers and journalist Tom Junod.

Marielle Heller

In the film, jaded writer Lloyd Vogel (Matthew Rhys), whose character is loosely based on Junod, is assigned a profile of Rogers. Over the course of his assignment, he overcomes his skepticism, learning about empathy, kindness and decency from America’s most beloved neighbor.

A Beautiful Day in the Neighborhood is helmed by Marielle Heller, who most recently directed the film Can You Ever Forgive Me? and whose feature directorial debut was 2015’s The Diary of a Teenage Girl. Heller has also directed episodes of Amazon’s Transparent and Hulu’s Casual.

Behind the scenes, Heller collaborated with DP Jody Lee Lipes, production designer Jade Healy, editor Anne McCabe, ACE, and composer Nate Heller.

I recently spoke with Heller about making the film, which is generating a lot of Oscar buzz, and her workflow.

What sort of film did you set out to make?
I didn’t want to make a traditional biopic, and part of what I loved about the script was it had this larger framing device — that it’s a big episode of Mister Rogers for adults. That was very clever, but it’s also trying to show who he was deep down and what it was like to be around him, rather than just rattling off facts and checking boxes. I wanted to show Fred in action and his philosophy. He believed in authenticity and truth and listening and forgiveness, and we wanted to embody all that in the filmmaking.

It couldn’t be more timely.
Exactly, and it’s weird since it’s taken eight years to get it made.

Is it true Tom Hanks had turned this down several times before, but you got him in a headlock and persuaded him to do it?
(Laughs) The headlock part is definitely true. He had turned it down several times, but there was no director attached. He’s the type of actor who can’t imagine what a project will be until he knows who’s helming it and what their vision is.

We first met at his grandkid’s birthday party. We became friends, and when I came on board as director, the producers told me, “Tom Hanks was always our dream for playing Mister Rogers, but he’s not interested.” I said, “Well, I could just call him and send him the script,” and then I told Tom I wasn’t interested in doing an imitation or a sketch version, and that I wanted to get to his essence right and the tone right. It would be a tightrope to walk, but if we could pull it off, I felt it would be very moving. A week later he was like, “Okay, I’ll do it.” And everyone was like, “How did you get him to finally agree?” I think they were amazed.

What did he bring to the role?
Maybe people think he just breezed into this — he’s a nice guy, Fred’s a nice guy, so it’s easy. But the truth is, Tom’s an incredibly technically gifted actor and one of the hardest-working ones I’ve ever worked with. He does a huge amount of research, and he came in completely prepared, and he loves to be directed, loves to collaborate and loves to do another take if you need it. He just loves the work.

Any surprises working with him?
I just heard that he’s actually related to Fred, and that’s another weird thing. But he truly had to transform for the role because he’s not like Fred. He had to slow everything down to a much slower pace than is normal for him and find Fred’s deliberate way of listening and his stillness and so on. It was pretty amazing considering how much coffee Tom drinks every day.

What did Matthew Rhys bring to his role?
It’s easy to forget that he’s actually the protagonist and the proxy for all the cynicism and neuroticism that many of us feel and carry around. This is what makes it so hard to buy into a Mister Rogers world and philosophy. But Matthew’s an incredibly complex, emotional person, and you always know how much he’s thinking. He’s always three steps ahead of you, he’s very smart, and he’s not afraid of his own anger and exploring it on screen. I put him through the ringer, as he had to go through this major emotional journey as Lloyd.

How important was the miniature model, which is a key part of the film?
It was a huge undertaking, but also the most fun we had on the movie. I grew up building miniatures and little cities out of clay, so figuring it all out — What’s the bigger concept behind it? How do we make it integrate seamlessly into the story? — fascinated me. We spent months figuring out all the logistics of moving between Fred’s set and home life in Pittsburgh and Lloyd’s gritty, New York environment.

While we shot in Pittsburgh, we had a team of people spend 12 weeks building the detailed models that included the Pittsburgh and Manhattan skylines, the New Jersey suburbs, and Fred’s miniature model neighborhood. I’d visit them once a week to check on progress. Our rule of thumb was we couldn’t do anything that Fred and his team couldn’t do on the “Neighborhood,” and we expanded a bit beyond Fred’s miniatures, but not outside of the realm of possibility. We had very specific shots and scenes all planned out, and we got to film with the miniatures for a whole week, which was a delight. They really help bridge the gap between the two worlds — Mister Rogers’ and Lloyd’s worlds.

I heard you shot with the same cameras the original show used. Can you talk about how you collaborated with DP Jody Lee Lipes, to get the right look?
We tracked down original Ikegami HK-323 cameras, which were used to film the show, and shipped them in from England and brought them to the set in Pittsburgh. That was huge in shooting the show and making it even more authentic. We tried doing it digitally, but it didn’t feel right, and it was Jody who insisted we get the original cameras — and he was so right.

Where did you post?
We did it in New York — the editing at Light Iron, the sound at Harbor and the color at Deluxe.

Do you like the post process?
I do, as it feels like writing. There’s always a bit of a comedown from production for me, which is so fast-paced. You really slow down for post; it feels a bit like screeching to a halt for me, but the plus is you get back to the deep critical thinking needed to rewrite in the edit, and to retell the story with the sound and the DI and so on.

I feel very strongly that the last 10% of post is the most important part of the whole process. It’s so tempting to just give up near the end. You’re tired, you’ve lost all objectivity, but it’s critical you keep going.

Talk about editing with Anne McCabe. What were the big editing challenges?
She wasn’t on the set. We sent dailies to her in New York, and she began assembling while we shot. We have a very close working relationship, so she’d be on the phone immediately if there were any concerns. I think finding the right tone was the biggest challenge, and making it emotionally truthful so that you can engage with it. How are you getting information and when? It’s also playing with audiences’ expectations. You have to get used to seeing Tom Hanks as Mister Rogers, so we decided it had to start really boldly and drop you in the deep end — here you go, get used to it! Editing is everything.

There are quite a few VFX. How did that work?
Obviously, there’s the really big VFX sequence when Lloyd goes into his “fever dreams” and imagines himself shrunk down on the set of the neighborhood and inside the castle. We planned that right from the start and did greenscreen — my first time ever — which I loved. And even the practical miniature sets all needed VFX to integrate them into the story. We also had seasonal stuff, period-correct stuff, cleanup and so on. Phosphene in New York did all the VFX.

Talk about the importance of sound and music.
My composer’s also my brother, and he starts very early on so the music’s always an integral part of post and not just something added at the end. He’s writing while we shoot, and we also had a lot of live music we had to pre-record so we could film it on the day. There’s a lot of singing too, and I wanted it to sound live and not overly produced. So when Tom’s singing live, I wanted to keep that human quality, with all the little mouth sounds and any mistakes. I left all that in purposely. We never used a temp score since I don’t like editing to temp music, and we worked closely with the sound guys at Harbor in integrating all of the music, the singing, the whole sound design.

How important is the DI to you?
Hugely important and we finessed a lot with colorist Sam Daley. When you’re doing a period piece, color is so crucial – that it feels authentic to that world. Jody and Sam have worked together for a long time and they worked very hard on the LUT before we began, and every department was aware of the color palette and how we wanted it to look and feel.

What’s next?
I just started a new company called Defiant By Nature, where I’ll be developing and producing TV projects by other people. As for movies, I’m taking a little break.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.