AMD 2.1

Category Archives: Editing

Words of wisdom from editor Jesse Averna, ACE

We are all living in a world we’ve never had to navigate before. People’s jobs are in flux, others are working from home, and anxiety is a regular part of our lives. Through all the chaos, Jesse Averna has been a calming voice on social media, so postPerspective reached out to ask him to address our readership directly.

Jesse, who was co-founder of the popular Twitter chat and Facebook group @PostChat, currently works at Disney Animation Studio and is a member of the American Cinema Editors.


Hey,

How are you doing? This isn’t an ad. I’m not going to sell you anything or try to convince you of anything. I just want to take the opportunity to check in. Like many of you, I’m a post professional (an editor), currently working from home. If we don’t look out for each other, who will? Please know that it’s okay to not be okay right now. I have to be honest, I’m exhausted. I’m just endlessly reading news and searching for new news and reading posts about news I’ve already read and searching again for news I may have missed…

I want to remind you of a couple things that I think may bring some peace, if you let me. I fear it’s about to get much darker and much scarier, so we need to anchor ourselves to some hope.

You are valuable. The world is literally different because you are here. You have intrinsic value and that will never change. No matter what. You are thought about and loved, despite whatever the voice in your head says. I’m sure your first reaction to reading that is to blow it off, but try and own it. Even for just a moment. It’s true.

You don’t deserve what’s going on, but let it bring some peace that the whole world is going through it together. You may be isolated, but not alone. We are forced to look out for one another by looking out for ourselves. It’s interesting, I feel so separate and vulnerable, but the truth is that the whole planet is feeling and reacting to this as one. We are in sync, whether we know it or not — and that’s encouraging to me. We ALL want to be well and be safe, and we want our neighbors to also be well. We have a rare moment of feeling like a people, like a planet.

If you are feeling anxious, do me a favor tonight. Go outside and look at the stars. Set a timer for five minutes. No entertainment or phone or anything else. Just five minutes. Reset. Feel yourself on a cosmic scale. Small. A blink of an eye. But so, so valuable.

And please give yourself a break. A sanity check. If you need help, please reach out. If you need to nest, do it. You need to tune out, do it. Take care of yourself. This is an unprecedented moment. It’s okay to not be okay. Once you can, though, see who you can help. This complete shift of reality has made me think about legacy. This is a unique legacy-building moment. That student who reached out to you on LinkedIn asking for advice? You now have time to reply. That non-profit you thought about volunteering your talents to? Now’s your chance. Even just to make the connection. Who can you help? Check in on? You don’t need any excuse in our current state to reach out.

I know I’m just some rando you’re reading on the internet, but I believe you are going to make it through this. You are wonderful. Do everything you can to be safe. The world needs you. It’s a better place because you are here. You know things, have ideas to share, and will make things that none of the rest of us do or have.

Hang in there, my friends, and let me know if you have any thoughts, encouragements or tips for staying sane during this time. I’ll try to compile them into another article to share.

Jesse
@dr0id


Jesse Averna  — pictured on his way to donate masks — is a five-time Emmy-winning ACE editor living in LA and currently working in the animation feature world. 

Finishing artist Tim Nagle discuses work on indie film Miss Juneteenth

Lucky Post Flame artist Tim Nagle has a long list of projects under his belt, including collaborations with David Lowery — providing Flame work on the short film Pioneer as well as finishing and VFX work to Lowery’s motion picture A Ghost Story. He is equally at home working on spots, such as campaigns for AT&T, Hershey’s, The Home Depot, Jeep, McDonald’s and Ram..

Nagle began his formal career on the audio side of the business, working as engineer for Solid State Logic, where he collaborated with clients including Fox, Warner Bros., Skywalker, EA Games and ABC.

Tim Nagle

We reached out to Nagle about his and Lucky Post’s work on the feature film Miss Juneteenth, which premiered at Sundance and was recently honored by SXSW 2020 as the winner of the Louis Black Lone Star award.

Miss Juneteenth was directed (and written) by Channing Godfrey Peoples — her first feature-length film. It focuses on a woman from the south — a bona fide beauty queen once crowned Miss Juneteenth, a title commemorating the day slavery was abolished in Texas. The film follows her journey as she tries to hold onto her elegance while striving to survive. She looks for ways to thrive despite her own shortcomings as she marches, step by step, toward self-realization.

How did the film come to you?
We have an ongoing relationship with Sailor Bear, the film’s producing team of David Lowery, Toby Halbrooks and James Johnston. We’ve collaborated with them on multiple projects, including The Old Man & The Gun, directed by Lowery.

What were you tasked to do?
We were asked to provide dailies transcoding, additional editorial, VFX, color and finishing and ultimately delivery to distribution.

How often did you talk to director Channing Godfrey Peoples?
Channing was in the studio, working side by side with our creatives, including colorist Neil Anderson and me, to get the project completed for the Sundance deadline. It was a massive team effort, and we felt privileged to help Channing with her debut feature.

Without spoilers, what most inspires you about the film?
There’s so much to appreciate in the film — it’s a love letter to Texas, for one. It’s directed by a woman, has a single mother at its center and is a celebration of black culture. The LA Times called it one of the best films to come out of Sundance 2020.

Once you knew the film was premiering at Sundance, what was left to complete and in what amount of time?
This was by far the tightest turnaround we have ever experienced. Everything came down to the wire, sound being the last element. It’s one of the advantages of having a variety of talent and services under one roof — the creative collaboration was immediate, intense and really made possible by our shorthand and proximity.

How important do you think it is for post houses to be diversified in terms of the work they do?
I think diversification is important not only for business purposes but also to keep the artists creatively inspired. Lucky Post’s ongoing commitment to support independent film, both financially and creatively, is an integrated part of our business along with brand-supported work and advertising. Increasingly, as you see greater crossover of these worlds, it just seems like a natural evolution for the business to have fewer silos.

What does it mean to you as a company to have work at Sundance? What kinds of impact do you see — business, morale and otherwise?
Having a project that we put our hands on accepted into Sundance was such an honor. It is unclear what the immediate and direct business impacts might be, but for morale, this is often where the immediate value is clear. The excitement and inspiration we all get from projects like this just naturally makes how we do business better.

What software and hardware did you use?
On this project we started with Assimilate Scratch for dailies creation. Editorial was done in Adobe Premiere. Color was Blackmagic DaVinci Resolve, and finishing was done in Autodesk Flame.

What is a piece of advice that you’d give to filmmakers when considering the post phase of their films?
We love being involved as early as possible — certainly not to get in anyone’s way,  but to be in the background supporting the director’s creative vision. I’d say get with a post company that can assist in setting looks and establishing a workflow. With a little bit of foresight, this will create the efficiency you need to deliver in what always ends up being a tight deadline with the utmost quality.

AMD 2.1

Workstations: Offline Editing Workflows

By Karen Moltenbrey

When selecting a workstation, post facilities differ in their opinions about what’s most important, depending on the function the workstations will serve. It goes without saying that everyone wants value. And for some, power is tantamount. For others, speed is a top priority. And for others still, reliability reigns supreme. Luckily for users, today’s workstations can check all those boxes.

As Eric Mittan, director of technology at New York’s Jigsaw Productions, is quick to point out, it’s hard to fathom the kinds of upgrades in power we’ve seen in workstations just in the time he has been working with them professionally. He recalls that in 2004, it took an overnight encoding session to author a standard-definition DVD with just one hour of video — and that task was performed on one of the first dual-processor desktops available to the regular consumer. “Nowadays, that kind of video transcode can take 15 minutes on a ‘light’ laptop, to say nothing of the fact that physical media like the DVD has gone the way of the dinosaur,” he says.

Eric Mittan

That is just the tip of the iceberg in terms of the revolution that workstations have undergone in a very short period. Here, we examine the types of workstations that a pair of studios are using for their editing tasks. Jigsaw, a production company, does a large portion of its own post through Apple iMacs that run Avid Media Composer; it is also a client of post houses for work such as color and final deliverables. Meanwhile, another company, Final Cut, is also a Mac-based operation, running Avid Media Composer and Adobe Premiere Pro, although the company’s Flames run on HP workstations.

[Editor’s Note: These interviews were done prior to the coronavirus lockdown.]

Jigsaw Productions
Jigsaw Productions is a documentary television and film company that was founded in 1978 by documentary filmmaker Alex Gibney. It has since transitioned from a company that made one movie at a time to one that is simultaneously producing multiple features and series for distribution by a number of networks and distribution partners.

Today, Jigsaw does production and offline editorial for all its own films and series. “Our commitment is to filmmakers bringing real stories to their audience,” Mittan says. Jigsaw’s film and episodic projects include the  political (Client 9: The Rise and Fall of Eliot Spitzer), the musical (History of the Eagles) and the athletic (The Armstrong Lie).

On the technical front, Jigsaw does all the creative editorial in house using Avid’s Media Composer. After Jigsaw’s producers and directors are satisfied with the storytelling, the lion’s share of the more technical work is left to the company’s partners at various post houses, such as Harbor, Technicolor, Light Iron and Final Frame, among others. Those facilities do the color timing and DCP generation in the case of the feature titles. Most of the conform and online work for Jigsaw’s TV series is now done in house and then sent out for color.

“I wouldn’t say for sure that we have mastered the Avid-to-Resolve online workflow, but we have become better at it with each project,” says Mittan. It’s Mittan’s job to support post and offline operations along with the needs of the others in the office. The backbone of the post fleet comprises 26 (2018) 27-inch i7 iMacs with 32GB of RAM. During 2018 and 2019, Jigsaw experienced a period of rapid growth, adding 19 new edit suites. (That was in addition to the original 13 built out before Mittan came aboard in 2017.) There are also some earlier iMac models that are used for lighter tasks, such as screening, occasional transcoding and data transfers, as well as eight Mac mini screening stations and five Mac Pro cylinders for heavy transcoding and conform/online tasks. Approximately 10 or more 2019 models round out the remainder of the hardware, though they were purchased with i5 processors, not i7s.

“Jigsaw’s rapid expansion pushed us to buy new machines in addition to replacing a significant portion of our 2012/2013 model Mac Pro and iMac units that had comprised most of our workstations prior to my arrival,” Mittan notes. Each project group at the company is responsible for its own data management and transcoding its own dailies.

Furthermore, Jigsaw has an Avid Nexis shared storage system. “Our editors need to be able to run the latest version of Avid and must maintain and play back multiple streams of DNxHR SQ via a 1Gb connection to our Nexis shared storage. While documentary work tends to be lower resolution and/or lower bandwidth than narrative scripted work, every one of our editors deserves to be able to craft a story with as few technical hiccups as possible,” says Mittan. “Those same workstations frequently need to handle heavy transcodes from interview shoots and research archive gathered each day by production teams.”

When buying new equipment, Mittan looks to strikes a balance between economy and sustainability. While the work at Jigsaw does not always require the latest and greatest of all possible end-user technology, he says, each purchase needs to be made with an eye toward how useful it will remain three to five years into the future.

Salt, Fat, Acid, Heat

While expansion in the past few years resulted in the need for additional purchases, Mittan is hoping to get Jigsaw on a regular schedule of cycling through each of the units over a period of five to six years. Optimally, the edit suite units are used for between three or more years before being downgraded for lighter tasks and eventually used as screening stations for Jigsaw’s producers. Even beyond that, the post machines could see life in years six to eight as office workstations for some of the non-post staff and interns. Although Mittan has yet to access one of the new Mac Pro towers, he is impressed by what he has read and hopes for an acquisition in 2021 to replace the Mac Pro cylinders for online and conform work.

Post at Jigsaw runs Avid Media Composer on the Apple machines. They also use the Adobe Creative Cloud suite for motion graphics within Adobe After Effects and Photoshop. Mittan has also implemented a number of open-source software tools to supplement Jigsaw’s tool kit for assistant editors. That includes command-line tools (like FFmpeg) for video and audio transcodes and Rsync for managed file transfers and verification.

“I’ve even begun to write a handful of custom software scripts that have made short work of tasks common to documentary filmmaking — mostly the kind of common video transcoding jobs that would usually require a paid title but that can be taken care of just as well with free software,” he says.

Additionally, Jigsaw makes frequent use of servers, either functioning as a device for a specific task or for automation.

Jigsaw has done projects for HBO (Robin Williams Come Into My Mind), Showtime (Enemies: The President, Justice & the FBI), Discovery Channel (Why We Hate), A&E (The Clinton Affair) and more, as well as for Netflix (Salt Fat Acid Heat, The Family) — work Mittan describes as an exercise in managing more and more pixels.

The Family

Indeed, documentaries can present big challenges when it comes to dealing with a plethora of media formats. “Documentary work can frequently deal with subjects that have already had a significant media footprint in legacy resolutions. This means that if you’re trying to build a documentary in 4K, you’re going to be dealing with archival footage that is usually HD or SD. You may shoot a handful of new interviews in your new, so-called ‘native’ footage but be overwhelmed by hours upon hours of footage from a VHS collection, or stories that have been downloaded from the website of a TV station in the Midwest,” he adds.

“Working with mixed resolutions means you have to have the capability of running and gunning with your new 4K footage, but the lower resolutions can’t leave your creative editors feeling as though they’ve been left with remnants from another time in history. Blending all of those elements together in a way that tells a cohesive story requires technology that can bring together all of those pieces (and newly generated elements like graphics and reenactments) into a unified piece of media without letting your viewing audience feel the whiplash of frequent resolution changes.”

Miky Wolf

Final Cut
Celebrating its 25th anniversary this year, Final Cut was founded in London by editor Rick Russell. It expanded to New York 20 years ago and to Los Angeles 15 years ago. Across all three offices and several subsidiaries – Significant Others VFX, Machine Sound and The Lofts — Final Cut has more than 100 staff and artists worldwide, offering offline editing, online editing, VFX, graphics, finishing, sound design, mixing and original composition, as well as “dry-hire” facilities for long-form content such as original Netflix series like Sex Education.

Primarily, Final Cut does offline creative editorial. Through Significant Others, it offers online editing and finishing. Although, as editor Miky Wolf notes, there are smaller jobs — such as music videos and digital work — for which the facility “does it all.”

Ryan Johnson

The same can be said of technical supervisor Ryan Johnson, whose job it is to design, implement and maintain the technical infrastructure for Final Cut’s New York and Los Angeles offices. This includes the workstations, software, data storage, backups, networking and security. “The best workstations should be like the best edited films. Something you don’t notice. If you are aware of the workstation while you’re working, it’s typically not a good thing,” Wolf says.

Johnson agrees. “Really, the workstation is just there to facilitate the work. It should be invisible. In fact, ours are mostly hidden under desks and are rarely seen. Mostly, it’s a purpose-built machine, designed less for aesthetics and portability than for reliability and practicality.”

Final Cut’s edit room runs off a Mac Pro with 32GB of RAM; there are two editing monitors, a preview monitor on the desk and a client monitor. The majority of the company’s edit workstations are six-core 2013 Mac Pro “trash cans” with AMD FirePro D500 GPUs and 32GB of RAM. There are approximately 16 of these workstations spread between the NY and LA offices. Moreover, the workstations use little to no local storage since the work resides on Avid’s Nexis servers. Each workstation is connected to a pair of 24-inch LCD displays, while video and audio from the edit software are delivered via Blackmagic Design hardware to an LCD preview monitor on the editor’s desk and to an OLED TV for clients.

The assistant editors all work on 27-inch iMacs of various vintages, mainly 2017 i7 models with 32GB of RAM.For on-set/off-site work, Final Cut keeps a fleet of MacBook Pros, mostly the 2015 Thunderbolt 2 pre-Touch Bar models. These travel with USB 3 SSDs for media storage. Final Cut’s Flames, however, all run on dual 12-core HP Z8s with 128GB of RAM. These machines use local SSD arrays for media storage.

According to Johnson, the workstations (running macOS 10.14.6) mostly are equipped with Avid Media Composer or Adobe Premiere Pro, and the editors sometimes “dabble” in Blackmagic’s DaVinci Resolve (for transcoding or when someone wants to try their hand at editing on it). “We primarily work with compressed proxy footage — typically DNxHD 115 or ProRes LT — at 1080p, so bandwidth requirements aren’t too high. Even lower-spec machines handle a few streams well,” he says. “Sequences that involve many layers or complicated effects will often require rendering, but the machines are fast enough that wait times aren’t too long.”

The editors also use Soundminer’s products for their sound effects library. The assistants perform basic compositing in Adobe After Effects, which the machines handle well, Johnson adds. “However, occasionally they will need to transcode raw/camera original footage to our preferred codec for editing. This is probably the most computationally intensive task for any of the machines, and we’ll try to use newer, faster models for this purpose.”

Stray Dolls feature film

Wherever possible, Final Cut deploys the same types of workstations across all its locations, as maintenance becomes easier when parts are interchangeable, and software compatibility is easier to manage when dealing with a homogeneous collection of machines. Not to mention the political benefit: Everybody gets the same machine, so there’s no workstation envy, so to speak.

Reliability and expandability are the most important factors Johnson considers in a workstation. He acknowledges that the 2013 Mac Pros were a disappointment on both counts: “They had thermal issues from the start — Apple admitted as much — that resulted in unpredictable behavior, and you were stuck with whichever 2013-era GPU you chose when purchasing the machine,” he says. “We expect to get many trouble-free years out of the workstations we buy. They should be easy to fix, maintain and upgrade.”

When selecting workstations for Final Cut, a Macintosh shop, there is not a great deal of choice. “Our choices are quickly narrowed down to whatever Apple happens to be selling,” explains Johnson. “Given the performance tiers of the models available, it is a matter of analyzing our performance needs versus our budget. In an ideal world, the entire staff would be working on the fastest possible machine with the most RAM and so forth, but alas, that is not always in the budget. Therefore, compromise must be found in selecting machines that can capably handle the typical workload and are fast enough not to keep editors and assistants waiting too long for renders.”

The most recent purchase were the new iMacs for the assistants in LA. “For the money, they are great machines, and I’ve found them to be reliable even when pushing them through all night renders, transcodes, etc. They’re at least as fast as the Mac Pros and, in most applications, even faster,” Johnson points out. He expects to replace the 2013 Mac Pros this year.

Florence and the Machine “Big God” music video

Wolf notes that he must be able to work as efficiently at home as he does at the office, “and that’s one nice thing about the evolution of offline editing. A combination of robust laptops and portable SSDs has allowed us to take the work anywhere.”

Using the above-described setup, Final Cut recently finished a campaign for an advertising client in which the edit started on set in LA, continued in the hotel room and then finished back in NY. “We needed to be able to work remotely, even on the plane home, just to get the first cuts done in time,” Wolf explains. “Agencies expect you to be fast. They schedule presentations assuming we can work around the clock to get stuff together — we need systems that can support us.”

Johnson highlighted another recent project with a tight schedule that involved cutting a multi-camera sequence in UHD from portable SSD storage on a standard iMac. “This would have been impossible just a few years ago,” he adds.

Main Image: Netflix’s Sex Education


Karen Moltenbrey is a veteran writer, covering visual effects and post production.


My Top Five Ergonomic Workstation Accessories

By Brady Betzel

Instead of writing up my normal “Top Five Workstation Accessories” column this year, I wanted to take a slightly different route and focus on products that might lessen pain and maybe even improve your creative workflow — whether you are working at a studio or, more likely these days, working from home.

As an editor, I sit in a chair for most of my day, and that is on top of my three- to four-hour round-trip commute to work. As aches and pains build up (I’m 36, and I’m sure it doesn’t just get better), I had to start looking for solutions to alleviate the pain I can see coming in the future. In the past I have mentioned products like the Wacom Intuos Pro Pen tablet, which is great and helped me lessen wrist pain. Or color correction panels such as theLoupedeck, which helps creative workflows but also prevents you from solely using the mouse, also lessening wrist pain.

This year I wanted to look at how the actual setup of a workstation environment that might prevent pain or alleviate it. So get out of your seat and move around a little, take a walk around the block, and when you get back, maybe rethink how your workstation environment could become more conducive to a creativity-inspiring flow.

Autonomous SmartDesk 2 
One of the most useful things in my search for flexibility in the edit bay is the standup desk. Originally, I went to Ikea and found a clearance tabletop in the “dents” section and then found a kitchen island stand that was standing height. It has worked great for over 10 years; the only issue is that it isn’t easily adjustable, and sometimes I need to sit to really get my editing “flow” going.

Many companies offer standing desk solutions, including manual options like the classic VariDesk desk riser. If you have been in the offline editing game over the past five to 10 years, then you have definitely seen these come and go. But at almost $400, you might as well look for a robotic standing desk. This is where the Autonomous SmartDesk 2 comes into play. Depending on whether you want the Home Office version, which stands between 29.5 inches and 48 inches, or the Business Office version, which stands between 26 inches and 52 inches, you are looking to spend $379 or $479, respectively (with free shipping included).

The SmartDesk 2 desktop itself is made of MDF (medium-density fibreboard) material, which helps to lower the overall cost but is still sturdy and will hold up to 300 pounds. From black to white oak, there are multiple color options that not only help alleviate pains but can also be a conversation piece in the edit bay. I have the Business version in black along with a matching black chair, and I love that it looks clean and modern. The SmartDesk 2 is operated using a front-facing switch plate complete with up, down and four height-level presets. It operates smoothly and, to be honest, impressively. It gives a touch of class to any environment. Setup took about half an hour, and it came with easy-to-follow instructions, screws/washers and tools.

Keep an eye out for my full review of the Autonomous SmartDesk 2 and ErgoChair 2, but for now think about how a standup desk will at least alleviate some of the sitting you do all day while adding some class and conversation to the edit bay.

Autonomous ErgoChair 2 
Along with a standup desk — and more important in, my opinion — is a good chair. Most offline editors and assistant editors work at a company that either values their posture and buys Herman Miller Aeron chairs, or cheaps out and buys the $49 special at Office Depot. I never quite understood the benefit of saving a few bucks on a chair, especially if a company pays for health insurance — because in the end, they will be paying for it. Not everyone likes or can afford the $1,395 Aeron chairs, but there are options that don’t involve ruining your posture.

Along with the Autonomous SmartDesk 2, you should consider buying the ErgoChair 2, which costs $349 — a similar price to other chairs, like the Secretlab Omega series gaming chair that retails for $359. But the ErgoChair 2 has the best of both worlds: an Aeron chair-feeling mesh back and neck support plus a super-comfortable seat cushion with all the adjustments you could want. Even though I have only had the Autonomous products for a few weeks now, I can already feel the difference when working at home. It seems like a small issue in the grand scheme of things, but being comfortable allows my creativity to flow. The chair took under 30 minutes to build and came with easy-to-follow instructions and good tools, just like the SmartDesk 2.

A Footrest
When I first started in the industry, as soon as I began a freelance job, I would look for an old Sony IMX tape packing box. (Yes, the green tapes. Yes, I worked with tape. And yes, I can operate an MSW-2000 tape deck.) Typically, the boxes would be full of tapes because companies bought hundreds and never used them, and they made great footrests! I would line up a couple boxes under my feet, and it made a huge difference for me. Having a footrest relieves lower back pressure that I find hard to relieve any other way.

As I continue my career into my senior years, I finally discovered that there are actual footstools! Not just old boxes. One of my favorites is on Amazon. It is technically an adjustable nursing footstool but works great for use under a desk. And if you have a baby on the way, it’s a two-for-one deal. Either way, check out the “My Brest Friend” on Amazon. It goes for about $25 with free one-day Amazon Prime shipping. Or if you are a woodworker, you might be able to make your own.

GoFit Muscle Hook 
After sitting in an edit bay for multiple hours, multiple days in a row, I really like to stretch and use a massager to un-stuff my back. One of the best massagers I have seen in multiple edit bays is called the GoFit Muscle Hook.

Luckily for us it’s available at almost any Target or on the Target website for about $25. It’s an alien-looking device that can dig deep into your shoulder blades, neck and back. You can use it a few different ways — large hook for middle-of-the-back issues, smaller hook that I like to use on the neck and upper back, and the neck massage on the bar (that one feels a little weird to me).

There are other massage devices similar to the Muscle Hook, but in my opinion the GoFit Muscle Hook is the best. The plastic-composite seems indestructible and almost feels like it could double as a self-defense tool. But it can work out almost any knots you have worked up after a long day. If you don’t buy anything else for self-care, buy the Muscle Hook. You will be glad you did. Anyone who gets one has that look of pain and relief when they use it for the first time.

Foam Roller
Another item that I just started using was a foam roller. You can find them anywhere for the most part, but I found one on Amazon for $13.95 plus free Amazon Prime one-day shipping. It’s also available on the manufacturer’s website for about $23. Simply, it’s a high-density foam cylinder that you roll on top of. It sounds a little silly, but once you get one, you will really wonder how you lived without one. I purchased an 18-inch version, but they range from 12 inches to 36 inches. And if you have three young sons at home, they can double as fat lightsabers (but they hurt, so keep an eye out).

Summing Up
In the end, there are so many ways to try keeping a flexible editing lifestyle, from kettlebells to stand-up desks. I’ve found that just getting over the mental hurdle of not wanting to move is the biggest catalyst. There are so many great tech accessories for workstations, but we hardly mention ones that can keep our bodies moving and our creativity flowing. Hopefully, some of these ergonomic accessories for your workstation will spark an idea to move around and get your blood flowing.

For some workout inspiration, Onnit has some great free workouts featuring weird stuff like maces, steel clubs and sandbags, but also kettlebells. The site also has nutritional advice. For foam roller stretches, I would check out the same Onnit Academy site.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Review: Digital Anarchy’s Transcriptive 2.0

By Barry Goch

Not long ago, I had the opportunity to go behind the scenes at Warner Bros. to cover the UHD HDR remastering of The Wizard of Oz. I had recorded audio of the entire experience so I could get accurate quotes from all involved — about an hour of audio. I then uploaded the audio file to Rev.com and waited. And waited. And waited. A few days later they came back and said they couldn’t do it. I was perplexed! I checked the audio file, and I could clearly hear the voices of the different speakers, but they couldn’t make it work.

That’s when my editor, Randi Altman, suggested Digital Anarchy’s Transcriptive, and it saved the day. What is Transcriptive? It’s an automated, intelligent transcription plugin for Adobe Premiere editors designed to automatically transcribe video using multiple speech and natural language processing engines with accuracy.

Well, not only did Transcriptive work, it worked super-fast, and it’s affordable and simple to use … once everything is set up. I spent a lot of time watching Transcriptive’s YouTube videos and then had to create two accounts for the two different AI transcription portals that they use. After a couple of hours of figuring and setup, I was finally good to go.

Digital Anarchy has lots of videos on YouTube about setting up the program. Here is a link to the overview video and a link to 2.0 new features. After getting everything set up, it took less than five minutes from start to finish to transcribe a one-minute video. That includes the coolest part: automatically linking the transcript to the video clip with word-for-word accuracy.

Transcriptive extension

Step by Step
Import video clip into Premiere, select the clips, and open the Transcriptive Extension.

Tell Transcriptive if you want to use an existing transcript or create a new transcription.

Then choose the AI that you want to transcribe your clip. You see the cost upfront, so no surprises.

Launch app

I picked the Speechmatics AI:

Choosing AI

Once you press continue, Media Encoder launches.

Media Encoder making FLAC file automatically.

And Media Encoder automatically makes a FLAC file and uploads it to the transcription engine you picked.

One minute later, no joke, I had a finished transcription linked word-accurately to my source video clip.

Final Thoughts
The only downside to this is that the transcription isn’t 100% accurate. For example, it heard Lake Tahoe as “Lake Thomas” and my son’s name, Oliver, as “over.”

Final transcription

This lack of accuracy is not a deal breaker for me, especially since I would have been totally out of luck without it on The Wizard of Oz article, which you can read here. For me, the speed and ease of use more than compensates for the lack of accuracy. And, as AI’s get better, the accuracy will only improve.

And on February 27, Digital Anarchy released Transcriptive V.2.0.3, which is compatible with Adobe Premiere v14.0.2. The update also includes a new prepaid option that can lower the cost of transcription to $2.40 per hour of footage. Transcriptive’s tight integration with Premiere makes it a must-have for working with transcripts when cutting long- and short-form projects.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya


Goldcrest Post’s Jay Tilin has passed away

Jay Tilin, head of production at New York’s Goldcrest Post, passed away last month after a long illness. For 40 years, Tilin worked in the industry as an editor, visual effects artist and executive. His many notable credits include the Netflix series Marco Polo and the HBO series Treme and True Detective.

“Jay was in integral part of New York’s post production community and one of the top conform artists in the world,” said Goldcrest Post managing director Domenic Rom. “He was beloved by our staff and clients as an admired colleague and valued friend. We offer our heartfelt condolences to his family and all who knew him.”

Tilin began his career in 1980 as an editor with Devlin Productions. He also spent many years at The Tape House, Technicolor, Riot and Deluxe, all in New York. He was an early adopter of many now standard post technologies, from the advent of HD video in the 1990s through more recent implementations of 4K and HDR finishing.

His credits also include the HBO series Boardwalk Empire, the Sundance Channel series Hap and Leonard, the PBS documentary The National Parks and the Merchant Ivory feature City of Your Final Destination. He also contributed to numerous commercials and broadcast promos. A native New Yorker, Tilin earned a degree in broadcasting from SUNY Oswego.

Tilin is survived by his wife Betsy, his children Kelsey and Sam, his mother Sonya and his sister Felice (Trudy).


Editor Anthony Marinelli joins Northern Lights

Editor Anthony Marinelli has joined post studio Northern Lights. Marinelli’s experience spans commercial, brand content, film and social projects. Marinelli comes to Northern Lights from a four-year stint at TwoPointO where he was also a partner. He has previously worked at Kind Editorial, Alkemy X, Red Car, Cut+Run and Crew Cuts.

Marinelli’s work includes projects for Mercedes, FedEx, BMW, Visa, Pepsi, Scotts, Mount Sinai and Verizon. He also edited the Webby Award-winning documentary “Alicia in Africa,” featuring Alicia Keys for Keep a Child Alive.

Marinelli is also an active in independent theater and film. He has written and directed many plays and short films, including Acoustic Space, which won Best Short at the 2018 Ridgewood Guild Film Festival and Best Short Screenplay in the Richmond International Film Festival.

Marinelli’s most recent campaigns are for Mount Sinai and Bernie & Phyl’s for DeVito Verdi.

He works on Avid Media Composer and Adobe Premiere. You can watch his reel here.


Blackmagic releases Resolve 16.2, beefs up audio post tools

Blackmagic has updated its color, edit, VFX and audio post tool to Resolve 16.2. This new version features major Fairlight updates for audio post as well as many improvements for color correction, editing and more.

This new version has major new updates for editing in the Fairlight audio timeline when using a mouse and keyboard. This is because the new edit selection mode unlocks functionality previously only available via the audio editor on the full Fairlight console, so editing is much faster than before. In addition, the edit selection mode makes adding fades and cuts and even moving clips only a mouse click away. New scalable waveforms let users zoom in without adjusting the volume. Bouncing lets customers render a clip with custom sound effects directly from the Fairlight timeline.

Adding multiple clips is also easier, as users can now add them to the timeline vertically, not just horizontally, making it simpler to add multiple tracks of audio at once. Multichannel tracks can now be converted into linked groups directly in the timeline so users no longer have to change clips manually and reimport. There’s added support for frame boundary editing, which improves file export compatibility for film and broadcast deliveries. Frame boundary editing now adds precision so users can easily trim to frame boundaries without having to zoom all the way in the timeline. The new version supports modifier keys so that clips can be duplicated directly in the timeline using the keyboard and mouse. Users can also copy clips across multiple timelines with ease.

Resolve 16.2 also includes support for the Blackmagic Fairlight Sound Library with new support for metadata based searches, so customers don’t need to know the filename to find a sound effect. Search results also display both the file name and description, so finding the perfect sound effect is faster and easier than before.

MPEG-H 3D immersive surround sound audio bussing and monitoring workflows are now supported. Additionally, improved pan and balance behavior includes the ability to constrain panning.

Fairlight audio editing also has index improvements. The edit index is now available in the Fairlight page and works as it does in the other pages, displaying a list of all media used; users simply click on a clip to navigate directly to its location in the timeline. The track index now supports drag selections for mute, solo, record enable and lock as well as visibility controls so editors can quickly swipe through a stack of tracks without having to click on each one individually. Audio tracks can also be rearranged by click and dragging a single track or a group of tracks in the track index.

This new release also includes improvements in AAF import and export. AAF support has been refined so that AAF sequences can be imported directly to the timeline in use. Additionally, if the project features a different time scale, the AAF data can also be imported with an offset value to match. AAF files that contain multiple channels will also be recognized as linked groups automatically. The AAF export has been updated and now supports industry-standard broadcast wave files. Audio cross-fades and fade handles are now added to the AAF files exported from Fairlight and will be recognized in other applications.

For traditional Fairlight users, this new update makes major improvements in importing old legacy Fairlight projects —including improved speed when opening projects with over 1,000 media files, so projects are imported more quickly.

Audio mixing is also improved. A new EQ curve preset for clip EQ in the inspector allows removal of troublesome frequencies. New FairlightFX filters include a new meter plug-in that adds a floating meter for any track or bus, so users can keep an eye on levels even if the monitoring panel or mixer are closed. There’s also a new LFE filter designed to smoothly roll off the higher frequencies when mixing low-frequency effects in surround.

Working with immersive sound workflows using the Fairlight audio editor has been updated and now includes dedicated controls for panning up and down. Additionally, clip EQ can now be altered in the inspector on the editor panel. Copy and paste functions have been updated, and now all attributes — including EQ, automation and clip gain — are copied. Sound engineers can set up their preferred workflow, including creating and applying their own presets for clip EQ. Plug-in parameters can also be customized or added so that users have fast access to their preferred tool set.

Clip levels can now be changed relatively, allowing users to adjust the overall gain while respecting existing adjustments. Clip levels can also be reset to unity, easily removing any level adjustments that might have previously been made. Fades can also be deleted directly from the Fairlight Editor, making it faster to do than before. Sound engineers can also now save their preferred track view so that they get the view they want without having to create it each time. More functions previously only available via the keyboard are now accessible using the panel, including layered editing. This also means that automation curves can now be selected via the keyboard or audio panel.

Continuing on with the extensive improvements to the Fairlight audio, there has also been major updates to the audio editor transport control. Track navigation is now improved and even works when nothing is selected. Users can navigate directly to the timecode entry window above the timeline from the audio editor panel, and there is added support for high-frame-rate timecodes. Timecode entry now supports values relative to the current CTI location, so the playhead can move along the timeline relative to the position rather than a set timecode.

Support has also been added so the colon key can be used in place of the user typing 00. Master spill on console faders now lets users spill out all the tracks to a bus fader for quick adjustments in the mix. There’s also more precision with rotary controls on the panel and when using a mouse with a modifier key. Users can also change the layout and select either icon or text-only labels on the Fairlight editor. Legacy Fairlight users can now use the traditional — and perhaps more familiar — Fairlight layout. Moving around the timeline is even quicker with added support for “media left” and “media right” selection keys to jump the playhead forward and back.

This update also improves editing in Resolve. Loading and switching timelines on the edit page is now faster, with improved performance when working with a large number of audio tracks. Compound clips can now be made from in and out points so that editors can be more selective about which media they want to see directly in the edit page. There is also support for previewing timeline audio when performing live overwrites of video-only edits. Now when trimming, the duration will reflect the clip duration as users actively trim, so they can set a specific clip length. Support for a change transition duration dialogue.

The media pool now includes metadata support for audio files with up to 24 embedded channels. Users can also duplicate clips and timelines into the same bin using copy and paste commands. Support for running the primary DaVinci Resolve screen as a window when dual-screen mode is enabled. Smart filters now let users sort media based on metadata fields, including keywords and people tags, so users can find the clips they need faster.


Quick Chat: Editing Leap Day short for Stella Artois

By Randi Altman

To celebrate February 29, otherwise known as Leap Day, beer-maker Stella Artois released a short film featuring real people who discover their time together is valuable in ways they didn’t expect. The short was conceived by VaynerMedia, directed by Division7s Kris Belman and cut by Union partner/editor Sloane Klevin. Union also supplied Flame work on the piece.

The film begins with the words, ”There is a crisis sweeping the nation” set on a black screen. Then we see different women standing on the street talking about how easy it is to cancel plans. “You’re just one text away,” says one. “When it’s really cold outside and I don’t want to go out, I use my dog excuse,” says another. That’s when the viewer is told, through text on the screen, that Stella Artois has set out to right this wrong “by showing them the value of their time together.”

The scene changes from the street to a restaurant where friends are reunited for a meal and a goblet of Stella after not seeing each other for a while. When the check comes the confused diners ask about their checks, as an employee explains, that the menu lists prices in minutes, and that Leap Day is a gift of 24 hours and that people should take advantage of that by “uncancelling plans.”

Prior to February 29, Stella encouraged people to #UnCancel plans and catch up with friends over a beer… paid for by the brand. Using the Stella Leap Day Fund — a $366,000 bank of beer reserved exclusively for those who spend time together (there are 366 days in a Leap Year) — people were able to claim as much as a 24-pack when sharing the film using #UnCancelPromo and tagging someone they would like to catch up with.

Editor Sloane Klevin

For the film short, the diners were captured with hidden cameras. Union editor Klevin, who used an Avid Media Composer 2018.12.03 with EditShare storage, was tasked with finding a story in their candid conversations. We reached out to her to find out more about the project and her process.

How early did you get involved in this project, and what kind of input did you have?
I knew I was probably getting the job about a week before they shot. I had no creative input into the shoot; that really only happens when I’m editing a feature.

What was your process like?
This was an incredibly fast turnaround. They shot on a Wednesday night, and it was finished and online the following Wednesday morning at 12am.

I thought about truncating my usual process in order to make the schedule, but when I saw their shooting breakdown for how they planned to shoot it all in one evening, I knew there wouldn’t be a ton of footage. Knowing this, I could treat the project the way I approach most unscripted longform branded content.

My assistant, Ryan Stacom, transcoded and loaded the footage into the Avid overnight, then grouped the four hidden cameras with the sound from the hidden microphones — and, brilliantly, production had time-of-day timecode on everything. The only thing that was tricky was when two tables were being filmed at once. Those takes had to be separated.

The Simon Says transcription software was used to transcribe the short pre and post interviews we had, and Ryan put markers from the transcripts on those clips so I could jump straight to a keyword or line I was searching for during the edit process. I watched all the verité footage myself and put markers on anything I thought was usable in the spot, typing into the markers what was said.

How did you choose the footage you needed?
Sometimes the people had conversations that were neither here nor there, because they had no idea they were being filmed, so I skipped that stuff. Also, I didn’t know if the transcription software would be accurate with so much background noise from the restaurant on the hidden table microphones, so markering myself seemed the best option. I used yellow markers for lines I really liked, and red for stuff I thought we might want to be able to find and audition, but those wasn’t necessarily my selects. That way I could open the markers tool, and read through my yellow selects at a glance.

Once I’d seen everything, I did a music search of Asche & Spencer’s incredibly intuitive, searchable music library website, downloaded my favorite tracks and started editing.  Because of the fast turnaround, the agency was nice enough to send an outline for how they hoped the material might be edited. I explored their road map, which was super helpful, but went with my gut on how to deviate. They gave me two days to edit, which meant I could post for the director first and get his thoughts.

Then I spent the weekend playing with the agency and trying other options. The client saw the cut and gave notes on both days I was with the agency, then we spent Monday and Tuesday color correcting (thanks to Mike Howell at Color Collective), reworking the music track, mixing (with Chris Afzal at Wave Studios), conforming, subtitling.

That was a crazy fast turnaround.
Considering how fast the turnaround was, it went incredibly smoothly. I attribute that to the manageable amount of footage, fantastic casting that got us really great reactions from all the people they filmed, and the amount of communication my producer at Union and the agency producer had in advance.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

 

Review: Loupedeck+ for Adobe’s Creative Cloud — a year later

By Mike McCarthy

It has been a little over a year since Loupedeck first announced support for Adobe’s Premiere Pro and After Effects thanks to its Loupedeck+ hardware interface panel. As you might know, Loupedeck was originally designed for Adobe Lightroom users. When Loupedeck was first introduced, I found myself wishing there was something similar for Premiere, so I was clearly pleased when that became a reality.

I was eager to test it and got one before starting editorial on a large feature film back in January. While I was knee-deep in the film, postPerspective’s Brady Betzel wrote a thorough review of the panel and how to use it in Premiere and Lightroom. My focus has been a bit different, working to find a way to make it a bit more user-friendly and looking for ways to take advantage of the tool’s immense array of possible functions.

Loupedeck+

I was looking forward to using the panel on a daily basis while editing the film (which I can’t name yet, sorry) because I would have three months of consistent time in Premiere to become familiar with it. The assistant editor on the film ordered a Loupedeck+ when he heard I had one. To our surprise, both of the panels sat idle for most of the duration of that project, even though we were using Premiere for 12 to 16 hours a day. There are a few reasons fo that from my own experience and perspective:

1) Using Premiere Pro 12 involved a delay — which made the controls, especially the dials, much less interactive — but that has been solved in Premiere 13. Unfortunately, we were stuck in version 12 on the film for larger reasons.

2) That said, even in Premiere 13, every time you rotate the dials, it sends a series of individual commands to Premiere that fills up your actions history with one or two adjustments. Pressing a dial resets its value to the default, which alleviates the need to undo that adjustment, but what about the other edit I just made before that? Long gone by that point. If you are just color correcting, this limitation isn’t an issue, but if you are alternating between making color adjustments and other fixes as you work through a sequence, this is a potential problem.

Loupedeck+

A solution? Limit each adjustment so that it’s seen as a single action until another value is adjusted or until a second or two go by — similar in principle to linear keyframe thinning, when you use sliders to make audio level adjustments.

3) Lastly, there was the issue of knowing what each button and dial would do, since there are a lot of them (40 buttons and 14 dials), and they are only marked for their functionality in Lightroom. I also couldn’t figure out how to map it to the functions I wanted to use the most. (The intrinsic motion effect values.)

The first issue will solve itself as I phase out Premiere 12 once this project is complete. The second could be resolved by some programming work by Loupedeck or Adobe, depending on where the limitations lie. Also, adding direct access to more functions in the Loupedeck utility would make it more useful to my workflows — specifically, the access to the motion effect values. But all of that hinges on me being able to remember the functions associated with each control, and those functions being more efficient than doing it with my mouse and keyboard.=

What solution works for you?

Dedicated Interface v. Mouse/Keyboard
The Loupedeck has led to a number of interesting debates about the utility of a dedicated interface for editing compared to a mouse and/or keyboard. I think this is a very interesting topic, as the interface between the system and the user is the heart of what we do. The monitor(s) and speakers are the flip side of that interface, completing the feedback loop. While I have little opinion on speakers because most of my work is visual, I have always been very into having the “best” monitorsolutions and figuring out exactly what “best” means.

It used to mean two 24-inch WUXGA panels, and then it meant a 30-inch LCD. Then I discovered that two 30-inch LCDs were too much for me to use effectively. Similarly, 4K had too many pixels for a 27-inch screen in Windows 7. An ultrawide 34-inch 3440×1440 is my current favorite, although my 32-inch 8K display is starting to grow on me now that Windows 10 can usually scale content on it smoothly.

Our monitor is how our computer communicates with us, and the mouse and keyboard are how we communicate with it. The QWERTY keyboard is a relic from the typewriter era, designed to be inefficient, to prevent jamming the keys. Other arrangements have been introduced but have not gained widespread popularity. The mouse is a much more flexible analog input device for giving more nuanced feedback. (Keys are only on or off, no in-between.) But it is not as efficient at discrete tasks as a keyboard shortcut, provided that you can remember it.

Keyboard shortcuts

This conundrum has led to debates about the best or most efficient way of controlling applications on the system, and editors have some pretty strong opinions on the matter. I am not going to settle it once and for all, but I am going to attempt to step back and look at the bigger picture. Many full-time operators who have become accustomed to their applications are very fast using their keyboard shortcuts, and Avid didn’t even support mouse editing on the timeline until a few years ago. This leads many of those operators to think that keyboard shortcuts are the most efficient possible method of operating, dismissing the possibility that there might be better solutions. But I am confident that for people starting from scratch, they could be at least as efficient, if not more so, using an interface that was actually designed for what they are doing.

Loupedeck is by no means the first or only option in that regard. I have had a Contour Shuttle Pro 2 for many years and have used it on rare occasions for certain highly repetitive tasks. Blackmagic sells a number of physical interface options for Resolve, including its new editing keyboard, and there have been many others for color correction, which is the focus of the Loupedeck’s design as well.

Shuttle Pro 2

Many people also use tablets or trackballs as a replacement for the mouse, but that usually is more about ergonomics and doesn’t compete with keyboard functionality. These other dedicated interfaces are designed to replace some of the keyboard and mouse functionality, but none of them totally replace the QWERTY keyboard, as we will still have to be able to type, to name files, insert titles, etc. But that is what a keyboard is designed to do, compared to pressing spacebar for playback or CTRL+K to add a layer slice. These functions are tasks that have been assigned to the keyboard for convenience, but they are not intrinsically connected with them.

There is no denying the keyboard is a fairly flexible digital input tool, consistently available on nearly all systems and designed to give your fingers lots of easily accessible options. Editors are hardly the only people repurposing it or attempting to use it to maximize efficiency in ways it wasn’t originally designed for. Gamers wear out their WASD keys because their functionality has nothing to do with their letter values and is entirely based on their position on the board. And while other interfaces have been marketed, most gamers are still using a QWERTY keyboard and mouse as their primary physical interface. People are taught the QWERTY keyboard from an early age to develop unconscious muscle memory and, ideally, to allow them to type as they think.

QWERTY keyboard

Once those unconscious links are developed, it is relatively easy to repurpose them for other uses. You think “T” and you press it without thinking about where it is. This is why the keyboard is so efficient as an input device, even outside of the tasks it was originally designed for. But what is preventing people from becoming as efficient with their other physical interfaces? Time with the device and good design are required. Controls have to be able to be identified by touch, without looking, to make that unconscious link possible, which is the reason for the bumps on your F and J keys. But those mental finger mappings may compete with your QWERTY muscle memory, which you are still going to need to be an effective operator, so certain people might be better off sticking with that.

If you are super-efficient with your keyboard shortcuts, and they do practically everything you need, then you are probably not in the target market for the Loupedeck or other dedicated interfaces. If you aren’t that efficient on your keyboard, or you do more analog tasks (color correction) that don’t take place with the discrete steps provided by a keyboard, then a dedicated interface might be more attractive to you. Ironically, my primary temp color tool on my recent film was Lumetri curves, which aren’t necessarily controlled by the Loupedeck.

 

Mike’s solution

That was more about contrast because “color” isn’t really my thing, but for someone who uses those tools that the Loupedeck is mapped to, I have no doubt the Loupedeck would be much faster than using mouse and keyboard for those functions. Mapping the dials to the position, scale and opacity values would improve my workflow, and that currently works great in After Effects, especially in 3D, but not in Premiere Pro (yet). Other functions like slipping and sliding clips are mapped to the Loupedeck dials, but they are not marked, making them very hard to learn. My solution to that is to label them.

Labeling the Loupedeck
I like the Loupedeck, but I have trouble keeping track of the huge variety of functions available, with four possible tasks assigned to each dial per application. Obviously, it would help if the functions were fairly consistent across applications, but currently, by default, they are not. There are some simple improvements that can be made, but not all of the same functions are available, even between Premiere and After Effects. Labeling the controls would be helpful, even just in the process of learning them, but they change between apps, so I don’t want to take a sharpie to the console itself.

Loupedeck CT

The solution I devised was to make cutouts, which can be dropped over the controls, with the various functions labeled with color-coded text. There are 14 dials, 40 buttons and four lights that I had to account for in the cutout. I did separate label patterns for Premiere, After Effects and Photoshop. They were initially based on the Loupedeck’s default settings for those applications, but I have created custom cutouts that have more consistent functionality when switching between the various apps.

Loupedeck recently introduced the new Loupedeck CT (Creative Tool), which is selling for $550. At more than twice the price, it is half the size and labels the buttons and dials with LCD screens that change to reflect the functions available for whatever application and workspace you might be in. This offers a similar but static capability to the much larger set of controls available on the cheaper Loupedeck+.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Matt Shaw on cutting Conan Without Borders: Ghana and Greenland

By Randi Altman

While Conan O’Brien was airing his traditional one-hour late night talk show on TBS, he and his crew would often go on the road to places like Cuba, South Korea and Armenia for Conan Without Borders — a series of one-hour specials. He would focus on regular folks, not celebrities, and would embed himself into the local culture… and there was often some very mediocre dancing, courtesy of Conan. The shows were funny, entertaining and educational, and he enjoyed doing them.

Conan and Matt on the road.

In 2019, Conan and his crew, Team Coco, switched the nightly show from one hour to a new 30-minute format. The format change allowed them to produce three to four hour-long Conan Without Borders specials per year. Two of the places the show visited last year were Ghana and Greenland. As you might imagine, they shoot a lot of footage, which all must be logged and edited, often while on the road.

Matt Shaw is one of the editors on Conan, and he went on the road with the show when it traveled to Greenland. Shaw’s past credits include Deon Cole’s Black Box and The Pete Holmes Show (both from Conan O’Brien’s Conaco production company) and The Late Late Show with James Corden (including Carpool Karaoke). One of his first gigs for Team Coco was editing Conan Without Borders: Made in Mexico. That led to a full-time editing gig on Conan on TBS and many fun adventures.

We reached out to Shaw to find out more about editing these specials and what challenges he faced along the way.

You recently edited Conan Without Borders — the Greenland and Ghana specials. Can you talk about preparing for a job like that? What kind of turnaround did you have?
Our Ghana special was shot back in June 2019, with the original plan to air in August, but it was pushed back to November 7 because of how fast the Greenland show came up.

In terms of prep for a show like Ghana, we mainly just know the shooting specs and will handle the rest once the crew actually returns. For the most part, that’s the norm. Ideally, we’ll have a working dark week (no nightly Conan show), and the three editors — me, Rob Ashe and Chris Heller — will take the time to offload, sync and begin our first cuts of everything. We’ll have been in contact with the writers on the shoot to get an idea of what pieces were shot and their general notes from the day.

With Greenland, we had to mobilize and adjust everything to accommodate a drastically different shoot/delivery schedule. The Friday before leaving, while we were prepping the Ghana show to screen for an audience, we heard there might be something coming up that would push Ghana back. On Monday, we heard the plan was to go to Greenland on Wednesday evening, after the nightly show, and turn around Greenland in place of Ghana’s audience screening. We had to adjust the nightly show schedule to still have a new episode ready for Thursday while we were in Greenland.

How did you end up on the Greenland trip?
Knowing we’d only have six days from returning from Greenland to having to finish the show broadcast, our lead editor, Rob Ashe, suggested we send an editor to work on location. We were originally looking into sending footage via Aspera from a local TV studio in Nuuk, Greenland, but we just wouldn’t have been able to turn it around fast enough. We decided about two days before the trip began that I’d go and do what I could to offload, backup, sync and do first cuts on everything.

How much footage did you have per episode, and what did they shoot on?
Ghana had close to 17 hours of material shot over five days on Sony Z450s at 4K XAVC, 29.97. Greenland was closer to 12 hours shot over three days on Panasonic HPX 250s, P2 media recording at 1080 60i.

We also used iPhone/iPad/GoPro footage picked up by the rest of the crew as needed for both shows. I also had a DJI Osmo pocket camera to play with when I had a chance, and we used some of that footage during the montage of icebergs.

So you were editing segments while they were still shooting?
In Greenland, I was cutting daily in the hotel. Midday, I’d get a drop of cards, offload, sync/group and the first cuts on everything. We had a simple offline edit workflow set up where I’d upload my cuts to Frame.io and email my project files to the team — Rob and Chris — in Burbank. They would then download and sync the Frame.io file to a top video layer in the timeline and continue cutting down, with any additional notes from the writers.

Generally, I’d have everything from Day One uploaded by the start of Day Two, etc. It seemed to work out pretty well to set us up for success when we returned. I was also getting notes on requests to help cut a few highlights from our remotes and to put on Team Coco’s Instagram account.

On our return day, we flew to Ilulissat for an iceberg expedition. We had about two hours on the ground before having to return to the airport and fly to Kangerlussuaq, where our chartered plane was waiting to take us back to California. On the flight back, I worked for another four hours or so to sort through the remaining segments and prep everything so we could hit the ground running the following morning. During the flight home, we screened some drone footage from the iceberg trip for Conan, and it really got everyone excited.

What are the challenges of working on the road and with such tight turnarounds?
The night we left for Greenland was preceded by a nightly show in Burbank. After the show ended, we hopped on a plane to fly eight hours to Kangerlussuaq for customs, then another to Nuuk. The minute we landed, we were filming for about three hours before checking into the hotel. I grabbed the morning’s camera cards, went to my room and began cutting. By the time I went to bed, I had cuts done of almost everything from the first day. I’m a terrible sleeper on planes, so the marathon start was pretty insane.

Outside of the little sleep, our offload speeds were slower because we were using different cameras than usual — for the sake of traveling lighter — because the plane we flew in had specific weight restrictions. We actually had to hire local crew for audio and B and C camera because there wasn’t enough room for everyone in the plane to start.

In general, I think the overall trip went as smooth as it could have. It would be interesting to see how it would play out for a longer shoot schedule.

What editing system did you use? What was your setup like? What kind of storage were you using?
On the road I had my MacBook Pro (2018 model), and we rented an identical backup machine in case mine died. For storage, we had four 1TB G-Tech USB-C drives and a 4TB G-RAID to back everything up. I had a USB-3.0 P2 card reader as well and multiple backup readers. A Bluetooth mouse and keyboard rounded out the kit, so I could travel with everything in a backpack.

We had to charter a plane in order to fly directly to Greenland. With such a tight turnaround between filming and delivering the actual show, this was the only way to actually make the special happen. Commercial flights fly only a few days per week out of neighboring countries, and once you’re in Greenland, you either have to fly or take a boat from city to city.

Matt Shaw editing on plane.

On the plane, there was a conference table in the back, so I set up there with one laptop and the G-RAID to continue working. The biggest trouble on the plane was making sure everything stayed secure on the table while taking off and making turns. There were a few close calls when everything started to slide away, and I had to reach to make sure nothing was disconnected.

How involved in the editing is Conan? What kind of feedback did you get?
In general, if Conan has specific notes, the writers will hear them during or right after a shoot is finished. Or we’ll test-screen something after a nightly show taping and indirectly get notes from the writers then.

There will be special circumstances, like our cold opens for Comic-Con, when Conan will come to edit and screen a close-to-final cut. And there just might be a run of jokes that isn’t as strong, but he lets us work with the writers to make what we all think is the best version by committee.

Can you point to some of the more challenging segments from Greenland or Ghana?
The entire show is difficult with the delivery-time constraints while handling the nightly show. We’ll be editing the versions for screening sometimes up to 10 minutes before they have to screen for an audience as well as doing all the finishing (audio mix, color as needed, subtitling and deliverables).

For any given special, we’re each cutting our respective remotes during the day while working on any new comedy pieces for that day’s show, then one or two of us will split the work on the nightly show, while the other keeps working with the travel show writers. In the middle of it all, we’ll cut together a mini tease or an unfinished piece to play into that night’s show to promote the specials, so the main challenge is juggling 30 things at a time.

For me, I got to edit this 1980s-style action movie trailer based on an awesome poster Conan had painted by a Ghanaian artist. We had puppets built, a lot of greenscreen and a body double to composite Conan’s head onto for fight scenes. Story-wise, we didn’t have much of a structure to start, but we had to piece something together in the edit and hope it did the ridiculous poster justice.

The Thursday before our show screened for an audience was the first time Mike Sweeney (head writer for the travel shows) had a chance to look at any greenscreen footage and knew we were test-screening it the following Monday or Tuesday. It started to take shape when one of our graphics/VFX artists, Angus Lyne, sent back some composites. In the end, it came together great and killed with the audience and our staff, who had already seen anything and everything.

Our other pieces seem to have a linear story, and we try to build the best highlights from any given remote. With something like this trailer, we have to switch our thought process to really build something from scratch. In the case of Greenland and Ghana, I think we put together two really great shows.

How challenging is editing comedy versus drama? Or editing these segments versus other parts of Conan’s world?
In a lot of the comedy we cut, the joke is king. There are always instances when we have blatant continuity errors, jump cuts, etc., but we don’t have to kill ourselves trying to make it work in the moment if it means hurting the joke. Our “man on the street” segments are great examples of this. Obviously, we want something to be as polished and coherent as possible, but there are cases when it just isn’t best, in our opinion, and that’s okay.

That being said, when we do our spoofs of whatever ad or try to recreate a specific style, we’re going to do everything to make that happen. We recently shot a bit with Nicholas Braun from Succession where he’s trying to get a job from Conan during his hiatus from Succession. This was a mix of improv and scripted, and we had to match the look of that show. It turned out well and funny and is in the vein of Succession.

What about for the Ghana show?
For Ghana, we had a few segments that were extremely serious and emotional. For example, Conan and Sam Richardson visited Osu Castle, a major slave trade port. This segment demands care and needs to breathe so the weight of it can really be expressed, versus earlier in the show, when Conan was buying a Ghana shirt from a street vendor, and we hard-cut to him wearing a shirt 10 sizes too small.

And Greenland?
Greenland is a place really affected by climate change. My personal favorite segment I’ve cut on these travel specials is the impact the melting icecaps could have on the world. Then there is a montage of the icebergs we saw, followed by Conan attempting to stake a “Sold” sign on an iceberg, signifying he had bought property in Greenland for the US. Originally, the montage had a few jokes within the segment, but we quickly realized it’s so beautiful we shouldn’t cheapen it. We just let it be beautiful.

Comedy or drama, it’s really about being aware of what you have in front of you and what the end goal is.

What haven’t I asked that’s important?
For me, it’s important to acknowledge how talented our post team is to be able to work simultaneously on a giant special while delivering four shows a week. Being on location for Greenland also gave me a taste of the chaos the whole production team and Team Coco goes through, and I think everyone should be proud of what we’re capable of producing.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The Den editorial boutique launches in Los Angeles

Christjan Jordan, editor of award-winning work for clients including Amazon, GEICO and Hulu, has partnered with industry veteran Mary Ellen Duggan to launch The Den, an independent boutique editorial house in Los Angeles.

Over the course of his career, Jordan has worked with Arcade Edit, Cosmo Street and Rock Paper Scissors, among others. He has edited such spots as Alexa Loses Her Voice for Amazon, Longest Goal Celebration Ever for GEICO, #notspecialneeds for World Down Syndrome Day out of Publicis NY and Super Bowl 2020 ads Tom Brady’s Big Announcement for Hulu and Famous Visitors for Walmart. Jordan’s work has been recognized by the Cannes Lions, AICE, AICP, Clio, D&AD, One Show and Sports Emmy awards.

Yes, with Mary Ellen, agency producers are guided by an industry veteran that knows exactly what agencies and clients are looking for,” says Jordan. “And for me, I love fostering young editors. It’s an interesting time in our industry and there is a lot of fresh creative talent.”

In her career, Duggan has headed production departments at both KPB and Cliff Freeman on the East Coast and, most recently, Big Family Table in Los Angeles. In addition, she has freelanced all over the country.

“The stars aligned for Christjan and I to work together,” says Duggan. “We had known each other for years and had recently worked on a Hulu campaign together. We had a similar vision for what we thought the editorial experience should be. A high end boutique editorial that is nimble, has a roster of diverse talent, and a real family vibe.”

Veteran producer Rachel Seitel has joined as partner and head of business development. The Den will be represented by Diane Patrone at The Family on the East Coast and by Ezra Burke and Shane Harris on the West Coast.

The Den’s founding roster also features editor Andrew Ratzlaff and junior editor Hannelore Gomes. The staff works on Avid Media Composer and Adobe Premiere.

LVLY adds veteran editor Bill Cramer

Bill Cramer, an editor known for his comedy and dialogue work, among other genres, has joined the editorial roster at LVLY, a content creation and creative studio based in New York City.

Cramer joins from Northern Lights. Prior to that he had spent many years at Crew Cuts, where he launched his career and built a strong reputation for his work on many ads and campaigns. Clients included ESPN, GMC, LG, Nickelodeon, Hasbro, MLB, Wendy’s and American Express. Check out his reel.

Cramer reports that he wasn’t looking to make a move but that LVLY’s VP/MD, Wendy Brovetto, inspired him. “Wendy and I knew of each other for years, and I’d been following LVLY since they did their top-to-bottom rebranding. I knew that they’re doing everything from live -action production to podcasting, VFX, design, VR and experiential, and I recognized that joining them would give me more opportunities to flex as an editor. Being at LVLY gives me the chance to take on any project, whether that’s a 30-second commercial, music video or long-form branded content piece; they’re set up to tackle any post production needs, no matter the scale.”

“Bill’s a great comedy/dialogue editor, and that’s something our clients have been looking for,” says Brovetto. “Once I saw the range of his work, it was an easy decision to invite him to join the LVLY team. In addition to being a great editor, he’s a funny guy, and who doesn’t need more humor in their day?”

Cramer, who works on both Avid Media Composer and Adobe Premiere, joins an editorial roster that includes Olivier Wicki, J.P. Damboragian, Geordie Anderson, Noelle Webb, Joe Siegel and Aaron & Bryan.

Behind the Title: Dell Blue lead editor Jason Uson

This veteran editor started his career at LA’s Rock Paper Scissors, where he spent four years learning the craft from editors such as Bee Ottinger and Angus Wall. After freelancing at Lost Planet, Spot Welders and Nomad, he held staff positions at Cosmo Street, Harpo Films and Beast Editorial before opening Foundation Editorial his own post boutique in Austin.

NAME: Jason Uson

COMPANY: Austin, Texas-based Dell Blue

Can you describe what Dell Blue does?
Dell Blue is the in-house agency for Dell Technologies.

What’s your job Title?
Senior Lead Creative Editor

What does that entail?
Outside of the projects that I am editing personally, there are multiple campaigns happening simultaneously at all times. I oversee all of them and have my eyes on every edit, fostering and mentoring our junior editors and producers to help them grow in their careers.

I’ve helped establish and maintain the process regarding our workflow and post pipeline. I also work closely with our entire team of creatives, producers, project managers and vendors from the beginning of each project and follow it through from production to post. This enables us to execute the best possible workflow and outcome for every project.

To add another layer to my role, I am also directing spots for Dell when the project is right.

Alienware

That’s a lot! What else would surprise people about what falls under that title?
The number of hours that go into making sure the job gets done and is the best it can be. Editing is a process that takes time. Creating something of value that means something is an art no matter how big or small the job might be. You have to have pride in every aspect of the process. It shows when you don’t.

What’s your favorite part of the job?
I have two favorites. The first is the people. I know that sounds cliché, but it’s true. The team here at Dell is truly something special. We are family. We work together. Play together. Happy Hour together. Respect, support and genuinely care for one another. But, ultimately, we care about the work. We are all aligned to create the best work possible. I am grateful to be surrounded by such a talented and amazing group of humans.

The second, which is equally important to me, is the process of organizing my project, watching all the footage and pulling selects. I make sure I have what I need and check it off my list. Music, sound effects, VO track, graphics and anything else I need to get started. Then I create my first timeline. A blank, empty timeline. Then I take a deep breath and say to myself, “Here we go.” That’s my favorite.

Do you have a least favorite?
My least favorite part is wrapping a project. I spend so much time with my clients and creatives and we really bond while working on a project together. We end on such a high note of excitement and pride in what we’ve done and then, just like that, it’s over. I realize that sounds a bit dramatic. Not to worry, though, because lucky for me, we all come back together in a few months to work on something new and the excitement starts all over again.

What is your most productive time of day?
This also requires a two-part answer. The first is early morning. This is my time to get things done, uninterrupted. I go upstairs and make a fresh cup of coffee. I open my deck doors. I check and send emails, and get my personal stuff done. This clears out all of my distractions for the day before I jump into my edit bay.

The second part is late at night. I get to replay all of the creative decisions from the day and explore other options. Sometimes, I get lucky and find something I didn’t see before.

If you didn’t have this job, what would you be doing instead?
That’s easy. I’d be a chef. I love to cook and experiment with ingredients. And I love to explore and create an amazing dining experience.

I see similarities between editors and chefs. Both aim to create something impactful that elicits an emotional response from the “elements” they are given. For chefs, the ingredients, spices and techniques are creatively brought together to bring a dish to life.

For editors, the “elements” that I am given, in combination with the use of my style, techniques, sound design, graphics and music etc. all give life to a spot.

How early did you know this would be your path?
I had originally moved to Los Angeles with dreams of becoming an actor. Yes, it’s groundbreaking, I know. During that time, I met editor Dana Glauberman (The Mandalorian, Juno, Up in the Air, Thank You for Smoking, Creed II, Ghostbusters: Afterlife). I had lunch with her at the studios one day in Burbank and went on a tour of the backlot. I got to see all the edit bays, film stages, soundstages and machine rooms. To me, this was magic. A total game-changer in an instant.

While I was waiting on that one big role, I got my foot in the door as a PA at editing house Rock Paper Scissors. One night after work, we all went for drinks at a local bar and every commercial on TV were the ones (editors) Angus Wall and Adam Pertofsky had worked on within the last month, and I was blown away. Something clicked.

This entire creative world behind the scenes was captivating to me. I made the decision at that moment to lean in and go for it. I asked the assistant editor the following morning if he would teach me — and I haven’t looked back. So, Dana, Angus and Adam… thank you!

Can you name some of your recent projects?
I edited the latest global campaign for Alienware called Everything Counts, which was directed by Tony Kaye. More recently, I worked on the campaign for Dell’s latest and greatest business PC laptop that launches in March 2020, which was directed by Mac Premo.

Dell business PC

Side note: I highly recommend Googling Mac Premo. His work is amazing.

What project are you most proud of?
There are two projects that stand out for me. The first one is the very first spot I ever cut — a Budweiser ad for director Sam Ketay and the Art Institute of Pasadena. During the edit, I thought, “Wow, I think I can do this.” It went on to win a Clio.

The second is the latest global campaign for Alienware, which I mentioned above. Director Tony Kaye is a genius. Tony and I sat in my edit bay for a week exploring and experimenting. His process is unlike any other director I have worked with. This project was extremely challenging on many levels. I honestly started looking at footage in a very different way. I evolved. I learned. And I strive to continue to grow every day.

Name three pieces of technology you can’t live without.
Wow, good question. I guess I’ll be that guy and say my phone. It really is a necessity.

Spotify, for sure. I am always listening to music in my car and trying to match artists with projects that are not even in existence yet.

My Bose noise cancelling headphones.

What social media channels do you follow?
I use Facebook and LinkedIn — mainly to stay up to date on what others are doing and to post my own updates every now and then.

I’m on Instagram quite a bit. Outside of the obvious industry-related accounts I follow, here are a few of my random favorites:

@nuts_about_birds
If you love birds as much as I do, this is a good one to follow.

@sergiosanchezart
This guy is incredible. I have been following his work for a long time. If you are looking for a new tattoo, look no further.

@andrewhagarofficial
I was lucky enough to meet Andrew through my friend @chrisprofera and immediately dove into his music. Amazing. Not to mention his dad is Sammy Hagar. Enough said.

@kaleynelson
She’s a talented photographer based in LA. Her concert stills are impressive.

@zuzubee
I love graffiti art and Zuzu is one of the best. Based is Austin, she has created several murals for me. You can see her work all over the city, as well as installations during SXSW and Austin City Limits, on Bud Light cans, and across the US.

Do you listen to music at work? What types?
I do listen to music when I work but only when I’m going through footage and pulling selects. Classical piano is my go-to. It opens my mind and helps me focus and dive into my footage.

Don’t get me wrong, I love music. But if I am jamming to my favorite, Sammy Hagar, I can’t drive…I mean dive… into my footage. So classical piano for me.

How do you de-stress from it all?
This is an understatement, but there are a few things that help me out. Sometimes during the day, I will take a walk around the block. Get a little vitamin D and fresh air. I look around at things other than my screen. This is something (editors) Tom Muldoon and John Murray at Nomad used to do every day. I always wondered why. Now I know. I come back refreshed and with my mind clear and ready for the next challenge.

I also “like” to hit the gym immediately after I leave my edit bay. Headphones on (Sammy Hagar, obviously), stretch it out and jump on the treadmill for 30 minutes.

All that is good and necessary for obvious reasons, but getting back to cooking… I love being in the kitchen. It’s therapy for me. Whether I am chopping and creating in the kitchen or out on the grill, I love it. And my wife appreciates my cooking. Well, I think she does at least.

Photo Credits: Dell PC and Jason Uson images – Chris Profera

An online editor’s first time at Sundance

By Brady Betzel

I’ve always wanted to attend the Sundance Film Festival, and my trip last month did not disappoint. Not only is it an iconic industry (and pop-culture) event, but the energy surrounding it is palpable.

Once I got to Park City and walked Main Street — with the sponsored stores (Canon and Lyft among others) and movie theaters, like the Egyptian — I started to feel an excitement and energy that I haven’t felt since I was making videos in high school and college… when there were no thoughts of limits and what I should or shouldn’t do.

A certain indescribable nervousness and love started to bubble up. Sitting in the luxurious Park City Burger King with Steve Hullfish (Art of the Cut) and Joe Herman (Cinemontage) before my second screening of Sundance 2020: Dinner in America, I was thinking how I was so lucky to be in a place that is packed with creatives. It sounds cliché and trite, but it really is reinvigorating to surround yourself with positive energy — especially if you can get caught up in cynicism like me.

It brought me back to my college classes, taught by Daniel Restuccio (another postPerspective writer), at California Lutheran University, where we would cut out pictures from magazines, draw pictures, blow up balloons, eat doughnuts and do whatever we could to get our ideas out in the open.

While Sundance occasionally felt like an amalgamation of the thirsty-hipster Coachella crowd mixed with a high school video production class (but with million-dollar budgets), it still had me excited to create. Sundance 2020 in Park City was a beautiful resurgence of ideas and discussions about how we as an artistic community can offer accessibility to everyone and anyone who wants to tell their own story on screen.

Inclusiveness Panel
After arriving in Park City, my first stop was a panel hosted by Adobe called “Empowering Every Voice in Film and the World.” Maybe it was a combination of the excitement of Sundance and the discussion about accessibility, but it really got me thinking. The panel was expertly hosted by Adobe’s Meagan Keane and included producer, director Yance Ford (Disclosure: Trans Lives on Screen, Oscar-nominated for Strong Island); editor Eileen Meyer (Crip Camp); editor Stacy Goldate (Disclosure: Trans Lives on Screen); and director Crystal Kayiza (See You Next Time).

I walked away feeling inspired and driven to increase my efforts in accessibility. Eileen said one of her biggest opportunities came from the Karen Schmeer Film Editing Fellowship, a year-long fellowship for emerging documentary editors.

Yance drove home the idea of inclusivity and re-emphasized the idea of access to equipment. But it’s not simply about access — you also have to make a great story and figure out things like distribution. I was really struck by all the speakers on-stage, but Yance really spoke to me. He feels like the voice we need when representing marginalized groups and to see more content from these creatives. The more content we see the better.

Crystal spoke about the community needing to tell stories that don’t necessarily have standard plot points and stakes. The idea to encourage people to create their stories and for those that are in power to help and support these stories and trust the filmmakers, regardless of whether you identify with the ideas and themes.

Rebuilding Paradise

Screenings
One screening I attended was Rebuilding Paradise, directed by Ron Howard. He was at the premiere, along with some of the people who lost everything in the Paradise, California fires. In the first half of November 2018, there were several fires that raged out of control in California. One surrounded the city of Simi Valley and worked its way toward the Pacific Coast. (It was way too close for my comfort in Simi Valley. We eventually evacuated but were fine.)

Another fire was in the town of Paradise, which burnt almost the entire city to the ground. Watching Rebuilding Paradise filled me with great sadness for those who lost family members and their homes. Some of the “found footage” was absolutely breathtaking. One in particular was of a father racing out of what appears to be hell, surrounded by flames, in his car with his child asking if they were going to die. Absolutely incredible and heart wrenching.

Dinner in America

Another film I saw was Dinner in America, as referenced earlier in this piece. I love a good dark comedy/drama, so when I got a ticket to Adam Carter Rehmeier’s Dinner in America I was all geared up. Little did I know it would start off with a disgruntled 20-something throwing a chair through a window and lighting the front sidewalk on fire. Kudos to composer John Swihart, who took a pretty awesome opening credit montage and dropped the heat with his soundtrack.

Dinner in America is a mid-‘90s Napoleon Dynamite cross-pollinated with the song “F*** Authority” by Pennywise. Coincidentally, Swihart composed the soundtrack for Napoleon Dynamite. Seriously, the soundtrack to Dinner in America is worth the ticket price alone, in my opinion. It adds so much to one of the main character’s attitude. The parallel editing mixed with the fierce anti-authoritarianism love story, lived by Kyle Gallner and Emily Skeggs, make for a movie you probably won’t forget.

Adam Rehmeier

During the Q&A at the end, writer, director and editor Rehmeier described how he essentially combined two ideas that led to Dinner in America. As I watched the first 20 minutes, it felt like two separate movies, but once it came together it really paid off. Much like the cult phenomenon Napoleon Dynamite, Dinner in America will resonate with a wide audience. It’s worth watching when it comes to a theater (or streaming platform) near you. In the meantime, check out my video interview with him.

Adobe Productions
During Sundance, Adobe announced an upcoming feature for Premiere called “Productions.” While in Park City, I got a small demo of the new Productions at Adobe’s Sundance Production House. It took about 15 minutes before I realized that Adobe has added the one feature that has set Avid Media Composer apart for over 20 years — bin locking. Head’s up Avid, Adobe is about to release multi-user workflow that is much easier to understand and use than on previous iterations of Premiere.

The only thing that caught me off guard was the nomenclature — Productions and Projects. Productions is the title, but really a “Production” is a project, and what they call a “project” is a bin. If you’re familiar with Media Composer, you can create a project and inside have folders and bins. Bins are what house media links, sequences, graphics and everything else. In the new Productions update, a “Production” will house all of your “Projects” (i.e. a Project with bins).

Additionally, you will be able to lock “Projects.” This means that in a multi-user environment (which can be something like a SAN or even an Avid Nexis), a project and media can live on the shared server and be accessed by multiple users. These users can be named and identified inside of the Premiere Preferences. And much like Blackmagic’s DaVinci Resolve, you can update the “projects” when you want to — individually or all projects at once. On its face, Productions looks like the answer to what every editor has said is one of the only reasons Avid is still such a powerhouse in “Hollywood” — the ability to work relatively flawlessly among tons of editors simultaneously. If what I saw works the way it should, Adobe is looking to take a piece of the multi-user environment pie Avid has controlled for so long.

Summing Up
In the end, the Sundance Film Festival 2020 in Park City was likely a once-in-a-lifetime experience for me. From seeing celebrities, meeting other journalists, getting some free beanies and hand warmers (it was definitely not 70 degrees like California), to attending parties hosted by Canon and Light Iron — Sundance can really reinvigorate your filmmaking energy.

It’s hard to keep going when you get burnt out by just how hard it is to succeed and break through the barriers in film and multimedia creation. But seeing indie films and meeting like-minded creatives, you can get excited to create your own story. And you realize that there are good people out there, and sometimes you just have to fly to Utah to find them.

Walking down Main Street, I found a coffee shop named Atticus Coffee and Tea House. My oldest son’s name is Atticus, so I naturally had to stop in and get him something, I ended up getting him a hat and me a coffee. It was good. But what I really did was sit out front pretending to shoot b-roll and eavesdropping on some conversations. It really is true that being around thoughtful energy is contagious. And while some parts of Sundance feel like a hipster-popularity contest, there are others who are there to improve and absorb culture from all around.

The 2020 Sundance Film Festival’s theme in my eyes was to uplift other people’s stories. As Harper Lee wrote in “To Kill a Mockingbird” when Atticus Finch is talking with Scout: “First of all, if you learn a simple trick, Scout, you’ll get along a lot better with all kinds of folks. You never really understand a person until you consider things from his point of view . . . until you climb into his skin and walk around in it.”


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

DejaEdit collaborative editing platform available worldwide

DejaSoft has expanded the availability of its DejaEdit collaborative editing solution for Avid Media Composer, Avid Nexis and EditShare workflows. Already well-established in Scandinavia and parts of Europe, the software-defined network solution is now accessible across the UK, Europe, Latin America, Middle East, Asia, Africa, China and North America.

DejaEdit allows editors to transfer media files and timelines automatically and securely around the world without having to be online continuously. It effectively acts as a media file synchronizer for multiple remote Avid systems.

DejaEdit allows multi-site post facilities to work as one, enabling multiple remote editors to work together, allowing media exchanges with VFX houses and letting editors easily migrate between office and home or mobile-based editing installations throughout the lifecycle of a project.

DejaEdit is available in two applications: Client and Nexus. The Client version works directly with Media Composer, whereas the Nexus variant further enables synchronization with projects stored on Nexis or EditShare storage systems.

DejaSoft and the DejaEdit platform are a collaboration between CEO Clas Hakeröd and CTO Nikolai Waldman, both editors and post pros and founders of boutique post facility Can Film based in Sweden.

The tool is already being used by Oscar-nominated editor Yorgos Mavropsaridis, ACE, of The Favourite, The Lobster and recently Suicide Tourist; Scandinavian producer Daniel Lägersten, who has produced TV series such as Riverside and The Spiral; editor Rickard Krantz, who used it on The Perfect Patient (aka Quick), which has been nominated for Sweden’s Guldbagge Award (similar to a BAFTA) for editing; and post producer Anna Knochenhauer, known for her work on Euphoria featuring Alicia Vikander, The 100-Year-Old Man Who Climbed Out the Window and Disappeared, Lilya 4-Ever and Together.

Review: Neat Video 5 noise reduction plugin

By Brady Betzel

One of the best (and most underrated) tricks in an online editor’s tool kit is to have good image restoration techniques. Removing digital video imperfections — from flicker to digital video noise — is not easy, and not easy to do well. That is, unless you have good noise reduction software like Neat Video.

While Neat Video might not be that well-known, once you see how simply (or intricatly) Neat Video 5 works inside of apps like Blackmagic’s DaVinci Resolve, it will be hard to forget the company’s name.

(While the software was recently updated to 5.1.5 — with expanded GPU support as well as support for new versions of Resolve, Adobe and Nuke — nothing really changes for this review. You can check out a detailed list of the updates here.)

Neat Video 5 is a noise reduction plugin. In a Windows OS environment, Neat Video is compatible with apps like Adobe After Effects, Adobe Premiere Pro, DaVinci Resolve, Avid Media Composer, Vegas, Magix, Edius, Virtual Dub, and the OFX-compatible apps Nuke, Fusion, Scratch, HitFilm, Mamba, Natron, Flame, Baselight and DustBuster. In a macOS environment, Neat Video 5 is compatible with After Effects, Premiere, Final Cut Pro X, Motion 5, OFX, Resolve and Media Composer. In Linux, the software is compatible with OFX-compatible apps and Resolve.

Neat Video 5 comes in three flavors: Demo, Home and Pro. The Demo version works in up to 1280×720 resolution with a watermark. Home is literally made for the home user: It will process video up to 1920×1080 resolutions, it will use up to one GPU, and it is for non-commercial use. The cost is just $74.90 for most apps (Resolve is $89.90). The Pro version has no resolution restrictions, will work on two or more GPUs simultaneously, and can be used commercially. The Pro version starts at $129.90 per app ($159.90 for Resolve). Because Neat Video 5 for OFX works with so many apps, it only comes in Pro ($249.90) and Studio ($349.90) versions. The Studio version adds the ability for a floating license. You can see all of the pricing details here.

If there is one line you should take away from this review, it is this: Neat Video 5 is by far the easiest and best noise reduction software I have used in any application to date. And while this review is focusing on the Resolve version of Neat Video 5, all other apps work in much the same way. You can find Neat Video’s software-specific Quick Start Guides to help. Once you install and register your Neat Video 5 license, removing digital video noise is as easy as applying Neat Video 5 to a node in the color tab, clicking on “Prepare Noise Profile,” clicking on “Auto Profile,” and clicking “Apply.” Then, unless you want to fine-tune your noise reduction, you are done. Obviously, I have somewhat simplified how Neat Video 5 works, but essentially it can be done in as little as three steps per clip, and the results are typically amazing. If they aren’t amazing, you can jump back into Neat Video 5 and manually adjust specifics until the noise reduction looks correct. But I will say that in about 90% of cases, the Auto Profiling will do all of the noise reduction work necessary.

For tinkerers, or for those who need to go far beyond an Auto Profile, you can manually adjust your settings. But taking a step back, Neat Video needs an area of your image that has a uniform color and noise profile to process how it removes noise. The automatic profiling will do its best to find an area, but it doesn’t always work. What you need to keep in mind when building a good noise profile inside of Neat Video is that the area being processed needs to be as uniform as possible (think dark night sky or a wall painted in one color) — meaning no prominent features, a high noise level (something in the high four area is better), the largest possible sample area and no warnings from Neat Video.

So, if your automatic profile doesn’t do the job, you can find an area of your image that meets the above requirements and then build a profile. From there you can use one of the Neat Video 5 features, like “Profile Check.” Profile Check will highlight details that aren’t being affected by Neat Video, giving you a clear representation of what noise is being reduced and whether you need to adjust your profile to better reduce video noise.

At this point you might be wondering where you tweak advanced settings. When you load Neat Video, you will be in Beginner mode. To get into Advanced mode, go to the Tools menu, where you will see a lot of advanced functions that can help you fine-tune your noise profile. And if you still can’t get a good noise reduction profile, you can try out the “Generic Profile,” which can help you build a profile even if your video doesn’t have a large enough area of uniform noise. There are also presets — such as like light flicker, moire flicker, repeat frame issues, dust and scratch filters (including scan lines), jitter of details, artifact removal filter and more — that can solve certain problems.

Neat Video 5 is faster than previous generations. As in previous versions, there is even a tool that inside of Neat Video preferences that will run your CPU and GPU through a benchmark to specify whether you should run on CPU only, GPU only, or a combo of both. In Neat Video 5, if you have trouble with a clip, you can use up to four “Variants” of noise reduction in the new playback window to see how each profile works with your clip.

In terms of playback and rendering, noise reduction is never fast. However, inside of Neat Video the new playback window will typically play back your footage to preview the noise reduction before you jump back into Resolve. Inside of Resolve, even in just 1080p, my sequence would crawl to just a few frames of playback per second. It is one of the most processor- and GPU-intensive tasks you will run on your computer.

In my testing I applied Neat Video 5 to the first node in my color correction tree, followed by a basic color correction in a one-minute timeline. I took those same clips and compared my Neat Video results to Resolve’s Temporal and Spatial noise reduction tools. In terms of visual results, Neat Video 5 was superior. If that’s not the case for you, then jump into YCbCr viewer mode inside of Neat Video 5, isolate each channel and tweak each channel individually so you won’t affect your overall noise reduction if it isn’t necessary. Not only did Neat Video 5 handle normal noise in the shadows well but on clips with very tight lines, it was able to keep a lot of the details while removing the noise. Resolve’s noise reduction tools had a harder time removing noise but keeping detail. Temporal noise reduction really didn’t do much, and while Spatial noise reduction did work it would heavily blur and distort the image — essentially not acceptable.

To get a good example of how Neat Video 5 slams a computer system, I exported 1080p MP4s. Resolve’s built-in Temporal noise reduction took 1:03, while the Spatial noise reduction took 1:05. The Neat Video 5 render of the same one-minute timeline took 3:51 — almost four times as long. I was curious how much longer a 4K render would take. Using 4K (UHD) media, I applied a simple color correction and on a previous serial node that applied Neat Video 5. I exported a 4K (UHD) MP4, which took 52 seconds without Neat Video 5 applied and 16:27 with Neat Video applied — at least 16 times more render time! So while Neat Video 5 is an amazing tool, there is a trade-off in high render times.

To find additional training on more advanced noise reduction techniques in Neat Video, check out the video tutorials. I find myself watching these just because of how much you can learn about noise reduction in general. They aren’t as exciting as watching Game of Thrones or The Handmaid’s Tale, but they will push your knowledge in noise reduction to the next level.

Summing Up
I’ve used Neat Video for a while, so when I was approached to review Version 5 I immediately said yes. Noise reduction is post skill that not many possess.

If you are an online editor or colorist looking to separate yourself from the pack, learn all the noise reduction techniques you can and definitely check out Neat Video 5. Not only can Neat Video 5 work automatically, but you can fine-tune your noise reduction as much as you want.

And when demoing your color correction services, think about using Neat Video 5 to remove camera noise, flickering and chroma issues; color correcting your footage; and, finally, adding some grain back into your shot. Not only will your footage look better, but you’ll have a technical workflow that will definitely impress clients. Just don’t forget to account for the extra render time.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Oscar-nominated Jojo Rabbit editor Tom Eagles: blending comedy and drama

By Daniel Restuccio

As an editor, Tom Eagles has done it all. He started his career in New Zealand cutting promos before graduating to assistant editor then editor on television series such as Secret Agent Men and Spartacus. Eventually he connected with up-and-coming director Taika Waititi and has worked with him on the series What We Do in the Shadows and the critically acclaimed feature Hunt for the Wilderpeople. Their most recent feature collaboration, 20th Century Fox’s Jojo Rabbit, earned Eagles BAFTA and Oscar nominations as well as an ACE Eddie Award win.

Tom Eagles

We recently caught up with him to talk about the unique storytelling style of Taika films, specifically Jojo Rabbit.

(Warning: If you haven’t seen the film yet, there might be some spoilers ahead.)

How did your first conversation with Taika go?
Fairly early on, unprompted, he gave me a list of his top five favorite films. The kind of scope and variety of it was startling, but they were also my top five favorite films. We talked about Stalker, from filmmaker Andrei Tarkovsky, and I was a massive Tarkovsky fan at the time. He also talked about Annie Hall and Bad Lands.

At that point in time, there weren’t a lot of people doing the type of work that Taika does: that mix of comedy and drama. That was the moment I thought, “I’ve got to work with this guy. I don’t know if I’m going to find anyone else like this in New Zealand.”

How is Jojo different than your previous collaboration on Hunt for the Wilderpeople?
We had a lot more to work with on Jojo. There’s a lot more coverage in a typical scene, while the Wilderpeople was three shots: a master and two singles. With Jojo, we just threw everything at it. Taika’s learned over the years that it’s never a bad thing to have another shot. Same goes for improv. It’s never a bad thing to have a different line. Jojo was a much bigger beast to work on.

Jojo is rooted in a moment in history, which people know well, and they’re used to a certain kind of storytelling around that moment. I think in the Czech Republic, where we shot, they make five World War II movies a year. They had a certain idea of how things should look, and we weren’t doing that. We were doing Taika’s take, so we weren’t doing desaturated, handheld, grim, kitchen sink realism. We were creating this whole other world. I think the challenge was to try and bring people along on that journey.

I saw an early version of the script, and the Hitler character wasn’t in the opening scene. How did that come about?
One of the great things about working with Taika is he always does pick-ups. Normally, it’s something that we figure out that we need during the process of the edit. He rewrote a bunch of different options for the ending of the movie, a few scenes dotted throughout and the opening of the film.

He shot three versions. In one, it was just Jojo on his own, trying to psych himself up. Then there were variations on how much Adolf we would have in the film. What we found when we screened the film up to that point was that people were on board with the film, but it sometimes took them a while to get there … to understand the tone of the film. The moment we put imaginary Adolf in that scene, it was like planting a flag and saying, “This is what this film is. It’s going to be a comedy about Hitler and Nazis, and you’re either with us or you’re walking out, but if you’re with us, you will find out it’s about a lot more than that.”

Some directors sit right at the editor’s elbow, overlooking every cut, and some go away and leave the editor to make a first cut. What was this experience like?
While I’ve experienced both, Taika’s definitely in the latter category. He’s interested in what you have to say and what you might bring to the edit. He also wants to know what people think, so we screen the film a lot. Across the board — it’s not just isolated to me, but anyone he works with — he just wants more ideas.

After the shooting finished, he gave me two weeks. He went and had a break and encouraged me to do what I wanted with the assemble, to cut scenes and to not be too precious about including everything. I did that, but I was still relatively cautious; there were some things I wanted him to see.

We experimented with various structures. We tried an archiving thing for the end of the film. There was a fantasy sequence in which Elsa is talking about the story of the Jews, and we see flights of fancy of what she thinks … a way for her to escape into fantasy. That was an idea of Taika’s. He just left me to it for a couple of weeks, and we looked at it and decided against it in the end. It was a fun process because when he comes back, he’s super fresh. You offer up one idea and he throws five back.

How long was the first cut?
I asked my assistant the other day, and he said it was about two hours and forty minutes, so I guess I have to go with that, which sounds long to me. That might have been the first compile that had all of the scenes in it, and what I showed Taika was probably half an hour shorter. We definitely had a lot to play with.

Do you think there’s going to be a director’s cut?
I think what you see is the director’s cut. There’s not a version of the film that has more stuff in it than we wanted in it. I think it is pretty much the perfect direction. I might have cut a little bit more because I think I just work that way. There were definitely things that we missed, but I wouldn’t put them back in because of what we gained by taking them out.

We didn’t lean that heavily on comedy once we transitioned into drama. The longer you’re away from Jojo and Elsa, that’s when we found that the story would flounder a little bit. It’s interesting because when I initially read the script, I was worried that we would get bored of that room, and that it would feel too much like a stage play. So we added all of this color and widened the world out. We had these scenes where Jojo goes out into the world, but actually the relationship between the two of them — that’s the story. Each scene in that relationship, the kind of gradual progression toward each other, is what’s moving the story forward.

This movie messes with your expectations, in terms of where you think it’s going or even how it’s saying it. How did you go about creating your own rhythms for that style of storytelling?
I was fortunate in that I already had Taika’s other films to lean on, so partly it was just trying to wrestle this genre into his world … into his kind of subgenre of Taika. It’s really just a sensibility a lot of the time. I was aware that I wanted a breathlessness to the pace of things, especially for the first half of the movie in order to match Jojo’s slightly ADD, overexcited character. That slows down a little bit when it needs to and when he’s starting to understand the world around him a little bit more.

Can you talk about the music?
Music also was important. The needle drops. Taika had a bunch of them already. He definitely had The Beatles and Bowie, and it was fleshing out a few more of those. I think I found the Roy Orbison piece. Temp music was also really important. It was quite hard to find stuff. Taika’s brief was: I don’t want it to sound like all the other movies in the genre. As much as we respected Schindler’s List, he didn’t want it to sound like Schindler’s List.

You edited on Avid Media Composer?
We cut on Avid, and it was the first time I really used ScriptSync. I had been wary of it, to be honest. I watch all the dailies through from head to tail and see the performances in context and feel how they affect me. Once that’s done, ScriptSync is great for comparing takes or swapping out a read on a line. Because we had so much improv on this film, we had to go through and enter all of that in manually. Sometimes we’d use PhraseFind to search on a particular word that I’d remembered an actor saying in an ad-lib. It’s a much faster and more efficient way of finding that stuff.

That said, I still periodically go back and watch dailies. As the film starts to solidify, so does what I’m looking for in the dailies, so I’ll always go back and see if there’s anything that I view differently with a new in mind.

You mentioned the difference between Wilderpeople and Jojo in terms of coverage. How much more coverage did you have? Were there multiple cameras?
There were two and sometimes three cameras (ARRI Alexa). Some scenes were single camera, so there was a lot more material mastered. Some directors get a bit iffy about two cameras, but we just rolled it.

If we had the option, we would almost always lean on the A camera, and part of the trick was to try and make it look like his other movies. We wanted the coverage plan to feel simple; it should still feel like a master, couple of mediums and a couple of singles, all in that very flat framing approach of his. Often, the characters are interacting with each other perpendicular to the camera in these fixed static wides.

Again, one of the things Taika was concerned with was that it should feel like his other movies. Just because we have a dolly, we don’t have to use it every time. We had all of those shots, we had those options, and often it was about pairing things back to try and stay in time.

Does he give you a lot of takes, and does he create different emotional variations within those takes?
We definitely had a lot of takes. And, yes, there would be a great deal of variety of performance, whether it’s him just trying to push an actor and get them to a specific place, or sometimes we just had options.

Was there an average — five takes, 10 takes?
It’s really hard to say. These days everyone just does rolling resets. You look at your bin and you think, “Ah, great, they did five takes, and there’s only three set-ups. How long is it going to take me?” But you open it up, and each take is like half an hour long, and they’re reframing on the fly.

With Scarlett Johansson, you do five takes max, probably. But with the kids it would be a lot of rolling resets and sometimes feeding them lines, and just picking up little lines here and there on the fly. Then with the comedians, it was a lot of improv, so it’s hard to quantify takes, but it was a ton of footage.

If you include the archive footage, I think we had 300 to 400 hours. I’m not sure how much of that was our material, but it would’ve been at least 100 hours.

I was impressed by the way you worked the “getting real” scenes: the killing of the rabbit and the hanging scene. How did you conceptualize and integrate those really important moments?
For the hanging scene, I was an advocate for having it as early in the movie as possible. It’s the moment in the film where we’ve had all this comedy and good times [regarding] Nazis, and then it drives home that this film is about Nazis, and this is what Nazis do.

I wanted to keep the rabbit scene fun to a degree because of where it sits in the movie. I know, obviously, it’s quite a freaky scene for a lot of people, but it’s kind of scary in a genre way for me.

Something about those woods always remind me of Stand by Me. That was the movie that was in my mind, and just the idea of those older kids, the bullies, being dicks. Moments like that and, much more so, the moment when Jojo finds Elsa; I thought of that sequence as a mini horror film within the film. That was really useful to let the scares drive it because we were so much in Jojo’s point of view. It’s taking those genres and interjecting a little bit of humor or a little bit of lightness into them to keep them in tone with Taika’s overall sensibility.

I read that you tried to steer clear of the sentimentality. How did you go about doing that?
It’s a question of taste with the performance(s) and things that other people might like. I will often feel I’m feeding the audience or demanding of the audience an emotional response. The scene where Jojo finds Rosie. We shot an option seeing Rosie hanging there. It just felt too much. It felt like it was really bludgeoning people over the head with the horror of the moment. It was enough to see the shoes. Every time we screened the movie and Jojo stands up, we see the shoes and everyone gasps. I think people have gotten the information that they need.


Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

Editor David Cea joins Chicago’s Optimus  

Chicago-based production and post house Optimus has added editor David Cea to its roster. With 15 years of experience in New York and Chicago, Cea brings a varied portfolio of commercial editing experience to Optimus.

Cea has cut spots for brands such as Bank of America, Chevrolet, Exxon, Jeep, Hallmark, McDonald’s, Microsoft and Target. He has partnered with many agencies, including BBDO, Commonwealth, DDB, Digitas, Hill Holliday, Leo Burnett, Mother and Saatchi & Saatchi.

“I grew up watching movies with my dad and knew I wanted to be a part of that magical process in some way,” explains Cea. “The combination of Goodfellas and Monty Python gave me all the fuel I needed to start my film journey. It wasn’t until I took an editing class in college that I discovered the part of filmmaking I wanted to pursue. The editor is the one who gets to shape the final product and bring out the true soul of the footage.”

After studying film at Long Island’s Hofstra University, Cea met Optimus editor and partner Angelo Valencia while working as his assistant at Whitehouse New York in 2005. Cea then moved on to hone his craft further at Cosmo Street in New York. Chicago became home for him in 2013 as he spent three years at Whitehouse. After heading back east for a couple of years, he returned to Chicago to put down roots.

While Avid Media Composer is Cea’s go-to choice for editing, he is also proficient in Adobe Premiere.

FXhome’s HitFilm Express 14, ‘Pay What You Want’ option

FXhome has a new “Pay What You Want” good-will program inspired by the HitFilm Express community’s requests to be able to help pay for development of the historically free video editing and VFX software. Pay What You Want gives users the option to contribute financially, ensuring that those funds will be allocated for future development and improvements to HitFilm.

Additionally, FXhome will contribute a percentage of the proceeds of Pay What You Want to organizations dedicated to global causes important to the company and its community. At its launch, the FXhome Pay What You Want initiative will donate a portion of its proceeds to the WWF and the Australia Emergency Bushfire Fund. The larger the contribution from customers, the more FXhome will donate.

HitFilm Express remains a free download, however, first-time customers will now have the option to “Pay What You Want” on the software. They’ll also receive some exclusive discounts on HitFilm add-on packs and effects.

Coinciding with the release of Pay What You Want, FXhome is releasing HitFilm Express 14, the first version of HitFilm Express to be eligible for the Pay What You Want initiative. HitFilm Express 14 features a new and simplified export process, new text controls, a streamlined UI and a host of new features.

For new customers who would like to download HitFilm Express 14 and also contribute to the Pay What You Want program, there are three options available:

• Starter Pack Level: With a contribution as little as $9, new HitFilm Express 14 customers will also receive a free Starter Pack of software and effects that includes:
o Professional dark mode interface
o Edit tools including Text, Split Screen Masking, PiP, Vertical Video, Action Cam Crop
o Color tools including Exposure, Vibrance, Shadows and Highlights, Custom Gray, Color Phase, Channel Mixer and 16-bit color
o Additional VFX packs including Shatter, 3D Extrusion, Fire, Blood Spray and Animated Lasers
• Content Creator Level: With contributions of $19 or more, users will receive everything included in the Starter Pack, as well as:
o Edit: Repair Pack with Denoise, Grain Removal and Rolling Shutter
o Color: LUT Pack with LUTs and Grading Transfer
o Edit: Beautify Pack with Bilateral Blur and Pro Skin Retouch
• VFX Artist Level: Users who contribute from $39 to $99 get everything in the Starter Pack and Content Creator levels plus:
o Composite Toolkit Pack with Wire Removal, Projector, Clone and Channel Swapper
o Composite Pro-Keying Pack for Chroma Keying
o Motion Audio Visual Pack with Atomic Particles, Audio Spectrum and Audio Waveform
o VFX Neon Lights Pack with Lightsword Ultra (2-Point Auto), Lightsword Ultra (4-Point Manual), Lightsword Ultra (Glow Only) and Neon Path
o VFX Lighting Pack with Anamorphic Lens Flares, Gleam, Flicker and Auto Volumetrics

What’s new in HitFilm Express 14
HitFilm Express 14 adds a number of VFX workflow enhancements to enable even more sophisticated effects for content creators, including a simplified export workflow that allows users to export content directly from the timeline and comps, new text controls, a streamlined UI and a host of new features. Updates include:

• Video Textures for 3D Models: For creators who already have the 3D: Model Render Pack, they can now use a video layer as a texture on a 3D model to add animated bullet holes, cracked glass or changing textures.
• Improvements to the Export Process: In HitFilm Express 14, the Export Queue is now an Export Panel, and is now much easier to use. Exporting can also now be done from the timeline and from comps. These “in-context” exports will export the content between the In and Out points set or the entire timeline using the current default preset (which can be changed from the menu).
• Additional Text Controls: Customizing text in HitFilm Express 14 is now even simpler, with Text panel options for All Caps, Small Caps, Subscript and Superscript. Users can also change the character spacing, horizontal or vertical scale, as well as baseline shift (for that Stranger-Things-style titling).
• Usability and Workflow Enhancements: In addition to the new and improved export process, FXhome has also implemented new changes to the interface to further simplify the entire post production process, including a new “composite button” in the media panel, double-click and keyboard shortcuts. A new Masking feature adds new automation to the workflow; when users double-click the Rectangle or Ellipse tools, a centered mask is automatically placed to fill the center of the screen. Masks are also automatically assigned colors, which can be changed to more easily identify different masks.
• Effects: Users can now double-click the effects panel to apply to the selected layer and drop 2D effects directly onto layers in the viewer. Some effects — such as the Chroma Key and Light Flares — can be dropped on a specific point, or users can select a specific color to key by. Users can also now favorite “effects” for quick and easy access to their five most recently used effects from the ‘Effects’ menu in the toolbar.
• Additional Improvements: Users can now use Behavior effects from the editor timeline, click-drag across multiple layers to toggle “solo,” “locked” or “visibility” settings in one action, and access templates directly from the media panel with the new Templates button. Menus have also been added to the tab of each panel to make customization of the interface easier.
• Open Imerge Pro files in HitFilm: Imerge Pro files can now be opened directly from HitFilm as image assets. Any changes made in the Imerge Pro project will be automatically updated with any save, making it easier to change image assets in real time.
• Introducing Light Mode: The HitFilm Express interface is now available in Light Mode and will open in Light Mode the first time you open the software. Users with a pre-existing HitFilm Express license can easily change back to the dark theme if desired.

HitFilm Express 14 is available immediately and is a free download. Customers downloading HitFilm Express 14 for the first time are eligible to participate in the new Pay What You Want initiative. Free effects and software packs offered in conjunction with Pay What You Want are only available at initial download of HitFilm Express 14.

Nomad Editorial hires eclectic editor Dan Maloney

Nomad Editing Company has added editor Dan Maloney to its team. Maloney is best known for his work cutting wry, eclectic comedy spots in addition to more emotional content. While his main tool is Avid Media Composer, he is also well-versed in Adobe Premiere.

“I love that I get to work in so many different styles and genres. It keeps it all interesting,” he says.

Prior to joining Nomad, Maloney cut at studios such as Whitehouse Post, Cut+Run, Spot Welders and Deluxe’s Beast. Throughout his career, Maloney has uses his eye for composition on a wide range of films, documentaries, branded content and commercials, including the Tide Interview spot that debuted at Super Bowl XLII.
“My editing style revolves mostly around performance and capturing that key moment,” he says. “Whether I’m doing a comedic or dramatic piece, I try to find that instance where an actor feels ‘locked in’ and expand the narrative out from there.”

According to Nomad editor/partner Jim Ulbrich, “Editing is all about timing and pace. It’s a craft and you can see Dan’s craftsmanship in every frame of his work. Each beat is carefully constructed to perfection across multiple mediums and genres. He’s not simply a comedy editor, visual storyteller, or doc specialist. He’s a skilled craftsman.”

Adobe Premiere Productions: film projects, collaborative workflows

By Mike McCarthy

Adobe announced a new set of features coming to its NLE Premiere Pro. They now support “Productions” within Premiere, which allows easier management of sets of projects being shared between different users. The announcement, which came during the Sundance Film Festival, is targeted at filmmakers working on large-scale projects with teams of people collaborating on site.

Productions extends and refines Premiere’s existing “Shared Project” model, making it easier to manage work spread across a large number of individual projects, which can become unwieldy with the current implementation.

Shared Projects should not be confused with Team Projects, which is an online project-sharing tool set across different locations that each have their own local media and Adobe Anywhere, which is a cloud based streaming editing platform with no local files. Shared Projects are used between users on a local network, usually with high-quality media, with simple mechanisms for passing work between different users. Shared Projects were introduced in Premiere Pro 2018 and included three components. Here, I’m going to tell you what the issues were and how the new Adobe Productions solves them:

1) The ability to add a shortcut to another project into the project panel, which was next to useless. The projects were in no other way connected with each other, and incrementing the target project to a new name (V02) broke the link. The only benefit was to see who might have the shortcut-ed project open and locked, which brings us to:

2) The ability to lock projects that were open on one system, preventing other users from inadvertently editing them at the same time and overwriting each other’s work, which should have been added a long time ago. This was previously managed through a process called “shout down the hall” before opening projects.

3) And most significantly, the inability to open more than one project at the same time. The previous approach was to import other projects into your existing project, but this resulted in massive project files that took forever to load, among other issues. Opening more than one project at once allowed projects to be broken into smaller individual parts, and then different people could more easily work on different parts at the same time.

For the last two-plus years, large films have been able to break down their work into many smaller projects and distribute those projects between numerous users who are working on various parts. And those users can pass the pieces back and forth without concern for overwriting each other’s work. But there was no central way to control all of those projects, and the master project/Shared Project shortcut system required you not to version your projects (bad file management) or to re-linking every project version to the master project (tedious).

You also end up with lots of copies of your media, as every time an asset is used in a different project, a new copy of it is copied into that project. If you update or edit an asset in one project, it won’t change the copies that are used in other projects (master clip effects, relinking, reinterpreting footage, proxies, etc.).

Problems Solved
Premiere’s new Production Panel and tool set are designed to solve those problems. First, it gives you a panel to navigate and explore all of the projects within your entire production, however you structure them within your master project folder. You can see who has what open and when.

When you copy an asset into a sequence from another project, it maintains a reference to the source project, so subsequent changes to that asset (color correction, attaching full-res media, etc.) can propagate to the instance in the sequence of the other project — if both projects are open concurrently to sync.

If the source project can’t be found, the child instance is still a freestanding piece of media that fully functions; it just no longer receives synchronized updates from the master copy. (So you don’t have a huge web of interconnected projects that will all fail if one of them is corrupted or deleted.)

All projects in a Production have the same project settings, (Scratch disks, GPU renderer, etc.) keeping them in sync and allowing you to update those settings across the production and share render files between users. And all files are stored on your local network for maximum performance and security.

In the application, this allows all of the source media to be in dedicated “dailies” projects, possibly a separate project for every day of filming. Then each scene or reel can be its own project, with every instance in the sequences referencing back to a master file in the dailies. Different editors and assistants can be editing different scenes, and all of them can have any source project open concurrently in read-only mode without conflict. As soon as someone saves changes, an icon will alert users that they can update the copy they have open and unlock it to continue working.

Some Limitations
Moving a sequence from one project to another doesn’t retain a link to the original because that could become a mess quickly. But it would be nice to be able to make edits to “reels” and have those changes reflected in a long-play project that strings those reels together. And with so many projects open at once, it can become difficult to keep track of what sequences go with what project panels.

Ideally, a color-coded panel system would help with that, either with random colors for contrast or with user-assigned colors by type of project. In that case it would still be good to highlight what other panels are associated with the selected panel, since two projects might be assigned the same color.

Summing Up
Regardless of those potential changes, I have been using Shared Projects to its fullest potential on a feature film throughout 2019, and I look forward to the improvements that the new Production panel will bring to my future workflows.

Check out this video rundown:


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Quick Chat: Director Sorrel Brae on Rocket Mortgage campaign

By Randi Altman

Production company Native Content and director Sorrel Brae have collaborated once again with Rocket Mortgage’s in-house creative team on two new spots in the ongoing “More Than a House” campaign. Brae and Native had worked together on the campaigns first four offerings.

The most recent spots are More Than a Tradition and More Than a Bear. More Than a Tradition shows a ‘50s family sitting down to dinner and having a fun time at home. Then the audience sees the same family in modern times, hammering home how traditions become traditions.

More Than a Bear combines fantasy and reality as it shows a human-sized teddy bear on an operating table. Then viewers see a worried boy looking on as his mother is repairing the his stuffed animal. Each spot opens with the notes of Bob Dylan’s “The Man In Me,” which is featured in all the “More Than a House” spots.

More Than a Bear was challenging, according to Brae, because there was some darker material in this piece as compared to the others  —  viewers aren’t sure at first if the bear will make it. Brae worked closely with DP Jeff Kim on the lighting and color palette to find a way to keep the tone lighthearted. By embracing primary colors, the two were able to channel a moodier tone and bring viewers inside a scared child’s imagination while still maintaining some playfulness.

We reached out to director Brae to find our more.

Sorrel Brae

What did you shoot these two spots on, and why?
I felt that in order for the comedy to land and the idea to shine, the visual separation between fantasy and reality had to be immediate, even shocking. Shooting on an Alexa Mini, we used different lenses for the two looks: Hawk V-Lite Vintage ’74 anamorphic for epic and cinematic fantasy, and spherical Zeiss and Cooke S4 primes for reality. The notable exception was in the hospital for the teddy bear spot, where our references were the great Spielberg and Zemeckis films from the ‘80s, which are primarily spherical and have a warmer, friendlier feeling.

How did you work with the DP and the colorist on the look? And how would you describe the look of each spot, and the looks within each spot? 
I was fortunate to bring on longtime collaborators DP Jeffrey Kim and colorist Mike Howell for both spots. Over the years, Jeff and I have developed a shorthand for working together. It all starts with defining our intention and deciding how to give the audience the feelings we want them to have.

In Tradition, for example, that feeling is a warm nostalgia for a bygone era that was probably a fantasy then, just as it is now. We looked to period print advertisements, photographs, color schemes, fonts — everything that spoke to that period. Crucial to pulling off both looks in one day was Heidi Adams’ production design. I wanted the architecture of the house to match when cutting between time periods. Her team had to put a contemporary skin on a 1950s interior for us to shoot “reality” and then quickly reset the entire house back to 1950s to shoot “fantasy.”

The intention for More Than a Bear was trickier. From the beginning I worried a cinematic treatment of a traumatic hospital scene wouldn’t match the tone of the campaign. My solution with Jeff was to lean into the look of ‘80s fantasy films like E.T. and Back to the Future with primary colors, gelled lights, a continuously moving camera and tons of atmosphere.

Mike at Color Collective even added a retro Ektachrome film emulation for the hospital and a discontinued Kodak 5287 emulation for the bedroom to complete the look. But the most fun was the custom bear that costume designer Bex Crofton-Atkins created for the scene. My only regret is that the spot isn’t 60 seconds because there’s so much great bear footage that we couldn’t fit into the cut.

What was this edited on? Did you work with the same team on both campaigns?
The first four spots of this campaign were cut by Jai Shukla out of Nomad Edit. Jai did great work establishing the rhythm between fantasy and reality and figuring out how to weave in Bob Dylan’s memorable track for the strongest impact. I’m pretty sure Jai cuts on Avid, which I like to tease him about.

These most recent two spots (Tradition and Teddy Bear) were cut by Zach DuFresne out of Hudson Edit, who did an excellent job navigating scripts with slightly different challenges. Teddy Bear has more character story than any of the others, and Tradition relies heavily on making the right match between time periods. Zach cuts on Premiere, which I’ve also migrated to (from FCP 7) for personal use.

Were any scenes more challenging than the others?
What could be difficult about kids, complex set design, elaborate wardrobe changes and detailed camera moves on a compressed schedule? In truth, it was all equally challenging and rewarding.

Ironically, the shots that gave us the most difficulty probably look the simplest. In Tradition there’s a SteadiCam move that introduces us into the contemporary world, has match cuts on either end and travels through most of the set and across most of the cast. Because everyone’s movements had to perfectly align with a non-repeatable camera, that one took longer than expected.

And on Teddy Bear, the simple shot looking up from the patient’s POV as the doctor/mom looms overhead was surprisingly difficult. Because we were on an extremely wide lens (12mm or similar), our actress had to nail her marks down to the millimeter, otherwise it looked weird. We probably shot that one setup 20 times.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Talking with Franki Ashiruka of Nairobi’s Africa Post Office

By Randi Altman

After two decades of editing award-winning film and television projects for media companies throughout Kenya and around the world, Franki Ashiruka opened Africa Post Office, a standalone, post house in Nairobi, Kenya. The studio provides color grading, animation, visual effects, motion graphics, compositing and more. In addition, they maintain a database of the Kenyan post production community that allows them to ramp up with the right artists when the need arises.

Here she talks about the company, its workflow and being a pioneer in Nairobi’s production industry.

When did you open Africa Post Office, and what was your background prior to starting this studio?
Africa Post Office (APO) opened its doors in February 2017. Prior to starting APO, I was a freelance editor with plenty of experience working with well-established media houses such as Channel 4 (UK), Fox International Channels (UK), 3D Global Leadership (Nigeria), PBS (USA), Touchdown (New Zealand), Greenstone Pictures (New Zealand) and Shadow Films (South Africa).

In terms of Kenya-based projects, I’ve worked with a number of production houses including Quite Bright Films, Fat Rain Films, Film Crew in Africa, Mojo Productions, Multichoice, Zuku, Content House and Ginger Ink Films.

I imagine female-run, independent studios in Africa are rare?
On the contrary, Kenya has reached a point where more and more women are emerging as leaders of their own companies. I actually think there are more women-led film production companies than male-led. The real challenge was that before APO, there was nothing quite like it in Nairobi. Historically, video production here was very vertical — if you shot something, you’d need to also manage post within whatever production house you were working in. There were no standalone post houses until us. That said, with my experience, even though hugely daunting, I never thought twice about starting APO. It is what I have always wanted to do, and if being the first company of our kind didn’t intimidate me, being female was never going to be a hindrance.

L-R: Franki Ashiruka, Kevin Kyalo, Carole Kinyua and Evans Wenani

What is the production and post industry like in Nairobi? 
When APO first opened, the workload was commercial-heavy, but in the last two years that has steadily declined. We’re seeing this gap filled by documentary films, corporate work and television series. Feature films are also slowly gaining traction and becoming the focus of many up-and-coming filmmakers.

What services do you provide, and what types of projects do you work on?
APO has a proven track record of successful delivery on hundreds of film and video projects for a diverse range of clients and collaborators, including major corporate entities, NGOs, advertising and PR agencies, and television stations. We also have plenty of experience mastering according to international delivery standards. We’re proud to house a complete end-to-end post ecosystem of offline and online editing suites.

Most importantly, we maintain a very thorough database of the post production community in Kenya.
This is of great benefit to our clients who come to us for a range of services including color grading, animation, visual effects, motion graphics and compositing. We are always excited to collaborate with the right people and get additional perspectives on the job at hand. One of our most notable collaborators is Ikweta Arts (Avatar, Black Panther, Game of Thrones, Hacksaw Ridge), owned and run by Yvonne Muinde. They specialize in providing VFX services with a focus in quality matte painting/digital environments, art direction, concept and post visual development art. We also collaborate with Keyframe (L’Oréal, BMW and Mitsubishi Malaysia) for motion graphics and animations.

Can you name some recent projects and the work you provided?
We are incredibly fortunate to be able to select projects that align with our beliefs and passions.

Our work on the short film Poacher (directed by Tom Whitworth) won us three global Best Editing Awards from the Short to the Point Online Film Festival (Romania, 2018), Feel the Reel International Film Festival (Glasgow, 2018) and Five Continents International Film Festival (Venezuela, 2019).

Other notable work includes three feature documentaries for the Big Story segment on China Global Television Network, directed by Juan Reina (director of the Netflix Original film Diving Into the Unknown), Lion’s Den (Quite Bright Films) an adaptation of ABC’s Shark Tank and The Great Kenyan Bake Off (Showstopper Media) adopted from the BBC series The Great British Bake Off. We also worked on Disconnect, a feature film produced by Kenya’s Tosh Gitonga (Nairobi Half Life), a director who is passionate about taking Africa’s budding film industry to the next level. We have also worked on a host of television commercials for clients extending across East Africa, including Kenya, Rwanda, South Sudan and Uganda.

What APO is most proud of though, is our clients’ ambitions and determination to contribute toward the growth of the African film industry. This truly resonates with APO’s mantra.

You recently added a MAM and some other gear. Can you talk about the need to upgrade?
Bringing on the EditShare EFS 200 nodes has significantly improved the collaborative possibilities of APO. We reached a point where we were quickly growing, and the old approach just wasn’t going to cut it.

Prior to centralizing our content, projects lived on individual hard disks. This meant that if I was editing and needed my assistant to find me a scene or a clip, or I needed VFX on something, I would have to export individual clips to different workstations. This created workflow redundancies and increased potential for versioning issues, which is something we couldn’t afford to be weighed down with.

The remote capabilities of the EditShare system were very appealing as well. Our color grading collaborator, Nic Apostoli of Comfort and Fame, is based in Cape Town, South Africa. From there, he can access the footage on the server and grade it while the client reviews with us in Nairobi. Flow media asset management also helps in this regard. We’re able to effectively organize and index clips, graphics, versions, etc. into clearly marked folders so there is no confusion about what media should be used. Collaboration among the team members is now seamless regardless of their physical location or tools used, which include the Adobe Creative Suite, Foundry Nuke, Autodesk Maya and Maxon Cinema 4D.

Any advice for others looking to break out on their own and start a post house?
Know what you want to do, and just do it! Thanks Nike …


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

ACE Eddie Awards: Parasite and Jojo Rabbit among winners

By Dayna McCallum

The 70th Annual ACE Eddie Awards concluded with wins for Parasite (edited by Jinmo Yang) for Best Edited Feature Film (Dramatic) and Jojo Rabbit (edited by Tom Eagles) for Best Edited Feature Film (Comedy). Yang’s win marks the first time in ACE Eddie Awards history that a foreign language film won the top prize.

The winner of the Best Edited Feature Film (Dramatic) category has gone on to win the Oscar for film editing in 11 of the last 15 years. In other feature categories, Toy Story 4 (edited by Axel Geddes, ACE) won Best Edited Animated Feature Film and Apollo 11 (edited by Todd Douglas Miller) won Best Edited Documentary.

For the second year in a row, Killing Eve won for Best Edited Drama Series (Commercial Television) for “Desperate Measures” (edited by Dan Crinnion). Tim Porter, ACE, took home his second Eddie for Game of Thrones “The Long Night” in the Best Edited Drama Series (Non-Commercial Television) category, and Chernobyl “Vichnaya Pamyat” (edited by Jinx Godfrey and Simon Smith) won Best Edited Miniseries or Motion Picture for Television.

Other television winners included Better Things “Easter” (edited by Janet Weinberg) for Best Edited Comedy Series (Commercial Television), and last year’s Eddie winner for Killing Eve,  Gary Dollner, ACE, for Fleabag “Episode 2.1″ in the Best Edited Comedy Series (Non-Commercial Television) category.

Lauren Shuler Donner received the ACE’s Golden Eddie honor, presented to her by Marvel’s Kevin Feige. In her heartfelt acceptance speech, she noted to an appreciative crowd, “I’ve witnessed many times an editor make chicken salad our of chicken shit.”

Alan Heim and Tina Hirsch received Career Achievement awards presented by filmmakers Nick Cassavetes and Ron Underwood respectively. Cathy Repola, national executive director of the Motion Picture Editors Guild, was presented  with the ACE Heritage Award. American Cinema Editors president Stephen Rivkin, ACE, presided over the evening’s festivities for the final time, as his second term is ending.  Actress D’Arcy Carden, star of NBC’s The Good Place, served as the evening’s host.

Here is the complete list of winners:

BEST EDITED FEATURE FILM (DRAMA):
Parasite 
Jinmo Yang

Tom Eagles – Jojo Rabbit

BEST EDITED FEATURE FILM (COMEDY):
Jojo Rabbit
Tom Eagles

BEST EDITED ANIMATED FEATURE FILM:
Toy Story 4
Axel Geddes, ACE

BEST EDITED DOCUMENTARY (FEATURE):
Apollo 11
Todd Douglas Miller

BEST EDITED DOCUMENTARY (NON-THEATRICAL):
What’s My Name: Muhammad Ali
Jake Pushinsky, ACE

BEST EDITED COMEDY SERIES FOR COMMERCIAL TELEVISION:
Better Things: “Easter”
Janet Weinberg, ACE

BEST EDITED COMEDY SERIES FOR NON-COMMERCIAL TELEVISION:
Fleabag: “Episode 2.1”
Gary Dollner, ACE

BEST EDITED DRAMA SERIES FOR COMMERCIAL TELEVISION: 
Killing Eve: “Desperate Times”
Dan Crinnion

BEST EDITED DRAMA SERIES FOR NON-COMMERCIAL TELEVISION:
Game of Thrones: “The Long Night”
Tim Porter, ACE

BEST EDITED MINISERIES OR MOTION PICTURE FOR TELEVISION:
Chernobyl: “Vichnaya Pamyat”
Jinx Godfrey & Simon Smith

BEST EDITED NON-SCRIPTED SERIES:
VICE Investigates: “Amazon on Fire”
Cameron Dennis, Kelly Kendrick, Joe Matoske, Ryo Ikegami

ANNE V. COATES AWARD FOR STUDENT EDITING
Chase Johnson – California State University, Fullerton


Main Image: Parasite editor Jinmo Yang

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.

Review: HP’s ZBook G6 mobile workstation

By Brady Betzel

In a year that’s seen AMD reveal an affordable 64-core processor with its Threadripper 3, it appears as though we are picking up steam toward next-level computing.

Apple finally released its much-anticipated Mac Pro (which comes with a hefty price tag for the 1.5TB upgrade), and custom-build workstation companies — like Boxx and Puget Systems — can customize good-looking systems to fit any need you can imagine. Additionally, over the past few months, I have seen mobile workstations leveling the playing field with their desktop counterparts.

HP is well-known in the M&E community for its powerhouse workstations. Since I started my career, I have either worked on a MacPro or an HP. Both have their strong points. However, workstation users who must be able to travel with their systems, there have always been some technical abilities you had to give up in exchange for a smaller footprint. That is, until now.

The newly released HP ZBook 15 G6 has become the rising the rising tide that will float all the boats in the mobile workstation market. I know I’ve said it before, but the classification of “workstation” is technically much more than just a term companies just throw around. The systems with workstation-level classification (at least from HP) are meant to be powered on and run at high levels 24 hours a day, seven days a week, 365 days a year.

They are built with high-quality, enterprise-level components, such as ECC (error correcting code) memory. ECC memory will self-correct errors that it sees, preventing things like blue screens of death and other screen freezes. ECC memory comes at a cost, and that is why these workstations are priced a little higher than a standard computer system. In addition, the warranties are a little more inclusive — the HP ZBook 15 G6 comes with a standard three-year/on-site service warranty.

Beyond the “workstation” classification, the ZBook 15 G6 is amazingly powerful, brutally strong and incredibly colorful and bright. But what really matters is under the hood. I was sent the HP ZBook 15 G6 that retails for $4,096 and contains the following specs:
– Intel Xeon E-2286M (eight cores/16 threads — 2.4GHz base/5GHz Turbo)
– Nvidia Quadro RTX 3000 (6GB VRAM)
15.6-inch UHD HP Dream Color display, anti-glare, WLED backlit 600 nits, 100% DCI-P3
– 64GB DDR4 2667MHz
– 1TB PCIe Gen 3 x4 NVMe SSD TLC
– FHD webcam 1080p plus IR camera
– HP collaboration keyboard with dual point stick
– Fingerprint sensor
– Smart Card reader
– Intel Wi-Fi 6 AX 200, 802.11ac 2×2 +BT 4.2 combo adapter (vPro)
– HP long-life battery four-cell 90 Wh
– Three-year limited warranty

The ZBook 15 G6 is a high-end mobile workstation with a price that reflects it. However, as I said earlier, true workstations are built to withstand constant use and, in this case, abuse. The ZBook 15 G6 has been designed to pass up to 21 extensive MIL-STD 810G tests, which is essentially worst-case scenario testing. For instance, drop testing of around four feet, sand and dust testing, radiation testing (the sun beating down on the laptop for an extended period) and much more.

The exterior of the G6 is made of aluminum and built to withstand abuse. The latest G6 is a little bulky/boxy, in my opinion, but I can see why it would hold up to some bumps and bruises, all while working at blazingly fast speeds, so bulk isn’t a huge issue for me. Because of that bulk, you can imagine that this isn’t the lightest laptop either. It weighs in at 5.79 pounds for the lowest end and measures 1 inch by 14.8 inches by 10.4 inches.

On the bottom of the workstation is an easy-to-access panel for performing repairs and upgrades yourself. I really like the bottom compartment. I opened it and noticed I could throw in an additional NVMe drive and an SSD if needed. You can also access memory here. I love this because not only can you perform easy repairs yourself, but you can perform upgrades or part replacements without voiding your warranty on the original equipment. I’m glad to see that HP kept this in mind.

The keyboard is smaller than a full-size version but has a number keypad, which I love using when typing in timecodes. It is such a time-saver for me. (I credit entering in repair order numbers when I fixed computers at Best Buy as a teenager.) On the top of the keyboard are some handy shortcuts if you do web conferences or calls on your computer, including answering and ending calls. The Bang & Olufsen speakers are some of the best laptop speakers I’ve heard. While they aren’t quite monitor-quality, they do have some nice sound on the low end that I was able to fine-tune in the Bang & Olufsen audio control app.

Software Tests
All right, enough of the technical specs. Let’s get on to what people really want to know — how the HP ZBook 15 G6 performs while using apps like Blackmagic’s DaVinci Resolve and Adobe Premiere Pro. I used sample Red and Blackmagic Raw footage that I use a lot in testing. You can grab the Red footage here and the BRaw footage here. Keep in mind you will need to download the BRaw software to edit with BRaw inside of Adobe products, which you can find here).

Performance monitor while exporting in Resolve with VFX.

For testing in Resolve and Premiere, I strung out one-minute of 4K, 6K and 8K Red media in one sequence and the 4608×2592 4K and 6K BRaw media in another. During the middle of my testing Resolve had a giant Red API upgrade to allow for better realtime playback of Red Raw files if you have an Nvidia CUDA-based GPU.

First up is Resolve 16.1.1 and then Resolve 16.1.2. Both sequences are set to UHD (3840×2160) resolution. One sequence of each codec contains just color correction, while another of each codec contains effects and color correction. The Premiere sequence with color and effects contains basic Lumetri color correction, noise reduction (50) and a Gaussian blur with settings of 0.4. In Resolve, the only difference in the color and effects sequence is that the noise reduction is spatial and set to Enhanced, Medium and 25/25.

In Resolve, the 4K Red media would play in realtime while the 6K (RedCode 3:1) would jump down to about 14fps to 15fps, and the 8K (RedCode 7:1) would play at 10fps at full resolution with just color correction. With effects, the 4K media would play at 20fps, 6K at 3fps and 8K at 10fps. The Blackmagic Raw video would play at real time with just color correction and around 3fps to 4fps with effects.

This is where I talk about just how loud the fans in the ZBook 15 G6 can get. When running exports and benchmarks, the fans are noticeable and a little distracting. Obviously, we are running some high-end testing with processor- and GPU-intensive tests but still, the fans were noticeable. However, the bottom of the mobile workstation was not terribly hot, unlike the MacBook Pros I’ve tested before. So my lap was not on fire.

In my export testing, I used those same sequences as before and from Adobe Premiere Pro 2020. I exported UHD files using Adobe Media Encoder in different containers and codecs: H.264 (Mov), H.265 (Mov), ProResHQ, DPX, DCP and MXF OP1a (XDCAM). The MXF OP1a was at 1920x1080p export.
Here are my results:

Red (4K,6K,8K)
– Color Only: H.264 – 5:27; H.265 – 4:45; ProResHQ – 4:29; DPX – 3:37; DCP – 10:38; MXF OP1a – 2:31

Red Color, Noise Reduction (50), Gaussian Blur .4: H.264 – 4:56; H.265 – 4:56; ProResHQ – 4:36; DPX – 4:02; DCP – 8:20; MXF OP1a – 2:41

Blackmagic Raw
Color Only: H.264 – 2:05; H.265 – 2:19; ProResHQ – 2:04; DPX – 3:33; DCP – 4:05; MXF OP1a – 1:38

Color, Noise Reduction (50), Gaussian Blur 0.4: H.264 – 1:59; H.265 – 2:22; ProResHQ – 2:07; DPX – 3:49; DCP – 3:45; MXF OP1a – 1:51

What is surprising is that when adding effects like noise reduction and a Gaussian blur in Premiere, the export times stayed similar. While using the ZBook 15 G6, I noticed my export times improved when I upgraded driver versions, so I re-did my tests with the latest Nvidia drivers to make sure I was consistent. The drivers also solved an issue in which Resolve wasn’t reading BRaw properly, so remember to always research drivers.

The Nvidia Quadro RTX 3000 really pulled its weight when editing and exporting in both Premiere and Resolve. In fact, in previous versions of Premiere, I noticed that the GPU was not really being used as well as it should have been. With the Premiere Pro 2020 upgrade it seems like Adobe really upped its GPU usage game — at some points I saw 100% GPU usage.

In Resolve, I performed similar tests, but instead of ProResHQ I exported a DNxHR QuickTime file/package instead of a DCP and IMF package. For the most part, they are stock exports in the Deliver page of Resolve, except I forced Video Levels, Forced Debayer and Resizing to Highest Quality. Here are my results from Resolve version 16.1.1 and 16.1.2. (16.1.2 will be in parenthesis.)

– Red (4K, 6K, 8K) Color Only: H.264 – 2:17 (2:31); H.265 – 2:23 (2:37); DNxHR – 2:59 (3:06); IMF – 6:37 (6:40); DPX – 2:48 (2:45); MXF OP1A – 2:45 (2:33)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 5:00 (5:15); H.265 – 5:18 (5:21); DNxHR – 5:25 (5:02); IMF – 5:28 (5:11); DPX – 5:23 (5:02); MXF OP1a – 5:20 (4:54)

-Blackmagic Raw Color Only: H.264 – 0:26 (0:25); H.265 – 0:31 (0:30); DNxHR – 0:50 (0:50); IMF – 3:51 (3:36); DPX – 0:46 (0:46); MXF OP1a – 0:23 (0:22)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 7:51 (7:53); H.265 – 7:45 (8:01); DNxHR – 7:53 (8:00); IMF – 8:13 (7:56); DPX – 7:54 (8:18); MXF OP1a – 7:58 (7:57)

Interesting to note: Exporting Red footage with color correction only was significantly faster from Resolve, but for Red footage with effects applied, export times were similar between Resolve and Premiere. With the CUDA Red SDK update to Resolve in 16.1.2, I thought I would see a large improvement, but I didn’t. I saw an approximate 10% increase in playback but no improvement in export times.

Puget

Puget Systems has some great benchmarking tools, so I reached out to Matt Bach, Puget Systems’ senior labs technician, about my findings. He suggested that the mobile Xeon could possibly still be the bottleneck for Resolve. In his testing he saw a larger increase in speed with AMD Threadripper 3 and Intel i9-based systems. Regardless, I am kind of going deep on realtime playback of 8K Red Raw media on a mobile workstation — what a time we are in. Nonetheless, Blackmagic Raw footage was insanely fast when exporting out of Resolve, while export time for the Blackmagic Raw footage with effects was higher than I expected. There was a consistent use of the GPU and CPU in Resolve much like in the new version of Premiere 2020, which is a trend that’s nice to see.

In addition to Premiere and Resolve testing, I ran some common benchmarks that provide a good 30,000-foot view of the HP ZBook 15 G6 when comparing it to other systems. I decided to use the Puget Systems benchmarking tools. Unfortunately, at the time of this review, the tools were only working properly with Premiere and After Effects 2019, so I ran the After Effects benchmark using the 2019 version. The ZBook 15 G6 received an overall score of 802, render score of 79, preview score of 75.2 and tracking score of 86.4. These are solid numbers that beat out some desktop systems I have tested.

Corona

To test some 3D applications, I ran the Cinebench R20, which gave a CPU score of 3243, CPU (single core) score of 470 and an M/P ratio of 6.90x. I recently began running the Gooseberry benchmark scene in Blender to get a better sense of 3D rendering performance, and it took 29:56 to export. Using the Corona benchmark, it took 2:33 to render 16 passes, 3,216,368 rays/s. Using Octane Bench the ZBook 15 G6 received a score of 139.79. In the Vray benchmark for CPU, it received 9833 Ksamples, and in the Vray GPU testing, 228 mpaths. I’m not going to lie; I really don’t know a lot about what these benchmarks are trying to tell me, but they might help you decide whether this is the mobile workstation for your work.

Cinebench

One benchmark I thought was interesting between driver updates for the Nvidia Quadro RTX 3000 was the Neat Bench from Neat Video — the noise reduction plugin for video. It measures whether your system should use the CPU, GPU or a combination thereof to run Neat Video. Initially, the best combination result was to use the CPU only (seven cores) at 11.5fps.

After updating to the latest Nvidia drivers, the best combination result was to use the CPU (seven cores) and GPU (Quadro RTX 3000) at 24.2fps. A pretty incredible jump just from a driver update. Moral of the story: Make sure you have the correct drivers always!

Summing Up
Overall, the HP ZBook 15 G6 is a powerful mobile workstation that will work well across the board. From 3D to color correction apps, the Xeon processor in combination with the Quadro RTX 3000 will get you running 4K video without a problem. With the HP DreamColor anti-glare display using up to 600 nits of brightness and covering 100% of the DCI-P3 color space, coupled with the HDR option, you can rely on the attached display for color accuracy if you don’t have your output monitor attached. And with features like two USB Type-C ports (Thunderbolt 3 plus DP 1.4 plus USB 3.1 Gen 2), you can connect external monitors for a larger view of your work

The HP Fast Charge will get you out of a dead battery fiasco with the ability to go from 0% to 50% charge in 45 minutes. All of this for around $4,000 seems to be a pretty low price to pay, especially because it includes a three-year on-site warranty and because the device is certified to work seamlessly with many apps that pros use with HP’s independent software vendor verifications.

If you are looking for a mobile workstation upgrade, are moving from desktop to mobile or want an alternative to a MacBook Pro, you should price a system out online.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Behind the Title: Film Editor Edward Line

By Randi Altman

This British editor got his start at Final Cut in London, honing his craft and developing his voice before joining Cartel in Santa Monica.

NAME: Edward Line

COMPANY: Cartel

WHAT KIND OF COMPANY IS CARTEL?
Cartel is an editorial and post company based in Santa Monica. We predominantly service the advertising industry but also accommodate long-form projects and other creative content. I joined Cartel as one of the founding editors in 2015.

CAN YOU GIVE US SOME MORE DETAIL ABOUT YOUR JOB?
I assemble the raw material from a film shoot into a sequence that tells the story and communicates the idea of a script. Sometimes I am involved before the shoot and cut together storyboard frames to help the director decide what to shoot. Occasionally, I’ll edit on location if there is a technical element that requires immediate approval for the shoot to move forward.

Edward Line working on Media Composer

During the edit, I work closely with the directors and creative teams to realize their vision of the script or concept and bring their ideas to life. In addition to picture editing, I incorporate sound design, music, visual effects and graphics into the edit. It’s a collaboration between many departments and an opportunity to validate existing ideas and try new ones.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THE FILM EDITOR TITLE?
A big part of my job involves collaborating with others, working with notes and dealing with tricky situations in the cutting room. Part of being a good editor is having the ability to manage people and ideas while not compromising the integrity and craft of the edit. It’s a skill that I’m constantly refining.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love being instrumental in bringing creative visions together and seeing them realized on screen, while being able to express my individual style and craft.

WHAT’S YOUR LEAST FAVORITE?
Tight deadlines. Filming with digital formats has allowed productions to shoot more and specify more deliverables. However, providing the editor proportional time to process everything is not always a consideration and can add pressure to the process.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I am a morning person so I tend to be most productive when I have fresh eyes. I’ve often executed a scene in the first few hours of a day and then spent the rest of the day (and night) fine-tuning it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I have always had a profound appreciation for design and architecture, and in an alternate universe, I could see myself working in that world.

WHY DID YOU CHOOSE THIS PROFESSION?
I’ve always had ambitions to work in filmmaking and initially worked in TV production after I graduated college. After a few years, I became curious about working in post and found an entry-level job at the renowned editorial company Final Cut in London. I was inspired by the work Final Cut was doing, and although I’d never edited before, I was determined to give editing a chance.

CoverGirl

I spent my weekends and evenings at the office, teaching myself how to edit on Avid Media Composer and learning editing techniques with found footage and music. It was during this experimental process, that I fell in love with editing and I never looked back.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
In the past year I have edited commercials for CoverGirl, Sephora, Bulgari, Carl’s Jr. and Smartcar. I have also cut a short film called Dad Was, which will be submitted to festivals in 2020.

HOW HAVE YOU DEVELOPED NEW SKILLS WHEN CUTTING FOR A SPECIFIC GENRE OR FORMAT?
Cutting music videos allowed me to hone my skills to edit musical performance while telling visual stories efficiently. I learned how to create rhythm and pace through editing and how to engage an audience when there is no obvious narrative. The format provided me with a fertile place to develop my individual editing style and perfect my storytelling skills.

When I started editing commercials, I learned to be more disciplined in visual storytelling, as most commercials are rarely longer than 60 seconds. I learned how to identify nuances in performance and the importance of story beats, specifically when editing comedy. I’ve also worked on numerous films with VFX, animation and puppetry. These films have allowed me to learn about the potential for these visual elements while gaining an understanding of the workflow and process.

More recently, I have been enjoying cutting dialogue in short films. Unlike commercials, this format allows more time for story and character to develop. So when choosing performances, I am more conscious of the emotional signals they send to the audience and overarching narrative themes.

Sephora

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It’s tough to narrow this down to one project…

Recently, I worked on a commercial for the beauty retailer Sephora that promoted its commitment to diversity and inclusivity. The film Identify As We is a celebration of the non-binary community and features a predominantly transgender cast. The film champions ideas of being different and self expression while challenging traditional perceptions of beauty. I worked tirelessly with the director and creative team to make sure we treated the cast and footage with respect while honoring the message of the campaign.

I’m also particularly proud of a short film that I edited called Wale. The film was selected for over 30 film festivals across the globe and won several awards. The culmination of the film’s success was receiving a BAFTA nomination and being shortlisted for the 91st Academy Awards for Best Live Action Short Film.

WHAT DO YOU USE TO EDIT?
I work on Avid Media Composer, but I have recently started to flirt with Adobe Premiere. I think it’s good to be adaptable, and I’d hate to restrict my ability to work on a project because of software.

Wale

ARE YOU OFTEN ASKED TO DO MORE THAN EDIT? IF SO, WHAT ELSE ARE YOU ASKED TO DO?
Yes, I usually incorporate other elements such as sound design, music and visual effects into my edits as they can be instrumental to the storytelling or communication of an idea. It’s often useful for the creative team and other film departments to see how these elements contribute to the final film, and they can sometimes inform decisions in the edit.

For example, sound can play a major part in accenting a moment or providing a transition to another scene, so I often spend time placing sound effects and sourcing music during the edit process. This helps me visualize the scene in a broader context and provides new perspective if I’ve become overfamiliar with the footage.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
No surprises, but my smartphone! Apart from the obvious functions, it’s a great place to review edits and source music when I’m on the move. I’ve also recently purchased a Bluetooth keyboard and Wacom tablet, which make for a tidy work area.

I’m also enjoying using my “smart thermostat” at home which learns my behavior and seems to know when I’m feeling too hot or cold.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Once I have left the edit bay, I decompress by listening to music on the way home. Once home, I take great pleasure from cooking for myself, friends and family.

Maryanne Brandon’s path, and editing Star Wars: The Rise of Skywalker

By Amy Leland

In the interest of full disclosure, I have been a fan of both the Star Wars world and the work of J.J. Abrams for a very long time. I saw Star Wars: Episode IV – A New Hope  in the theaters with my big brother when I was five years old, and we were hooked. I don’t remember a time in my life without Star Wars. And I have been a fan of all of Abrams’ work, starting with Felicity. Periodically, I go back and rewatch Felicity, Alias and Lost. I was, in fact, in the middle of Season 2 of Alias and had already purchased my ticket for The Rise of Skywalker when I was assigned this interview.

As a female editor, I have looked up to Maryann Brandon, ACE, and Mary Jo Markey, ACE — longtime Abrams collaborators — for years. A chance to speak with Brandon was more than a little exciting. After getting the fangirl out of my system at the start of the interview, we had a wonderful conversation about her incredible career and this latest Star Wars offering.

After working in the world of indie film in New York City after NYU film school, Brandon has not only been an important part of J.J. Abrams’ world — serving as a primary editor on Alias, and then on Mission Impossible III, Super 8 and two films each in the Star Trek and Star Wars worlds — but has also edited The Jane Austen Book Club, How to Train Your Dragon and Venom, among others.

Maryann Brandon

Let’s dig a bit deeper with Brandon…

How did your path to editing begin?
I started in college, but I wasn’t really editing. I was just a member of the film society. I was recruited by the NYU Graduate Film program in 1981 because they wanted women in the program. And I thought, it’s that or working on Wall Street, and I wasn’t really that great with the money or numbers. I chose film school.

I had no idea what it was going to be like because I don’t come from a film background or a film family. I just grew up loving films. I ended up spending three years just running around Manhattan, making movies with everyone, and everyone did every job. Then, when I got out of school, I had to finish my thesis film, and there was no one to edit it for me. So I ended up editing it myself. I started to meet people in the business because New York was very close. I got offered a paid position in editing, and I stayed.

I met and worked for some really incredible people along the way. I worked as a second assistant on the Francis Ford Coppola film The Cotton Club. I went from that to working as a first assistant on Richard Attenborough’s version of A Chorus Line. I was sent to London and got swept up in the editing part of it. I like telling stories. It became the thing I did. And that’s how it happened.

Who inspired you in those early days?
I was highly influenced by Dede Allen. She was this matriarch of New York at that time, and I was so blown away by her and her personality. I mean, her work spoke for itself, but she was also this incredible person. I think it’s my nature anyway, but I learned from her early on an approach of kindness and caring. I think that’s part of why I stayed in the cutting room.

On set, things tend to become quite fraught sometimes when you’re trying to make something happen, but the cutting room is this calm place of reality, and you could figure stuff out. She was very influential to me, and she was such a kind, caring person. She cared about everyone in the cutting room, and she took time to talk to everyone.

There was also John Bloom, who was the editor on A Chorus Line. We became very close, and he always used to call me over to see what he was doing. I learned tons from him. In those days, we cut on film, so it was running through your fingers.

The truth is everyone I meet influences me a bit. I am fascinated by each person’s approach and why they see things the way they do.

While your resume is eclectic, you’ve worked on many sci-fi and action films. Was that something you were aiming for, or did it happen by chance?
I was lucky enough to meet J.J. Abrams, and I was lucky enough to get on Alias, which was not something I thought I’d want to do. Then I did it because it seemed to suit me at the time. It was a bit of faith and a bit of, “Oh, that makes sense for you, because you grew up loving Twilight Zone and Star Trek.”

Of course, I’d love to do more drama. I did The Jane Austen Book Club and other films like that. One does tend to get sort of suddenly identified as, now I’m the expert on sci-fi and visual effects. Also, I think because there aren’t a lot of women who do that, it’s probably something people notice. But I’d love to do a good comedy. I’d love to do something like Jumanji, which I think is hilarious.

How did this long and wonderful collaboration with J.J. Abrams get started?
Well, my kids were getting older. It was getting harder and harder for me to go on location with the nanny, the dog, the nanny’s kids, my kids, set up a third grade class and figure out how to do it all. A friend of mine who was a producer on Felicity had originally tried to get me to work on that show. She said, “You’ll love J.J. You’ll love (series creator) Matt Reeves. Come and just meet us.” I just thought television is such hard work.

Then he was starting this new show, Alias. My friend said, “You’re going to love it. Just meet him.” And I did. Honestly, I went to an interview with him, and I spent an hour basically laughing at every joke he told me. I thought, “This guy’s never going to hire me.” But he said, “Okay, I’ll see you tomorrow.” That’s how it started.

What was that like?
Alias was so much fun. I didn’t work on Felicity, which was more of a straightforward drama about a college girl growing up. Alias was this crazy, complicated, action-filled show, but also a girl trying to grow up. It was all of those things. It was classic J.J. It was a challenge, and it was really fun because we all discovered it together. There were three other female editors who are amazing — Mary Jo Markey, Kristin Windell, and Virginia Katz — and there was J.J. and Ken Olin, who was a producer in residence there and director. We just found the show together, and that was really fun.

How has your collaboration with J.J. changed over time?
It’s changed in terms of the scope of a project and what we have to do. And, obviously, the level of conflict and communication is pretty easy because we’ve known each other for so long. There’s not a lot of barriers like, “Hey, I’m trying to get to know you. What do I…?” We just jump right in. Over the years, it’s changed a bit.

On The Rise of Skywalker, I cut this film with a different co-editor. Mary Jo [Markey, Brandon’s longtime co-editor] was doing something else at the time, so I ended up working with Stefan Grube. The way I had worked with Mary Jo was we would divide up the film. She’d do her thing and I’d do mine. But because these films are so massive, I prefer not to divide it up, but instead have both of us work on whatever needs working on at the time to get it done. I proposed this to J.J., and it worked out great. Everything got cut immediately and we got together periodically to ask him what he thought.

Another thing that changed was, because we needed to turn over our visual effects really quickly, I proposed that I cut on the set, on location, when they were shooting. At first J.J. was like, “We’ve never done this before.” I said, “It’s the only way I’m going to get your eyes on sequences,” because by the time the 12-hour day is over, everyone’s exhausted.

It was great and worked out well. I had this little mobile unit, and the joke was it was always within 10 feet of wherever J.J. was. It was also great because I felt like I was part of the crew, and they felt like they could talk to me. I had the DP asking me questions. I had full access to the visual effects supervisor. We worked out shots on the set. Given the fact that you could see what we already had, it really was a game-changer.

What are some of the challenges of working on films that are heavy on action, especially with the Star Wars and Star Trek films and all the effects and CGI?
There’s a scene where they arrive on Exogal, and they’re fighting with each other and more ships are arriving. All of that was in my imagination. It was me going, “Okay, that’ll be on the screen for this amount of time.” I was making up so much of it and using the performances and the story as a guide. I worked really closely with the visual effects people describing what I thought was going to happen. They would then explain that what I thought was going to happen was way too much money to do.

Luckily I was on the set, so I could work it out with J.J. as we went. Sometimes it’s better for me just to build something that I imagine and work off of that, but it’s hard. It’s like having a blank page and then knowing there’s this one element, and then figuring out what the next one will be.

There are people who are incredibly devoted to the worlds of Star Trek and Star Wars and have very strong feelings about those worlds. Does that add more pressure to the process?
I’m a big fan of Star Trek and Star Wars, as is J.J. I grew up with Star Trek, and it’s very different because Star Trek was essentially a week-to-week serial that featured an adventure, and Star Wars is this world where they’re in one major war the whole time.

Sometimes I would go off on a tangent, and J.J. and my co-editor Stefan would be like, “That’s not in the lore,” and I’d have to pull it back and remember that we do serve a fan base that is loyal to it. When I edit anything, I really try to abandon any kind of preconceived thing I have so I can discover things.

I think there’s a lot of pressure to answer to the first two movies, because this is the third, and you can’t just ignore a story that’s been set up, right? We needed to stay within the boundaries of that world. So yeah, there’s a lot of pressure to do that, for sure. One of the things that Chris Terrio and J.J., as the writers, felt very strongly about was having it be Leia’s final story. That was a labor of love for sure. All of that was like a love letter to her.

I don’t know how much of that had been decided before Carrie Fisher (Leia) died. It was my understanding that you had to reconstruct based on things she shot for the other films.
She died before this film was even written, so all of the footage you see is from Episode 7. It’s all been repurposed, and scenes were written around it. Not just for the sake of writing around the footage, but they created scenes that actually work in the context of the film. A lot of what works is due to Daisy Ridley and the other actors who were in the scenes with her. I mean, they really brought her to life and really sold it. I have to say they were incredible.

With two editors co-editing on set during production, you must have needed an extensive staff of assistant editors. How do you work with assistant editors on something of this scale?
I’ve worked with an assistant editor named Jane Tones on the last couple of films. She is amazing. She was the one who figured out how to make the mobile unit work on set. She’s incredibly gifted, both technologically and story-wise. She was instrumental in organizing everything to do with the edit and getting us around. Stefan’s assistant was Warren Paeff, and he is very experienced. We also had a sound person we carried with us and a couple of other assistants. I had another assistant, Ben Cox, who was such a Star Wars fan. When I said, “I’m happy to hire you, but I only have a second assistant position.” He was like, “I’ll take it!”

What advice do you have for someone starting out or who would like to build the kind of career you’ve made?
I would say, try to get a PA job or a job in the cutting room where you really enjoy the people, and pay attention. If you have ideas, don’t be shy but figure out how to express your ideas. I think people in the cutting room are always looking for anyone with an opinion or reaction because you need to step back from it. It’s a love of film, a love of storytelling and a lot of luck. I work really hard, but I also had a lot of good fortune meeting the people I did.


Amy Leland is a film director and editor. Her short film, Echoes, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

CVLT adds Joe Simons as lead editor

Bi-coastal production studio CVLT, which offers full-service production and post, has Joe Simons as lead editor. He will be tasked with growing CVLT’s editorial department. He edits on Adobe Premiere and will be based in the New York studio.

Simons joins CVLT after three years at The Mill, where his edited the “It’s What Connects Us” campaign for HBO, the “Top Artist of the Year” campaign for Spotify and several major campaigns for Ralph Lauren, among many others. Prior to The Mill, he launched his career at PS260 before spending four years at editing house Cut+Run.

Simons’ addition comes at a time when CVLT is growing into a full concept-to-completion creative studio, launching campaigns for top luxury and fashion brands, including Lexus, Peloton and Louis Vuitton.

“Having soaked up everything I could at The Mill and Cut+Run, it was time for me to take that learning and carve my own path,” says Simons.

Maxon and Red Giant to merge

Maxon, developers of pro 3D software solutions, and Red Giant, makers of tools for editors, VFX artists, and motion designers, have agreed to merge under the media and entertainment division of Nemetschek Group. The transaction is expected to close in January 2020, subject to regulatory approval and customary closing conditions.

Maxon, best known for its 3D product Cinema 4D, was formed in 1986 to provide high-end yet accessible 3D software solutions. Artists across the globe rely on Maxon products to create high-end visuals. In April of this year, Maxon acquired Redshift, developer of the GPU-accelerated Redshift render engine.

Since 2002, Red Giant has built its brand through products such as Trapcode, Magic Bullet, Universe, PluralEyes and its line of visual effects software. Its tools are used in the fields of film, broadcast and advertising.

The two companies provide tools for companies including ABC, CBS, NBC, HBO, BBC, Sky, Fox Networks, Turner Broadcasting, NFL Network, WWE, Viacom, Netflix, ITV Creative, Discovery Channel, MPC, Digital Domain, VDO, Sony, Universal, The Walt Disney Company, Blizzard Entertainment, BMW, Facebook, Apple, Google, Vitra, Nike and many more.

Main Photo: L-R: Maxon CEO Dave McGavran and Red Giant CEP Chad Bechert

Behind the title: Cutters editor Steve Bell

“I’ve always done a fair amount of animation design, music rearranging and other things that aren’t strictly editing, but most editors are expected to play a role in aspects of the post process that aren’t strictly editing.”

Name: Steve Bell

What’s your job title?
Editor

Company: Cutters Editorial

Can you describe your company?
Cutters is part of a global group of companies offering offline editing, audio engineering, VFX and picture finishing, production and design – all of which fall under Cutters Studios. Here in New York, we do traditional broadcast TV advertising and online content, as well as longer format work and social media content for brands, directors and various organizations that hire us to develop a concept, shoot and direct.

Cutters New York

What’s your job title?
Editor

What’s your favorite part of the job?
There’s a stage to pretty much every project where I feel I’ve gotten a good enough grasp of the material that I can connect the storytelling dots and see it come to life. I like problem solving and love the feeling you get when you know you’ve “figured it out.”

Depending on the scale of the project, it can start a few hours in, a few days in or a few weeks in, but once it hits you can’t stop until you see the piece finished. It’s like reading a good page-turner; you can’t put it down. That’s the part of the creative process I love and what I like most about my job.

What’s your least favorite?
It’s those times when it becomes clear that I’ve/we’ve probably looked at something too many times to actually make it better. That certainly doesn’t happen on many jobs, but when it does, it’s probably because too many voices have had a say; too many cooks in the kitchen, as they say.

What is your most productive time of the day?
Early in the morning. I’m most clearheaded at the very beginning of the day, and then sometimes toward the very end of a long day. But those times also happen to be when I’m most likely to be alone with what I’m working on and free from other distractions.

If you didn’t have this job, what would you be doing instead? 
Baseball player? Astronaut? Joking. But let’s face it, we all fantasize about fulfilling the childhood dreams that are completely different from what we do. To be truthful I’m sure I’d be doing some kind of writing, because it was my desire to be a writer, particularly of film, that indirectly led me to be an editor.

Why did you choose this profession? How early on did you know this would be your path?
Well the simple answer is probably that I had opportunities to edit professionally at a relatively young age, which forced me to get better at editing way before I had a chance to get better at writing. If I keep editing I may never know if I can write!

Stella Artois

Can you name some recent projects you have worked on?
The Dwyane Wade Budweiser retirement film, Stella Artois holiday spots, a few films for the Schott/Hamilton watch collaboration. We did some fun work for Rihanna’s Savage X Fenty release. Early in the year I did a bunch of lovely spots for Hallmark Hall of Fame programming.

Do you put on a different hat when cutting for a specific genre?
For sure. There are overlapping tasks, but I do believe it takes a different set of skills to do good dramatic storytelling than it takes to do straight comedy, or doc or beauty. Good “Storytelling” (with a capital ‘S’) is helpful in all of it — I’d probably say crucial. But it comes down to the important element that’s used to create the story: emotion, humor, rhythm, etc. And then you need to know when it needs to be raw versus formal, broad versus subtle and so forth. Different hats are needed to get that exactly right.

What is the project that you are most proud of and why?
I’m still proud of the NHL’s No Words spot I worked on with Cliff Skeete and Bruce Jacobson. We’ve become close friends as we’ve collaborated on a lot of work since then for the NHL and others. I love how effective that spot is, and I’m proud that it continues to be referenced in certain circles.

NHL No Words

In a very different vein, I think I’m equally proud of the work I’ve done for the UN General Assembly meetings, especially the film that accompanied Kathy Jetnil-Kijiner’s spoken word performance of her poem “Dear Matafele Peinem” during the opening ceremonies of the UN’s first Climate Change conference. That’s an issue that’s very important to me and I’m grateful for the chance to do something that had an impact on those who saw it.

What do you use to edit?
I’m a Media Composer editor, and it probably goes back to the days when I did freelance work for Avid and had to learn it inside out. The interface at least is second nature to me. Also, the media sharing and networking capabilities of Avid make it indispensable. That said, I appreciate that Premiere has some clear advantages in other ways. If I had to start over I’m not sure I wouldn’t start with Premiere.

What is your favorite plugin?
I use a lot of Boris FX plugins for stabilization, color correction and so forth. I used to use After Effects often, and Boris FX offers a way of achieving some of what I once did exclusively in After Effects.

Are you often asked to do more than edit? If so, what else are you asked to do?
I’ve always done a fair amount of animation design, music rearranging and other things that aren’t strictly editing, but most editors are expected to play a role in aspects of the post process that aren’t strictly “film editing.”

Many of my clients know that I have strong opinions about those things, so I do get asked to participate in music and animation quite often. I’m also sometimes asked to help with the write-ups of what we’ve done in the edit because I like talking about the process and clarifying what I’ve done. If you can explain what you’ve done you’re probably that much more confident about the reasons you did it. It can be a good way to call “bullshit” on yourself.

This is a high stress job with deadlines and client expectations. What do you do to de-stress from it all?
Yeah, right?! It can be stressful, especially when you’re occasionally lucky enough to be busy with multiple projects all at once. I take decompressing very seriously. When I can, I spend a lot of time outdoors — hiking, biking, you name it — not just for the cardio and exercise, which is important enough, but also because it’s important to give your eyes a chance to look off into the distance. There are tremendous physical and psychological benefits to looking to the horizon.

Review: The Sensel Morph hardware interface

By Brady Betzel

As an online editor and colorist, I have tried a lot of hardware interfaces designed for apps like Adobe Premiere, Avid Media Composer, Blackmagic DaVinci Resolve and others. With the exception of professional color correction surfaces like the FilmLight Baselight, the Resolve Advanced Panel and Tangent’s Element color correction panels, it’s hard to get exactly what I need.

While they typically work well, there is always a drawback for my workflow; usually they are missing one key shortcut or feature. Enter Sensel Morph, a self-proclaimed morphable hardware interface. In reality, it is a pressure-sensitive trackpad that uses individual purchasable magnetic rubber overlays and keys for a variety of creative applications. It can also be used as a pressure-sensitive trackpad without any overlays.

For example, inside of the Sensel app you can identify the Morph as a trackpad and click “Send Map to Morph,” and it will turn itself into a large trackpad. If you are a digital painter, you can turn the Morph into “Paintbrush Area” and use a brush and/or your fingers to paint! Once you understand how to enable the different mappings you can quickly and easily Morph between settings.

For this review, I am going to focus on how you can use the Sensel Morph with Adobe Premiere Pro. For the record, you can actually use it with any NLE by creating your own map inside of the Sensel app. The Morph essentially works with keyboard shortcuts for NLEs. With that in mind, if you customize your keyboard shortcuts you are going to want to enable the default mapping inside of Premiere or adjust your settings to match the Sensel Morph’s settings.

Before you plug in your Morph, you will need to click over to https://sensel.com/pages/support, where you can get a quick-start guide in addition to the Sensel app you will need to install before you get working. After it’s downloaded and installed, you will want to plug in the Morph via the USB and let it charge before using the Bluetooth connection. It took a while for the Morph to fully charge, about two hours, but once I installed the Sensel app, added the Video Editing Overlay and opened Adobe Premiere, I was up and working.

To be honest, I was a little dubious about the Sensel Morph. A lot of these hardware interfaces have come across my desk, and they usually have poor software implementation, or the hardware just doesn’t hold up. But the Sensel Morph broke through my preconceived ideas of hardware controllers for NLEs like Premiere, and for the first time in a long time, I was inspired to use Premiere more often.

It’s no secret that I learned professional editing in Avid Media Composer and Symphony. And most NLEs can’t quite rise to the level of professional experience that I have experienced in Symphony. One of those experiences is how well and fluid the keyboard and Wacom tablet work together. The first time I plugged in the Sensel Morph, overlayed the Video Editing Overlay on top of the Morph and opened Premiere, I began to have that same feeling but inside of Premiere!

While there are still things Premiere has issues with, the Sensel Morph really got me feeling good about how well this Adobe NLE worked. And to be honest, some of those issues relate to me not learning Premiere’s keyboard shortcuts like I did in Avid. The Sensel Morph felt like a natural addition to my Premiere editing workflow. It was the first time I started to feel that “flow state” inside of Premiere that I previously got into when using Media Composer or Symphony, and I started trimming and editing like a mad man. It was kind of shocking to me.

You may be thinking that I am blowing this out of proportion, and maybe I am, a little, but the Morph immediately improved my lazy Premiere editing. In fact, I told someone that Adobe should package these with first-time Premiere users.

I really like the way the timeline navigation works (much like the touch bar). I also like the quick Ripple Left/Right commands, and I like how you can quickly switch timelines by pressing the “Timeline” button multiple times to cycle through them. I did feel like I needed a mouse some of the time and keyboard for some of the time, but for about 60% of the time I could edit without them. Much like how I had to force myself to use a Wacom tablet for editing, if you try not to use a mouse I think you will get by just fine. I did try and use a Wacom stylus with the Sensel Morph and, unfortunately, it did not work.

What improvements could the Sensel Morph make? Specifically in Premiere, I wish they had a full-screen shortcut (“`”) labeled on the Morph. It’s one of those shortcuts I use all the time, whether I want to see my timeline full screen, the effects controls full screen or the Program feed full screen. And while I know I could program it using the Sensel app, the OCD in me wants to see that reflected onto the keys. While we are on the keys subject, or overlay, I do find it a little hard to use when I customize the key presses. Maybe ordering a custom printed overlay could assuage this concern.

One thing I found odd was the GPU usage that the Sensel app needed. My laptop’s fans were kicking on, so I opened up Task Manager and saw that the Sensel app was taking 30% of my Nvidia RTX 2080. Luckily, you really only need it open when changing overlays or turning it into a trackpad, but I found myself leaving it open by accident, which could really hurt performance.

Summing Up
In the end, is the Sensel Morph really worth the $249? It does come with one free overlay of your choice with the $249 purchase price, along with a one-year warranty; but if you want more overlays those will set you back from $35 to $59 depending on the overlay.

The Video Editing one is $35 while the new Buchla Thunder overlay is $59. From a traditional Keyboard, Piano Key, Music Production, or even Drum Pad Overlay there are a few different options you can choose from. If you are a one-person band that goes between Premiere and apps like Abelton, then it’s 100 percent worth it. If you use Premiere a lot, I still think it is worth it. The iPad Mini-size and weight is really nice, and when using over Bluetooth you feel untethered. Its sleek and thin design really allows you to bring this morphable hardware interface with you anywhere you take your laptop or tablet.

The Sensel Morph is not like any of the other hardware interfaces I have used. Not only is it extremely mobile, but it works well and is compatible with a lot of content creation apps that pros use daily. They really delivered on this one.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Ford v Ferrari’s co-editors discuss the cut

By Oliver Peters

After a failed attempt to acquire European carmaker Ferrari, an outraged Henry Ford II sets out to trounce Enzo Ferrari on his own playing field — automobile endurance racing. That is the plot of 20th Century Fox’s Ford v Ferrari, directed by James Mangold. In the end, Ford’s effort falls short, leading him to independent car designer Carroll Shelby (Matt Damon). Shelby’s outspoken lead test driver Ken Miles (Christian Bale) complicates the situation by making an enemy out of Ford senior VP Leo Beebe.

Michael McCusker

Nevertheless, Shelby and his team are able to build one of the greatest race cars ever — the GT40 MkII — setting up a showdown between the two auto legends at the 1966 24 Hours of Le Mans.

The challenge of bringing this clash of personalities to the screen was taken on by director James Mangold (Logan, Wolverine, 3:10 to Yuma) and his team of long-time collaborators.

I recently spoke with film editors Michael McCusker, ACE, (Walk the Line, 3:10 to Yuma, Logan) and Andrew Buckland (The Girl On the Train) — both of whom were recently nominated for an Oscar and ACE Eddie Award for their work on the film — about what it took to bring Ford v Ferrari together.

The post team for this film has worked with James Mangold on quite a few films. Tell me a bit about the relationship.
Michael McCusker: I cut my very first movie, Walk the Line, for Jim 15 years ago and have since cut his last six movies. I was the first assistant editor on Kate & Leopold, which was shot in New York in 2001. That’s where I met Andrew, who was hired as one of the local New York film assistants. We became fast friends. Andrew moved to LA in 2009, and I hired him to assist me on Knight & Day.

Andrew Buckland

I always want to keep myself available for Jim — he chooses good material, attracts great talent and is a filmmaker who works across multiple genres. Since I’ve worked with him, I’ve cut a musical movie, a western, a rom-com, an action movie, a straight-up superhero movie, a dystopian superhero movie and now a racing film.

As a film editor, it must be great not to get typecast for any particular cutting style.
McCusker: Exactly. I worked for David Brenner for years as his first. He was able to cross genres, and that’s what I wanted to do. I knew even then that the most important decisions I would make would be choosing projects. I couldn’t have foreseen that Jim was going to work across all these genres — I simply knew that we worked well together and that the end product was good.

In preparing for Ford v Ferrari, did you study any other recent racing films, like Ron Howard’s Rush?
McCusker: I saw that movie, and liked it. Jim was aware of it, too, but I think he wanted to do something a little more organic. We watched a lot of older racing films, like Steve McQueen’s Le Mans and John Frankenheimer’s Grand Prix.

Jim’s original intention was to play the racing in long takes and bring the audience along for the ride. As he was developing the script, and we were in preproduction, it became clear that there was more drama for him to portray during the racing sequences than he anticipated. So the races took on more of an energized pace.

Energized in what way? Do you mean in how you cut it or in a change of production technique, like more stunt cameras and angles?
McCusker: I was fortunate to get involved about two-and-a-half months prior to the start of production. We were developing the Le Mans race in previs. This required a lot of editing and discussions about shot design and figuring out what the intercutting was going to be during that sequence, which is like the fourth act of the movie.

You’re dealing with Mollie and Peter [Miles’ wife and son] at home watching the race, the pit drama, what’s going on with Shelby and his crew, with Ford and Leo Beebe and also, of course, what’s going on in the car with Ken. It’s a three-act movie unto itself, so Jim was trying to figure out how it was all going to work before he had to shoot it. That’s where I came in. The frenetic pace of Le Mans was more a part of the writing process — and part of the writing process was the previs. The trick was how to make sure we weren’t just following cars around a track. That’s where redundancy can tend to beleaguer an audience in racing movies.

What was the timeline for production and post?
McCusker: I started at the end of May 2018. Production began at the beginning of August and went all the way through to the end of November. We started post in earnest at the beginning of November of last year, took some time off for the holidays, and then showed the film to the studios around February or March.

When did you realize you were going to need help?
The challenge was that there was going to be a lot of racing footage, which meant there was going to be a lot of footage. I knew I was going to need a strong co-editor, so Andrew was the natural choice. He had been cutting on his own and cutting with me over the years. We share a common approach to editing and have a similar aesthetic.

There was a point when things got really intense and we needed another pair of hands, so I brought in Dirk Westervelt to help out for a couple of months. That kept our noses above water, but the process was really enjoyable. We were never in a crisis mode. We got a great response from preview audiences and, of course, that calms everybody down. At that point it was just about quality control and making sure we weren’t resting on our laurels.

How long was your initial cut, and what was your process for trimming the film down to the present run time?
McCusker: We’re at 2:30:00 right now and I think the first cut was 3:10 or 3:12. The Le Mans section was longer. The front end of the movie had more scenes in it. We ended up lifting some scenes and rearranging others. Plus, the basic trimming of scenes brought the length down.

But nothing was the result of a panic, like, “Oh my God, we’ve got to get to 2:30!” There were no demands by the studio or any pressures we placed upon ourselves to hit a particular running time. I like to say that there’s real time and there’s cinematic time. You can watch Once Upon a Time in America, which is 3:45, and feels like it’s an hour. Or you can watch an 89-minute movie and feel like it’s drudgery. We just wanted to make sure we weren’t overstaying our welcome.

How extensively did you rearrange scenes during the edit? Or did the structure of the film stay pretty much as scripted?
McCusker: To a great degree it stayed as scripted. We had some scenes in the beginning that we felt were a little bit tangential and weren’t serving the narrative directly, and those were cut.

The real endeavor of this movie starts the moment that these two guys [Shelby and Miles] decide to tackle the challenge of developing this car. There’s a scene where Miles sees the car for the first time at LAX. We understood that we had to get to that point in a very efficient way, but also set up all the other characters — their motives and their desires.

It’s an interesting movie, because it starts off with a lot of characters. But then it develops into a movie about two guys and their friendship. So it goes from an ensemble piece to being about Ken and Carroll, while at the same time the scope of the movie is opening up and becoming larger as the racing is going on. For us, the trickiest part was the front end — to make sure we spent enough time with each character so that we understood them, but not so much time that audience would go, “Enough already! Get on with it!”

Did that help inform your cutting style for this film?
McCusker: I don’t think so. Where it helped was knowing the sound of the broadcasters and race announcers. I liked Chris Economaki and Jim McKay — guys who were broadcasting the races when I was a kid. I was intrigued about how they gave us the narrative of the race. It came in handy while we were making this movie, because we were able to get our hands on some of Jim McKay’s actual coverage of Le Mans and used it in the movie. That brings so much authenticity.

Let’s talk sound. I would imagine the sound design was integral to your rough cuts. How did you tackle that?
Andrew Buckland: We were fortunate to have the sound team on very early during preproduction. We were cutting in a 5.1 environment, so we wanted to create sound design early. The engine sounds might not have been the exact sounds that would end up in the final, but they were adequate enough to allow you to experience the scenes as intended. Because we needed to get Jim’s response early, some of the races were cut with the production sound — from the live mics during filming. This allowed Jim and us to quickly see how the scenes would flow.

Other scenes were cut strictly MOS because the sound design would have been way too complicated for the initial cut of the scene. Once the scene was cut visually, we’d hand over the scene to sound supervisor Don Sylvester, who was able to provide us with a set of 5.1 stems. That was great, because we could recut and repurpose those stems for other races.

McCusker: We had developed a strategy with Don to split the sound design into four or five stems to give us enough discrete channels to recut these sequences. The stems were a palette of interior perspectives, exterior perspectives, crowds, car-bys, and so on. By employing this strategy, we didn’t need to continually turn over the cut to sound for patch-up work.

Then, as Don went out and recorded the real cars and was developing the actual sounds for what was going to be used in the mix, he’d generate new stems and we would put them into the Media Composer. This was extremely informative to Jim, because he could experience our Avid temp mix in 5.1 and give notes, which ultimately informed the final sound design and the mix.

What about temp music? Did you also weave that into your rough cuts?
McCusker: Ted Caplan, our music editor, has also worked with Jim for 15 years. He’s a bit of a renaissance man — a screenwriter, a novelist, a one-time musician and a sound designer in his own right. When he sits down to work with music, he’s coming at it from a story point-of-view. He has a very instinctual knowledge of where music should start, and it happens to dovetail into the aesthetic that Jim, Andrew, and I are working toward. None of us like music to lead scenes in a way that anticipates what the scene is going to be about before you experience it.

For this movie, it was challenging to develop what the musical tone of the movie would be. Ted was developing the temp track along with us from a very early stage. We found over time that not one particular musical style was going to work. This is a very complex score. It includes a kind of surf-rock sound with Carroll Shelby in LA, an almost jaunty, lounge jazz sound for Detroit and the Ford executives, and then the hard-driving rhythmic sound for the racing.

The final score was composed by Marco Beltrami and Buck Sanders.

I presume you were housed in multiple cutting rooms at a central facility.
McCusker: We cut at 20th Century Fox, where Jim has a large office space. We cut Logan and Wolverine there before this movie. It has several cutting spaces and I was situated between Andrew and Don. Ted was next to Don and John Berri, our additional editor. Assistants were right around the corner. It makes for a very efficient working environment.

Since the team was cutting with Avid Media Composer, did any of its features stand out to you for this film?
Both: FluidMorph! (laughing)

McCusker: FluidMorph, speed-ramping — we often had to manipulate the shot speeds to communicate the speed of the cars. A lot of these cars were kit cars that could drive safely at a certain speed for photography, but not at race speed. So we had to manipulate the speed a lot to get the sense of action that these cars have.

What about Avid’s ScriptSync? I know a lot of narrative editors love it.
McCusker: I used ScriptSync once a few years ago and I never cut a scene faster. I was so excited. Then I watched it, and it was terrible. To me there’s so much more to editing than hitting the next line of dialogue. I’m more interested in the lines between the lines — subtext. I do understand the value of it in certain applications. For instance, I think it’s great on straight comedy. It’s helpful to get around and find things when you are shooting tons of coverage for a particular joke. But for me, it’s not something I lean on. I mark up my own dailies and find stuff that way.

Tell me a bit more about your organizational process. Do you start with a Kem roll or stringouts of selected takes?
McCusker: I don’t watch dailies, at least in a traditional sense. I don’t start in the morning, watch the dailies and then cut. And I don’t ask my assistants to organize any of my dailies in bins. I come in and grab the scene that I have in front on me. I’ll look at the last take of every set-up quickly and then I spend an enormous amount of time — particularly on complex scenes — creating a bin structure that I can work with.

Sometimes it’s the beats in a scene, sometimes I organize by shot size, sometimes by character — it depends on what’s driving the scene. I learn my footage by organizing it. I remember shot sizes. I remember what was shot from set-up to set-up. I have a strong visual memory of where things are in a bin. So, if I ask an assistant to do that, then I’m not going to remember it. If there are a lot of resets or restarts in a take, I’ll have the assistant mark those up. But, I’ll go through and mark up beats or pivotal points in a scene, or particularly beautiful moments, and then I’ll start cutting.

Buckland: I’ve adopted a lot of Mike’s methodology, mainly because I assisted Mike on a few films. But it actually works for me, as well. I have a similar aesthetic to Mike.

Was this was shot digitally?
McCusker: It was primarily shot with ARRI Alexa 65 LFs, plus some other small-format cameras. A lot of it was shot with old anamorphic lenses on the Alexa that allowed them to give it a bit of a vintage feeling. It’s interesting that as you watch it, you see the effect of the old lenses. There’s a fall-off on the edges, which is kind of cool. There were a couple of places where the subject matter was framed into the curve of the lens, which affects the focus. But we stuck with it, because it feels “of the time.”

Since the film takes place in the 1960s and has a lot of racing sequences, I assume there a lot of VFX?
McCusker: The whole movie is a period film and we would temp certain things in the Avid for the rough cuts. John Berri was wrangling visual effects. He’s a master in the Avid and also Adobe After Effects. He has some clever ways of filling in backgrounds or greenscreens with temp elements to give the director an idea of what’s going to go there. We try to do as much temp work in the Avid as we are capable of doing, but there’s so much 3D visual effects work in this movie that we weren’t able to do that all of the time.

The racing is real. The cars are real. The visual effects work was for a lot of the backgrounds. The movie was shot almost entirely in Los Angeles with some second unit footage shot in Georgia. The modern-day Le Mans track isn’t at all representative of what Le Mans was in 1966, so there was no way to shoot that. Everything had to be doubled and then augmented with visual effects. In addition to Georgia, where they shot most of the actual racing for Le Mans, they went to France to get some shots of the actual town of Le Mans. Of those, I think only about four of those shots are left. (laughs)

Any final thoughts about how this film turned out?
McCusker: I’m psyched that people seem to like the film. Our concern was that we had a lot of story to tell. Would we wear audiences out? We continually have people tell us, “That was two and a half hours? We had no idea.” That’s humbling for us and a great feeling. It’s a movie about these really great characters with great scope and great racing. You can put all the big visual effects in a film that you want to, but it’s really about people.

Buckland: I agree. It’s more of a character movie with racing. Also, because I am not a racing fan per se, the character drama really pulled me into the film while working on it.


Oliver Peters is an experienced film and commercial editor/colorist. In addition, he regularly interviews editors for trade publications. He may be contacted through his website at oliverpeters.com.

The 70th annual ACE Eddie Award nominations

The American Cinema Editors (ACE), the honorary society of the world’s top film editors, has announced its nominations for the 70th Annual ACE Eddie Awards recognizing outstanding editing in 11 categories of film, television and documentaries.

For the first time in ACE’s history, three foreign language films are among the nominees, including The Farewell, I Lost My Body and Parasite, despite there not being a specific category for films predominantly in a foreign language.

Winners will be revealed during a ceremony on Friday, January 17 at the Beverly Hilton Hotel and will be presided over by ACE president, Stephen Rivkin, ACE. Final ballots open December 16 and close on January 6.

Here are the nominees:

BEST EDITED FEATURE FILM (DRAMA):
Ford v Ferrari
Michael McCusker, ACE & Andrew Buckland

The Irishman
Thelma Schoonmaker, ACE

Joker 
Jeff Groth

Marriage Story
Jennifer Lame, ACE

Parasite
Jinmo Yang

BEST EDITED FEATURE FILM (COMEDY):
Dolemite is My Name
Billy Fox, ACE

The Farewell
Michael Taylor & Matthew Friedman

Jojo Rabbit
Tom Eagles

Knives Out
Bob Ducsay

Once Upon a Time in Hollywood
Fred Raskin, ACE

BEST EDITED ANIMATED FEATURE FILM:
Frozen 2
Jeff Draheim, ACE

I Lost My Body
Benjamin Massoubre

Toy Story 4
Axel Geddes, ACE

BEST EDITED DOCUMENTARY (FEATURE):
American Factory
Lindsay Utz

Apollo 11
Todd Douglas Miller

Linda Ronstadt: The Sound of My Voice
Jake Pushinsky, ACE & Heidi Scharfe, ACE

Making Waves: The Art of Cinematic Sound
David J. Turner & Thomas G. Miller, ACE

BEST EDITED DOCUMENTARY (NON-THEATRICAL):
Abducted in Plain Sight
James Cude

Bathtubs Over Broadway
Dava Whisenant

Leaving Neverland
Jules Cornell

What’s My Name: Muhammad Ali
Jake Pushinsky, ACE

BEST EDITED COMEDY SERIES FOR COMMERCIAL TELEVISION:
Better Things: “Easter”
Janet Weinberg, ACE

Crazy Ex-Girlfriend: “I Need To Find My Frenemy” 
Nena Erb, ACE

The Good Place: “Pandemonium” 
Eric Kissack

Schitt’s Creek: “Life is a Cabaret”
Trevor Ambrose

BEST EDITED COMEDY SERIES FOR NON-COMMERCIAL TELEVISION:
Barry: “berkman > block”
Kyle Reiter, ACE

Dead to Me: “Pilot”
Liza Cardinale

Fleabag: “Episode 2.1”
Gary Dollner, ACE

Russian Doll: “The Way Out”
Todd Downing

BEST EDITED DRAMA SERIES FOR COMMERCIAL TELEVISION:
Chicago Med: “Never Going Back To Normal”
David J. Siegel, ACE

Killing Eve: “Desperate Times”
Dan Crinnion

Killing Eve: “Smell Ya Later”
Al Morrow

Mr. Robot: “401 Unauthorized”
Rosanne Tan, ACE

BEST EDITED DRAMA SERIES FOR NON-COMMERCIAL TELEVISION:
Euphoria: “Pilot””
Julio C. Perez IV

Game of Thrones: “The Long Night”
Tim Porter, ACE

Mindhunter: “Episode 2”
Kirk Baxter, ACE

Watchmen: “It’s Summer and We’re Running Out of Ice”
David Eisenberg

BEST EDITED MINISERIES OR MOTION PICTURE FOR TELEVISION:
Chernobyl: “Vichnaya Pamyat”
Jinx Godfrey & Simon Smith

Fosse/Verdon: “Life is a Cabaret”
Tim Streeto, ACE

When They See Us: “Part 1”
Terilyn A. Shropshire, ACE

BEST EDITED NON-SCRIPTED SERIES:
Deadliest Catch: “Triple Jeopardy”
Ben Bulatao, ACE, Rob Butler, ACE, Isaiah Camp, Greg Cornejo, Joe Mikan, ACE

Surviving R. Kelly: “All The Missing Girls”
Stephanie Neroes, Sam Citron, LaRonda Morris, Rachel Cushing, Justin Goll, Masayoshi Matsuda, Kyle Schadt

Vice Investigates: “Amazon on Fire”
Cameron Dennis, Kelly Kendrick, Joe Matoske, Ryo Ikegami

Main Image: Marriage Story

Storage for Editors

By Karen Moltenbrey

Whether you are a small-, medium- or large-size facility, storage is at the heart of your workflow. Consider, for instance, the one-person shop Fin Film Company, which films and edits footage for branding and events, often on water. Then there’s Uppercut, a boutique creative/post studio where collaborative workflow is the key to pushing boundaries on commercials and other similar projects.

Let’s take a look at Uppercut’s workflow first…

Uppercut
Uppercut is a creative editorial boutique shop founded by Micah Scarpelli in 2015 and offering a range of post services. Based in New York and soon Atlanta, the studio employs five editors with their own suites along with an in-house Flame artist who has his own suite.

Taylor Schafer

In contrast to Uppercut’s size, its storage needs are quite large, with five editors working on as many as five projects at a time. Although most of it is commercial work, some of those projects can get heavy in terms of the generated media, which is stored on-site.

So, for its storage needs, the studio employs an EditShare RAID system. “Sometimes we have multiple editors working on one large campaign, and then usually an assistant is working with an editor, so we want to make sure they have access to all the media at the same time,” says Taylor Schafer, an assistant editor at Uppercut.

Additionally, Uppercut uses a Supermicro nearline server to store some of its VFX data, as the Flame artist cannot access the EditShare system on his CentOS operating system. Furthermore, the studio uses LTO-6 archive media in a number of ways. “We use EditShare’s Ark to LTO our partitions once the editors are done with them for their projects. It’s wonderfully integrated with the whole EditShare system. Ark is easy to navigate, and it’s easy to swap LTO tapes in and out, and everything is in one location,” says Schafer.

The studio employs the EditShare Ark to archive its editors’ working files, such as Premiere and Avid projects, graphics, transcodes and so forth. Uppercut also uses BRU (Backup Restore Utility) from Tolis Group to archive larger files that only live on LaCie hard drives and not on EditShare, such as a raw grade. “Then we’re LTO’ing the project and the whole partition with all the working files at the end through Ark,” Schafer explains.

The importance of having a system like this was punctuated over the summer when Uppercut underwent a renovation and had to move into temporary office space at Light Iron, New York — without the EditShare system. As a result, the team had to work off of hard drives and Light Iron’s Avid Nexis for some limited projects. “However, due to storage limits, we mainly worked off of the hard drives, and I realized how important a file storage system that has the ability to share data in real time truly is,” Schafer recalls. “It was a pain having to copy everything onto a hard drive, hand it back to the editor to make new changes, copy it again and make sure all the files were up to date, as opposed to using a storage system like ours, where everything is instantly up to date. You don’t have to worry whether something copied over correctly or not.”

She continues: “Even with Nexis, we were limited in our ability to restore old projects, which lived on EditShare.”

When a new project comes in at Uppercut, the first thing Schafer and her colleagues do is create a partition on EditShare and copy over the working template, whether it’s for Avid or Premiere, on that partition. Then they get their various working files and start the project, copying over the transcodes they receive. As the project progresses, the artists will get graphics and update the partition size as needed. “It’s so easy to change on our end,” notes Schafer. And once the project is completed, she or another assistant will make sure all the files they would possibly need, dating back to day one of the project, are on the EditShare, and that the client files are on the various hard drives and FTP links.

Reebok

“We’ll LTO the partition on EditShare through Ark onto an LTO-6 tape, and once that is complete, then generally we will take the projects or partition off the EditShare,” Schafer continues. The studio has approximately 26TB of RAID storage but, due to the large size of the projects, cannot retain everything on the EditShare long term. Nevertheless, the studio has a nearline server that hosts its masters and generics, as well as any other file the team might need to send to a client. “We don’t always need to restore. Generally the only time we try to restore is when we need to go back to the actual working files, like the Premiere or Avid project,” she adds.

Uppercut avoids keeping data locally on workstations due to the collaborative workflow.

According to Schafer, the storage setup is easy to use. Recently, Schafer finished a Reebok project she and two editors had been working on. The project initially started in Avid Media Composer, which was preferred by one of the editors. The other editor prefers Premiere but is well-versed on the Avid. After they received the transcodes and all the materials, the two editors started working in tandem using the EditShare. “It was great to use Avid on top of it, having Avid bins to open separately and not having to close out of the project and sharing through a media browser or closing out of entire projects, like you have to do with a Premiere project,” she says. “Avid is nice to work with in situations where we have multiple editors because we can all have the project open at once, as opposed to Premiere projects.”

Later, after the project was finished, the editor who prefers Premiere did a director’s cut in that software. As a result, Schafer had to re-transcode the footage, “which was more complicated because it was shot on 16mm, so it was also digitized and on one large video reel instead of many video files — on top of everything else we were doing,” she notes. She re-transcoded for Premiere and created a Premiere project from scratch, then added more storage on EditShare to make sure the files were all in place and that everything was up to date and working properly. “When we were done, the client had everything; the director had his director’s cut and everything was backed up to our nearline for easy access. Then it was LTO’d through Ark on LTO-6 tapes and taken off EditShare, as well as LTO’d on BRU for the raw and the grade. It is now done, inactive and archived.”

Without question, says Schafer, storage is important in the work she and her colleagues do. “It’s not so much about the storage itself, but the speed of the storage, how easily I’m able to access it, how collaborative it allows me to be with the other people I’m working with. Storage is great when it’s accessible and easy for pretty much anyone to use. It’s not so good when it’s slow or hard to navigate and possibly has tech issues and failures,” Schafer says. “So, when I’m looking for storage, I’m looking for something that is secure, fast and reliable, and most of all, easy to understand, no matter the person’s level of technical expertise.”

Chris Aguilar

Fin Film Company
People can count themselves fortunate when they can mix business with pleasure and integrate their beloved hobby with their work. Such is the case for solo producer/director/editor Chris Aguilar of Fin Film Company in Southern California, which he founded a decade ago. As Aguilar says, he does it all, as does Fin Film, which produces everything from conferences to music videos and commercial/branded content. But his real passion involves outdoor adventure paddle sports, from stand-up paddleboarding to pro paddleboarding.

“That’s been pretty much my niche,” says Aguilar, who got his start doing in-house production (photography, video and so forth) for a paddleboard company. Since then, he has been able to turn his passion and adventures into full-time freelance work. “When someone wants an event video done, especially one involving paddleboard races, I get the phone call and go!”

Like many videographers and editors, Aguilar got his start filming weddings. Always into surfing himself, he would shoot surfing videos of friends “and just have fun with it,” he says of augmenting that work. Eventually, this allowed him to move into areas he is more passionate about, such as surfing events and outdoor sports. Now, Aguilar finds that a lot of his time is spent filming paddleboard events around the globe.

Today, there are many one-person studios with solo producers, directors and editors. And as Aguilar points out, their storage needs might not be on the level of feature filmmakers or even independent TV cinematographers, but that doesn’t negate their need for storage. “I have some pretty wide-ranging storage needs, and it has definitely increased over the years,” he says.

In his work, Aguilar has to avoid cumbersome and heavy equipment, such as Atomos recorders, because of their weight on board the watercraft he uses to film paddleboard events. “I’m usually on a small boat and don’t have a lot of room to haul a bunch of gear around,” he says. Rather, Aguilar uses Panasonic’s AG-CX350 as well as Panasonic’s EVA1 and GH5, and on a typical two-day shoot (the event and interviews), he will fill five to six 64GB cards.

“Because most paddleboard races are long-distance, we’re usually on the water for about five to eight hours,” says Aguilar. “Although I am not rolling cameras the whole time, the weight still adds up pretty quickly.”

As for storage, Aguilar offloads his video onto SSD drives or other kinds of external media. “I call it my ‘working drive for editing and that kind of thing,’” he says. “Once I am done with the edit and other tasks, I have all those source files somewhere.” He calls on the G-Technology G-Drive Mobile SSD 1TB for in the field and some editing and their Ev Raw portable raw drive for back ups and some editing. He also calls on Gylph’s Atom SSD for the field.

For years, that “somewhere” has been a cabinet that was filled with archived files. Indeed, that cabinet is currently holding, in Aguilar’s estimate, 30TB of data, if not more. “That’s just the archives. I have 10 or 11 years of archives sitting there. It’s pretty intense,” he adds. But, as soon as he gets an opportunity, those will be ported to the same cloud backup solution he is using for all his current work.

Yes, he still uses the source cards, but for a typical project involving an end-to-end shoot, Aguilar will use at least a 1TB drive to house all the source cards and all the subsequent work files. “Things have changed. Back in the day, I used hard drives – you should see the cabinet in my office with all these hard drives in it. Thank God for SSDs and other options out there. It’s changed our lives. I can get [some brands of] 1TB SSD for $99 or a little more right now. My workflow has me throwing all the source cards onto something like that that’s dedicated to all those cards, and that becomes my little archive,” explains Aguilar.

He usually uploads the content as fast as possible to keep the data secure. “That’s always the concern, losing it, and that’s where Backblaze comes in,” Aguilar says. Backblaze is a cloud backup solution that is easily deployed across desktops and laptops and managed centrally — a solution Aguilar recently began employing. He also uses Iconik Solutions’ digital management system, which eases the task of looking up video files or pulling archived files from Backblaze. The digital management system sits on top of Backblaze and creates little offline proxies of the larger content, allowing Aguilar to view the entire 10-year archive online in one interface.

According to Aguilar, his archived files are an important aspect of his work. Since he works so many paddleboard events, he often receives requests for clips from specific racers or races, some dating back years. Prior to using Backblaze, if someone requested footage, it was a challenge to locate it because he’d have to pull that particular hard drive and plug it into the computer, “and if I had been organized that year, I’ll know where that piece of content is because I can find it. If I wasn’t organized that year, I’d be in trouble,” he explains. “At best, though, it would be an hour and a half or more of looking around. Now I can locate and send it in 15 minutes.”

Aguilar says the Iconik digital management system allows him to pull up the content on the interface and drill down to the year of the race, click on it, download it and send it off or share it directly through his interface to the person requesting the footage.

Aguilar went live with this new Backblaze and digital management system storage workflow this year and has been fully on board with it for just the past two to three months. He is still uncovering all the available features and the power underneath the hood. “Even for a guy who’s got a technical background, I’m still finding things I didn’t know I could do,” and as such, Aguilar is still fine-tuning his workflow. “The neat thing with Iconik is that it could actually support online editing straight up, and that’s the next phase of my workflow, to accommodate that.”

Fortunately or unfortunately, at this time Aguilar is just starting to come off his busy season, so now he can step back and explore the new system. And transfer onto the new system all the material on the old source cards in that cabinet of his.

“[The new solution] is more efficient and has reduced costs since I am not buying all these drives anymore. I can reuse them now. But mostly, it has given me peace of mind that I know the data is secure,” says Aguilar. “I have been lucky in my career to be present for a lot of cool moments in the sport of paddling. It’s a small community and a very close-knit group. The peace of mind knowing that this history is preserved, well, that’s something I greatly appreciate. And I know my fellow paddlers also appreciate it.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

The Irishman editor Thelma Schoonmaker

By Iain Blair

Editor Thelma Schoonmaker is a three-time Academy Award winner who has worked alongside filmmaker Martin Scorsese for almost 50 years. Simply put, Schoonmaker has been Scorsese’s go-to editor and key collaborator over the course of some 25 films, winning Oscars for Raging Bull, The Aviator and The Departed. The 79-year-old also received a career achievement award from the American Cinema Editors (ACE).

Thelma Schoonmaker

Schoonmaker cut Scorsese’s first feature, 1967’s Who’s That Knocking at My Door, and since 1980’s Raging Bull has worked on all of his features, receiving a number of Oscar nominations along the way. There are too many to name, but some highlights include The King of Comedy, After Hours, The Color of Money, The Last Temptation of Christ, Goodfellas, Casino and Hugo.

Now Scorsese and Schoonmaker have once again turned their attention to the mob with The Irishman, which was nominated for 10 Academy Awards, including one for Shoonmaker’s editing work. Starring Robert De Niro, Al Pacino and Joe Pesci, it’s an epic saga that runs 3.5 hours and focuses on organized crime in post-war America. It’s told through the eyes of World War II veteran Frank Sheeran (De Niro). He’s a hustler and hitman who worked alongside some of the most notorious figures of the 20th century. Spanning decades, the film chronicles one of the greatest unsolved mysteries in American history, the disappearance of legendary union boss Jimmy Hoffa. It also offers a monumental journey through the hidden corridors of organized crime — its inner workings, rivalries and connections to mainstream politics.

But there’s a twist to this latest mob drama that Scorsese directed for Netflix from a screenplay by Steven Zaillian. Gone are the flashy wise guys and the glamour of Goodfellas and Casino. Instead, the film examines the mundane nature of mob killings and the sad price any survivors pay in the end.

Here, Schoonmaker — who in addition to her film editing works to promote the films and writings of her late husband, famed British director Michael Powell (The Red Shoes, Black Narcissus) — talks about cutting The Irishman, working with Scorsese and their long and storied collaboration.

The Irishman must have been very challenging to cut, just in terms of its 3.5-hour length?
Actually, it wasn’t very challenging to cut. It came together much more quickly than some of our other films because Scorsese and Steve Zaillian had created a very strong structure. I think some critics think I came up with this structure, but it was already there in the script. We didn’t have to restructure, which we do sometimes, and only dropped a few minor scenes.

Did you stay in New York cutting while he shot on location, or did you visit the set?
Almost everything in the The Irishman was shot in or around New York. The production was moving all over the place, so I never got to the set. I couldn’t afford the time.

When I last interviewed Marty, he told me that editing and post are his favorite parts of filmmaking. When the two of you sit down to edit, is it like having two editors in the room rather than a director and his editor?
Marty’s favorite part of filmmaking is editing, and he directs the editing after he finishes shooting. I do an assembly based on what he tells me in dailies and what I feel, and then we do all the rest of the editing together.

Could you give us some sense of how that collaboration works?
We’ve worked together for almost 50 years, and it’s a wonderful collaboration. He taught me how to edit at first, but then gradually it has become more of a collaboration. The best thing is that we both work for what is best for the film — it never becomes an ego battle.

How long did it take to edit the film, and what were the main challenges?
We edited for a year and the footage was so incredibly rich: the only challenge was to make sure we chose the best of it and took advantage of the wonderful improvisations the actors gave us. It was a complete joy for Scorsese and me to edit this film. After we locked the film, we turned over to ILM so they could do the “youthifying” of the actors. That took about seven months.

Could you talk about finding the overall structure and considerable use of flashbacks to tell the story?
Scorsese had such a strong concept for this film — and one of his most important ideas was to not explain too much. He respects the audience’s ability to figure things out themselves without pummeling them with facts. It was a bold choice and I was worried about it, frankly, at first. But he was absolutely right. He didn’t want the film to feel like a documentary. He wanted to use brushstrokes of history just to show how they affected the characters. The way the characters were developed in the film, particularly Frank Sheeran, the De Niro character, was what was most important.

Could you talk about the pacing, and how you and Marty kept its momentum going?
Scorsese was determined that The Irishman would have a slower pace than many films today. He gave the film a deceptive simplicity. Interestingly, our first audiences had no problem with this — they became gripped by the characters and kept saying they didn’t mind the length and loved the pace. Many of them said they wanted to see the film again right away.

There are several slo-mo sequences. Could you talk about why you used them and to what effect?
The Phantom camera slo-motion wedding sequence (250fps) near the end of the film was done to give the feeling of a funeral, instead of a wedding, because the DeNiro character has just been forced to do the worst thing he will ever do in his life. Scorsese wanted to hold on De Niro’s face and evoke what he is feeling and to study the Italian-American faces of the mobsters surrounding him. Instead of the joy a wedding is supposed to bring, there is a deep feeling of grief.

What was the most difficult sequence to cut and why?
The montage where De Niro repeatedly throws guns into the river after he has killed someone took some time to get right. It was very normal at first — and then we started violating the structure and jump cutting and shortening until we got the right feeling. It was fun.

There’s been a lot of talk about the digital de-aging process. How did it impact the edit?
Pablo Helman at ILM came up with the new de-aging process, and it works incredibly well. He would send shots and we would evaluate them and sometimes ask for changes — usually to be sure that we kept the amazing performances of De Niro, Pacino and Pesci intact. Sometimes we would put back in a few wrinkles if it meant we could keep the subtlety of De Niro’s acting, for example. Scorsese was adamant that he didn’t want to have younger actors play the three main parts in the beginning of the film. So he really wanted this “youthifying” process to work — and it does!

There’s a lot of graphic violence. How do you feel about that in the film?
Scorsese made the violence very quick in The Irishman and shot it in a deceptively simple way. There aren’t any complicated camera moves and flashy editing. Sometimes the violence takes place after a simple pan, when you least expect it because of the blandness of the setting. He wanted to show the banality of violence in the mob — that it is a job, and if you do it well, you get rewarded. There’s no morality involved.

Last time we talked, you were using the Lightworks editing system. Do you still use Lightworks, and if so, can you talk about the system’s advantages for you?
I use Lightworks because the editing surface is still the fastest and most efficient and most intuitive to use. Maintaining sync is different from all other NLE systems. You don’t correct sync by sync lock — if you go out of sync, Lightworks gives you a red icon with a number of frames that you are out of sync. You get to choose where you want to correct sync. Since editors place sound and picture on the timeline, adjusting sync where you want to adjust the sync is much more efficient.

You’ve been Marty’s editor since his very first film — a 50-year collaboration. What’s the secret?
I think Scorsese felt when he first met me that I would do what was right for his films — that there wouldn’t be ego battles. We work together extremely well. That’s all there is to it. There couldn’t be a better job.

Do you ever have strong disagreements about the editing?
If we do have disagreements, which is very rare, they are never strong. He is very open to experimentation. Sometimes we will screen two ways and see what the audience says. But that is very rare.

What’s next?
A movie about the Osage Nation in Oklahoma, based on the book “Killers of the Flower Moon” by David Grann.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Alaina Zanotti rejoins Cartel as executive producer

Santa Monica-based editorial and post studio Cartel has named Alaina Zanotti as executive producer to help with business development and to oversee creative operations along with partner and executive producer Lauren Bleiweiss. Additionally, Cartel has bolstered its roster with the signing of comedic editor Kevin Zimmerman.

Kevin Zimmerman

With more than 15 years of experience, Zanotti joins Cartel after working for clients that include BBDO, Wieden+Kennedy, Deutsch, Google, Paramount and Disney. Zanotti most recently served as senior executive producer at Method Studios, where she oversaw business development for global VFX and post. Prior to that stint, she joined Cartel in 2016 to assist the newly established post and editorial house’s growth. Previously, Zanotti spent more than a decade driving operations and raising brand visibility for Method and Company 3.

Editor Zimmerman joins Cartel following a tenure as a freelance editor, during which his comedic timing and entrepreneurial spirit earned him commercial work for Avocados From Mexico and Planters that aired during 2019’s Super Bowl.

Throughout his two-decade career in editorial, Zimmerman has held positions at Spot Welders, NO6, Whitehouse Post and FilmCore, with recent work for Sprite, Kia, hotels.com, Microsoft and Miller Lite, and a PSA for Girls Who Code. Zimmer has previously worked with Cartel partners Adam Robinson and Leo Scott.

Chris Hellman joins Harbor as ECD of editorial

Harbor has added award-winning editor Chris Hellman as executive creative director of editorial. Hellman brings 35 years of experience collaborating and editing with producers, art directors, writers and directors working on commercials. He will be based at Harbor in New York but available at its locations in LA and London as well.

During his long and distinguished career, Hellman has garnered multiple Cannes Lions, Addy Awards, Clios, The One Clubs, London International Awards, CA Annuals, and AICP Awards. Chris served as senior editor at Crew Cuts for 16 years. He was owner/partner and senior editor at Homestead Editorial and then became senior editor at Cutting Room Films. Hellman then took up the role of creative director of post production with the FCB Health network of agencies. His work has been seen in movie theaters, during concerts, and the Super Bowl and as short films and spoof commercials on Saturday Night Live.

“Creating great commercial advertising is about collaboration,” says Hellman. “Harbor is evolving to take that collaboration to a new level, offering clients an approach where the editor is brought into the creative process early on, bringing a new paradigm and a singular creative force.”

Hellman’s clients have included AT&T, Verizon, IBM, Intel, ESPN, NFL, MLB, NBA, Nike, Adidas, New Balance, 3M, Starbucks, Coke, Pepsi, Lipton, Tropicana, Audi, BMW, Volvo, Ford, Jaguar, GMC, Chrysler, Porsche, Pfizer, Merck, Novartis, AstraZeneca, Bayer, Johnson & Johnson, General Mills, Unilever, Lancome, Estee Lauder, Macy’s, TJ Maxx, Tommy Hilfiger, Victorias Secret, Lands End and The Jon Stewart Show, among many others.

Behind the Title: Logan & Sons director Tom Schlagkamp

This director also loves editing, sound design and working with VFX long before and after the shoot.

Name: Tom Schlagkamp

Company: Logan & Sons, the live-action division of bicoastal content creation studio Logan, which is based in NYC and LA.

Job Title: Director

What’s your favorite part of the job?
I can honestly say I love every detail of the job, even the initial pitch, as it’s the first contact with a new story, a new project and a new challenge. I put a lot of heart into every aspect of a film — the better you’ve prepared in pre-production, the more creative you can be during the shoot; it brings you more time and oversight during shooting and more power to react if anything changes.

Tom Schlagkamp’s short film Dysconnected.

For my European, South African and Asian projects, I’m also very happy to be deeply involved in editing, sound design and post production, as I love working with the material. I usually shoot footage, so there are more possibilities to work with in editing.

What’s your least favorite?
Not winning a job, that’s why I’m trying to avoid that… (laughs).

If you didn’t have this job, what would you be doing instead?
Well, plan A would be a rock star — specifically, a guitarist in a thrash metal band. Plan B would be the exact opposite: working at my family’s winery — Schlagkamp-Desoye in Germany’s beautiful Mosel Valley. My brother runs this company now, which is in its 11th generation. Our family has grown wine since 1602. The winery also includes a wine museum.

How early on did you know this would be your path?
In Germany, you don’t necessarily jump from high school to college right away, so I took a short time to learn all the basics of filmmaking with as much practical experience as I could get. That included directing music videos and short films while I worked for Germany’s biggest TV station, RTL. There I learned to edit and produced campaigns for shows, and in particular movie trailers and campaigns for the TV premieres of blockbuster movies. That was a lot of work and fun at the same time.

What was it about directing that attracted you?
The whole idea of creating something completely new. I loved (and still do) the films of the “New Hollywood” and the Nouvelle Vague — they challenged the regular way of storytelling and created something outstanding that changed filmmaking forever. This fascinated me, and I knew I had to learn the rules first in order to be able to question them, so I started studying at Germany’s film academy, the Filmakademie Baden-Württemberg.

What is it about directing that keeps you interested?
It’s about always moving forward. There are so many more ways you can tell a story and so many stories that have not yet been told, so I love working on as many projects as possible.

Dysconnected

Do you get involved with post at all?
Yes, I love to be part of that whenever the circumstances allow it. As mentioned before, I love editing and sound design as well, but also planning and working with VFX long before and after the shoot is fascinating to me.

Can you name some recent projects you have worked on?
As I answer these questions, I’m sitting at the airport in Berlin, traveling to Johannesburg, South Africa. I’m excited about shooting a series of commercials in the African savanna. I shot many commercials this year, but was also happy that my short film Dysconnected, which I shot in Los Angeles last year, premiered at LA Shorts International Film Festival this summer.

What project are you most proud of?
I loved shooting the Rock ’n’ Roll Manifesto for Visions magazine, because it was the perfect combination of my job as a director and my before-mentioned “alternative Plan A,” making my living as a musician. Also, everybody involved in the project was so into it and it’s been the best shooting experience. And winning awards with it in the end was an added bonus.

Rock ‘n’ Roll Manifesto

Name three pieces of technology you can’t live without.
1. Noise cancelling headphones. When I travel, I love listening to music and podcasts, and with these headphones you can dive into that world perfectly.
2. My mobile phone, which I hardly use for phone calls anymore but everything else.
3. My laptop, which is part of every project from the beginning until the end.

What do you do to de-stress from it all?
Cycling, hiking and rock concerts. There is nothing like the silence of being in pure nature and the loudness of heavy guitars and drums at a metal show (laughs).

2019 HPA Award winners announced

The industry came together on November 21 in Los Angeles to celebrate its own at the 14th annual HPA Awards. Awards were given to individuals and teams working in 12 creative craft categories, recognizing outstanding contributions to color grading, sound, editing and visual effects for commercials, television and feature film.

Rob Legato receiving Lifetime Achievement Award from presenter Mike Kanfer. (Photo by Ryan Miller/Capture Imaging)

As was previously announced, renowned visual effects supervisor and creative Robert Legato, ASC, was honored with this year’s HPA Lifetime Achievement Award; Peter Jackson’s They Shall Not Grow Old was presented with the HPA Judges Award for Creativity and Innovation; acclaimed journalist Peter Caranicas was the recipient of the very first HPA Legacy Award; and special awards were presented for Engineering Excellence.

The winners of the 2019 HPA Awards are:

Outstanding Color Grading – Theatrical Feature

WINNER: “Spider-Man: Into the Spider-Verse”
Natasha Leonnet // Efilm

“First Man”
Natasha Leonnet // Efilm

“Roma”
Steven J. Scott // Technicolor

Natasha Leonnet (Photo by Ryan Miller/Capture Imaging)

“Green Book”
Walter Volpatto // FotoKem

“The Nutcracker and the Four Realms”
Tom Poole // Company 3

“Us”
Michael Hatzer // Technicolor

 

Outstanding Color Grading – Episodic or Non-theatrical Feature

WINNER: “Game of Thrones – Winterfell”
Joe Finley // Sim, Los Angeles

 “The Handmaid’s Tale – Liars”
Bill Ferwerda // Deluxe Toronto

“The Marvelous Mrs. Maisel – Vote for Kennedy, Vote for Kennedy”
Steven Bodner // Light Iron

“I Am the Night – Pilot”
Stefan Sonnenfeld // Company 3

“Gotham – Legend of the Dark Knight: The Trial of Jim Gordon”
Paul Westerbeck // Picture Shop

“The Man in The High Castle – Jahr Null”
Roy Vasich // Technicolor

 

Outstanding Color Grading – Commercial  

WINNER: Hennessy X.O. – “The Seven Worlds”
Stephen Nakamura // Company 3

Zara – “Woman Campaign Spring Summer 2019”
Tim Masick // Company 3

Tiffany & Co. – “Believe in Dreams: A Tiffany Holiday”
James Tillett // Moving Picture Company

Palms Casino – “Unstatus Quo”
Ricky Gausis // Moving Picture Company

Audi – “Cashew”
Tom Poole // Company 3

 

Outstanding Editing – Theatrical Feature

Once Upon a Time… in Hollywood

WINNER: “Once Upon a Time… in Hollywood”
Fred Raskin, ACE

“Green Book”
Patrick J. Don Vito, ACE

“Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese”
David Tedeschi, Damian Rodriguez

“The Other Side of the Wind”
Orson Welles, Bob Murawski, ACE

“A Star Is Born”
Jay Cassidy, ACE

 

Outstanding Editing – Episodic or Non-theatrical Feature (30 Minutes and Under)

VEEP

WINNER: “Veep – Pledge”
Roger Nygard, ACE

“Russian Doll – The Way Out”
Todd Downing

“Homecoming – Redwood”
Rosanne Tan, ACE

“Withorwithout”
Jake Shaver, Shannon Albrink // Therapy Studios

“Russian Doll – Ariadne”
Laura Weinberg

 

Outstanding Editing – Episodic or Non-theatrical Feature (Over 30 Minutes)

WINNER: “Stranger Things – Chapter Eight: The Battle of Starcourt”
Dean Zimmerman, ACE, Katheryn Naranjo

“Chernobyl – Vichnaya Pamyat”
Simon Smith, Jinx Godfrey // Sister Pictures

“Game of Thrones – The Iron Throne”
Katie Weiland, ACE

“Game of Thrones – The Long Night”
Tim Porter, ACE

“The Bodyguard – Episode One”
Steve Singleton

 

Outstanding Sound – Theatrical Feature

WINNER: “Godzilla: King of Monsters”
Tim LeBlanc, Tom Ozanich, MPSE // Warner Bros.
Erik Aadahl, MPSE, Nancy Nugent, MPSE, Jason W. Jennings // E Squared

“Shazam!”
Michael Keller, Kevin O’Connell // Warner Bros.
Bill R. Dean, MPSE, Erick Ocampo, Kelly Oxford, MPSE // Technicolor

“Smallfoot”
Michael Babcock, David E. Fluhr, CAS, Jeff Sawyer, Chris Diebold, Harrison Meyle // Warner Bros.

“Roma”
Skip Lievsay, Sergio Diaz, Craig Henighan, Carlos Honc, Ruy Garcia, MPSE, Caleb Townsend

“Aquaman”
Tim LeBlanc // Warner Bros.
Peter Brown, Joe Dzuban, Stephen P. Robinson, MPSE, Eliot Connors, MPSE // Formosa Group

 

Outstanding Sound – Episodic or Non-theatrical Feature

WINNER: “The Haunting of Hill House – Two Storms”
Trevor Gates, MPSE, Jason Dotts, Jonathan Wales, Paul Knox, Walter Spencer // Formosa Group

“Chernobyl – 1:23:45”
Stefan Henrix, Stuart Hilliker, Joe Beal, Michael Maroussas, Harry Barnes // Boom Post

“Deadwood: The Movie”
John W. Cook II, Bill Freesh, Mandell Winter, MPSE, Daniel Colman, MPSE, Ben Cook, MPSE, Micha Liberman // NBC Universal

“Game of Thrones – The Bells”
Tim Kimmel, MPSE, Onnalee Blank, CAS, Mathew Waters, CAS, Paula Fairfield, David Klotz

“Homecoming – Protocol”
John W. Cook II, Bill Freesh, Kevin Buchholz, Jeff A. Pitts, Ben Zales, Polly McKinnon // NBC Universal

 

Outstanding Sound – Commercial 

WINNER: John Lewis & Partners – “Bohemian Rhapsody”
Mark Hills, Anthony Moore // Factory

Audi – “Life”
Doobie White // Therapy Studios

Leonard Cheshire Disability – “Together Unstoppable”
Mark Hills // Factory

New York Times – “The Truth Is Worth It: Fearlessness”
Aaron Reynolds // Wave Studios NY

John Lewis & Partners – “The Boy and the Piano”
Anthony Moore // Factory

 

Outstanding Visual Effects – Theatrical Feature

WINNER: “The Lion King”
Robert Legato
Andrew R. Jones
Adam Valdez, Elliot Newman, Audrey Ferrara // MPC Film
Tom Peitzman // T&C Productions

“Avengers: Endgame”
Matt Aitken, Marvyn Young, Sidney Kombo-Kintombo, Sean Walker, David Conley // Weta Digital

“Spider-Man: Far From Home”
Alexis Wajsbrot, Sylvain Degrotte, Nathan McConnel, Stephen Kennedy, Jonathan Opgenhaffen // Framestore

“Alita: Battle Angel”
Eric Saindon, Michael Cozens, Dejan Momcilovic, Mark Haenga, Kevin Sherwood // Weta Digital

“Pokemon Detective Pikachu”
Jonathan Fawkner, Carlos Monzon, Gavin Mckenzie, Fabio Zangla, Dale Newton // Framestore

 

Outstanding Visual Effects – Episodic (Under 13 Episodes) or Non-theatrical Feature

Game of Thrones

WINNER: “Game of Thrones – The Bells”
Steve Kullback, Joe Bauer, Ted Rae
Mohsen Mousavi // Scanline
Thomas Schelesny // Image Engine

“Game of Thrones – The Long Night”
Martin Hill, Nicky Muir, Mike Perry, Mark Richardson, Darren Christie // Weta Digital

“The Umbrella Academy – The White Violin”
Everett Burrell, Misato Shinohara, Chris White, Jeff Campbell, Sebastien Bergeron

“The Man in the High Castle – Jahr Null”
Lawson Deming, Cory Jamieson, Casi Blume, Nick Chamberlain, William Parker, Saber Jlassi, Chris Parks // Barnstorm VFX

“Chernobyl – 1:23:45”
Lindsay McFarlane
Max Dennison, Clare Cheetham, Steven Godfrey, Luke Letkey // DNEG

 

Outstanding Visual Effects – Episodic (Over 13 Episodes)

Team from The Orville – Outstanding VFX, Episodic, Over 13 Episodes (Photo by Ryan Miller/Capture Imaging)

WINNER: “The Orville – Identity: Part II”
Tommy Tran, Kevin Lingenfelser, Joseph Vincent Pike // FuseFX
Brandon Fayette, Brooke Noska // Twentieth Century FOX TV

“Hawaii Five-O – Ke iho mai nei ko luna”
Thomas Connors, Anthony Davis, Chad Schott, Gary Lopez, Adam Avitabile // Picture Shop

“9-1-1 – 7.1”
Jon Massey, Tony Pirzadeh, Brigitte Bourque, Gavin Whelan, Kwon Choi // FuseFX

“Star Trek: Discovery – Such Sweet Sorrow Part 2”
Jason Zimmerman, Ante Dekovic, Aleksandra Kochoska, Charles Collyer, Alexander Wood // CBS Television Studios

“The Flash – King Shark vs. Gorilla Grodd”
Armen V. Kevorkian, Joshua Spivack, Andranik Taranyan, Shirak Agresta, Jason Shulman // Encore VFX

The 2019 HPA Engineering Excellence Awards were presented to:

Adobe – Content-Aware Fill for Video in Adobe After Effects

Epic Games — Unreal Engine 4

Pixelworks — TrueCut Motion

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix for Photon.

Julian Clarke on editing Terminator: Dark Fate

By Oliver Peters

Linda Hamilton’s Sarah Connor and Arnold Schwarzenegger T-800 are back to save humanity from a dystopian future in this latest installment of the Terminator franchise. James Cameron is also back and brings with him writing and producing credits, which is fitting — Terminator: Dark Fate is in essence Cameron’s sequel to Terminator 2: Judgment Day.

Julian Clarke

Tim Miller (Deadpool) is at the helm to direct the tale. It’s roughly two decades after the time of T2, and a new Rev-9 machine has been sent from an alternate future to kill Dani Ramos (Natalia Reyes), an unsuspecting auto plant worker in Mexico. But the new future’s resistance has sent back Grace (Mackenzie Davis), an enhanced super-soldier, to combat the Rev-9 and save her. They cross paths with Connor, and the story sets off on a mad dash to the finale at Hoover Dam.

Miller brought back much of his Deadpool team, including his VFX shop Blur, DP Ken Seng and editor Julian Clarke. This is also the second pairing of Miller and Clarke with Adobe. Both Deadpool and Terminator: Dark Fate were edited using Premiere Pro. In fact, Adobe was also happy to tie in with the film’s promotion through its own #CreateYourFate trailer remix challenge. Participants could re-edit their own trailer using supplied content from the film.

I recently spoke with Clarke about the challenges and fun of cutting this latest iteration of such an iconic film franchise.

Terminator: Dark Fate picks up two decades after Terminator 2, leaving out the timelines of the subsequent sequels. Was that always the plan, or did it evolve out of the process of making the film?
That had to do with the screenplay. You were written into a corner by the various sequels. We really wanted to bring Linda Hamilton’s character back. With Jim involved, we wanted to get back to first principles and have it based on Cameron’s mythology alone. To get back to the Linda/Arnold character arcs, and then add some new stuff to that.

Many fans were attracted to the franchise by Cameron’s two original Terminator films. Was there a conscious effort at integrating that nostalgia?
I come from a place of deep fandom for Terminator 2. As a teenager I had VHS copies of Aliens and Terminator 2 and watched them on repeat after school! Those films are deeply embedded in my psyche, and both of them have aged well — they still hold up. I watched the sequels, and they just didn’t feel like a Terminator film to me. So the goal was definitely to make it of the DNA of those first two movies. There’s going to be a chase. It’s going to be more grounded. It’s going to get back into the Sarah Connor character and have more heart.

This film tends to have elements of humor unlike most other action films. That must have posed a challenge to set the right tone without getting campy.
The humor thing is interesting. Terminator 2 has a lot of humor throughout. We have a little bit of humor in the first half and then more once Arnold shows up, but that’s really the way it had to be. The Dani Ramos character — who’s your entry point into the movie — is devastated when her whole family is killed. To have a lot of jokes happening would be terrible. It’s not the same in Terminator 2 because John Connor’s stepparents get very little screen time, and they don’t seem that nice. You feel bad for them, but it’s OK that you get into this funny stuff right off the bat. On this one we had to ease into the humor so you could [experience] the gravity of the situation at the start of the movie.

Did you have to do much to alter that balance during the edit?
There were one or two jokes that we nipped out, but it wasn’t like that whole first act was chock full of jokes. The tone of the first act is more like Terminator, which is more of a thriller or horror movie. Then it becomes more like T2 as the action gets bigger and the jokes come in. So the first half is like a bigger Terminator and the second half more like T2.

Deadpool, which Tim Miller also directed, used a very nonlinear story structure, balancing action, comedic moments and drama. Terminator was always designed with a linear, straightforward storyline. Right?
A movie hands you certain editing tools. Deadpool was designed to be nonlinear, with characters in different places, so there are a whole bunch of options for you. Terminator: Dark Fate is more like a road movie. The detonation of certain paths along the road are predetermined. You can’t be in Texas before Mexico. So the structural options you had were where to check in with the Rev-9, as well as the inter-scene structure. Once you are in the detention center, who are you cutting to? Sarah? Dani? However, where that is placed in the movie is pretty much set. All you can do is pace it up, pace it down, adjust how to get there. There aren’t a lot of mobile pieces that can be swapped around.

When we had talked after Deadpool, you discussed how you liked the assistants to build string-outs — what some call a Kem roll. Similar action is assembled back to back into a sequence in order from every take. Did you use that same organizational method on Terminator: Dark Fate?
Sometimes we were so swamped with material that there wasn’t time to create string-outs. I still like to have those. It’s a nice way to quickly see all the pieces that cover a moment. If you are trying to find the one take or action that’s 5% better than another, then it’s good to see them all in a row, rather than trying to keep it all in your head for a five-minute take. There was a lot of footage that we shot in the action scenes, but we didn’t do 11 or 12 takes for a dialogue scene. I didn’t feel like I needed some tool to quickly navigate through the dialogue takes. We would string out the ones that were more complicated.

Depending on the directing style, a series of takes may have increasingly calibrated performances with successive takes. With other directors, each take might be a lot different than the one before and after it. What is your approach to evaluating which is the best take to use?
It’s interesting when you use the earlier takes versus the later takes and what you get from them. The later takes are usually the ones that are most directed. The actors are warmed up and most closely nail what the director has in mind. So they are strong in that regard, but sometimes they can become more self-conscious. So sometimes the first take is more thrown away and may have less power but feels more real — more off the cuff. Sometimes a delivered dialogue line feels less written, and you’ll buy it more. Other times you’ll want that more dramatic quality of the later takes. My instinct is to first use the later takes, but as you start to revise a scene, you often go back to pieces of the earlier takes to ground it a little more.

How long did the production and post take?
It took a little over 100 days of shooting with a lot of units. I work on a lot of mid-budget films, so this seemed like a really long shoot. It was a little relentless for everyone — even squeezing it into those 100 days. Shooting action with a lot of VFX is slow due to the reset time needed between takes. The ending of the movie is 30 minutes of action in a row. That’s a big job shooting all of that stuff. When they have a couple of units cranking through the dialogue scenes plus shooting action sequences — that’s when I have to work hard to keep up. Once you hit the roadblocks of shooting just those little action pieces, you get a little time to catch up.

We had the usual director’s cut period and finished by the end of this September. The original plan was to finish by the beginning of September, but we needed the time for VFX. So everything piled up with the DI and the mix in order to still hit the release date. September got a little crazy. It seems like a long time — a total of 13 or 14 months — but it still was an absolute sprint to get the movie in shape and get the VFX into the film in time. This might be normal for some of these films, but compared to the other VFX movies I’ve done, it was definitely turning things up a notch!

I imagine that there was a fair amount of previz required to lay out the action for the large VFX and CG scenes. Did you have that to work with as placeholder shots? How did you handle adjusting the cut as the interim and final shots were delivered?
Tim is big into previz with his background in VFX and animation and owning his own VFX company. We had very detailed animatics going into production. Depending on a lot of factors, you still abandon a lot of things. For example, the freeway chases are quite a bit different because when you go there and do it with real cars, they do different things. Or only part of the cars look like they are going fast enough. Those scenes became quite different than the previz.

Others are almost 100% CG, so you can drop in the previz as placeholders. Although, even in those cases, sometimes the finished shot doesn’t feel real enough. In the “cartoon” world of previz, you can do wild camera moves and say, “Wow, that seems cool!” But when you start doing it at photoreal quality, then you go, “This seems really fake.” So we tried to get ahead of that stuff and find what to do with the camera to ground it. Kind of mess it up so it’s not too dynamic and perfect.

How involved were you with shaping the music? Did you use previous Terminator films’ scores as a temp track to cut with?
I was very involved with the music production. I definitely used a lot of temp music. Some of it was ripped from old Terminator movies, but there’s only so much Terminator 2 music you can put in. Those scores used a lot of synthesizers that date the sound. I did use “Desert Suite” from Terminator 2, when Sarah is in the hotel room. I loved having a very direct homage to a Sarah Connor moment while she’s talking about John. Then I begged our composer, Tom Holkenborg (from Junkie XL), to consider doing a version of it for our movie. So it is essentially the same chord progression.

That was an interesting musical and general question about how much do you lean into the homage thing. It’s powerful when you do it, but if you do it too much, it starts to feel artificial or pandering. So I tried to hit the sweet spot so you knew you were watching a Terminator movie, but not so much that it felt like Terminator karaoke. How many times can you go da-dum-dum-da-da-dum? You have to pick your moments for those Terminator motifs. It’s diminishing returns if you do it too much.

Another inspirational moment for me was another part in Terminator 2. There’s a disturbing industrial sound for the T-1000. It sounds more like a foghorn or something in a factory rather than music, and it created this unnerving quality to the T-1000 scenes, when he’s just scoping things out. So we came up with a modern-day electronic equivalent for the Rev-9 character, and that was very potent.

Was James Cameron involved much in the post production?
He’s quite busy with his Avatar movies. Some of the time he was in New Zealand, some of the time he was in Los Angeles. Depending on where he was and where we were in the process, we would hit milestones, like screenings or the first cut. We would send him versions and download a bunch of his thoughts.

Editing is very much a part of his wheelhouse. Unlike many other directors, he really thinks about this shot, then that shot, then the next shot. His mind really works that way. Sometimes he would give us pretty specific, dialed-in notes on things. Sometimes it would just be bigger suggestions, like, “Maybe the action cutting pattern could be more like this …” So we’d get his thoughts — and, of course, he’s Jim Cameron, and he knows the business and the Terminator franchise — so I listened pretty carefully to that input.

This is the second film that you’ve cut with Premiere Pro. Deadpool was first, and there were challenges using it on such a complex project. What was the experience like this time around?
Whenever you set out to use a new workflow — not to say Premiere is new because it’s been around a long time and has millions of users, but it’s unusual to use it on large VFX movies for specific reasons.

L-R: Matthew Carson and Julian Clarke

On Deadpool, that led to certain challenges, and that’s just what happens when you try to do something new. The fact that we had to split the movie into separate projects for each reel, instead of one large project. Even so, the size of our project files made it tough. They were so full of media that they would take five minutes to open. Nevertheless, we made it work, and there are lots of benefits to using Adobe over other applications.

In comparison, the interface to Avid Media Composer looks like it was designed 20 years ago, but they have multi-user collaboration nailed, and I love the trim tool. Yet, some things are old and creaky. Adobe’s not that at all. It’s nice and elegant in terms of the actual editing process. We got through it and sat down with Adobe to point out things that needed work, and they worked on them. When we started up Terminator, they had a whole new build for us. Project files now opened in 15 seconds. They are about halfway there in terms of multi-user editing. Now everyone can go into a big, shared project, and you can move bins back and forth. Although, only one user at a time has write access to the master project.

This is not simple software they are writing. Adobe is putting a lot of work into making it a more fitting tool for this type of movie. Even though this film was exponentially larger than Deadpool, from the Adobe side it was a smoother process. Props to them for doing that! The cool part about pioneering this stuff is the amount of work that Adobe is on board to do. They’ll have people work on stuff that is helpful to us, so we get to participate a little in how Adobe’s software gets made.

With two large Premiere Pro projects under your belt, what sort of new features would you like to see Adobe add to the application to make it even better for feature film editors?
They’ve built out the software from being a single-user application to being multi-user software, but the inherent software at the base level is still single-user. Sometimes your render files get unlinked when you go back and forth between multiple users. There’s probably stuff where they have to dig deep into the code to make those minor annoyances go away. Other items I’d like to see — let’s not use third-party software to send change lists to the mix stage.

I know Premiere Pro integrates beautifully with After Effects, but for me, After Effects is this precise tool for executing shots. I don’t want a fine tool for compositing — I want to work in broad strokes and then have someone come back and clean it up. I would love to have a tracking tool to composite two shots together for a seamless, split screen of two combined takes — features like that.

The After Effects integration and the color correction are awesome features for a single user to execute the film, but I don’t have the time to be the guy to execute the film at that high level. I just have to keep going. I want to be able to do a fast and dirty version so I know it’s not a terrible idea, and then turn to someone else and say, “OK, make that good.” After Effects is cool, but it’s more for VFX editors or single users who are trying to make a film on their own.

After all of these action films, are you ready to do a different type of film, like a period drama?
Funny you should say that. After Deadpool I worked on The Handmaid’s Tale pilot, and it was exactly that. I was working on this beautifully acted, elegant project with tons of women characters and almost everything was done in-camera. It was a lot of parlor room drama and power dynamics. And that was wonderful to work on after all of this VFX/action stuff. Periodically it’s nice to flex a different creative muscle.

It’s not that I only work on science-fiction/VFX projects — which I love — but, in part, people start associating you with a certain genre, and then that becomes an easy thing to pursue and get work for.

Much like acting, if you want to be known for doing a lot of different things, you have to actively pursue it. It’s easy to go where momentum will take you. If you want to be the editor who can cut any genre, you have to make it a mission to pursue those projects that will keep your resume looking diverse. For a brief moment after Deadpool, I might have been able to pivot to a comedy career (laughs). That was a real hybrid, so it was challenging to thread the needle of the different tones of the film and make it feel like one piece.

Any final thoughts on the challenges of editing Terminator: Dark Fate?
The biggest challenge of the film was that, in a way, the film was an ensemble with the Dani character, the Grace character, the Sarah character and Arnold’s character — the T-800. All of these characters are protagonists that all have their individual arcs. Feeling that you were adequately servicing those arcs without grinding the movie to a halt or not touching bases with a character often enough — finding out how to dial that in was the major challenge of the movie, plus the scale of the VFX and finessing all the action scenes. I learned a lot.


Oliver Peters is an experienced film and commercial editor/colorist. In addition, he regularly interviews editors for trade publications. He may be contacted through his website at oliverpeters.com

Final Cut ups Zoe Schack to editor

Final Cut in LA has promoted Zoe Schack to editor after working at the studio as an assistant editor for three years. While at Final Cut, Schack has been mentored by Final Cut editors Crispin Struthers, Joe Guest, Jeff Buchanan and Rick Russell.

Schack has edited branded content and commercials for Audi, Infiniti, Doritos and Dollar Shave Club as well as music videos for Swae Lee and Whitney Woerz. She has also worked with a number of high-profile directors, including Dougal Wilson, Ava DuVernay, Michel Gondry, Craig Gillespie and Steve Ayson.

Originally from a small town north of New York, Schack studied film at Rhode Island School of Design and NYU’s Tisch School of the Arts. Her love for documentaries led her to intern with renowned filmmaker Albert Maysles and to produce the Bicycle Film Festival in Portland, Oregon. She edited several short documentaries and a pilot series that were featured in many film festivals.

“It’s been amazing watching Zoe’s growth the last few years,” says Final Cut executive producer Suzy Ramirez. “She’s so meticulous, always doing a deep dive into the footage. Clients love working with her because she makes the process fun. She’s grown here at Final Cut so much already under the guidance of our editors, and her craft keeps evolving. I’m excited to see what’s ahead”

A post engineer’s thoughts on Adobe MAX, new offerings

By Mike McCarthy

Last week, I had the opportunity to attend Adobe’s MAX conference at the LA Convention Center. Adobe showed me, and 15,000 of my closest friends, the newest updates to pretty much all of its Creative Cloud applications, as well as a number of interesting upcoming developments. From a post production perspective, the most significant pieces of news are the release of Premiere Pro 14 and After Effects 17 (a.ka., the 2020 releases of those Creative Cloud apps).

The main show ran from Monday to Wednesday, with a number of pre-show seminars and activities the preceding weekend. My experience started off by attending a screening of the new Terminator Dark Fate film at LA Live, followed by Q&A with the director and post team. The new Terminator was edited in Premiere Pro, sharing the project assets between a large team of editors and assistants, with extensive use of After Effects, Adobe’s newly acquired Substance app and various other tools in the Creative Cloud.

The post team extolled the improvements in shared project support and project opening times since their last Premiere endeavor on the first Deadpool movie. Visual effects editor Jon Carr shared how they used the integration between Premiere and After Effects to facilitate rapid generation of temporary “postvis” effects. This helped the editors tell the story while they were waiting on the VFX teams to finish generating the final CGI characters and renders.

MAX
The conference itself kicked off with a keynote presentation of all of Adobe’s new developments and releases. The 150-minute presentation covered all aspects of the company’s extensive line of applications. “Creativity for All” is the primary message Adobe is going for, and they focused on the tension between creativity and time. So they are trying to improve their products in ways that give their users more time to be creative.

The three prongs of that approach for this iteration of updates were:
– Faster, more powerful, more reliable — fixing time-wasting bugs, improving hardware use.
– Create anywhere, anytime, with anyone — adding functionality via the iPad, and shared Libraries for collaboration.
– Explore new frontiers — specifically in 3D with Adobe’s Dimension, Substance and Aero)

Education is also an important focus for Adobe, with 15 million copies of CC in use in education around the world. They are also creating a platform for CC users to stream their working process to viewers who want to learn from them, directly from within the applications. That will probably integrate with the new expanded Creative Cloud app released last month. They also have released integration for Office apps to access assets in CC libraries.

The first application updates they showed off were in Photoshop. They have made the new locked aspect ratio scaling a toggle-able behavior, improved the warp tool and improved ways to navigate deep layer stacks by seeing which layers effect particular parts of an image. But the biggest improvement is AI-based object selection. This makes detailed maskings based on simple box selections or rough lassos. Illustrator now has GPU acceleration, improving performance of larger documents and a path simplifying tool to reduce the number of anchor points.

They released Photoshop for the iPad and announced that Illustrator will be following that path as well. Fresco is headed the other direction and now available on Windows. That is currently limited to Microsoft Surface products, but I look forward to being able to try it out on my ZBook-X2 at some point. Adobe XD has new features, and apparently is the best way to move complex Illustrator files into After Effects, which I learned at one of the sessions later.

Premiere
Premiere Pro 14 has a number of new features, the most significant one being AI-driven automatic reframe to allow you to automatically convert your edited project into other aspect ratios for various deliverables. While 16×9 is obviously a standard size, certain web platforms are optimized for square or tall videos. The feature can also be used to reframe content for 2.35 to 16×9 or 4×3, which are frequent delivery requirements for feature films that I work on. My favorite aspect of this new functionality is that the user has complete control over the results.

Unlike other automated features like warp stabilizer, which only offers on/off of applying the results, the auto-frame function just generates motion effect keyframes that can be further edited and customized by the user… once the initial AI pass is complete. It also has a nesting feature for retaining existing framing choices, that results in the creation of a new single-layer source sequence. I can envision this being useful for a number of other workflow processes — such as preparing for external color grading or texturing passes, etc.

They also added better support for multi-channel audio workflows and effects, improved playback performance for many popular video formats, better HDR export options and a variety of changes to make the motion graphics tools more flexible and efficient for users who use them extensively. They also increased the range of values available for clip playback speed and volume, and added support for new camera formats and derivations.

The brains behind After Effects have focused on improving playback and performance for this release and have made some significant improvements in that regard. The other big feature that actually may make a difference is content-aware fill for video. This was sneak previewed at MAX last year and first implemented in the NAB 2019 release of After Effects, but it should be greatly refined and improved in this version since it’s now twice as fast.

They also greatly improved support for OpenEXR frame sequences, especially with multiple render pass channels. The channels can be labeled; it creates a video contact sheet for viewing all the layers in thumbnail form. EXR playback performance is supposed to be greatly improved as well.

Character Animator is now at 3.0, and they have added keyframing of all editable values, trigger-able reposition “cameras” and trigger-able audio effects, among other new features. And Adobe Rush now supports publishing directly to TikTok.

Content Authenticity Initiative
Outside of individual applications, Adobe has launched the Content Authenticity Initiative in partnership with the NY Times and Twitter. It aims to fight fake news and restore consumer confidence in media. Its three main goals are: trust, attribution and authenticity. It aims to present end users with who created an image and who edited or altered it and, if so, in what ways. Seemingly at odds with that, they also released a new mobile app that edits images upon capture, using AI empowered “lenses” for highly stylized looks, even providing a live view.

This opening keynote was followed by a selection of over 200 different labs and sessions available over the next three days. I attended a couple sessions focused on After Effects, as that is a program I know I don’t use to its full capacity. (Does anyone, really?)

Partners
A variety of other partner companies were showing off their products in the community pavilion. HP was pushing 3D printing and digital manufacturing tools that integrate with Photoshop and Illustrator. Dell has a new 27-inch color accurate monitor with built-in colorimeter, presumably to compete with HP’s top end DreamColor displays. Asus also has some new HDR monitors that are Dolby Vision compatible. One is designed to be portable, and is as thin and lightweight as a laptop screen. I have always wondered why that wasn’t a standard approach for desktop displays.

Keynotes
Tuesday opened with a keynote presentation from a number of artists of different types, speaking or being interviewed. Jason Levine’s talk with M. Night Shyamalan was my favorite part, even though thrillers aren’t really my cup of tea. Later, I was able to sit down and talk with Patrick Palmer, Adobe’s Premiere Pro product manager about where Premiere is headed and the challenges of developing HDR creation tools when there is no unified set of standards for final delivery. But I am looking forward to being able to view my work in HDR while I am editing at some point in the future.

One of the highlights of MAX is the 90-minute Sneaks session on Tuesday night, where comedian John Mulaney “helped” a number of Adobe researchers demonstrate new media technologies they are working on. These will eventually improve audio quality, automate animation, analyze photographic authenticity and many other tasks once they are refined into final products at some point in the future.

This was only my second time attending MAX, and with Premiere Rush being released last year, video production was a key part of that show. This year, without that factor, it was much more apparent to me that I was an engineer attending an event catering to designers. Not that this is bad, but I mention it here because it is good to have a better idea of what you are stepping into when you are making decisions about whether to invest in attending a particular event.

Adobe focuses MAX on artists and creatives as opposed to engineers and developers, who have other events that are more focused on their interests and needs. I suppose that is understandable since it is not branded Creative Cloud for nothing. But it is always good to connect with the people who develop the tools I use, and the others who use them with me, which is a big part of what Adobe MAX is all about.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

James Norris joins Nomad in London as editor, partner

Nomad in London has added James Norris as editor and partner. A self-taught, natural editor, James started out running for the likes of Working Title, Partizan and Tomboy Films. He then moved to Whitehouse Post as an assistant where he refined his craft and rose through the ranks to become an editor.

Over the past 15 years, he’s worked across commercials, music videos, features and television. Norris edited Ikea’s Fly Robot Fly spot and Asda’s Get Possessed piece, and has recently cut a new project for Nike. Working within television and film, he also cut an episode of the BAFTA-nominated drama Our World War and feature film We Are Monster.

“I was attracted to Nomad for their vision for the future and their dedication to the craft of editing. They have a wonderful history but are also so forward-thinking and want to create new, exciting things. The New York and LA offices have seen incredible success over the last few years, and now there’s Tokyo and London too. On top of this, Nomad feels like home already. They’re really lovely people — it really does feel like a family.”

Norris will be cutting on Avid Media Composer at Nomad.

 

Review: Lenovo Yoga A940 all-in-one workstation

By Brady Betzel

While more and more creators are looking for alternatives to the iMac, iMac Pro and Mac Pro, there are few options with high-quality, built-in monitors: Microsoft Surface Studio, HP Envy, and Dell 7000 are a few. There are even fewer choices if you want touch and pen capabilities. It’s with that need in mind that I decided to review the Lenovo Yoga A940, a 27-inch, UHD, pen- and touch-capable Intel Core i7 computer with an AMD Radeon RX 560 GPU.

While I haven’t done a lot of all-in-one system reviews like the Yoga A940, I have had my eyes on the Microsoft Surface Studio 2 for a long time. The only problem is the hefty price tag of around $3,500. The Lenovo’s most appealing feature — in addition to the tech specs I will go over — is its price point: It’s available from $2,200 and up. (I saw Best Buy selling a similar system to the one I reviewed for around $2,299. The insides of the Yoga and the Surface Studio 2 aren’t that far off from each other either, at least not enough to make up for the $1,300 disparity.)

Here are the parts inside the Lenovo Yoga A940: Intel Core i7-8700 3.2GHz processor (up to 4.6GHz with Turbo Boost), six cores (12 threads) and 12MB cache; 27-inch 4K UHD IPS multitouch 100% Adobe RGB display; 16GB DDR4 2666MHz (SODIMM) memory; 1TB 5400 RPM drive plus 256GB PCIe SSD; AMD Radeon RX 560 4GB graphics processor; 25-degree monitor tilt angle; Dolby Atmos speakers; Dimensions: 25 inches by 18.3 inches by 9.6 inches; Weight: 32.2 pounds; 802.11AC and Bluetooth 4.2 connectivity; side panel inputs: Intel Thunderbolt, USB 3.1, 3-in-1 card reader and audio jack; rear panel inputs: AC-in, RJ45, HDMI and four USB 3.0; Bluetooth active pen (appears to be the Lenovo Active Pen 2); and QI wireless charging technology platform.

Digging In
Right off the bat, I just happened to put my Android Galaxy phone on the odd little flat platform located on the right side of the all-in-one workstation, just under the monitor, and I saw my phone begin to charge wirelessly. QI wireless charging is an amazing little addition to the Yoga; it really comes through in a pinch when I need my phone charged and don’t have the cable or charging dock around.

Other than that nifty feature, why would you choose a Lenovo Yoga A940 over any other all-in-one system? Well, as mentioned, the price point is very attractive, but you are also getting a near-professional-level system in a very tiny footprint — including Thunderbolt 3 and USB connections, HDMI port, network port and SD card reader. While it would be incredible to have an Intel i9 processor inside of the Yoga, the i7 clocks in at 3.2GHz with six cores. Not a beast, but enough to get the job done inside of Adobe Premiere and Blackmagic’s DaVinci Resolve, but maybe with transcoded files instead of Red raw or the like.

The Lenovo Yoga A940 is outfitted with a front-facing Dolby Atmos audio speaker as well as Dolby Vision technology in the IPS display. The audio could use a little more low end, but it is good. The monitor is surprisingly great — the whites are white and the blacks are black; something not everyone can get right. It has 100% Adobe RGB color coverage and is Pantone-validated. The HDR is technically Dolby Vision and looks great at about 350 nits (not the brightest, but it won’t burn your eyes out either). The Lenovo BT active pen works well. I use Wacom tablets and laptop tablets daily, so this pen had a lot to live up to. While I still prefer the Wacom pen, the Lenovo pen, with 4,096 levels of sensitivity, will do just fine. I actually found myself using the touchscreen with my fingers way more than the pen.

One feature that sets the A940 apart from the other all-in-one machines is the USB Content Creation dial. With the little time I had with the system, I only used it to adjust speaker volume when playing Spotify, but in time I can see myself customizing the dials to work in Premiere and Resolve. The dial has good action and resistance. To customize the dial, you can jump into the Lenovo Dial Customization Assistant.

Besides the Intel i7, there is an AMD Radeon RX 560 with 4GB of memory, two 3W and two 5W speakers, 32 GB of DDR4 2666 MHz memory, a 1 TB 5400 RPM hard drive for storage, and a 256GB PCIe SSD. I wish the 1TB drive was also an SSD, but obviously Lenovo has to keep that price point somehow.

Real-World Testing
I use Premiere Pro, After Effects and Resolve all the time and can understand the horsepower of a machine through these apps. Whether editing and/or color correcting, the Lenovo A940 is a good medium ground — it won’t be running much more than 4K Red raw footage in real time without cutting the debayering quality down to half if not one-eighth. This system would make a good “offline” edit system, where you transcode your high-res media to a mezzanine codec like DNxHR or ProRes for your editing and then up-res your footage back to the highest resolution you have. Or, if you are in Resolve, maybe you could use optimized media for 80% of the workflow until you color. You will really want a system with a higher-end GPU if you want to fluidly cut and color in Premiere and Resolve. That being said, you can make it work with some debayer tweaking and/or transcoding.

In my testing I downloaded some footage from Red’s sample library, which you can find here. I also used some BRAW clips to test inside of Resolve, which can be downloaded here. I grabbed 4K, 6K, and 8K Red raw R3D files and the UHD-sized Blackmagic raw (BRAW) files to test with.

Adobe Premiere
Using the same Red clips as above, I created two one-minute-long UHD (3840×2160) sequences. I also clicked “Set to Frame Size” for all the clips. Sequence 1 contained these clips with a simple contrast, brightness and color cast applied. Sequence 2 contained these same clips with the same color correction applied, but also a 110% resize, 100 sharpen and 20 Gaussian Blur. I then exported them to various codecs via Adobe Media Encoder using the OpenCL for processing. Here are my results:

QuickTime (.mov) H.264, No Audio, UHD, 23.98 Maximum Render Quality, 10 Mb/s:
Color Correction Only: 24:07
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 26:11
DNxHR HQX 10 bit UHD
Color Correction Only: 25:42
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 27:03

ProRes HQ
Color Correction Only: 24:48
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 25:34

As you can see, the export time is pretty long. And let me tell you, once the sequence with the Gaussian Blur and Resize kicked in, so did the fans. While it wasn’t like a jet was taking off, the sound of the fans definitely made me and my wife take a glance at the system. It was also throwing some heat out the back. Because of the way Premiere works, it relies heavily on the CPU over GPU. Not that it doesn’t embrace the GPU, but, as you will see later, Resolve takes more advantage of the GPUs. Either way, Premiere really taxed the Lenovo A940 when using 4K, 6K and 8K Red raw files. Playback in real time wasn’t possible except for the 4K files. I probably wouldn’t recommend this system for someone working with lots of higher-than-4K raw files; it seems to be simply too much for it to handle. But if you transcode the files down to ProRes, you will be in business.

Blackmagic Resolve 16 Studio
Resolve seemed to take better advantage of the AMD Radeon RX 560 GPU in combination with the CPU, as well as the onboard Intel GPU. In this test I added in Resolve’s amazing built-in spatial noise reduction, so other than the Red R3D footage, this test and the Premiere test weren’t exactly comparing apples to apples. Overall the export times will be significantly higher (or, in theory, they should be). I also added in some BRAW footage to test for fun, and that footage was way easier to work and color with. Both sequences were UHD (3840×2160) 23.98. I will definitely be looking into working with more BRAW footage. Here are my results:

Playback: 4K realtime playback at half-premium, 6K no realtime playback, 8K no realtime playback

H.264 no audio, UHD, 23.98fps, force sizing and debayering to highest quality
Export 1 (Native Renderer)
Export 2 (AMD Renderer)
Export 3 (Intel QuickSync)

Color Only
Export 1: 3:46
Export 2: 4:35
Export 3: 4:01

Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:51
Export 2: 37:21
Export 3: 37:13

BRAW 4K (4608×2592) Playback and Export Tests

Playback: Full-res would play at about 22fps; half-res plays at realtime

H.264 No Audio, UHD, 23.98 fps, Force Sizing and Debayering to highest quality
Color Only
Export 1: 1:26
Export 2: 1:31
Export 3: 1:29
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:30
Export 2: 36:24
Export 3: 36:22

DNxHR 10 bit:
Color Correction Only: 3:42
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur: 39:03

One takeaway from the Resolve exports is that the color-only export was much more efficient than in Premiere, taking just over three or four times realtime for the intensive Red R3D files, and just over one and a half times real time for BRAW.

Summing UpIn the end, the Lenovo A940 is a sleek looking all-in-one touchscreen- and pen-compatible system. While it isn’t jam-packed with the latest high-end AMD GPUs or Intel i9 processors, the A940 is a mid-level system with an incredibly good-looking IPS Dolby Vision monitor with Dolby Atmos speakers. It has some other features — like IR camera, QI wireless charger and USB Dial — that you might not necessarily be looking for but love to find.

The power adapter is like a large laptop power brick, so you will need somewhere to stash that, but overall the monitor has a really nice 25-degree tilt that is comfortable when using just the touchscreen or pen, or when using the wireless keyboard and mouse.

Because the Lenovo A940 starts at just around $2,299 I think it really deserves a look when searching for a new system. If you are working in primarily HD video and/or graphics this is the all-in-one system for you. Check out more at their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

The editors of Ad Astra: John Axelrad and Lee Haugen

By Amy Leland

The new Brad Pitt film Ad Astra follows astronaut Roy McBride (Pitt) as he journeys deep into space in search of his father, astronaut Clifford McBride (Tommy Lee Jones). The elder McBride disappeared years before, and his experiments in space might now be endangering all life on Earth. Much of the film features Pitt’s character alone in space with his thoughts, creating a happy challenge for the film’s editing team, who have a long history of collaboration with each other and the film’s director James Gray.

L-R: Lee Haugen and John Axelrad

Co-editors John Axelrad, ACE, and Lee Haugen share credits on three previous films — Haugen served as Axelrad’s apprentice editor on Two Lovers, and the two co-edited The Lost City of Z and Papillon. Ad Astra’s director, James Gray, was also at the helm of Two Lovers and The Lost City of Z. A lot can be said for long-time collaborations.

When I had the opportunity to speak with Axlerad and Haugen, I was eager to find out more about how this shared history influenced their editing process and the creation of this fascinating story.

What led you both to film editing?
John Axelrad: I went to film school at USC and graduated in 1990. Like everyone else, I wanted to be a director. Everyone that goes to film school wants that. Then I focused on studying cinematography, but then I realized several years into film school that I don’t like being on the set.

Not long ago, I spoke to Fred Raskin about editing Once Upon a Time… in Hollywood. He originally thought he was going to be a director, but then he figured out he could tell stories in an air-conditioned room.
Axelrad: That’s exactly it. Air conditioning plays a big role in my life; I can tell you that much. I get a lot of enjoyment out of putting a movie together and of being in my own head creatively and really working with the elements that make the magic. In some ways, there are a lot of parallels with the writer when you’re an editor; the difference is I’m not dealing with a blank page and words — I’m dealing with images, sound and music, and how it all comes together. A lot of people say the first draft is the script, the second draft is the shoot, and the third draft is the edit.

L-R: John and Lee at the Papillon premiere.

I started off as an assistant editor, working for some top editors for about 10 years in the ’90s, including Anne V. Coates. I was an assistant on Out of Sight when Anne Coates was nominated for the Oscar. Those 10 years of experience really prepped me for dealing with what it’s like to be the lead editor in charge of a department — dealing with the politics, the personalities and the creative content and learning how to solve problems. I started cutting on my own in the late ‘90s, and in the early 2000s, I started editing feature films.

When did you meet your frequent collaborator James Gray?
Axelrad: I had done a few horror features, and then I hooked up with James on We Own the Night, and that went very well. Then we did Two Lovers after that. That’s where Lee Haugen came in — and I’ll let him tell his side of the story — but suffice it to say that I’ve done five films for James Gray, and Lee Haugen rose up through the ranks and became my co-editor on the Lost City of Z. Then we edited the movie Papillon together, so it was just natural that we would do Ad Astra together as a team.

What about you, Lee? How did you wind your way to where we are now?
Lee Haugen: Growing up in Wisconsin, any time I had a school project, like writing a story or writing an article, I would change it into a short video or short film instead. Back then I had to shoot on VHS tape and edited tape to tape by pushing play and hitting record and timing it. It took forever, but that was when I really found out that I loved editing.

So I went to school with a focus on wanting to be an editor. After graduating from Wisconsin, I moved to California and found my way into reality television. That was the mid-2000s and it was the boom of reality television; there were a lot of jobs that offered me the chance to get in the hours needed for becoming a member of the Editors Guild as well as more experience on Avid Media Composer.

After about a year of that, I realized working the night shift as an assistant editor on reality television shows was not my real passion. I really wanted to move toward features. I was listening to a podcast by Patrick Don Vito (editor of Green Book, among other things), and he mentioned John Axelrad. I met John on an interview for We Own the Night when I first moved out here, but I didn’t get the job. But a year or two later, I called him, and he said, “You know what? We’re starting another James Gray movie next week. Why don’t you come in for an interview?” I started working with John the day I came in. I could not have been more fortunate to find this group of people that gave me my first experience in feature films.

Then I had the opportunity to work on a lower-budget feature called Dope, and that was my first feature editing job by myself. The success of the film at Sundance really helped launch my career. Then things came back around. John was finishing up Krampus, and he needed somebody to go out to Northern Ireland to edit the assembly of The Lost City of Z with James Gray. So, it worked out perfectly, and from there, we’ve been collaborating.

Axelrad: Ad Astra is my third time co-editing with Lee, and I find our working as a team to be a naturally fluid and creative process. It’s a collaboration entailing many months of sharing perspectives, ideas and insights on how best to approach the material, and one that ultimately benefits the final edit. Lee wouldn’t be where he is if he weren’t a talent in his own right. He proved himself, and here we are together.

How has your collaborative process changed and grown from when you were first working together (John, Lee and James) to now, on Ad Astra?
Axelrad: This is my fifth film with James. He’s a marvelous filmmaker, and one of the reasons he’s so good is that he really understands the subtlety and power of editing. He’s very neoclassical in his approach, and he challenges the viewer since we’re all accustomed to faster cutting and faster pacing. But with James, it’s so much more of a methodical approach. James is very performance-driven. It’s all about the character, it’s all about the narrative and the story, and we really understand his instincts. Additionally, you need to develop a second-hand language and truly understand what the director wants.

Working with Lee, it was just a natural process to have the two of us cutting. I would work on a scene, and then I could say, “Hey Lee, why don’t you take a stab at it?” Or vice versa. When James was in the editing room working with us, he would often work intensely with one of us and then switch rooms and work with the other. I think we each really touched almost everything in the film.

Haugen: I agree with John. Our way of working is very collaborative —that includes John and I, but also our assistant editors and additional editors. It’s a process that we feel benefits the film as a whole; when we have different perspectives, it can help us explore different options that can raise the film to another level. And when James comes in, he’s extremely meticulous. And as John said, he and I both touched every single scene, and I think we’ve even touched every frame of the film.

Axelrad: To add to what Lee said, about involving our whole editing team, I love mentoring, and I love having my crew feel very involved. Not just technical stuff, but creatively. We worked with a terrific guy, Scott Morris, who is our first assistant editor. Ultimately, he got bumped up during the course of the film and got an additional editor credit on Ad Astra.

We involve everyone, even down to the post assistant. We want to hear their ideas and make them feel like a welcome part of a collaborative environment. They obviously have to focus on their primary tasks, but I think it just makes for a much happier editing room when everyone feels part of a team.

How did you manage an edit that was so collaborative? Did you have screenings of dailies or screenings of cuts?
Axelrad: During dailies it was just James, and we would send edits for him to look at. But James doesn’t really start until he’s in the room. He really wants to explore every frame of film and try all the infinite combinations, especially when you’re dealing with drama and dealing with nuance and subtlety and subtext. Those are the scenes that take the longest. When I put together the lunar rover chase, it was almost easier in some ways than some of the intense drama scenes in the film.

Haugen: As the dailies came in, John and I would each take a scene and do a first cut. And then, once we had something to present, we would call everybody in to watch the scene. We would get everybody’s feedback and see what was working, what wasn’t working. If there were any problems that we could address before moving to the next scene, we would. We liked to get the outside point of view, because once you get further and deeper into the process of editing a film, you do start to lose perspective. To be able to bring somebody else in to watch a scene and to give you feedback is extremely helpful.

One thing that John established with me on Two Lovers — my first editing job on a feature — was allowing me to come and sit in the room during the editing. After my work was done, I was welcome to sit in the back of the room and just observe the interaction between John and James. We continued that process with this film, just to give those people experience and to learn and to observe how an edit room works. That helped me become an editor.

John, you talked about how the action scenes are often easier to cut than the dramatic scenes. It seems like that would be even more true with Ad Astra, because so much of this film is about isolation. How does that complicate the process of structuring a scene when it’s so much about a person alone with his own thoughts?
Axelrad: That was the biggest challenge, but one we were prepared for. To James’ credit, he’s not precious about his written words; he’s not precious about the script. Some directors might say, “Oh no, we need to mold it to fit the script,” but he allows the actors to work within a space. The script is a guide for them, and they bring so much to it that it changes the story. That’s why I always say that we serve the ego of the movie. The movie, in a way, informs us what it wants to be, and what it needs to be. And in the case of this, Brad gave us such amazing nuanced performances. I believe you can sometimes shape the best performance around what is not said through the more nuanced cues of facial expressions and gestures.

So, as an editor, when you can craft something that transcends what is written and what is photographed and achieve a compelling synergy of sound, music and performance — to create heightened emotions in a film — that’s what we’re aiming for. In the case of his isolation, we discovered early on that having voiceover and really getting more interior was important. That wasn’t initially part of the cut, but James had written voiceover, and we began to incorporate that, and it really helped make this film into more of an existential journey.

The further he goes out into space, the deeper we go into his soul, and it’s really a dive into the subconscious. That sequence where he dives underwater in the cooling liquid of the rocket, he emerges and climbs up the rocket, and it’s almost like a dream. Like how in our dreams we have superhuman strength as a way to conquer our demons and our fears. The intent really was to make the film very hypnotic. Some people get it and appreciate it.

As an editor, sound often determines the rhythm of the edit, but one of the things that was fascinating with this film is how deafeningly quiet space likely is. How do you work with the material when it’s mostly silent?
Haugen: Early on, James established that he wanted to make the film as realistic as possible. Sound, or lack of sound, is a huge part of space travel. So the hard part is when you have, for example, the lunar rover chase on the moon, and you play it completely silent; it’s disarming and different and eerie, which was very interesting at first.

But then we started to explore how we could make this sound more realistic or find a way to amplify the action beats through sound. One way was, when things were hitting him or things were vibrating off of his suit, he could feel the impacts and he could hear the vibrations of different things going on.

Axelrad: It was very much part of our rhythm, of how we cut it together, because we knew James wanted to be as realistic as possible. We did what we could with the soundscapes that were allowable for a big studio film like this. And, as Lee mentioned, playing it from Roy’s perspective — being in the space suit with him. It was really just to get into his head and hear things how he would hear things.

Thanks to Max Richter’s beautiful score, we were able to hone the rhythms to induce a transcendental state. We had Gary Rydstrom and Tom Johnson mix the movie for us at Skywalker, and they were the ultimate creators of the balance of the rhythms of the sounds.

Did you work with music in the cut?
Axelrad: James loves to temp with classical music. In previous films, we used a lot of Puccini. In this film, there was a lot of Wagner. But Max Richter came in fairly early in the process and developed such beautiful themes, and we began to incorporate his themes. That really set the mood.

When you’re working with your composer and sound designer, you feed off each other. So things that they would do would inspire us, and we would change the edits. I always tell the composers when I work with them, “Hey, if you come up with something, and you think musically it’s very powerful, let me know, and I am more than willing to pitch changing the edit to accommodate.” Max’s music editor, Katrina Schiller, worked in-house with us and was hugely helpful, since Max worked out of London.

We tend not to want to cut with music because initially you want the edit not to have music as a Band-Aid to cover up a problem. But once we feel the picture is working, and the rhythm is going, sometimes the music will just fit perfectly, even as temp music. And if the rhythms match up to what we’re doing, then we know that we’ve done it right.

What is next for the two of you?
Axelrad: I’m working on a lower-budget movie right now, a Lionsgate feature film. The title is under wraps, but it stars Janelle Monáe, and it’s kind of a socio-political thriller.

What about you Lee?
Haugen: I jumped onto another film as well. It’s an independent film starring Zoe Saldana. It’s called Keyhole Garden, and it’s this very intimate drama that takes place on the border between Mexico and America. So it’s a very timely story to tell.


Amy Leland is a film director and editor. Her short film, Echoes, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Updated Apple Final Cut Pro features new Metal engine

Apple has updated Final Cut Pro X with a new Metal engine designed to provide performance gains across a wide range of Mac systems. It takes advantage of the new Mac Pro and the high-resolution, high-dynamic-range viewing experience of Apple Pro Display XDR. The company also optimized Motion and Compressor with Metal as well.

The Metal-based engine improves playback and accelerates graphics tasks in FCP X, including rendering, realtime effects and exporting on compatible Mac computers. According to Apple, video editors with a 15-inch MacBook Pro will benefit from performance that’s up to 20 percent faster, while editors using an iMac Pro will see gains up to 35 percent.

Final Cut Pro also works with the new Sidecar feature of macOS Catalina, which allows users to extend their Mac workspace by using an iPad as a second display to show the browser or viewer. Video editors can use Sidecar with a cable or they can connect wirelessly.

Final Cut Pro will now support multiple GPUs and up to 28 CPU cores. This means that rendering is up to 2.9 times faster and transcoding is up to 3.2 times faster than on the previous-generation 12-core Mac Pro. And Final Cut Pro uses the new Afterburner card when working with ProRes and ProRes Raw. This allows editors to simultaneously play up to 16 streams of 4K ProRes 422 video or work in 8K resolution with support for up to three streams of 8K ProRes Raw video.

Pro Display XDR
The Pro Display XDR features a 32-inch Retina 6K display, P3 wide color and extreme dynamic range. Final Cut Pro users can view, edit, grade and deliver HDR video with 1,000 nits of full screen sustained brightness, 1,600 nits peak brightness and a 1,000,000:1 contrast ratio. Pro Display XDR connects to the Mac through a single Thunderbolt cable, and pros using Final Cut Pro on Mac Pro can simultaneously use up to three Pro Display XDR units — two for the Final Cut Pro interface and one as a dedicated professional reference monitor.

Final Cut Pro 10.4.7 is available now as a free update for existing users and for $299.99 for new users on the Mac App Store. Motion 5.4.4 and Compressor 4.4.5 are also available today as free updates for existing users and for $49.99 each for new users on the Mac App Store.

Uppercut ups Tyler Horton to editor

After spending two years as an assistant at New York-based editorial house Uppercut, Tyler Horton has been promoted to editor. This is the first internal talent promotion for Uppercut.

Horton first joined Uppercut in 2017 after a stint as an assistant editor at Whitehouse Post. Stepping up as editor he’s cut notable projects, such as a recent Nike campaign “Letters to Heroes,” a series launched in conjunction with the US Open that highlights young athletes meeting their role models, including Serena Williams and Naomi Osaka. He also has cut campaigns for brands such as Asics, Hypebeast, Volvo and MOMA.

“From the beginning, Uppercut was always intentionally a boutique studio that embraced a collaborative of visions and styles — never just a one-person shop,” says Uppercut EP Julia Williams. “Tyler took initiative from day one to be as hands-on as possible with every project and we’ve been proud to see him really grow and refine his own voice.”

Horton’s love of film was sparked by watching sports reels and highlight videos. He went on to study film editing, then hit the road to tour with his band for four years before returning to his passion for film.

Wildlife DP Steve Lumpkin on the road and looking for speed

For more than a decade, Steve Lumpkin has been traveling to the Republic of Botswana to capture and celebrate the country’s diverse and protected wildlife population. As a cinematographer and still photographer, Under Prairies Skies Photography‘s Lumpkin will spend a total of 65 days this year filming in the bush for his current project, Endless Treasures of Botswana.

Steve Lumpkin

It’s a labor of love that comes through in his stunning photographs, whether they depict a proud and healthy lioness washed with early-morning sunlight, an indolent leopard draped over a tree branch or a herd of elephants traversing a brilliant green meadow. The big cats hold a special place in Lumpkin’s heart, and documenting Botswana’s largest pride of lions is central to the project’s mission.

“Our team stands witness to the greatest conservation of the natural world on the planet. Botswana has the will and the courage to protect all things wild,” he explains. “I wanted to fund a not-for-profit effort to create both still images and films that would showcase The Republic of Botswana’s success in protecting these vulnerable species. In return, the government granted me a two-year filming permit to bring back emotional, true tales from the bush.”

Lumpkin recently graduated to shooting 4K video in the bush in Apple ProRes Raw, using a Sony FS5 camera and an Atomos Inferno recorder. He brings the raw footage back to his US studio for post, working in Apple Final Cut Pro on an iMac 5K and employing a variety of tools, including Color Grading Central and Neat Video.

Leopard

Until recently, Lumpkin was hitting a performance snag when transferring files from his QNAP TBS 882T NAS storage system to his iMac Pro. “I was only getting read times of about 100 Mb/sec from Thunderbolt, so editing 4K footage was painful,” he says. “At the time, I was transitioning to ProRes RAW, and I knew I needed a big performance kick.”

On the recommendation of Bob Zelin, video engineering consultant and owner of Rescue 1, Lumpkin installed Sonnet’s Solo10G Thunderbolt 3 adapter. The Solo10G uses the 10GbE standard to connect computers via Ethernet cables to high-speed infrastructure and storage systems. “Instantly, I jumped to a transfer rate of more than 880MB per second, a nearly tenfold throughput increase,” he says. “The system just screams now – the Solo10G has accelerated every piece of my workflow, from ingest to 4K editing to rendering and output.”

“So many colleagues I know are struggling with this exact problem — they need to work with huge files and they’ve got these big storage arrays, but their Thunderbolt 2 or 3 connections alone just aren’t cutting it.”

With Lumpkin, everything comes down to the wildlife. He appreciates any tools that help streamline his ability to tell the story of the country and its tremendous success in protecting threatened species. “The work we’re doing on behalf of Botswana is really what it’s all about — in 10 or 15 years, that country might be the only place on the planet where some of these animals still exist.

“Botswana has the largest herd of elephants in Africa and the largest group of wild dogs, of which there are only about 6,000 left,” says Lumpkin. “Products like Sonnet’s Solo10G, Final Cut, the Sony FS5 camera and Atomos Inferno, among others, help our team celebrate Botswana’s recognition as the conservation leader of Africa.”