Tag Archives: editing

Editor Brandy Troxler joins SF’s 1606 Studio

San Francisco editorial boutique 1606 Studio has hired editor Brandy Troxler. Having worked in the Bay Area for more than a decade, Troxler has edited spots for Mini USA, Yelp, Toyota, Texas.gov and others. Most recently, she was an in-house editor at San Francisco agency Heat.

1606 Studio executive producer Jon Ettinger has known Troxler for years; she began her career at Beast Editorial when he was executive producer there. “She’s a great collaborator and good team player,” he observes. “She fits the vibe established here by our partners, Doug Walker, Connor McDonald and Brian Lagerhausen, which is to work hard and form long-term partnership with our clients.”

Troxler joined Heat in 2019 after six years at Beast Editorial. A graduate of Elon University in North Carolina, she also worked at Footpath Pictures in Raleigh-Durham, and Barbary Post in San Francisco.

Her first spot for 1606 Studio was a PSA produced for International Women’s Day by UN Women, a United Nations organization working for global equality. It’s from the agency Erich & Kallman and was directed by Caruso Company’s Doug Walker. The spot begins with what appears to be a news broadcast from the 1950s as a male newscaster recites a litany of workplace issues that negatively affect women. As he speaks, the scene around him becomes more modern and it becomes apparent that the issues he is referring to apply today.

“It’s simple, but powerful,” Troxler says. “As a woman of color, it was awesome to have the opportunity to tell that story. First and foremost, I am a storyteller and I like to tell diverse stories. While docu-style is a focus of mine, I quite enjoy cutting comedy as well.”

Apple and Avid offer free temp licenses during COVID-19 crisis

Apple is offering free 90-day trials of Final Cut Pro X and Logic Pro X apps for all in order to help those working from home and looking for something new to master, as well as for students who are already using the tools in school but don’t have the apps on their home computers.

Apple Final Cut X

Apple is extending what is normally a 30-day trial for Final Cut Pro X, while a free trial is new to Logic Pro X. The extension to 90 days is for a limited time and will revert to 30 days across both apps in the future.

Trials for both Final Cut Pro X and Logic Pro X are now available. Customers can download the free trials on the web pages for Final Cut Pro X  and Logic Pro X. The 90-day extension is also available to customers who have already downloaded the free 30-day trial of Final Cut Pro X.

For its part, Avid is offering free temp licenses for remote users of the company’s creative tools. Commercial customers can get a free 90-day license for each registered user of Media Composer | Ultimate, Pro Tools, Pro Tools | Ultimate and Sibelius | Ultimate. For students whose school campuses are closed, any student of an Avid-based learning institution that uses Media Composer, Pro Tools or Sibelius can receive a free 90-day license for the same products.

The offer is open through April 17.

Main Image: Courtesy of Avid

My Top Five Ergonomic Workstation Accessories

By Brady Betzel

Instead of writing up my normal “Top Five Workstation Accessories” column this year, I wanted to take a slightly different route and focus on products that might lessen pain and maybe even improve your creative workflow — whether you are working at a studio or, more likely these days, working from home.

As an editor, I sit in a chair for most of my day, and that is on top of my three- to four-hour round-trip commute to work. As aches and pains build up (I’m 36, and I’m sure it doesn’t just get better), I had to start looking for solutions to alleviate the pain I can see coming in the future. In the past I have mentioned products like the Wacom Intuos Pro Pen tablet, which is great and helped me lessen wrist pain. Or color correction panels such as theLoupedeck, which helps creative workflows but also prevents you from solely using the mouse, also lessening wrist pain.

This year I wanted to look at how the actual setup of a workstation environment that might prevent pain or alleviate it. So get out of your seat and move around a little, take a walk around the block, and when you get back, maybe rethink how your workstation environment could become more conducive to a creativity-inspiring flow.

Autonomous SmartDesk 2 
One of the most useful things in my search for flexibility in the edit bay is the standup desk. Originally, I went to Ikea and found a clearance tabletop in the “dents” section and then found a kitchen island stand that was standing height. It has worked great for over 10 years; the only issue is that it isn’t easily adjustable, and sometimes I need to sit to really get my editing “flow” going.

Many companies offer standing desk solutions, including manual options like the classic VariDesk desk riser. If you have been in the offline editing game over the past five to 10 years, then you have definitely seen these come and go. But at almost $400, you might as well look for a robotic standing desk. This is where the Autonomous SmartDesk 2 comes into play. Depending on whether you want the Home Office version, which stands between 29.5 inches and 48 inches, or the Business Office version, which stands between 26 inches and 52 inches, you are looking to spend $379 or $479, respectively (with free shipping included).

The SmartDesk 2 desktop itself is made of MDF (medium-density fibreboard) material, which helps to lower the overall cost but is still sturdy and will hold up to 300 pounds. From black to white oak, there are multiple color options that not only help alleviate pains but can also be a conversation piece in the edit bay. I have the Business version in black along with a matching black chair, and I love that it looks clean and modern. The SmartDesk 2 is operated using a front-facing switch plate complete with up, down and four height-level presets. It operates smoothly and, to be honest, impressively. It gives a touch of class to any environment. Setup took about half an hour, and it came with easy-to-follow instructions, screws/washers and tools.

Keep an eye out for my full review of the Autonomous SmartDesk 2 and ErgoChair 2, but for now think about how a standup desk will at least alleviate some of the sitting you do all day while adding some class and conversation to the edit bay.

Autonomous ErgoChair 2 
Along with a standup desk — and more important in, my opinion — is a good chair. Most offline editors and assistant editors work at a company that either values their posture and buys Herman Miller Aeron chairs, or cheaps out and buys the $49 special at Office Depot. I never quite understood the benefit of saving a few bucks on a chair, especially if a company pays for health insurance — because in the end, they will be paying for it. Not everyone likes or can afford the $1,395 Aeron chairs, but there are options that don’t involve ruining your posture.

Along with the Autonomous SmartDesk 2, you should consider buying the ErgoChair 2, which costs $349 — a similar price to other chairs, like the Secretlab Omega series gaming chair that retails for $359. But the ErgoChair 2 has the best of both worlds: an Aeron chair-feeling mesh back and neck support plus a super-comfortable seat cushion with all the adjustments you could want. Even though I have only had the Autonomous products for a few weeks now, I can already feel the difference when working at home. It seems like a small issue in the grand scheme of things, but being comfortable allows my creativity to flow. The chair took under 30 minutes to build and came with easy-to-follow instructions and good tools, just like the SmartDesk 2.

A Footrest
When I first started in the industry, as soon as I began a freelance job, I would look for an old Sony IMX tape packing box. (Yes, the green tapes. Yes, I worked with tape. And yes, I can operate an MSW-2000 tape deck.) Typically, the boxes would be full of tapes because companies bought hundreds and never used them, and they made great footrests! I would line up a couple boxes under my feet, and it made a huge difference for me. Having a footrest relieves lower back pressure that I find hard to relieve any other way.

As I continue my career into my senior years, I finally discovered that there are actual footstools! Not just old boxes. One of my favorites is on Amazon. It is technically an adjustable nursing footstool but works great for use under a desk. And if you have a baby on the way, it’s a two-for-one deal. Either way, check out the “My Brest Friend” on Amazon. It goes for about $25 with free one-day Amazon Prime shipping. Or if you are a woodworker, you might be able to make your own.

GoFit Muscle Hook 
After sitting in an edit bay for multiple hours, multiple days in a row, I really like to stretch and use a massager to un-stuff my back. One of the best massagers I have seen in multiple edit bays is called the GoFit Muscle Hook.

Luckily for us it’s available at almost any Target or on the Target website for about $25. It’s an alien-looking device that can dig deep into your shoulder blades, neck and back. You can use it a few different ways — large hook for middle-of-the-back issues, smaller hook that I like to use on the neck and upper back, and the neck massage on the bar (that one feels a little weird to me).

There are other massage devices similar to the Muscle Hook, but in my opinion the GoFit Muscle Hook is the best. The plastic-composite seems indestructible and almost feels like it could double as a self-defense tool. But it can work out almost any knots you have worked up after a long day. If you don’t buy anything else for self-care, buy the Muscle Hook. You will be glad you did. Anyone who gets one has that look of pain and relief when they use it for the first time.

Foam Roller
Another item that I just started using was a foam roller. You can find them anywhere for the most part, but I found one on Amazon for $13.95 plus free Amazon Prime one-day shipping. It’s also available on the manufacturer’s website for about $23. Simply, it’s a high-density foam cylinder that you roll on top of. It sounds a little silly, but once you get one, you will really wonder how you lived without one. I purchased an 18-inch version, but they range from 12 inches to 36 inches. And if you have three young sons at home, they can double as fat lightsabers (but they hurt, so keep an eye out).

Summing Up
In the end, there are so many ways to try keeping a flexible editing lifestyle, from kettlebells to stand-up desks. I’ve found that just getting over the mental hurdle of not wanting to move is the biggest catalyst. There are so many great tech accessories for workstations, but we hardly mention ones that can keep our bodies moving and our creativity flowing. Hopefully, some of these ergonomic accessories for your workstation will spark an idea to move around and get your blood flowing.

For some workout inspiration, Onnit has some great free workouts featuring weird stuff like maces, steel clubs and sandbags, but also kettlebells. The site also has nutritional advice. For foam roller stretches, I would check out the same Onnit Academy site.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Editor Anthony Marinelli joins Northern Lights

Editor Anthony Marinelli has joined post studio Northern Lights. Marinelli’s experience spans commercial, brand content, film and social projects. Marinelli comes to Northern Lights from a four-year stint at TwoPointO where he was also a partner. He has previously worked at Kind Editorial, Alkemy X, Red Car, Cut+Run and Crew Cuts.

Marinelli’s work includes projects for Mercedes, FedEx, BMW, Visa, Pepsi, Scotts, Mount Sinai and Verizon. He also edited the Webby Award-winning documentary “Alicia in Africa,” featuring Alicia Keys for Keep a Child Alive.

Marinelli is also an active in independent theater and film. He has written and directed many plays and short films, including Acoustic Space, which won Best Short at the 2018 Ridgewood Guild Film Festival and Best Short Screenplay in the Richmond International Film Festival.

Marinelli’s most recent campaigns are for Mount Sinai and Bernie & Phyl’s for DeVito Verdi.

He works on Avid Media Composer and Adobe Premiere. You can watch his reel here.

Blackmagic releases Resolve 16.2, beefs up audio post tools

Blackmagic has updated its color, edit, VFX and audio post tool to Resolve 16.2. This new version features major Fairlight updates for audio post as well as many improvements for color correction, editing and more.

This new version has major new updates for editing in the Fairlight audio timeline when using a mouse and keyboard. This is because the new edit selection mode unlocks functionality previously only available via the audio editor on the full Fairlight console, so editing is much faster than before. In addition, the edit selection mode makes adding fades and cuts and even moving clips only a mouse click away. New scalable waveforms let users zoom in without adjusting the volume. Bouncing lets customers render a clip with custom sound effects directly from the Fairlight timeline.

Adding multiple clips is also easier, as users can now add them to the timeline vertically, not just horizontally, making it simpler to add multiple tracks of audio at once. Multichannel tracks can now be converted into linked groups directly in the timeline so users no longer have to change clips manually and reimport. There’s added support for frame boundary editing, which improves file export compatibility for film and broadcast deliveries. Frame boundary editing now adds precision so users can easily trim to frame boundaries without having to zoom all the way in the timeline. The new version supports modifier keys so that clips can be duplicated directly in the timeline using the keyboard and mouse. Users can also copy clips across multiple timelines with ease.

Resolve 16.2 also includes support for the Blackmagic Fairlight Sound Library with new support for metadata based searches, so customers don’t need to know the filename to find a sound effect. Search results also display both the file name and description, so finding the perfect sound effect is faster and easier than before.

MPEG-H 3D immersive surround sound audio bussing and monitoring workflows are now supported. Additionally, improved pan and balance behavior includes the ability to constrain panning.

Fairlight audio editing also has index improvements. The edit index is now available in the Fairlight page and works as it does in the other pages, displaying a list of all media used; users simply click on a clip to navigate directly to its location in the timeline. The track index now supports drag selections for mute, solo, record enable and lock as well as visibility controls so editors can quickly swipe through a stack of tracks without having to click on each one individually. Audio tracks can also be rearranged by click and dragging a single track or a group of tracks in the track index.

This new release also includes improvements in AAF import and export. AAF support has been refined so that AAF sequences can be imported directly to the timeline in use. Additionally, if the project features a different time scale, the AAF data can also be imported with an offset value to match. AAF files that contain multiple channels will also be recognized as linked groups automatically. The AAF export has been updated and now supports industry-standard broadcast wave files. Audio cross-fades and fade handles are now added to the AAF files exported from Fairlight and will be recognized in other applications.

For traditional Fairlight users, this new update makes major improvements in importing old legacy Fairlight projects —including improved speed when opening projects with over 1,000 media files, so projects are imported more quickly.

Audio mixing is also improved. A new EQ curve preset for clip EQ in the inspector allows removal of troublesome frequencies. New FairlightFX filters include a new meter plug-in that adds a floating meter for any track or bus, so users can keep an eye on levels even if the monitoring panel or mixer are closed. There’s also a new LFE filter designed to smoothly roll off the higher frequencies when mixing low-frequency effects in surround.

Working with immersive sound workflows using the Fairlight audio editor has been updated and now includes dedicated controls for panning up and down. Additionally, clip EQ can now be altered in the inspector on the editor panel. Copy and paste functions have been updated, and now all attributes — including EQ, automation and clip gain — are copied. Sound engineers can set up their preferred workflow, including creating and applying their own presets for clip EQ. Plug-in parameters can also be customized or added so that users have fast access to their preferred tool set.

Clip levels can now be changed relatively, allowing users to adjust the overall gain while respecting existing adjustments. Clip levels can also be reset to unity, easily removing any level adjustments that might have previously been made. Fades can also be deleted directly from the Fairlight Editor, making it faster to do than before. Sound engineers can also now save their preferred track view so that they get the view they want without having to create it each time. More functions previously only available via the keyboard are now accessible using the panel, including layered editing. This also means that automation curves can now be selected via the keyboard or audio panel.

Continuing on with the extensive improvements to the Fairlight audio, there has also been major updates to the audio editor transport control. Track navigation is now improved and even works when nothing is selected. Users can navigate directly to the timecode entry window above the timeline from the audio editor panel, and there is added support for high-frame-rate timecodes. Timecode entry now supports values relative to the current CTI location, so the playhead can move along the timeline relative to the position rather than a set timecode.

Support has also been added so the colon key can be used in place of the user typing 00. Master spill on console faders now lets users spill out all the tracks to a bus fader for quick adjustments in the mix. There’s also more precision with rotary controls on the panel and when using a mouse with a modifier key. Users can also change the layout and select either icon or text-only labels on the Fairlight editor. Legacy Fairlight users can now use the traditional — and perhaps more familiar — Fairlight layout. Moving around the timeline is even quicker with added support for “media left” and “media right” selection keys to jump the playhead forward and back.

This update also improves editing in Resolve. Loading and switching timelines on the edit page is now faster, with improved performance when working with a large number of audio tracks. Compound clips can now be made from in and out points so that editors can be more selective about which media they want to see directly in the edit page. There is also support for previewing timeline audio when performing live overwrites of video-only edits. Now when trimming, the duration will reflect the clip duration as users actively trim, so they can set a specific clip length. Support for a change transition duration dialogue.

The media pool now includes metadata support for audio files with up to 24 embedded channels. Users can also duplicate clips and timelines into the same bin using copy and paste commands. Support for running the primary DaVinci Resolve screen as a window when dual-screen mode is enabled. Smart filters now let users sort media based on metadata fields, including keywords and people tags, so users can find the clips they need faster.

Matt Shaw on cutting Conan Without Borders: Ghana and Greenland

By Randi Altman

While Conan O’Brien was airing his traditional one-hour late night talk show on TBS, he and his crew would often go on the road to places like Cuba, South Korea and Armenia for Conan Without Borders — a series of one-hour specials. He would focus on regular folks, not celebrities, and would embed himself into the local culture… and there was often some very mediocre dancing, courtesy of Conan. The shows were funny, entertaining and educational, and he enjoyed doing them.

Conan and Matt on the road.

In 2019, Conan and his crew, Team Coco, switched the nightly show from one hour to a new 30-minute format. The format change allowed them to produce three to four hour-long Conan Without Borders specials per year. Two of the places the show visited last year were Ghana and Greenland. As you might imagine, they shoot a lot of footage, which all must be logged and edited, often while on the road.

Matt Shaw is one of the editors on Conan, and he went on the road with the show when it traveled to Greenland. Shaw’s past credits include Deon Cole’s Black Box and The Pete Holmes Show (both from Conan O’Brien’s Conaco production company) and The Late Late Show with James Corden (including Carpool Karaoke). One of his first gigs for Team Coco was editing Conan Without Borders: Made in Mexico. That led to a full-time editing gig on Conan on TBS and many fun adventures.

We reached out to Shaw to find out more about editing these specials and what challenges he faced along the way.

You recently edited Conan Without Borders — the Greenland and Ghana specials. Can you talk about preparing for a job like that? What kind of turnaround did you have?
Our Ghana special was shot back in June 2019, with the original plan to air in August, but it was pushed back to November 7 because of how fast the Greenland show came up.

In terms of prep for a show like Ghana, we mainly just know the shooting specs and will handle the rest once the crew actually returns. For the most part, that’s the norm. Ideally, we’ll have a working dark week (no nightly Conan show), and the three editors — me, Rob Ashe and Chris Heller — will take the time to offload, sync and begin our first cuts of everything. We’ll have been in contact with the writers on the shoot to get an idea of what pieces were shot and their general notes from the day.

With Greenland, we had to mobilize and adjust everything to accommodate a drastically different shoot/delivery schedule. The Friday before leaving, while we were prepping the Ghana show to screen for an audience, we heard there might be something coming up that would push Ghana back. On Monday, we heard the plan was to go to Greenland on Wednesday evening, after the nightly show, and turn around Greenland in place of Ghana’s audience screening. We had to adjust the nightly show schedule to still have a new episode ready for Thursday while we were in Greenland.

How did you end up on the Greenland trip?
Knowing we’d only have six days from returning from Greenland to having to finish the show broadcast, our lead editor, Rob Ashe, suggested we send an editor to work on location. We were originally looking into sending footage via Aspera from a local TV studio in Nuuk, Greenland, but we just wouldn’t have been able to turn it around fast enough. We decided about two days before the trip began that I’d go and do what I could to offload, backup, sync and do first cuts on everything.

How much footage did you have per episode, and what did they shoot on?
Ghana had close to 17 hours of material shot over five days on Sony Z450s at 4K XAVC, 29.97. Greenland was closer to 12 hours shot over three days on Panasonic HPX 250s, P2 media recording at 1080 60i.

We also used iPhone/iPad/GoPro footage picked up by the rest of the crew as needed for both shows. I also had a DJI Osmo pocket camera to play with when I had a chance, and we used some of that footage during the montage of icebergs.

So you were editing segments while they were still shooting?
In Greenland, I was cutting daily in the hotel. Midday, I’d get a drop of cards, offload, sync/group and the first cuts on everything. We had a simple offline edit workflow set up where I’d upload my cuts to Frame.io and email my project files to the team — Rob and Chris — in Burbank. They would then download and sync the Frame.io file to a top video layer in the timeline and continue cutting down, with any additional notes from the writers.

Generally, I’d have everything from Day One uploaded by the start of Day Two, etc. It seemed to work out pretty well to set us up for success when we returned. I was also getting notes on requests to help cut a few highlights from our remotes and to put on Team Coco’s Instagram account.

On our return day, we flew to Ilulissat for an iceberg expedition. We had about two hours on the ground before having to return to the airport and fly to Kangerlussuaq, where our chartered plane was waiting to take us back to California. On the flight back, I worked for another four hours or so to sort through the remaining segments and prep everything so we could hit the ground running the following morning. During the flight home, we screened some drone footage from the iceberg trip for Conan, and it really got everyone excited.

What are the challenges of working on the road and with such tight turnarounds?
The night we left for Greenland was preceded by a nightly show in Burbank. After the show ended, we hopped on a plane to fly eight hours to Kangerlussuaq for customs, then another to Nuuk. The minute we landed, we were filming for about three hours before checking into the hotel. I grabbed the morning’s camera cards, went to my room and began cutting. By the time I went to bed, I had cuts done of almost everything from the first day. I’m a terrible sleeper on planes, so the marathon start was pretty insane.

Outside of the little sleep, our offload speeds were slower because we were using different cameras than usual — for the sake of traveling lighter — because the plane we flew in had specific weight restrictions. We actually had to hire local crew for audio and B and C camera because there wasn’t enough room for everyone in the plane to start.

In general, I think the overall trip went as smooth as it could have. It would be interesting to see how it would play out for a longer shoot schedule.

What editing system did you use? What was your setup like? What kind of storage were you using?
On the road I had my MacBook Pro (2018 model), and we rented an identical backup machine in case mine died. For storage, we had four 1TB G-Tech USB-C drives and a 4TB G-RAID to back everything up. I had a USB-3.0 P2 card reader as well and multiple backup readers. A Bluetooth mouse and keyboard rounded out the kit, so I could travel with everything in a backpack.

We had to charter a plane in order to fly directly to Greenland. With such a tight turnaround between filming and delivering the actual show, this was the only way to actually make the special happen. Commercial flights fly only a few days per week out of neighboring countries, and once you’re in Greenland, you either have to fly or take a boat from city to city.

Matt Shaw editing on plane.

On the plane, there was a conference table in the back, so I set up there with one laptop and the G-RAID to continue working. The biggest trouble on the plane was making sure everything stayed secure on the table while taking off and making turns. There were a few close calls when everything started to slide away, and I had to reach to make sure nothing was disconnected.

How involved in the editing is Conan? What kind of feedback did you get?
In general, if Conan has specific notes, the writers will hear them during or right after a shoot is finished. Or we’ll test-screen something after a nightly show taping and indirectly get notes from the writers then.

There will be special circumstances, like our cold opens for Comic-Con, when Conan will come to edit and screen a close-to-final cut. And there just might be a run of jokes that isn’t as strong, but he lets us work with the writers to make what we all think is the best version by committee.

Can you point to some of the more challenging segments from Greenland or Ghana?
The entire show is difficult with the delivery-time constraints while handling the nightly show. We’ll be editing the versions for screening sometimes up to 10 minutes before they have to screen for an audience as well as doing all the finishing (audio mix, color as needed, subtitling and deliverables).

For any given special, we’re each cutting our respective remotes during the day while working on any new comedy pieces for that day’s show, then one or two of us will split the work on the nightly show, while the other keeps working with the travel show writers. In the middle of it all, we’ll cut together a mini tease or an unfinished piece to play into that night’s show to promote the specials, so the main challenge is juggling 30 things at a time.

For me, I got to edit this 1980s-style action movie trailer based on an awesome poster Conan had painted by a Ghanaian artist. We had puppets built, a lot of greenscreen and a body double to composite Conan’s head onto for fight scenes. Story-wise, we didn’t have much of a structure to start, but we had to piece something together in the edit and hope it did the ridiculous poster justice.

The Thursday before our show screened for an audience was the first time Mike Sweeney (head writer for the travel shows) had a chance to look at any greenscreen footage and knew we were test-screening it the following Monday or Tuesday. It started to take shape when one of our graphics/VFX artists, Angus Lyne, sent back some composites. In the end, it came together great and killed with the audience and our staff, who had already seen anything and everything.

Our other pieces seem to have a linear story, and we try to build the best highlights from any given remote. With something like this trailer, we have to switch our thought process to really build something from scratch. In the case of Greenland and Ghana, I think we put together two really great shows.

How challenging is editing comedy versus drama? Or editing these segments versus other parts of Conan’s world?
In a lot of the comedy we cut, the joke is king. There are always instances when we have blatant continuity errors, jump cuts, etc., but we don’t have to kill ourselves trying to make it work in the moment if it means hurting the joke. Our “man on the street” segments are great examples of this. Obviously, we want something to be as polished and coherent as possible, but there are cases when it just isn’t best, in our opinion, and that’s okay.

That being said, when we do our spoofs of whatever ad or try to recreate a specific style, we’re going to do everything to make that happen. We recently shot a bit with Nicholas Braun from Succession where he’s trying to get a job from Conan during his hiatus from Succession. This was a mix of improv and scripted, and we had to match the look of that show. It turned out well and funny and is in the vein of Succession.

What about for the Ghana show?
For Ghana, we had a few segments that were extremely serious and emotional. For example, Conan and Sam Richardson visited Osu Castle, a major slave trade port. This segment demands care and needs to breathe so the weight of it can really be expressed, versus earlier in the show, when Conan was buying a Ghana shirt from a street vendor, and we hard-cut to him wearing a shirt 10 sizes too small.

And Greenland?
Greenland is a place really affected by climate change. My personal favorite segment I’ve cut on these travel specials is the impact the melting icecaps could have on the world. Then there is a montage of the icebergs we saw, followed by Conan attempting to stake a “Sold” sign on an iceberg, signifying he had bought property in Greenland for the US. Originally, the montage had a few jokes within the segment, but we quickly realized it’s so beautiful we shouldn’t cheapen it. We just let it be beautiful.

Comedy or drama, it’s really about being aware of what you have in front of you and what the end goal is.

What haven’t I asked that’s important?
For me, it’s important to acknowledge how talented our post team is to be able to work simultaneously on a giant special while delivering four shows a week. Being on location for Greenland also gave me a taste of the chaos the whole production team and Team Coco goes through, and I think everyone should be proud of what we’re capable of producing.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The Den editorial boutique launches in Los Angeles

Christjan Jordan, editor of award-winning work for clients including Amazon, GEICO and Hulu, has partnered with industry veteran Mary Ellen Duggan to launch The Den, an independent boutique editorial house in Los Angeles.

Over the course of his career, Jordan has worked with Arcade Edit, Cosmo Street and Rock Paper Scissors, among others. He has edited such spots as Alexa Loses Her Voice for Amazon, Longest Goal Celebration Ever for GEICO, #notspecialneeds for World Down Syndrome Day out of Publicis NY and Super Bowl 2020 ads Tom Brady’s Big Announcement for Hulu and Famous Visitors for Walmart. Jordan’s work has been recognized by the Cannes Lions, AICE, AICP, Clio, D&AD, One Show and Sports Emmy awards.

Yes, with Mary Ellen, agency producers are guided by an industry veteran that knows exactly what agencies and clients are looking for,” says Jordan. “And for me, I love fostering young editors. It’s an interesting time in our industry and there is a lot of fresh creative talent.”

In her career, Duggan has headed production departments at both KPB and Cliff Freeman on the East Coast and, most recently, Big Family Table in Los Angeles. In addition, she has freelanced all over the country.

“The stars aligned for Christjan and I to work together,” says Duggan. “We had known each other for years and had recently worked on a Hulu campaign together. We had a similar vision for what we thought the editorial experience should be. A high end boutique editorial that is nimble, has a roster of diverse talent, and a real family vibe.”

Veteran producer Rachel Seitel has joined as partner and head of business development. The Den will be represented by Diane Patrone at The Family on the East Coast and by Ezra Burke and Shane Harris on the West Coast.

The Den’s founding roster also features editor Andrew Ratzlaff and junior editor Hannelore Gomes. The staff works on Avid Media Composer and Adobe Premiere.

LVLY adds veteran editor Bill Cramer

Bill Cramer, an editor known for his comedy and dialogue work, among other genres, has joined the editorial roster at LVLY, a content creation and creative studio based in New York City.

Cramer joins from Northern Lights. Prior to that he had spent many years at Crew Cuts, where he launched his career and built a strong reputation for his work on many ads and campaigns. Clients included ESPN, GMC, LG, Nickelodeon, Hasbro, MLB, Wendy’s and American Express. Check out his reel.

Cramer reports that he wasn’t looking to make a move but that LVLY’s VP/MD, Wendy Brovetto, inspired him. “Wendy and I knew of each other for years, and I’d been following LVLY since they did their top-to-bottom rebranding. I knew that they’re doing everything from live -action production to podcasting, VFX, design, VR and experiential, and I recognized that joining them would give me more opportunities to flex as an editor. Being at LVLY gives me the chance to take on any project, whether that’s a 30-second commercial, music video or long-form branded content piece; they’re set up to tackle any post production needs, no matter the scale.”

“Bill’s a great comedy/dialogue editor, and that’s something our clients have been looking for,” says Brovetto. “Once I saw the range of his work, it was an easy decision to invite him to join the LVLY team. In addition to being a great editor, he’s a funny guy, and who doesn’t need more humor in their day?”

Cramer, who works on both Avid Media Composer and Adobe Premiere, joins an editorial roster that includes Olivier Wicki, J.P. Damboragian, Geordie Anderson, Noelle Webb, Joe Siegel and Aaron & Bryan.

Behind the Title: Dell Blue lead editor Jason Uson

This veteran editor started his career at LA’s Rock Paper Scissors, where he spent four years learning the craft from editors such as Bee Ottinger and Angus Wall. After freelancing at Lost Planet, Spot Welders and Nomad, he held staff positions at Cosmo Street, Harpo Films and Beast Editorial before opening Foundation Editorial his own post boutique in Austin.

NAME: Jason Uson

COMPANY: Austin, Texas-based Dell Blue

Can you describe what Dell Blue does?
Dell Blue is the in-house agency for Dell Technologies.

What’s your job Title?
Senior Lead Creative Editor

What does that entail?
Outside of the projects that I am editing personally, there are multiple campaigns happening simultaneously at all times. I oversee all of them and have my eyes on every edit, fostering and mentoring our junior editors and producers to help them grow in their careers.

I’ve helped establish and maintain the process regarding our workflow and post pipeline. I also work closely with our entire team of creatives, producers, project managers and vendors from the beginning of each project and follow it through from production to post. This enables us to execute the best possible workflow and outcome for every project.

To add another layer to my role, I am also directing spots for Dell when the project is right.

Alienware

That’s a lot! What else would surprise people about what falls under that title?
The number of hours that go into making sure the job gets done and is the best it can be. Editing is a process that takes time. Creating something of value that means something is an art no matter how big or small the job might be. You have to have pride in every aspect of the process. It shows when you don’t.

What’s your favorite part of the job?
I have two favorites. The first is the people. I know that sounds cliché, but it’s true. The team here at Dell is truly something special. We are family. We work together. Play together. Happy Hour together. Respect, support and genuinely care for one another. But, ultimately, we care about the work. We are all aligned to create the best work possible. I am grateful to be surrounded by such a talented and amazing group of humans.

The second, which is equally important to me, is the process of organizing my project, watching all the footage and pulling selects. I make sure I have what I need and check it off my list. Music, sound effects, VO track, graphics and anything else I need to get started. Then I create my first timeline. A blank, empty timeline. Then I take a deep breath and say to myself, “Here we go.” That’s my favorite.

Do you have a least favorite?
My least favorite part is wrapping a project. I spend so much time with my clients and creatives and we really bond while working on a project together. We end on such a high note of excitement and pride in what we’ve done and then, just like that, it’s over. I realize that sounds a bit dramatic. Not to worry, though, because lucky for me, we all come back together in a few months to work on something new and the excitement starts all over again.

What is your most productive time of day?
This also requires a two-part answer. The first is early morning. This is my time to get things done, uninterrupted. I go upstairs and make a fresh cup of coffee. I open my deck doors. I check and send emails, and get my personal stuff done. This clears out all of my distractions for the day before I jump into my edit bay.

The second part is late at night. I get to replay all of the creative decisions from the day and explore other options. Sometimes, I get lucky and find something I didn’t see before.

If you didn’t have this job, what would you be doing instead?
That’s easy. I’d be a chef. I love to cook and experiment with ingredients. And I love to explore and create an amazing dining experience.

I see similarities between editors and chefs. Both aim to create something impactful that elicits an emotional response from the “elements” they are given. For chefs, the ingredients, spices and techniques are creatively brought together to bring a dish to life.

For editors, the “elements” that I am given, in combination with the use of my style, techniques, sound design, graphics and music etc. all give life to a spot.

How early did you know this would be your path?
I had originally moved to Los Angeles with dreams of becoming an actor. Yes, it’s groundbreaking, I know. During that time, I met editor Dana Glauberman (The Mandalorian, Juno, Up in the Air, Thank You for Smoking, Creed II, Ghostbusters: Afterlife). I had lunch with her at the studios one day in Burbank and went on a tour of the backlot. I got to see all the edit bays, film stages, soundstages and machine rooms. To me, this was magic. A total game-changer in an instant.

While I was waiting on that one big role, I got my foot in the door as a PA at editing house Rock Paper Scissors. One night after work, we all went for drinks at a local bar and every commercial on TV were the ones (editors) Angus Wall and Adam Pertofsky had worked on within the last month, and I was blown away. Something clicked.

This entire creative world behind the scenes was captivating to me. I made the decision at that moment to lean in and go for it. I asked the assistant editor the following morning if he would teach me — and I haven’t looked back. So, Dana, Angus and Adam… thank you!

Can you name some of your recent projects?
I edited the latest global campaign for Alienware called Everything Counts, which was directed by Tony Kaye. More recently, I worked on the campaign for Dell’s latest and greatest business PC laptop that launches in March 2020, which was directed by Mac Premo.

Dell business PC

Side note: I highly recommend Googling Mac Premo. His work is amazing.

What project are you most proud of?
There are two projects that stand out for me. The first one is the very first spot I ever cut — a Budweiser ad for director Sam Ketay and the Art Institute of Pasadena. During the edit, I thought, “Wow, I think I can do this.” It went on to win a Clio.

The second is the latest global campaign for Alienware, which I mentioned above. Director Tony Kaye is a genius. Tony and I sat in my edit bay for a week exploring and experimenting. His process is unlike any other director I have worked with. This project was extremely challenging on many levels. I honestly started looking at footage in a very different way. I evolved. I learned. And I strive to continue to grow every day.

Name three pieces of technology you can’t live without.
Wow, good question. I guess I’ll be that guy and say my phone. It really is a necessity.

Spotify, for sure. I am always listening to music in my car and trying to match artists with projects that are not even in existence yet.

My Bose noise cancelling headphones.

What social media channels do you follow?
I use Facebook and LinkedIn — mainly to stay up to date on what others are doing and to post my own updates every now and then.

I’m on Instagram quite a bit. Outside of the obvious industry-related accounts I follow, here are a few of my random favorites:

@nuts_about_birds
If you love birds as much as I do, this is a good one to follow.

@sergiosanchezart
This guy is incredible. I have been following his work for a long time. If you are looking for a new tattoo, look no further.

@andrewhagarofficial
I was lucky enough to meet Andrew through my friend @chrisprofera and immediately dove into his music. Amazing. Not to mention his dad is Sammy Hagar. Enough said.

@kaleynelson
She’s a talented photographer based in LA. Her concert stills are impressive.

@zuzubee
I love graffiti art and Zuzu is one of the best. Based is Austin, she has created several murals for me. You can see her work all over the city, as well as installations during SXSW and Austin City Limits, on Bud Light cans, and across the US.

Do you listen to music at work? What types?
I do listen to music when I work but only when I’m going through footage and pulling selects. Classical piano is my go-to. It opens my mind and helps me focus and dive into my footage.

Don’t get me wrong, I love music. But if I am jamming to my favorite, Sammy Hagar, I can’t drive…I mean dive… into my footage. So classical piano for me.

How do you de-stress from it all?
This is an understatement, but there are a few things that help me out. Sometimes during the day, I will take a walk around the block. Get a little vitamin D and fresh air. I look around at things other than my screen. This is something (editors) Tom Muldoon and John Murray at Nomad used to do every day. I always wondered why. Now I know. I come back refreshed and with my mind clear and ready for the next challenge.

I also “like” to hit the gym immediately after I leave my edit bay. Headphones on (Sammy Hagar, obviously), stretch it out and jump on the treadmill for 30 minutes.

All that is good and necessary for obvious reasons, but getting back to cooking… I love being in the kitchen. It’s therapy for me. Whether I am chopping and creating in the kitchen or out on the grill, I love it. And my wife appreciates my cooking. Well, I think she does at least.

Photo Credits: Dell PC and Jason Uson images – Chris Profera

Destin Daniel Cretton talks directing Warner’s Just Mercy

By Iain Blair

An emotionally powerful and thought-provoking true story, Just Mercy is the latest film from award-winning filmmaker Destin Daniel Cretton (The Glass Castle, Short Term 12), who directed the film from a screenplay he co-wrote. Based on famed lawyer and activist Bryan Stevenson’s memoir, “Just Mercy: A Story of Justice and Redemption,” which details his crusade to defend, among others, wrongly accused prisoners on death row, it stars Michael B. Jordan and Oscar winners Jamie Foxx and Brie Larson.

The story starts when, after graduating from Harvard, Stevenson (Jordan) — who had his pick of lucrative jobs — instead heads to Alabama to defend those wrongly condemned or who were not afforded proper representation, with the support of local advocate Eva Ansley (Larson).

One of his first cases is that of Walter McMillian (Foxx), who, in 1987, was sentenced to die for the murder of an 18-year-old girl, despite evidence proving his innocence. In the years that follow, Stevenson becomes embroiled in a labyrinth of legal and political maneuverings as well as overt racism as he fights for Walter, and others like him, with the odds — and the system — stacked against them.

This case becomes the main focus of the film, whose cast also includes Rob Morgan as Herbert Richardson, a fellow prisoner who also sits on death row; Tim Blake Nelson as Ralph Myers, whose pivotal testimony against Walter McMillian is called into question; Rafe Spall as Tommy Chapman, the DA who is fighting to uphold Walter’s conviction and sentence; O’Shea Jackson Jr. as Anthony Ray Hinton, another wrongly convicted death row inmate whose cause is taken up by Stevenson; and Karan Kendrick as Walter’s wife, Minnie McMillian.

Cretton’s behind-the-scenes creative team included DP Brett Pawlak, co-writer Andrew Lanham, production designer Sharon Seymour, editor Nat Sanders and composer Joel P. West, all of whom previously collaborated with the director on The Glass Castle.

Destin Daniel Crettin

I spoke with the director about making the film, his workflow and his love of post.

When you read Brian’s book, did you feel compelled to take this on?
I did. His voice and the way he tells the story about these characters, who seem so easy to judge at first. Then he starts peeling off all the layers, and the way he uses humor in certain areas and devastation in others. Somehow it still makes you feel hopeful and inspired to do something about all the injustice – all of it just hit me so hard, and I felt I had to be involved in it some way.

Did you work very closely with him on the film?
I did. Before we even began writing a word, we went to meet him in Montgomery, and he introduced us to the real Anthony Ray Hinton and a bunch of lawyers working on cases. Brian was with us through the whole writing process, filling in the blanks and helping us piece the story together. We did a lot of research, and we had the book, but it obviously couldn’t include everything. Brian gave us all the transcripts of all the hearings, and a lot of the lines were taken directly from those.

This is different from most other courtroom dramas, as the trial’s already happened when the movie begins. What sort of film did you set out to make?
We set out to make the book in as compelling a way as possible. And it’s a story about this young lawyer who’s trying to convince the system and state they made a terrible mistake, with all the ups and downs, and just how long it takes him to succeed. That’s the drama.

What were the main challenges in pulling it all together?
Telling a very intense, true story about people, many of whom are still alive and still doing the work they were doing then. So accuracy was a huge thing, and we all really felt the burden and responsibility to get it right. I felt it more so than any film I’ve ever done because I respect Brian’s work so much. We’re also telling stories about people who were very vulnerable.

Trying to figure out how to tell a narrative that still moved at the right pace and gave you an emotional ride, but which stayed completely accurate to the facts and to a legal process that moves incredibly slowly was very challenging. A big moment for me were when Brian first saw the film and gave me a big hug and thank you; he told me it was not for how he was portrayed, but for how we took care of his clients. That was his big concern.

What did Jamie and Michael bring to their roles?
They’ve been friends for a long time, so they already had this great natural chemistry, and they were able to play through scenes like two jazz musicians and bring a lot of stuff that wasn’t there on the page.

I heard you actually shot in the south. How tough was the shoot?
Filming in some of the real locations really helped. We were able to shoot in Montgomery — such as the scenes where Brian’s doing his morning jogs, the Baptist church where MLK Jr. was the pastor, and then the cotton fields and places where Walter and his family actually lived. Being there and feeling the weight of history was very important to the whole experience. Then we shot the rest of the film in Atlanta.

Where did you post?
All in LA on the Warner lot.

Do you like the post process?
I love post and I hate it (laughs). And it depends on whether you’re finding a solution to a problem or you’re realizing you have a big problem. Post, of course, is where you make the film and where all the problems are exposed… the problems with all the choices I made on set. Sometimes things are working great, but usually it’s the problems you’re having to face. But working with a good post team is so fulfilling, and you’re doing the final rewrite, and we solved so many things in post on this.

Talk about editing with your go-to Nat Sanders, who got an Oscar nom for his work (with co-editor Joi McMillon) on Moonlight and also cut If Beale Street Could Talk.
Nat wasn’t on set. He began cutting material here in LA while we shot on location in Atlanta and Alabama, and we talked a lot on the phone. He did the first assembly which was just over three hours long. All the elements were there but shaping all the material and fine-tuning it all took nearly a year as we went through every scene, talking them out.

Finding the correct emotional ride and balance was a big challenge, as this has so many emotional highs and lows and you can easily tire an audience out. We had to cut some storylines that were working, but we were sending people on another down when they needed something lighter. The other part of it was performance, and you can craft so much of that in the edit; our leads gave us so many takes and options to play with. Dealing with that is one of Nat’s big strengths. Both of us are meticulous, and we did a lot of test screenings and kept making adjustments.

Writer Iain Blair (left) and director Destin Daniel Crettin.

Nat and I both felt the hardest scene to cut and get right was Herb’s execution scene, because of the specific tone needed. If you went too far in one direction, it felt too much, but if you went too far the other way, it didn’t quite hit the emotional beat it needed. So that took a lot of time, playing around with all the cross-cutting and the music and sound to create the right balance.

All period films need VFX. What was entailed?
Crafty Apes did them, and we did a lot of fixes, added period stuff and did a lot of wig fixes — more than you’d think (laughs). We weren’t allowed to shoot at the real prison, so we had to create all the backdrops and set extensions for the death row sequences.

Can you talk about the importance of sound and music.
It’s always huge for me, and I’ve worked with my composer, Joel, and supervising sound editor/re-recording mixer Onnalee Blank, who was half of the sound team, since the start. For both of them, it was all about finding the right tone to create just the right amount of emotion that doesn’t overdo it, and Joel wrote the score in a very stripped-down way and then got all these jazz musicians to improvise along with the score.

Where did you do the DI and how important is it to you?
That’s huge too, and we did it at Light Iron with colorist Ian Vertovec. He’s worked with my DP on almost every project I’ve done, and he’s so good at grading and giving you a very subtle palette.

What’s next?
We’re currently on preproduction on Shang-Chi and the Legend of the Ten Rings, featuring Marvel’s first Asian superhero. It’s definitely a change of pace after this.


 

Visible Studios produces, posts Dance Monkey music video

If you haven’t heard about the Dance Monkey song by Tones and I, you soon will.  Australia’s Visible Studios provided production and post on the video to go with the song that has hit number one in more than 30 countries, went seven times platinum and remained at the top of the charts in Australia for 22 weeks. The video has been viewed on YouTube more than half a billion times.

Visible Studios, a full production and post company, is run by producer Tim Whiting and director and editor Nick Kozakis. The company features a team of directors, scriptwriters, designers, motion graphic artists and editors working on films, TV commercials and music videos.

For Dance Monkey, Visible Studios worked directly with Tones and I to develop the idea for the video. The video, which was shot on Red cameras at the beginning of the song’s meteoric rise, was completed in less than a week and on a small budget.

“The Dance Monkey music video was made on an extremely quick turnaround,” says Whiting. “[Tones] was blowing up at the time, and they needed the music video out fast. The video was shot in one day, edited in two, with an extra day and a half for color and VFX.”  Visible Studios called on Blackmagic Resolve studio for edit, VFX and color.

Dance Monkey features the singer dressed as Old Tones, an elderly man whisked away by his friends to a golf course to dance and party. On the day of production, the sun was nowhere to be found, and each shot was done against a gray and dismal background. To fix this, the team brought in a sky image as a matte and used Resolve’s match move tool, keyer, lens blur and power windows to turn gray footage to brilliant sunshine.

“In post we decided to replace the overcast skies with a cloudy blue sky. We ended up doing this all in Resolve’s color page and keyed the grass and plants to make them more lush, and we were there,” says Whiting.

Editor/directors Kozakis and Liam Kelly used Resolve for the entire editing process. “Being able to edit 6K raw footage smoothly on a 4K timeline, at a good quality debayer, means that we don’t have to mess around with proxies and that the footage gets out of the way of the editing process. The recent update for decompression and debayer on Nvidia cards has made this performance even better,” Kozakis says.

 

Review: Neat Video 5 noise reduction plugin

By Brady Betzel

One of the best (and most underrated) tricks in an online editor’s tool kit is to have good image restoration techniques. Removing digital video imperfections — from flicker to digital video noise — is not easy, and not easy to do well. That is, unless you have good noise reduction software like Neat Video.

While Neat Video might not be that well-known, once you see how simply (or intricatly) Neat Video 5 works inside of apps like Blackmagic’s DaVinci Resolve, it will be hard to forget the company’s name.

(While the software was recently updated to 5.1.5 — with expanded GPU support as well as support for new versions of Resolve, Adobe and Nuke — nothing really changes for this review. You can check out a detailed list of the updates here.)

Neat Video 5 is a noise reduction plugin. In a Windows OS environment, Neat Video is compatible with apps like Adobe After Effects, Adobe Premiere Pro, DaVinci Resolve, Avid Media Composer, Vegas, Magix, Edius, Virtual Dub, and the OFX-compatible apps Nuke, Fusion, Scratch, HitFilm, Mamba, Natron, Flame, Baselight and DustBuster. In a macOS environment, Neat Video 5 is compatible with After Effects, Premiere, Final Cut Pro X, Motion 5, OFX, Resolve and Media Composer. In Linux, the software is compatible with OFX-compatible apps and Resolve.

Neat Video 5 comes in three flavors: Demo, Home and Pro. The Demo version works in up to 1280×720 resolution with a watermark. Home is literally made for the home user: It will process video up to 1920×1080 resolutions, it will use up to one GPU, and it is for non-commercial use. The cost is just $74.90 for most apps (Resolve is $89.90). The Pro version has no resolution restrictions, will work on two or more GPUs simultaneously, and can be used commercially. The Pro version starts at $129.90 per app ($159.90 for Resolve). Because Neat Video 5 for OFX works with so many apps, it only comes in Pro ($249.90) and Studio ($349.90) versions. The Studio version adds the ability for a floating license. You can see all of the pricing details here.

If there is one line you should take away from this review, it is this: Neat Video 5 is by far the easiest and best noise reduction software I have used in any application to date. And while this review is focusing on the Resolve version of Neat Video 5, all other apps work in much the same way. You can find Neat Video’s software-specific Quick Start Guides to help. Once you install and register your Neat Video 5 license, removing digital video noise is as easy as applying Neat Video 5 to a node in the color tab, clicking on “Prepare Noise Profile,” clicking on “Auto Profile,” and clicking “Apply.” Then, unless you want to fine-tune your noise reduction, you are done. Obviously, I have somewhat simplified how Neat Video 5 works, but essentially it can be done in as little as three steps per clip, and the results are typically amazing. If they aren’t amazing, you can jump back into Neat Video 5 and manually adjust specifics until the noise reduction looks correct. But I will say that in about 90% of cases, the Auto Profiling will do all of the noise reduction work necessary.

For tinkerers, or for those who need to go far beyond an Auto Profile, you can manually adjust your settings. But taking a step back, Neat Video needs an area of your image that has a uniform color and noise profile to process how it removes noise. The automatic profiling will do its best to find an area, but it doesn’t always work. What you need to keep in mind when building a good noise profile inside of Neat Video is that the area being processed needs to be as uniform as possible (think dark night sky or a wall painted in one color) — meaning no prominent features, a high noise level (something in the high four area is better), the largest possible sample area and no warnings from Neat Video.

So, if your automatic profile doesn’t do the job, you can find an area of your image that meets the above requirements and then build a profile. From there you can use one of the Neat Video 5 features, like “Profile Check.” Profile Check will highlight details that aren’t being affected by Neat Video, giving you a clear representation of what noise is being reduced and whether you need to adjust your profile to better reduce video noise.

At this point you might be wondering where you tweak advanced settings. When you load Neat Video, you will be in Beginner mode. To get into Advanced mode, go to the Tools menu, where you will see a lot of advanced functions that can help you fine-tune your noise profile. And if you still can’t get a good noise reduction profile, you can try out the “Generic Profile,” which can help you build a profile even if your video doesn’t have a large enough area of uniform noise. There are also presets — such as like light flicker, moire flicker, repeat frame issues, dust and scratch filters (including scan lines), jitter of details, artifact removal filter and more — that can solve certain problems.

Neat Video 5 is faster than previous generations. As in previous versions, there is even a tool that inside of Neat Video preferences that will run your CPU and GPU through a benchmark to specify whether you should run on CPU only, GPU only, or a combo of both. In Neat Video 5, if you have trouble with a clip, you can use up to four “Variants” of noise reduction in the new playback window to see how each profile works with your clip.

In terms of playback and rendering, noise reduction is never fast. However, inside of Neat Video the new playback window will typically play back your footage to preview the noise reduction before you jump back into Resolve. Inside of Resolve, even in just 1080p, my sequence would crawl to just a few frames of playback per second. It is one of the most processor- and GPU-intensive tasks you will run on your computer.

In my testing I applied Neat Video 5 to the first node in my color correction tree, followed by a basic color correction in a one-minute timeline. I took those same clips and compared my Neat Video results to Resolve’s Temporal and Spatial noise reduction tools. In terms of visual results, Neat Video 5 was superior. If that’s not the case for you, then jump into YCbCr viewer mode inside of Neat Video 5, isolate each channel and tweak each channel individually so you won’t affect your overall noise reduction if it isn’t necessary. Not only did Neat Video 5 handle normal noise in the shadows well but on clips with very tight lines, it was able to keep a lot of the details while removing the noise. Resolve’s noise reduction tools had a harder time removing noise but keeping detail. Temporal noise reduction really didn’t do much, and while Spatial noise reduction did work it would heavily blur and distort the image — essentially not acceptable.

To get a good example of how Neat Video 5 slams a computer system, I exported 1080p MP4s. Resolve’s built-in Temporal noise reduction took 1:03, while the Spatial noise reduction took 1:05. The Neat Video 5 render of the same one-minute timeline took 3:51 — almost four times as long. I was curious how much longer a 4K render would take. Using 4K (UHD) media, I applied a simple color correction and on a previous serial node that applied Neat Video 5. I exported a 4K (UHD) MP4, which took 52 seconds without Neat Video 5 applied and 16:27 with Neat Video applied — at least 16 times more render time! So while Neat Video 5 is an amazing tool, there is a trade-off in high render times.

To find additional training on more advanced noise reduction techniques in Neat Video, check out the video tutorials. I find myself watching these just because of how much you can learn about noise reduction in general. They aren’t as exciting as watching Game of Thrones or The Handmaid’s Tale, but they will push your knowledge in noise reduction to the next level.

Summing Up
I’ve used Neat Video for a while, so when I was approached to review Version 5 I immediately said yes. Noise reduction is post skill that not many possess.

If you are an online editor or colorist looking to separate yourself from the pack, learn all the noise reduction techniques you can and definitely check out Neat Video 5. Not only can Neat Video 5 work automatically, but you can fine-tune your noise reduction as much as you want.

And when demoing your color correction services, think about using Neat Video 5 to remove camera noise, flickering and chroma issues; color correcting your footage; and, finally, adding some grain back into your shot. Not only will your footage look better, but you’ll have a technical workflow that will definitely impress clients. Just don’t forget to account for the extra render time.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Oscar-nominated Jojo Rabbit editor Tom Eagles: blending comedy and drama

By Daniel Restuccio

As an editor, Tom Eagles has done it all. He started his career in New Zealand cutting promos before graduating to assistant editor then editor on television series such as Secret Agent Men and Spartacus. Eventually he connected with up-and-coming director Taika Waititi and has worked with him on the series What We Do in the Shadows and the critically acclaimed feature Hunt for the Wilderpeople. Their most recent feature collaboration, 20th Century Fox’s Jojo Rabbit, earned Eagles BAFTA and Oscar nominations as well as an ACE Eddie Award win.

Tom Eagles

We recently caught up with him to talk about the unique storytelling style of Taika films, specifically Jojo Rabbit.

(Warning: If you haven’t seen the film yet, there might be some spoilers ahead.)

How did your first conversation with Taika go?
Fairly early on, unprompted, he gave me a list of his top five favorite films. The kind of scope and variety of it was startling, but they were also my top five favorite films. We talked about Stalker, from filmmaker Andrei Tarkovsky, and I was a massive Tarkovsky fan at the time. He also talked about Annie Hall and Bad Lands.

At that point in time, there weren’t a lot of people doing the type of work that Taika does: that mix of comedy and drama. That was the moment I thought, “I’ve got to work with this guy. I don’t know if I’m going to find anyone else like this in New Zealand.”

How is Jojo different than your previous collaboration on Hunt for the Wilderpeople?
We had a lot more to work with on Jojo. There’s a lot more coverage in a typical scene, while the Wilderpeople was three shots: a master and two singles. With Jojo, we just threw everything at it. Taika’s learned over the years that it’s never a bad thing to have another shot. Same goes for improv. It’s never a bad thing to have a different line. Jojo was a much bigger beast to work on.

Jojo is rooted in a moment in history, which people know well, and they’re used to a certain kind of storytelling around that moment. I think in the Czech Republic, where we shot, they make five World War II movies a year. They had a certain idea of how things should look, and we weren’t doing that. We were doing Taika’s take, so we weren’t doing desaturated, handheld, grim, kitchen sink realism. We were creating this whole other world. I think the challenge was to try and bring people along on that journey.

I saw an early version of the script, and the Hitler character wasn’t in the opening scene. How did that come about?
One of the great things about working with Taika is he always does pick-ups. Normally, it’s something that we figure out that we need during the process of the edit. He rewrote a bunch of different options for the ending of the movie, a few scenes dotted throughout and the opening of the film.

He shot three versions. In one, it was just Jojo on his own, trying to psych himself up. Then there were variations on how much Adolf we would have in the film. What we found when we screened the film up to that point was that people were on board with the film, but it sometimes took them a while to get there … to understand the tone of the film. The moment we put imaginary Adolf in that scene, it was like planting a flag and saying, “This is what this film is. It’s going to be a comedy about Hitler and Nazis, and you’re either with us or you’re walking out, but if you’re with us, you will find out it’s about a lot more than that.”

Some directors sit right at the editor’s elbow, overlooking every cut, and some go away and leave the editor to make a first cut. What was this experience like?
While I’ve experienced both, Taika’s definitely in the latter category. He’s interested in what you have to say and what you might bring to the edit. He also wants to know what people think, so we screen the film a lot. Across the board — it’s not just isolated to me, but anyone he works with — he just wants more ideas.

After the shooting finished, he gave me two weeks. He went and had a break and encouraged me to do what I wanted with the assemble, to cut scenes and to not be too precious about including everything. I did that, but I was still relatively cautious; there were some things I wanted him to see.

We experimented with various structures. We tried an archiving thing for the end of the film. There was a fantasy sequence in which Elsa is talking about the story of the Jews, and we see flights of fancy of what she thinks … a way for her to escape into fantasy. That was an idea of Taika’s. He just left me to it for a couple of weeks, and we looked at it and decided against it in the end. It was a fun process because when he comes back, he’s super fresh. You offer up one idea and he throws five back.

How long was the first cut?
I asked my assistant the other day, and he said it was about two hours and forty minutes, so I guess I have to go with that, which sounds long to me. That might have been the first compile that had all of the scenes in it, and what I showed Taika was probably half an hour shorter. We definitely had a lot to play with.

Do you think there’s going to be a director’s cut?
I think what you see is the director’s cut. There’s not a version of the film that has more stuff in it than we wanted in it. I think it is pretty much the perfect direction. I might have cut a little bit more because I think I just work that way. There were definitely things that we missed, but I wouldn’t put them back in because of what we gained by taking them out.

We didn’t lean that heavily on comedy once we transitioned into drama. The longer you’re away from Jojo and Elsa, that’s when we found that the story would flounder a little bit. It’s interesting because when I initially read the script, I was worried that we would get bored of that room, and that it would feel too much like a stage play. So we added all of this color and widened the world out. We had these scenes where Jojo goes out into the world, but actually the relationship between the two of them — that’s the story. Each scene in that relationship, the kind of gradual progression toward each other, is what’s moving the story forward.

This movie messes with your expectations, in terms of where you think it’s going or even how it’s saying it. How did you go about creating your own rhythms for that style of storytelling?
I was fortunate in that I already had Taika’s other films to lean on, so partly it was just trying to wrestle this genre into his world … into his kind of subgenre of Taika. It’s really just a sensibility a lot of the time. I was aware that I wanted a breathlessness to the pace of things, especially for the first half of the movie in order to match Jojo’s slightly ADD, overexcited character. That slows down a little bit when it needs to and when he’s starting to understand the world around him a little bit more.

Can you talk about the music?
Music also was important. The needle drops. Taika had a bunch of them already. He definitely had The Beatles and Bowie, and it was fleshing out a few more of those. I think I found the Roy Orbison piece. Temp music was also really important. It was quite hard to find stuff. Taika’s brief was: I don’t want it to sound like all the other movies in the genre. As much as we respected Schindler’s List, he didn’t want it to sound like Schindler’s List.

You edited on Avid Media Composer?
We cut on Avid, and it was the first time I really used ScriptSync. I had been wary of it, to be honest. I watch all the dailies through from head to tail and see the performances in context and feel how they affect me. Once that’s done, ScriptSync is great for comparing takes or swapping out a read on a line. Because we had so much improv on this film, we had to go through and enter all of that in manually. Sometimes we’d use PhraseFind to search on a particular word that I’d remembered an actor saying in an ad-lib. It’s a much faster and more efficient way of finding that stuff.

That said, I still periodically go back and watch dailies. As the film starts to solidify, so does what I’m looking for in the dailies, so I’ll always go back and see if there’s anything that I view differently with a new in mind.

You mentioned the difference between Wilderpeople and Jojo in terms of coverage. How much more coverage did you have? Were there multiple cameras?
There were two and sometimes three cameras (ARRI Alexa). Some scenes were single camera, so there was a lot more material mastered. Some directors get a bit iffy about two cameras, but we just rolled it.

If we had the option, we would almost always lean on the A camera, and part of the trick was to try and make it look like his other movies. We wanted the coverage plan to feel simple; it should still feel like a master, couple of mediums and a couple of singles, all in that very flat framing approach of his. Often, the characters are interacting with each other perpendicular to the camera in these fixed static wides.

Again, one of the things Taika was concerned with was that it should feel like his other movies. Just because we have a dolly, we don’t have to use it every time. We had all of those shots, we had those options, and often it was about pairing things back to try and stay in time.

Does he give you a lot of takes, and does he create different emotional variations within those takes?
We definitely had a lot of takes. And, yes, there would be a great deal of variety of performance, whether it’s him just trying to push an actor and get them to a specific place, or sometimes we just had options.

Was there an average — five takes, 10 takes?
It’s really hard to say. These days everyone just does rolling resets. You look at your bin and you think, “Ah, great, they did five takes, and there’s only three set-ups. How long is it going to take me?” But you open it up, and each take is like half an hour long, and they’re reframing on the fly.

With Scarlett Johansson, you do five takes max, probably. But with the kids it would be a lot of rolling resets and sometimes feeding them lines, and just picking up little lines here and there on the fly. Then with the comedians, it was a lot of improv, so it’s hard to quantify takes, but it was a ton of footage.

If you include the archive footage, I think we had 300 to 400 hours. I’m not sure how much of that was our material, but it would’ve been at least 100 hours.

I was impressed by the way you worked the “getting real” scenes: the killing of the rabbit and the hanging scene. How did you conceptualize and integrate those really important moments?
For the hanging scene, I was an advocate for having it as early in the movie as possible. It’s the moment in the film where we’ve had all this comedy and good times [regarding] Nazis, and then it drives home that this film is about Nazis, and this is what Nazis do.

I wanted to keep the rabbit scene fun to a degree because of where it sits in the movie. I know, obviously, it’s quite a freaky scene for a lot of people, but it’s kind of scary in a genre way for me.

Something about those woods always remind me of Stand by Me. That was the movie that was in my mind, and just the idea of those older kids, the bullies, being dicks. Moments like that and, much more so, the moment when Jojo finds Elsa; I thought of that sequence as a mini horror film within the film. That was really useful to let the scares drive it because we were so much in Jojo’s point of view. It’s taking those genres and interjecting a little bit of humor or a little bit of lightness into them to keep them in tone with Taika’s overall sensibility.

I read that you tried to steer clear of the sentimentality. How did you go about doing that?
It’s a question of taste with the performance(s) and things that other people might like. I will often feel I’m feeding the audience or demanding of the audience an emotional response. The scene where Jojo finds Rosie. We shot an option seeing Rosie hanging there. It just felt too much. It felt like it was really bludgeoning people over the head with the horror of the moment. It was enough to see the shoes. Every time we screened the movie and Jojo stands up, we see the shoes and everyone gasps. I think people have gotten the information that they need.


Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

Editor David Cea joins Chicago’s Optimus  

Chicago-based production and post house Optimus has added editor David Cea to its roster. With 15 years of experience in New York and Chicago, Cea brings a varied portfolio of commercial editing experience to Optimus.

Cea has cut spots for brands such as Bank of America, Chevrolet, Exxon, Jeep, Hallmark, McDonald’s, Microsoft and Target. He has partnered with many agencies, including BBDO, Commonwealth, DDB, Digitas, Hill Holliday, Leo Burnett, Mother and Saatchi & Saatchi.

“I grew up watching movies with my dad and knew I wanted to be a part of that magical process in some way,” explains Cea. “The combination of Goodfellas and Monty Python gave me all the fuel I needed to start my film journey. It wasn’t until I took an editing class in college that I discovered the part of filmmaking I wanted to pursue. The editor is the one who gets to shape the final product and bring out the true soul of the footage.”

After studying film at Long Island’s Hofstra University, Cea met Optimus editor and partner Angelo Valencia while working as his assistant at Whitehouse New York in 2005. Cea then moved on to hone his craft further at Cosmo Street in New York. Chicago became home for him in 2013 as he spent three years at Whitehouse. After heading back east for a couple of years, he returned to Chicago to put down roots.

While Avid Media Composer is Cea’s go-to choice for editing, he is also proficient in Adobe Premiere.

FXhome’s HitFilm Express 14, ‘Pay What You Want’ option

FXhome has a new “Pay What You Want” good-will program inspired by the HitFilm Express community’s requests to be able to help pay for development of the historically free video editing and VFX software. Pay What You Want gives users the option to contribute financially, ensuring that those funds will be allocated for future development and improvements to HitFilm.

Additionally, FXhome will contribute a percentage of the proceeds of Pay What You Want to organizations dedicated to global causes important to the company and its community. At its launch, the FXhome Pay What You Want initiative will donate a portion of its proceeds to the WWF and the Australia Emergency Bushfire Fund. The larger the contribution from customers, the more FXhome will donate.

HitFilm Express remains a free download, however, first-time customers will now have the option to “Pay What You Want” on the software. They’ll also receive some exclusive discounts on HitFilm add-on packs and effects.

Coinciding with the release of Pay What You Want, FXhome is releasing HitFilm Express 14, the first version of HitFilm Express to be eligible for the Pay What You Want initiative. HitFilm Express 14 features a new and simplified export process, new text controls, a streamlined UI and a host of new features.

For new customers who would like to download HitFilm Express 14 and also contribute to the Pay What You Want program, there are three options available:

• Starter Pack Level: With a contribution as little as $9, new HitFilm Express 14 customers will also receive a free Starter Pack of software and effects that includes:
o Professional dark mode interface
o Edit tools including Text, Split Screen Masking, PiP, Vertical Video, Action Cam Crop
o Color tools including Exposure, Vibrance, Shadows and Highlights, Custom Gray, Color Phase, Channel Mixer and 16-bit color
o Additional VFX packs including Shatter, 3D Extrusion, Fire, Blood Spray and Animated Lasers
• Content Creator Level: With contributions of $19 or more, users will receive everything included in the Starter Pack, as well as:
o Edit: Repair Pack with Denoise, Grain Removal and Rolling Shutter
o Color: LUT Pack with LUTs and Grading Transfer
o Edit: Beautify Pack with Bilateral Blur and Pro Skin Retouch
• VFX Artist Level: Users who contribute from $39 to $99 get everything in the Starter Pack and Content Creator levels plus:
o Composite Toolkit Pack with Wire Removal, Projector, Clone and Channel Swapper
o Composite Pro-Keying Pack for Chroma Keying
o Motion Audio Visual Pack with Atomic Particles, Audio Spectrum and Audio Waveform
o VFX Neon Lights Pack with Lightsword Ultra (2-Point Auto), Lightsword Ultra (4-Point Manual), Lightsword Ultra (Glow Only) and Neon Path
o VFX Lighting Pack with Anamorphic Lens Flares, Gleam, Flicker and Auto Volumetrics

What’s new in HitFilm Express 14
HitFilm Express 14 adds a number of VFX workflow enhancements to enable even more sophisticated effects for content creators, including a simplified export workflow that allows users to export content directly from the timeline and comps, new text controls, a streamlined UI and a host of new features. Updates include:

• Video Textures for 3D Models: For creators who already have the 3D: Model Render Pack, they can now use a video layer as a texture on a 3D model to add animated bullet holes, cracked glass or changing textures.
• Improvements to the Export Process: In HitFilm Express 14, the Export Queue is now an Export Panel, and is now much easier to use. Exporting can also now be done from the timeline and from comps. These “in-context” exports will export the content between the In and Out points set or the entire timeline using the current default preset (which can be changed from the menu).
• Additional Text Controls: Customizing text in HitFilm Express 14 is now even simpler, with Text panel options for All Caps, Small Caps, Subscript and Superscript. Users can also change the character spacing, horizontal or vertical scale, as well as baseline shift (for that Stranger-Things-style titling).
• Usability and Workflow Enhancements: In addition to the new and improved export process, FXhome has also implemented new changes to the interface to further simplify the entire post production process, including a new “composite button” in the media panel, double-click and keyboard shortcuts. A new Masking feature adds new automation to the workflow; when users double-click the Rectangle or Ellipse tools, a centered mask is automatically placed to fill the center of the screen. Masks are also automatically assigned colors, which can be changed to more easily identify different masks.
• Effects: Users can now double-click the effects panel to apply to the selected layer and drop 2D effects directly onto layers in the viewer. Some effects — such as the Chroma Key and Light Flares — can be dropped on a specific point, or users can select a specific color to key by. Users can also now favorite “effects” for quick and easy access to their five most recently used effects from the ‘Effects’ menu in the toolbar.
• Additional Improvements: Users can now use Behavior effects from the editor timeline, click-drag across multiple layers to toggle “solo,” “locked” or “visibility” settings in one action, and access templates directly from the media panel with the new Templates button. Menus have also been added to the tab of each panel to make customization of the interface easier.
• Open Imerge Pro files in HitFilm: Imerge Pro files can now be opened directly from HitFilm as image assets. Any changes made in the Imerge Pro project will be automatically updated with any save, making it easier to change image assets in real time.
• Introducing Light Mode: The HitFilm Express interface is now available in Light Mode and will open in Light Mode the first time you open the software. Users with a pre-existing HitFilm Express license can easily change back to the dark theme if desired.

HitFilm Express 14 is available immediately and is a free download. Customers downloading HitFilm Express 14 for the first time are eligible to participate in the new Pay What You Want initiative. Free effects and software packs offered in conjunction with Pay What You Want are only available at initial download of HitFilm Express 14.

Nomad Editorial hires eclectic editor Dan Maloney

Nomad Editing Company has added editor Dan Maloney to its team. Maloney is best known for his work cutting wry, eclectic comedy spots in addition to more emotional content. While his main tool is Avid Media Composer, he is also well-versed in Adobe Premiere.

“I love that I get to work in so many different styles and genres. It keeps it all interesting,” he says.

Prior to joining Nomad, Maloney cut at studios such as Whitehouse Post, Cut+Run, Spot Welders and Deluxe’s Beast. Throughout his career, Maloney has uses his eye for composition on a wide range of films, documentaries, branded content and commercials, including the Tide Interview spot that debuted at Super Bowl XLII.
“My editing style revolves mostly around performance and capturing that key moment,” he says. “Whether I’m doing a comedic or dramatic piece, I try to find that instance where an actor feels ‘locked in’ and expand the narrative out from there.”

According to Nomad editor/partner Jim Ulbrich, “Editing is all about timing and pace. It’s a craft and you can see Dan’s craftsmanship in every frame of his work. Each beat is carefully constructed to perfection across multiple mediums and genres. He’s not simply a comedy editor, visual storyteller, or doc specialist. He’s a skilled craftsman.”

Adobe Premiere Productions: film projects, collaborative workflows

By Mike McCarthy

Adobe announced a new set of features coming to its NLE Premiere Pro. They now support “Productions” within Premiere, which allows easier management of sets of projects being shared between different users. The announcement, which came during the Sundance Film Festival, is targeted at filmmakers working on large-scale projects with teams of people collaborating on site.

Productions extends and refines Premiere’s existing “Shared Project” model, making it easier to manage work spread across a large number of individual projects, which can become unwieldy with the current implementation.

Shared Projects should not be confused with Team Projects, which is an online project-sharing tool set across different locations that each have their own local media and Adobe Anywhere, which is a cloud based streaming editing platform with no local files. Shared Projects are used between users on a local network, usually with high-quality media, with simple mechanisms for passing work between different users. Shared Projects were introduced in Premiere Pro 2018 and included three components. Here, I’m going to tell you what the issues were and how the new Adobe Productions solves them:

1) The ability to add a shortcut to another project into the project panel, which was next to useless. The projects were in no other way connected with each other, and incrementing the target project to a new name (V02) broke the link. The only benefit was to see who might have the shortcut-ed project open and locked, which brings us to:

2) The ability to lock projects that were open on one system, preventing other users from inadvertently editing them at the same time and overwriting each other’s work, which should have been added a long time ago. This was previously managed through a process called “shout down the hall” before opening projects.

3) And most significantly, the inability to open more than one project at the same time. The previous approach was to import other projects into your existing project, but this resulted in massive project files that took forever to load, among other issues. Opening more than one project at once allowed projects to be broken into smaller individual parts, and then different people could more easily work on different parts at the same time.

For the last two-plus years, large films have been able to break down their work into many smaller projects and distribute those projects between numerous users who are working on various parts. And those users can pass the pieces back and forth without concern for overwriting each other’s work. But there was no central way to control all of those projects, and the master project/Shared Project shortcut system required you not to version your projects (bad file management) or to re-linking every project version to the master project (tedious).

You also end up with lots of copies of your media, as every time an asset is used in a different project, a new copy of it is copied into that project. If you update or edit an asset in one project, it won’t change the copies that are used in other projects (master clip effects, relinking, reinterpreting footage, proxies, etc.).

Problems Solved
Premiere’s new Production Panel and tool set are designed to solve those problems. First, it gives you a panel to navigate and explore all of the projects within your entire production, however you structure them within your master project folder. You can see who has what open and when.

When you copy an asset into a sequence from another project, it maintains a reference to the source project, so subsequent changes to that asset (color correction, attaching full-res media, etc.) can propagate to the instance in the sequence of the other project — if both projects are open concurrently to sync.

If the source project can’t be found, the child instance is still a freestanding piece of media that fully functions; it just no longer receives synchronized updates from the master copy. (So you don’t have a huge web of interconnected projects that will all fail if one of them is corrupted or deleted.)

All projects in a Production have the same project settings, (Scratch disks, GPU renderer, etc.) keeping them in sync and allowing you to update those settings across the production and share render files between users. And all files are stored on your local network for maximum performance and security.

In the application, this allows all of the source media to be in dedicated “dailies” projects, possibly a separate project for every day of filming. Then each scene or reel can be its own project, with every instance in the sequences referencing back to a master file in the dailies. Different editors and assistants can be editing different scenes, and all of them can have any source project open concurrently in read-only mode without conflict. As soon as someone saves changes, an icon will alert users that they can update the copy they have open and unlock it to continue working.

Some Limitations
Moving a sequence from one project to another doesn’t retain a link to the original because that could become a mess quickly. But it would be nice to be able to make edits to “reels” and have those changes reflected in a long-play project that strings those reels together. And with so many projects open at once, it can become difficult to keep track of what sequences go with what project panels.

Ideally, a color-coded panel system would help with that, either with random colors for contrast or with user-assigned colors by type of project. In that case it would still be good to highlight what other panels are associated with the selected panel, since two projects might be assigned the same color.

Summing Up
Regardless of those potential changes, I have been using Shared Projects to its fullest potential on a feature film throughout 2019, and I look forward to the improvements that the new Production panel will bring to my future workflows.

Check out this video rundown:


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

ACE Eddie Awards: Parasite and Jojo Rabbit among winners

By Dayna McCallum

The 70th Annual ACE Eddie Awards concluded with wins for Parasite (edited by Jinmo Yang) for Best Edited Feature Film (Dramatic) and Jojo Rabbit (edited by Tom Eagles) for Best Edited Feature Film (Comedy). Yang’s win marks the first time in ACE Eddie Awards history that a foreign language film won the top prize.

The winner of the Best Edited Feature Film (Dramatic) category has gone on to win the Oscar for film editing in 11 of the last 15 years. In other feature categories, Toy Story 4 (edited by Axel Geddes, ACE) won Best Edited Animated Feature Film and Apollo 11 (edited by Todd Douglas Miller) won Best Edited Documentary.

For the second year in a row, Killing Eve won for Best Edited Drama Series (Commercial Television) for “Desperate Measures” (edited by Dan Crinnion). Tim Porter, ACE, took home his second Eddie for Game of Thrones “The Long Night” in the Best Edited Drama Series (Non-Commercial Television) category, and Chernobyl “Vichnaya Pamyat” (edited by Jinx Godfrey and Simon Smith) won Best Edited Miniseries or Motion Picture for Television.

Other television winners included Better Things “Easter” (edited by Janet Weinberg) for Best Edited Comedy Series (Commercial Television), and last year’s Eddie winner for Killing Eve,  Gary Dollner, ACE, for Fleabag “Episode 2.1″ in the Best Edited Comedy Series (Non-Commercial Television) category.

Lauren Shuler Donner received the ACE’s Golden Eddie honor, presented to her by Marvel’s Kevin Feige. In her heartfelt acceptance speech, she noted to an appreciative crowd, “I’ve witnessed many times an editor make chicken salad our of chicken shit.”

Alan Heim and Tina Hirsch received Career Achievement awards presented by filmmakers Nick Cassavetes and Ron Underwood respectively. Cathy Repola, national executive director of the Motion Picture Editors Guild, was presented  with the ACE Heritage Award. American Cinema Editors president Stephen Rivkin, ACE, presided over the evening’s festivities for the final time, as his second term is ending.  Actress D’Arcy Carden, star of NBC’s The Good Place, served as the evening’s host.

Here is the complete list of winners:

BEST EDITED FEATURE FILM (DRAMA):
Parasite 
Jinmo Yang

Tom Eagles – Jojo Rabbit

BEST EDITED FEATURE FILM (COMEDY):
Jojo Rabbit
Tom Eagles

BEST EDITED ANIMATED FEATURE FILM:
Toy Story 4
Axel Geddes, ACE

BEST EDITED DOCUMENTARY (FEATURE):
Apollo 11
Todd Douglas Miller

BEST EDITED DOCUMENTARY (NON-THEATRICAL):
What’s My Name: Muhammad Ali
Jake Pushinsky, ACE

BEST EDITED COMEDY SERIES FOR COMMERCIAL TELEVISION:
Better Things: “Easter”
Janet Weinberg, ACE

BEST EDITED COMEDY SERIES FOR NON-COMMERCIAL TELEVISION:
Fleabag: “Episode 2.1”
Gary Dollner, ACE

BEST EDITED DRAMA SERIES FOR COMMERCIAL TELEVISION: 
Killing Eve: “Desperate Times”
Dan Crinnion

BEST EDITED DRAMA SERIES FOR NON-COMMERCIAL TELEVISION:
Game of Thrones: “The Long Night”
Tim Porter, ACE

BEST EDITED MINISERIES OR MOTION PICTURE FOR TELEVISION:
Chernobyl: “Vichnaya Pamyat”
Jinx Godfrey & Simon Smith

BEST EDITED NON-SCRIPTED SERIES:
VICE Investigates: “Amazon on Fire”
Cameron Dennis, Kelly Kendrick, Joe Matoske, Ryo Ikegami

ANNE V. COATES AWARD FOR STUDENT EDITING
Chase Johnson – California State University, Fullerton


Main Image: Parasite editor Jinmo Yang

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.

Directing Olly’s ‘Happy Inside Out’ campaign

How do you express how vitamins make you feel? Well, production company 1stAveMachine partnered with independent creative agency Yard NYC to develop the stylized “Happy Inside Out” campaign for Olly multivitamin gummies to show just that.

Beauty

The directing duo of Erika Zorzi and Matteo Sangalli, known as Mathery, highlighted the brand’s products and benefits by using rich textures, colors and lighting. They shot on an ARRI Alexa Mini. “Our vision was to tell a cohesive narrative, where each story of the supplements spoke the same visual language,” Mathery explains. “We created worlds where everything is possible and sometimes took each product’s concept to the extreme and other times added some romance to it.”

Each spot imagines various benefits of taking Olly products. The side-scrolling Energy, which features a green palette, shows a woman jumping and doing flips through life’s everyday challenges, including through her home to work, doing laundry and going to the movies. Beauty, with its pink color pallete, features another woman “feeling beautiful” while turning the heads of a parliament of owls. Meanwhile, Stress, with its purple/blue palette, features a women tied up in a giant ball of yarn, and as she unspools herself, the things that were tying her up spin away. In the purple-shaded Sleep, a lady lies in bed pulling off layer after layer of sleep masks until she just happily sleeps.

Sleep

The spots were shot with minimal VFX, other than a few greenscreen moments, and the team found itself making decisions on the fly, constantly managing logistics for stunt choreography, animal performances and wardrobe. Jogger Studios provided the VFX using Autodesk Flame for conform, cleanup and composite work. Adobe After Effects was used for all of the end tag animation. Cut+Run edited the campaign.

According to Mathery, “The acrobatic moves and obstacle pieces in the Energy spot were rehearsed on the same day of the shoot. We had to be mindful because the action was physically demanding on the talent. With the Beauty spot, we didn’t have time to prepare with the owls. We had no idea if they would move their heads on command or try to escape and fly around the whole time. For the Stress spot, we experimented with various costume designs and materials until we reached a look that humorously captured the concept.”

The campaign marks Mathery’s second collaboration with Yard NYC and Olly, who brought the directing team into the fold very early on, during the initial stages of the project. This familiarity gave everyone plenty of time to let the ideas breath.

Behind the Title: Film Editor Edward Line

By Randi Altman

This British editor got his start at Final Cut in London, honing his craft and developing his voice before joining Cartel in Santa Monica.

NAME: Edward Line

COMPANY: Cartel

WHAT KIND OF COMPANY IS CARTEL?
Cartel is an editorial and post company based in Santa Monica. We predominantly service the advertising industry but also accommodate long-form projects and other creative content. I joined Cartel as one of the founding editors in 2015.

CAN YOU GIVE US SOME MORE DETAIL ABOUT YOUR JOB?
I assemble the raw material from a film shoot into a sequence that tells the story and communicates the idea of a script. Sometimes I am involved before the shoot and cut together storyboard frames to help the director decide what to shoot. Occasionally, I’ll edit on location if there is a technical element that requires immediate approval for the shoot to move forward.

Edward Line working on Media Composer

During the edit, I work closely with the directors and creative teams to realize their vision of the script or concept and bring their ideas to life. In addition to picture editing, I incorporate sound design, music, visual effects and graphics into the edit. It’s a collaboration between many departments and an opportunity to validate existing ideas and try new ones.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THE FILM EDITOR TITLE?
A big part of my job involves collaborating with others, working with notes and dealing with tricky situations in the cutting room. Part of being a good editor is having the ability to manage people and ideas while not compromising the integrity and craft of the edit. It’s a skill that I’m constantly refining.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love being instrumental in bringing creative visions together and seeing them realized on screen, while being able to express my individual style and craft.

WHAT’S YOUR LEAST FAVORITE?
Tight deadlines. Filming with digital formats has allowed productions to shoot more and specify more deliverables. However, providing the editor proportional time to process everything is not always a consideration and can add pressure to the process.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I am a morning person so I tend to be most productive when I have fresh eyes. I’ve often executed a scene in the first few hours of a day and then spent the rest of the day (and night) fine-tuning it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I have always had a profound appreciation for design and architecture, and in an alternate universe, I could see myself working in that world.

WHY DID YOU CHOOSE THIS PROFESSION?
I’ve always had ambitions to work in filmmaking and initially worked in TV production after I graduated college. After a few years, I became curious about working in post and found an entry-level job at the renowned editorial company Final Cut in London. I was inspired by the work Final Cut was doing, and although I’d never edited before, I was determined to give editing a chance.

CoverGirl

I spent my weekends and evenings at the office, teaching myself how to edit on Avid Media Composer and learning editing techniques with found footage and music. It was during this experimental process, that I fell in love with editing and I never looked back.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
In the past year I have edited commercials for CoverGirl, Sephora, Bulgari, Carl’s Jr. and Smartcar. I have also cut a short film called Dad Was, which will be submitted to festivals in 2020.

HOW HAVE YOU DEVELOPED NEW SKILLS WHEN CUTTING FOR A SPECIFIC GENRE OR FORMAT?
Cutting music videos allowed me to hone my skills to edit musical performance while telling visual stories efficiently. I learned how to create rhythm and pace through editing and how to engage an audience when there is no obvious narrative. The format provided me with a fertile place to develop my individual editing style and perfect my storytelling skills.

When I started editing commercials, I learned to be more disciplined in visual storytelling, as most commercials are rarely longer than 60 seconds. I learned how to identify nuances in performance and the importance of story beats, specifically when editing comedy. I’ve also worked on numerous films with VFX, animation and puppetry. These films have allowed me to learn about the potential for these visual elements while gaining an understanding of the workflow and process.

More recently, I have been enjoying cutting dialogue in short films. Unlike commercials, this format allows more time for story and character to develop. So when choosing performances, I am more conscious of the emotional signals they send to the audience and overarching narrative themes.

Sephora

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It’s tough to narrow this down to one project…

Recently, I worked on a commercial for the beauty retailer Sephora that promoted its commitment to diversity and inclusivity. The film Identify As We is a celebration of the non-binary community and features a predominantly transgender cast. The film champions ideas of being different and self expression while challenging traditional perceptions of beauty. I worked tirelessly with the director and creative team to make sure we treated the cast and footage with respect while honoring the message of the campaign.

I’m also particularly proud of a short film that I edited called Wale. The film was selected for over 30 film festivals across the globe and won several awards. The culmination of the film’s success was receiving a BAFTA nomination and being shortlisted for the 91st Academy Awards for Best Live Action Short Film.

WHAT DO YOU USE TO EDIT?
I work on Avid Media Composer, but I have recently started to flirt with Adobe Premiere. I think it’s good to be adaptable, and I’d hate to restrict my ability to work on a project because of software.

Wale

ARE YOU OFTEN ASKED TO DO MORE THAN EDIT? IF SO, WHAT ELSE ARE YOU ASKED TO DO?
Yes, I usually incorporate other elements such as sound design, music and visual effects into my edits as they can be instrumental to the storytelling or communication of an idea. It’s often useful for the creative team and other film departments to see how these elements contribute to the final film, and they can sometimes inform decisions in the edit.

For example, sound can play a major part in accenting a moment or providing a transition to another scene, so I often spend time placing sound effects and sourcing music during the edit process. This helps me visualize the scene in a broader context and provides new perspective if I’ve become overfamiliar with the footage.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
No surprises, but my smartphone! Apart from the obvious functions, it’s a great place to review edits and source music when I’m on the move. I’ve also recently purchased a Bluetooth keyboard and Wacom tablet, which make for a tidy work area.

I’m also enjoying using my “smart thermostat” at home which learns my behavior and seems to know when I’m feeling too hot or cold.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Once I have left the edit bay, I decompress by listening to music on the way home. Once home, I take great pleasure from cooking for myself, friends and family.

Maryanne Brandon’s path, and editing Star Wars: The Rise of Skywalker

By Amy Leland

In the interest of full disclosure, I have been a fan of both the Star Wars world and the work of J.J. Abrams for a very long time. I saw Star Wars: Episode IV – A New Hope  in the theaters with my big brother when I was five years old, and we were hooked. I don’t remember a time in my life without Star Wars. And I have been a fan of all of Abrams’ work, starting with Felicity. Periodically, I go back and rewatch Felicity, Alias and Lost. I was, in fact, in the middle of Season 2 of Alias and had already purchased my ticket for The Rise of Skywalker when I was assigned this interview.

As a female editor, I have looked up to Maryann Brandon, ACE, and Mary Jo Markey, ACE — longtime Abrams collaborators — for years. A chance to speak with Brandon was more than a little exciting. After getting the fangirl out of my system at the start of the interview, we had a wonderful conversation about her incredible career and this latest Star Wars offering.

After working in the world of indie film in New York City after NYU film school, Brandon has not only been an important part of J.J. Abrams’ world — serving as a primary editor on Alias, and then on Mission Impossible III, Super 8 and two films each in the Star Trek and Star Wars worlds — but has also edited The Jane Austen Book Club, How to Train Your Dragon and Venom, among others.

Maryann Brandon

Let’s dig a bit deeper with Brandon…

How did your path to editing begin?
I started in college, but I wasn’t really editing. I was just a member of the film society. I was recruited by the NYU Graduate Film program in 1981 because they wanted women in the program. And I thought, it’s that or working on Wall Street, and I wasn’t really that great with the money or numbers. I chose film school.

I had no idea what it was going to be like because I don’t come from a film background or a film family. I just grew up loving films. I ended up spending three years just running around Manhattan, making movies with everyone, and everyone did every job. Then, when I got out of school, I had to finish my thesis film, and there was no one to edit it for me. So I ended up editing it myself. I started to meet people in the business because New York was very close. I got offered a paid position in editing, and I stayed.

I met and worked for some really incredible people along the way. I worked as a second assistant on the Francis Ford Coppola film The Cotton Club. I went from that to working as a first assistant on Richard Attenborough’s version of A Chorus Line. I was sent to London and got swept up in the editing part of it. I like telling stories. It became the thing I did. And that’s how it happened.

Who inspired you in those early days?
I was highly influenced by Dede Allen. She was this matriarch of New York at that time, and I was so blown away by her and her personality. I mean, her work spoke for itself, but she was also this incredible person. I think it’s my nature anyway, but I learned from her early on an approach of kindness and caring. I think that’s part of why I stayed in the cutting room.

On set, things tend to become quite fraught sometimes when you’re trying to make something happen, but the cutting room is this calm place of reality, and you could figure stuff out. She was very influential to me, and she was such a kind, caring person. She cared about everyone in the cutting room, and she took time to talk to everyone.

There was also John Bloom, who was the editor on A Chorus Line. We became very close, and he always used to call me over to see what he was doing. I learned tons from him. In those days, we cut on film, so it was running through your fingers.

The truth is everyone I meet influences me a bit. I am fascinated by each person’s approach and why they see things the way they do.

While your resume is eclectic, you’ve worked on many sci-fi and action films. Was that something you were aiming for, or did it happen by chance?
I was lucky enough to meet J.J. Abrams, and I was lucky enough to get on Alias, which was not something I thought I’d want to do. Then I did it because it seemed to suit me at the time. It was a bit of faith and a bit of, “Oh, that makes sense for you, because you grew up loving Twilight Zone and Star Trek.”

Of course, I’d love to do more drama. I did The Jane Austen Book Club and other films like that. One does tend to get sort of suddenly identified as, now I’m the expert on sci-fi and visual effects. Also, I think because there aren’t a lot of women who do that, it’s probably something people notice. But I’d love to do a good comedy. I’d love to do something like Jumanji, which I think is hilarious.

How did this long and wonderful collaboration with J.J. Abrams get started?
Well, my kids were getting older. It was getting harder and harder for me to go on location with the nanny, the dog, the nanny’s kids, my kids, set up a third grade class and figure out how to do it all. A friend of mine who was a producer on Felicity had originally tried to get me to work on that show. She said, “You’ll love J.J. You’ll love (series creator) Matt Reeves. Come and just meet us.” I just thought television is such hard work.

Then he was starting this new show, Alias. My friend said, “You’re going to love it. Just meet him.” And I did. Honestly, I went to an interview with him, and I spent an hour basically laughing at every joke he told me. I thought, “This guy’s never going to hire me.” But he said, “Okay, I’ll see you tomorrow.” That’s how it started.

What was that like?
Alias was so much fun. I didn’t work on Felicity, which was more of a straightforward drama about a college girl growing up. Alias was this crazy, complicated, action-filled show, but also a girl trying to grow up. It was all of those things. It was classic J.J. It was a challenge, and it was really fun because we all discovered it together. There were three other female editors who are amazing — Mary Jo Markey, Kristin Windell, and Virginia Katz — and there was J.J. and Ken Olin, who was a producer in residence there and director. We just found the show together, and that was really fun.

How has your collaboration with J.J. changed over time?
It’s changed in terms of the scope of a project and what we have to do. And, obviously, the level of conflict and communication is pretty easy because we’ve known each other for so long. There’s not a lot of barriers like, “Hey, I’m trying to get to know you. What do I…?” We just jump right in. Over the years, it’s changed a bit.

On The Rise of Skywalker, I cut this film with a different co-editor. Mary Jo [Markey, Brandon’s longtime co-editor] was doing something else at the time, so I ended up working with Stefan Grube. The way I had worked with Mary Jo was we would divide up the film. She’d do her thing and I’d do mine. But because these films are so massive, I prefer not to divide it up, but instead have both of us work on whatever needs working on at the time to get it done. I proposed this to J.J., and it worked out great. Everything got cut immediately and we got together periodically to ask him what he thought.

Another thing that changed was, because we needed to turn over our visual effects really quickly, I proposed that I cut on the set, on location, when they were shooting. At first J.J. was like, “We’ve never done this before.” I said, “It’s the only way I’m going to get your eyes on sequences,” because by the time the 12-hour day is over, everyone’s exhausted.

It was great and worked out well. I had this little mobile unit, and the joke was it was always within 10 feet of wherever J.J. was. It was also great because I felt like I was part of the crew, and they felt like they could talk to me. I had the DP asking me questions. I had full access to the visual effects supervisor. We worked out shots on the set. Given the fact that you could see what we already had, it really was a game-changer.

What are some of the challenges of working on films that are heavy on action, especially with the Star Wars and Star Trek films and all the effects and CGI?
There’s a scene where they arrive on Exogal, and they’re fighting with each other and more ships are arriving. All of that was in my imagination. It was me going, “Okay, that’ll be on the screen for this amount of time.” I was making up so much of it and using the performances and the story as a guide. I worked really closely with the visual effects people describing what I thought was going to happen. They would then explain that what I thought was going to happen was way too much money to do.

Luckily I was on the set, so I could work it out with J.J. as we went. Sometimes it’s better for me just to build something that I imagine and work off of that, but it’s hard. It’s like having a blank page and then knowing there’s this one element, and then figuring out what the next one will be.

There are people who are incredibly devoted to the worlds of Star Trek and Star Wars and have very strong feelings about those worlds. Does that add more pressure to the process?
I’m a big fan of Star Trek and Star Wars, as is J.J. I grew up with Star Trek, and it’s very different because Star Trek was essentially a week-to-week serial that featured an adventure, and Star Wars is this world where they’re in one major war the whole time.

Sometimes I would go off on a tangent, and J.J. and my co-editor Stefan would be like, “That’s not in the lore,” and I’d have to pull it back and remember that we do serve a fan base that is loyal to it. When I edit anything, I really try to abandon any kind of preconceived thing I have so I can discover things.

I think there’s a lot of pressure to answer to the first two movies, because this is the third, and you can’t just ignore a story that’s been set up, right? We needed to stay within the boundaries of that world. So yeah, there’s a lot of pressure to do that, for sure. One of the things that Chris Terrio and J.J., as the writers, felt very strongly about was having it be Leia’s final story. That was a labor of love for sure. All of that was like a love letter to her.

I don’t know how much of that had been decided before Carrie Fisher (Leia) died. It was my understanding that you had to reconstruct based on things she shot for the other films.
She died before this film was even written, so all of the footage you see is from Episode 7. It’s all been repurposed, and scenes were written around it. Not just for the sake of writing around the footage, but they created scenes that actually work in the context of the film. A lot of what works is due to Daisy Ridley and the other actors who were in the scenes with her. I mean, they really brought her to life and really sold it. I have to say they were incredible.

With two editors co-editing on set during production, you must have needed an extensive staff of assistant editors. How do you work with assistant editors on something of this scale?
I’ve worked with an assistant editor named Jane Tones on the last couple of films. She is amazing. She was the one who figured out how to make the mobile unit work on set. She’s incredibly gifted, both technologically and story-wise. She was instrumental in organizing everything to do with the edit and getting us around. Stefan’s assistant was Warren Paeff, and he is very experienced. We also had a sound person we carried with us and a couple of other assistants. I had another assistant, Ben Cox, who was such a Star Wars fan. When I said, “I’m happy to hire you, but I only have a second assistant position.” He was like, “I’ll take it!”

What advice do you have for someone starting out or who would like to build the kind of career you’ve made?
I would say, try to get a PA job or a job in the cutting room where you really enjoy the people, and pay attention. If you have ideas, don’t be shy but figure out how to express your ideas. I think people in the cutting room are always looking for anyone with an opinion or reaction because you need to step back from it. It’s a love of film, a love of storytelling and a lot of luck. I work really hard, but I also had a lot of good fortune meeting the people I did.


Amy Leland is a film director and editor. Her short film, Echoes, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

CVLT adds Joe Simons as lead editor

Bi-coastal production studio CVLT, which offers full-service production and post, has Joe Simons as lead editor. He will be tasked with growing CVLT’s editorial department. He edits on Adobe Premiere and will be based in the New York studio.

Simons joins CVLT after three years at The Mill, where his edited the “It’s What Connects Us” campaign for HBO, the “Top Artist of the Year” campaign for Spotify and several major campaigns for Ralph Lauren, among many others. Prior to The Mill, he launched his career at PS260 before spending four years at editing house Cut+Run.

Simons’ addition comes at a time when CVLT is growing into a full concept-to-completion creative studio, launching campaigns for top luxury and fashion brands, including Lexus, Peloton and Louis Vuitton.

“Having soaked up everything I could at The Mill and Cut+Run, it was time for me to take that learning and carve my own path,” says Simons.

Uncut Gems directors Josh and Benny Safdie

By Iain Blair

Filmmakers Josh and Benny Safdie have been on the verge of the big time since they started making their own distinctive brand of cinema: one full of anxiety, brashness, untamed egos and sweaty palms. They’ve finally done it with A24’s Uncut Gems.

Following their cinema verité Heaven Knows What — with its look at the New York City heroin subculture — and the crime thriller Good Time, the Safdies return to the mean streets of New York City with their latest, Uncut Gems. The film is a twisty, tense tale that explores the tragic sway of fortune, family and fate.

The Safdies on set.

It stars Adam Sandler in a career-defining performance as Howard Ratner, a profane, charismatic New York City jeweler who’s always on the lookout for the next big score. When he makes a series of high-stakes bets that could lead to the windfall of a lifetime, Howard must perform a high-wire act by balancing business, family and encroaching adversaries on all sides.

Uncut Gems combines relentless pacing with gritty visuals, courtesy of DP Darius Khondji, and a score from Brooklyn-based experimental composer Daniel Lopatin.

In the tradition of ‘70s urban thrillers by Sidney Lumet, William Friedkin and Martin Scorsese (who produced, along with Scott Rudin), the film creates an authentic tapestry of indelible faces, places, sounds and moods.

Behind the camera, the Safdies also assembled a stellar team of go-to collaborators that included co-editor Ronald Bronstein and production designer Sam Lisenco.

I recently sat down with the Safdies, whose credits include Daddy Longlegs, Lenny Cooke and The Pleasure of Being Robbed, to talk about making the film (which is generating awards buzz) and their workflow.

What sort of film did you set out to make?
Josh Safdie: The exact one you see on the screen. It changed a lot along the way, but the cosmic vibe of it and the mélange of characters who don’t seem to fit together but do on this journey where we are all on on this colorful rock that might as well be a colorful uncut gem – it was all there in the original idea. It’s pulpy, cosmic, funny, tense, and it’s what we wanted to do.

Benny Safdie: We have veteran actors, first-time actors and non-professionals in the cast, working alongside people we love so much. It’s great to see it all come together like it did.

How tough was it getting Adam Sandler, as I heard he initially passed on it?
Josh: He did. We sent it to him back in 2012, and I’m not sure it even got past “the moat,” as we call it. But once he saw our work, he immediately responded; he called us right after seeing Good Time. The irony is, one of his favorites was Daddy Longlegs, which we’d tried to approach him with. Once we actually made contact and started talking, it was instantly a strong kinship.

Benny: Now it’s like a deep friendship. He really got our need to dig deep on who this character is, and he put in the time and the care.

Any surprises working with him?
Josh: What’s funny is, we had a bunch of jokes written for him, and he then ad-libbed so many little things. He made us all smile every day.

What did he bring to the role of Howard, who could easily be quite unlikeable?
Josh: Exactly, and Adam brought that likeability in a way only he can. We had the benefit of following up his 50-city standup tour, where he did three hours of material every night, and we had a script loaded with dialogue. His mind was so sharp, so by the time he did this — and we were giving him new pages over lunch sometimes — he could just ingest them and regurgitate them and go out on a limb and try out a new joke, and then come back to the dialogue. He was so well oiled in the character that it was second nature to him.

Benny: And you root for him. You want him to succeed. Adam pushed us on stuff, like the family and the kids. He knew it was important to show those relationships, that audiences would want to see and feel that. And he wanted to create a very complicated person. Yes, Howard’s doing some bad things, but you want him to get there.

Was it difficult getting former pro basketball player Kevin Garnett to act in the film?
Josh: It’s always tough when someone is very successful in their own field. When you try to convince them to do acting, they know it’s a lot of work and they don’t need the money, and you’re asking them to play a version of themselves — and there’s the huge time commitment. But Kevin really committed, and he came back a bunch to shoot scenes we needed.

You’re well known for your run-and-gun, guerilla-style of shooting. Was that how you shot this film?
Josh: Yeah, a lot of locations, but we built the office sets. And we got permits for everything.

Benny: But we kept the streets open and mixed in the 80 SAG actors in the background.

How does it work on the set in terms of co-directing?
Josh: On a technical level, we’ll work with our DP on placing the camera. It was a bit different this time since Benny wasn’t also acting, like he did in Good Time. We were co-directing and getting that much closer to the action; you see different parts of a performance that way, and we have each other’s backs. We are able to put our heads together and get a really full picture of what’s happening on set. And if one of us talks to somebody, it’s always coming from both of us. We’ve been working together since we were kids, and we have a huge amount of trust in each other.

The way characters talk over each other, and then all the background chatter, reminded me a lot of Robert Altman and his approach.
Benny: Thanks. He was a huge influence on us. It’s using sound as it’s heard in real life. We heard this great story about Altman and the film McCabe and Mrs. Miller. About 15 minutes into the premiere Warren Beatty turned to Altman and asked, “Does the whole movie sound like this?” And Altman replied excitedly, “Yeah!” He was so far ahead of his time, and that’s what we tried to emulate.

What’s so great about Altman is that he saw life as a film, and he tried to get the film to ride up parallel to life in that sense. We ended up writing 45 extra pages of dialogue recording — just for the background. Scott Rudin was like, “You wrote a whole other script for background people?” We’d have a character there just to say one line, but it added all these extra layers.

Josh: On top of the non-stop dialogue, Howard’s a real loudmouth; he hears everything. Our post sound team was very special, and it was very educational for us. We began with Oscar-winning re-recording mixer Tom Fleischman, but then he had to go off to do The Irishman, so Skip Lievsay took over. Then Warren Shaw came on, and we worked with the two of them for a very long time.

Thanks to our producers, we had the time to really get in there and go deeper and deeper. I’d say the soundscape they built in Dolby Atmos really achieved something like life, and it also had areas for music and sound design that are so meticulous and rich that we’d watch the movie without the dialogue.

Where did you do the post?
Benny: All in New York. For sound, we started off at Soundtrack and then went to Warner Bros. Sound. We edited at our company offices with co-editor Ronny Bronstein. Brainstorm Digital did most of the crazy visual effects. We worked closely with them and on the whole idea of, “What does the inside of a gemstone look like?”

How does editing work with Ronny?
Josh: He’s often on the set with us, but we didn’t cut a frame until we sat down after the shoot and watched it all. I think that kept it fresh for us. Our assistant editor developed binders with all the script and script supervisor notes, and we didn’t touch it once during the edit. I think coming from documentaries, and that approach to the material, has informed all our editing. You look at what’s in front of you, and that’s what you use to make your film. Who cares what the script says!

One big challenge was the sheer amount of material, even though we only shot for 35 days — that includes the African unit. We had so many setups and perspectives, in things like the auction and the Seder scenes, but the scene we spent the most time writing and editing was the scene between Howard and Kevin in the back room… and we had the least time to shoot it — just over three hours.

L-R: Benny Safdie, Iain Blair and Josh Safdie.

You have a great score by your go-to composer Daniel Lopatin, who records as Oneohtrix Point Never.
Josh: We did the score at his studio in Brooklyn. It’s really another main character, and he did a great job as usual.

The DI must have been vital?
Josh: Yes, and we did all the color at The Mill with colorist Damien van der Cruyssen, who’s a really great colorist and also ran our dailies. Darius likes to spend a lot of time in the DI experimenting and finding the look, so we ended up doing about a month on it. Usually, we get just four days.

What’s next? A big studio movie?
Josh: Maybe, but we don’t want to abandon what we’ve got going right now. We know what we want. People offer us scripts but I can’t see us doing that.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

A Beautiful Day in the Neighborhood director Marielle Heller

By Iain Blair

If you are of a certain age, the red cardigan, the cozy living room and the comfy sneakers can only mean one thing — Mister Rogers! Sony Pictures’ new film, A Beautiful Day in the Neighborhood, is a story of kindness triumphing over cynicism. It stars Tom Hanks and is based on the real-life friendship between Fred Rogers and journalist Tom Junod.

Marielle Heller

In the film, jaded writer Lloyd Vogel (Matthew Rhys), whose character is loosely based on Junod, is assigned a profile of Rogers. Over the course of his assignment, he overcomes his skepticism, learning about empathy, kindness and decency from America’s most beloved neighbor.

A Beautiful Day in the Neighborhood is helmed by Marielle Heller, who most recently directed the film Can You Ever Forgive Me? and whose feature directorial debut was 2015’s The Diary of a Teenage Girl. Heller has also directed episodes of Amazon’s Transparent and Hulu’s Casual.

Behind the scenes, Heller collaborated with DP Jody Lee Lipes, production designer Jade Healy, editor Anne McCabe, ACE, and composer Nate Heller.

I recently spoke with Heller about making the film, which is generating a lot of Oscar buzz, and her workflow.

What sort of film did you set out to make?
I didn’t want to make a traditional biopic, and part of what I loved about the script was it had this larger framing device — that it’s a big episode of Mister Rogers for adults. That was very clever, but it’s also trying to show who he was deep down and what it was like to be around him, rather than just rattling off facts and checking boxes. I wanted to show Fred in action and his philosophy. He believed in authenticity and truth and listening and forgiveness, and we wanted to embody all that in the filmmaking.

It couldn’t be more timely.
Exactly, and it’s weird since it’s taken eight years to get it made.

Is it true Tom Hanks had turned this down several times before, but you got him in a headlock and persuaded him to do it?
(Laughs) The headlock part is definitely true. He had turned it down several times, but there was no director attached. He’s the type of actor who can’t imagine what a project will be until he knows who’s helming it and what their vision is.

We first met at his grandkid’s birthday party. We became friends, and when I came on board as director, the producers told me, “Tom Hanks was always our dream for playing Mister Rogers, but he’s not interested.” I said, “Well, I could just call him and send him the script,” and then I told Tom I wasn’t interested in doing an imitation or a sketch version, and that I wanted to get to his essence right and the tone right. It would be a tightrope to walk, but if we could pull it off, I felt it would be very moving. A week later he was like, “Okay, I’ll do it.” And everyone was like, “How did you get him to finally agree?” I think they were amazed.

What did he bring to the role?
Maybe people think he just breezed into this — he’s a nice guy, Fred’s a nice guy, so it’s easy. But the truth is, Tom’s an incredibly technically gifted actor and one of the hardest-working ones I’ve ever worked with. He does a huge amount of research, and he came in completely prepared, and he loves to be directed, loves to collaborate and loves to do another take if you need it. He just loves the work.

Any surprises working with him?
I just heard that he’s actually related to Fred, and that’s another weird thing. But he truly had to transform for the role because he’s not like Fred. He had to slow everything down to a much slower pace than is normal for him and find Fred’s deliberate way of listening and his stillness and so on. It was pretty amazing considering how much coffee Tom drinks every day.

What did Matthew Rhys bring to his role?
It’s easy to forget that he’s actually the protagonist and the proxy for all the cynicism and neuroticism that many of us feel and carry around. This is what makes it so hard to buy into a Mister Rogers world and philosophy. But Matthew’s an incredibly complex, emotional person, and you always know how much he’s thinking. He’s always three steps ahead of you, he’s very smart, and he’s not afraid of his own anger and exploring it on screen. I put him through the ringer, as he had to go through this major emotional journey as Lloyd.

How important was the miniature model, which is a key part of the film?
It was a huge undertaking, but also the most fun we had on the movie. I grew up building miniatures and little cities out of clay, so figuring it all out — What’s the bigger concept behind it? How do we make it integrate seamlessly into the story? — fascinated me. We spent months figuring out all the logistics of moving between Fred’s set and home life in Pittsburgh and Lloyd’s gritty, New York environment.

While we shot in Pittsburgh, we had a team of people spend 12 weeks building the detailed models that included the Pittsburgh and Manhattan skylines, the New Jersey suburbs, and Fred’s miniature model neighborhood. I’d visit them once a week to check on progress. Our rule of thumb was we couldn’t do anything that Fred and his team couldn’t do on the “Neighborhood,” and we expanded a bit beyond Fred’s miniatures, but not outside of the realm of possibility. We had very specific shots and scenes all planned out, and we got to film with the miniatures for a whole week, which was a delight. They really help bridge the gap between the two worlds — Mister Rogers’ and Lloyd’s worlds.

I heard you shot with the same cameras the original show used. Can you talk about how you collaborated with DP Jody Lee Lipes, to get the right look?
We tracked down original Ikegami HK-323 cameras, which were used to film the show, and shipped them in from England and brought them to the set in Pittsburgh. That was huge in shooting the show and making it even more authentic. We tried doing it digitally, but it didn’t feel right, and it was Jody who insisted we get the original cameras — and he was so right.

Where did you post?
We did it in New York — the editing at Light Iron, the sound at Harbor and the color at Deluxe.

Do you like the post process?
I do, as it feels like writing. There’s always a bit of a comedown from production for me, which is so fast-paced. You really slow down for post; it feels a bit like screeching to a halt for me, but the plus is you get back to the deep critical thinking needed to rewrite in the edit, and to retell the story with the sound and the DI and so on.

I feel very strongly that the last 10% of post is the most important part of the whole process. It’s so tempting to just give up near the end. You’re tired, you’ve lost all objectivity, but it’s critical you keep going.

Talk about editing with Anne McCabe. What were the big editing challenges?
She wasn’t on the set. We sent dailies to her in New York, and she began assembling while we shot. We have a very close working relationship, so she’d be on the phone immediately if there were any concerns. I think finding the right tone was the biggest challenge, and making it emotionally truthful so that you can engage with it. How are you getting information and when? It’s also playing with audiences’ expectations. You have to get used to seeing Tom Hanks as Mister Rogers, so we decided it had to start really boldly and drop you in the deep end — here you go, get used to it! Editing is everything.

There are quite a few VFX. How did that work?
Obviously, there’s the really big VFX sequence when Lloyd goes into his “fever dreams” and imagines himself shrunk down on the set of the neighborhood and inside the castle. We planned that right from the start and did greenscreen — my first time ever — which I loved. And even the practical miniature sets all needed VFX to integrate them into the story. We also had seasonal stuff, period-correct stuff, cleanup and so on. Phosphene in New York did all the VFX.

Talk about the importance of sound and music.
My composer’s also my brother, and he starts very early on so the music’s always an integral part of post and not just something added at the end. He’s writing while we shoot, and we also had a lot of live music we had to pre-record so we could film it on the day. There’s a lot of singing too, and I wanted it to sound live and not overly produced. So when Tom’s singing live, I wanted to keep that human quality, with all the little mouth sounds and any mistakes. I left all that in purposely. We never used a temp score since I don’t like editing to temp music, and we worked closely with the sound guys at Harbor in integrating all of the music, the singing, the whole sound design.

How important is the DI to you?
Hugely important and we finessed a lot with colorist Sam Daley. When you’re doing a period piece, color is so crucial – that it feels authentic to that world. Jody and Sam have worked together for a long time and they worked very hard on the LUT before we began, and every department was aware of the color palette and how we wanted it to look and feel.

What’s next?
I just started a new company called Defiant By Nature, where I’ll be developing and producing TV projects by other people. As for movies, I’m taking a little break.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Review: The Sensel Morph hardware interface

By Brady Betzel

As an online editor and colorist, I have tried a lot of hardware interfaces designed for apps like Adobe Premiere, Avid Media Composer, Blackmagic DaVinci Resolve and others. With the exception of professional color correction surfaces like the FilmLight Baselight, the Resolve Advanced Panel and Tangent’s Element color correction panels, it’s hard to get exactly what I need.

While they typically work well, there is always a drawback for my workflow; usually they are missing one key shortcut or feature. Enter Sensel Morph, a self-proclaimed morphable hardware interface. In reality, it is a pressure-sensitive trackpad that uses individual purchasable magnetic rubber overlays and keys for a variety of creative applications. It can also be used as a pressure-sensitive trackpad without any overlays.

For example, inside of the Sensel app you can identify the Morph as a trackpad and click “Send Map to Morph,” and it will turn itself into a large trackpad. If you are a digital painter, you can turn the Morph into “Paintbrush Area” and use a brush and/or your fingers to paint! Once you understand how to enable the different mappings you can quickly and easily Morph between settings.

For this review, I am going to focus on how you can use the Sensel Morph with Adobe Premiere Pro. For the record, you can actually use it with any NLE by creating your own map inside of the Sensel app. The Morph essentially works with keyboard shortcuts for NLEs. With that in mind, if you customize your keyboard shortcuts you are going to want to enable the default mapping inside of Premiere or adjust your settings to match the Sensel Morph’s settings.

Before you plug in your Morph, you will need to click over to https://sensel.com/pages/support, where you can get a quick-start guide in addition to the Sensel app you will need to install before you get working. After it’s downloaded and installed, you will want to plug in the Morph via the USB and let it charge before using the Bluetooth connection. It took a while for the Morph to fully charge, about two hours, but once I installed the Sensel app, added the Video Editing Overlay and opened Adobe Premiere, I was up and working.

To be honest, I was a little dubious about the Sensel Morph. A lot of these hardware interfaces have come across my desk, and they usually have poor software implementation, or the hardware just doesn’t hold up. But the Sensel Morph broke through my preconceived ideas of hardware controllers for NLEs like Premiere, and for the first time in a long time, I was inspired to use Premiere more often.

It’s no secret that I learned professional editing in Avid Media Composer and Symphony. And most NLEs can’t quite rise to the level of professional experience that I have experienced in Symphony. One of those experiences is how well and fluid the keyboard and Wacom tablet work together. The first time I plugged in the Sensel Morph, overlayed the Video Editing Overlay on top of the Morph and opened Premiere, I began to have that same feeling but inside of Premiere!

While there are still things Premiere has issues with, the Sensel Morph really got me feeling good about how well this Adobe NLE worked. And to be honest, some of those issues relate to me not learning Premiere’s keyboard shortcuts like I did in Avid. The Sensel Morph felt like a natural addition to my Premiere editing workflow. It was the first time I started to feel that “flow state” inside of Premiere that I previously got into when using Media Composer or Symphony, and I started trimming and editing like a mad man. It was kind of shocking to me.

You may be thinking that I am blowing this out of proportion, and maybe I am, a little, but the Morph immediately improved my lazy Premiere editing. In fact, I told someone that Adobe should package these with first-time Premiere users.

I really like the way the timeline navigation works (much like the touch bar). I also like the quick Ripple Left/Right commands, and I like how you can quickly switch timelines by pressing the “Timeline” button multiple times to cycle through them. I did feel like I needed a mouse some of the time and keyboard for some of the time, but for about 60% of the time I could edit without them. Much like how I had to force myself to use a Wacom tablet for editing, if you try not to use a mouse I think you will get by just fine. I did try and use a Wacom stylus with the Sensel Morph and, unfortunately, it did not work.

What improvements could the Sensel Morph make? Specifically in Premiere, I wish they had a full-screen shortcut (“`”) labeled on the Morph. It’s one of those shortcuts I use all the time, whether I want to see my timeline full screen, the effects controls full screen or the Program feed full screen. And while I know I could program it using the Sensel app, the OCD in me wants to see that reflected onto the keys. While we are on the keys subject, or overlay, I do find it a little hard to use when I customize the key presses. Maybe ordering a custom printed overlay could assuage this concern.

One thing I found odd was the GPU usage that the Sensel app needed. My laptop’s fans were kicking on, so I opened up Task Manager and saw that the Sensel app was taking 30% of my Nvidia RTX 2080. Luckily, you really only need it open when changing overlays or turning it into a trackpad, but I found myself leaving it open by accident, which could really hurt performance.

Summing Up
In the end, is the Sensel Morph really worth the $249? It does come with one free overlay of your choice with the $249 purchase price, along with a one-year warranty; but if you want more overlays those will set you back from $35 to $59 depending on the overlay.

The Video Editing one is $35 while the new Buchla Thunder overlay is $59. From a traditional Keyboard, Piano Key, Music Production, or even Drum Pad Overlay there are a few different options you can choose from. If you are a one-person band that goes between Premiere and apps like Abelton, then it’s 100 percent worth it. If you use Premiere a lot, I still think it is worth it. The iPad Mini-size and weight is really nice, and when using over Bluetooth you feel untethered. Its sleek and thin design really allows you to bring this morphable hardware interface with you anywhere you take your laptop or tablet.

The Sensel Morph is not like any of the other hardware interfaces I have used. Not only is it extremely mobile, but it works well and is compatible with a lot of content creation apps that pros use daily. They really delivered on this one.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Ford v Ferrari’s co-editors discuss the cut

By Oliver Peters

After a failed attempt to acquire European carmaker Ferrari, an outraged Henry Ford II sets out to trounce Enzo Ferrari on his own playing field — automobile endurance racing. That is the plot of 20th Century Fox’s Ford v Ferrari, directed by James Mangold. In the end, Ford’s effort falls short, leading him to independent car designer Carroll Shelby (Matt Damon). Shelby’s outspoken lead test driver Ken Miles (Christian Bale) complicates the situation by making an enemy out of Ford senior VP Leo Beebe.

Michael McCusker

Nevertheless, Shelby and his team are able to build one of the greatest race cars ever — the GT40 MkII — setting up a showdown between the two auto legends at the 1966 24 Hours of Le Mans.

The challenge of bringing this clash of personalities to the screen was taken on by director James Mangold (Logan, Wolverine, 3:10 to Yuma) and his team of long-time collaborators.

I recently spoke with film editors Michael McCusker, ACE, (Walk the Line, 3:10 to Yuma, Logan) and Andrew Buckland (The Girl On the Train) — both of whom were recently nominated for an Oscar and ACE Eddie Award for their work on the film — about what it took to bring Ford v Ferrari together.

The post team for this film has worked with James Mangold on quite a few films. Tell me a bit about the relationship.
Michael McCusker: I cut my very first movie, Walk the Line, for Jim 15 years ago and have since cut his last six movies. I was the first assistant editor on Kate & Leopold, which was shot in New York in 2001. That’s where I met Andrew, who was hired as one of the local New York film assistants. We became fast friends. Andrew moved to LA in 2009, and I hired him to assist me on Knight & Day.

Andrew Buckland

I always want to keep myself available for Jim — he chooses good material, attracts great talent and is a filmmaker who works across multiple genres. Since I’ve worked with him, I’ve cut a musical movie, a western, a rom-com, an action movie, a straight-up superhero movie, a dystopian superhero movie and now a racing film.

As a film editor, it must be great not to get typecast for any particular cutting style.
McCusker: Exactly. I worked for David Brenner for years as his first. He was able to cross genres, and that’s what I wanted to do. I knew even then that the most important decisions I would make would be choosing projects. I couldn’t have foreseen that Jim was going to work across all these genres — I simply knew that we worked well together and that the end product was good.

In preparing for Ford v Ferrari, did you study any other recent racing films, like Ron Howard’s Rush?
McCusker: I saw that movie, and liked it. Jim was aware of it, too, but I think he wanted to do something a little more organic. We watched a lot of older racing films, like Steve McQueen’s Le Mans and John Frankenheimer’s Grand Prix.

Jim’s original intention was to play the racing in long takes and bring the audience along for the ride. As he was developing the script, and we were in preproduction, it became clear that there was more drama for him to portray during the racing sequences than he anticipated. So the races took on more of an energized pace.

Energized in what way? Do you mean in how you cut it or in a change of production technique, like more stunt cameras and angles?
McCusker: I was fortunate to get involved about two-and-a-half months prior to the start of production. We were developing the Le Mans race in previs. This required a lot of editing and discussions about shot design and figuring out what the intercutting was going to be during that sequence, which is like the fourth act of the movie.

You’re dealing with Mollie and Peter [Miles’ wife and son] at home watching the race, the pit drama, what’s going on with Shelby and his crew, with Ford and Leo Beebe and also, of course, what’s going on in the car with Ken. It’s a three-act movie unto itself, so Jim was trying to figure out how it was all going to work before he had to shoot it. That’s where I came in. The frenetic pace of Le Mans was more a part of the writing process — and part of the writing process was the previs. The trick was how to make sure we weren’t just following cars around a track. That’s where redundancy can tend to beleaguer an audience in racing movies.

What was the timeline for production and post?
McCusker: I started at the end of May 2018. Production began at the beginning of August and went all the way through to the end of November. We started post in earnest at the beginning of November of last year, took some time off for the holidays, and then showed the film to the studios around February or March.

When did you realize you were going to need help?
The challenge was that there was going to be a lot of racing footage, which meant there was going to be a lot of footage. I knew I was going to need a strong co-editor, so Andrew was the natural choice. He had been cutting on his own and cutting with me over the years. We share a common approach to editing and have a similar aesthetic.

There was a point when things got really intense and we needed another pair of hands, so I brought in Dirk Westervelt to help out for a couple of months. That kept our noses above water, but the process was really enjoyable. We were never in a crisis mode. We got a great response from preview audiences and, of course, that calms everybody down. At that point it was just about quality control and making sure we weren’t resting on our laurels.

How long was your initial cut, and what was your process for trimming the film down to the present run time?
McCusker: We’re at 2:30:00 right now and I think the first cut was 3:10 or 3:12. The Le Mans section was longer. The front end of the movie had more scenes in it. We ended up lifting some scenes and rearranging others. Plus, the basic trimming of scenes brought the length down.

But nothing was the result of a panic, like, “Oh my God, we’ve got to get to 2:30!” There were no demands by the studio or any pressures we placed upon ourselves to hit a particular running time. I like to say that there’s real time and there’s cinematic time. You can watch Once Upon a Time in America, which is 3:45, and feels like it’s an hour. Or you can watch an 89-minute movie and feel like it’s drudgery. We just wanted to make sure we weren’t overstaying our welcome.

How extensively did you rearrange scenes during the edit? Or did the structure of the film stay pretty much as scripted?
McCusker: To a great degree it stayed as scripted. We had some scenes in the beginning that we felt were a little bit tangential and weren’t serving the narrative directly, and those were cut.

The real endeavor of this movie starts the moment that these two guys [Shelby and Miles] decide to tackle the challenge of developing this car. There’s a scene where Miles sees the car for the first time at LAX. We understood that we had to get to that point in a very efficient way, but also set up all the other characters — their motives and their desires.

It’s an interesting movie, because it starts off with a lot of characters. But then it develops into a movie about two guys and their friendship. So it goes from an ensemble piece to being about Ken and Carroll, while at the same time the scope of the movie is opening up and becoming larger as the racing is going on. For us, the trickiest part was the front end — to make sure we spent enough time with each character so that we understood them, but not so much time that audience would go, “Enough already! Get on with it!”

Did that help inform your cutting style for this film?
McCusker: I don’t think so. Where it helped was knowing the sound of the broadcasters and race announcers. I liked Chris Economaki and Jim McKay — guys who were broadcasting the races when I was a kid. I was intrigued about how they gave us the narrative of the race. It came in handy while we were making this movie, because we were able to get our hands on some of Jim McKay’s actual coverage of Le Mans and used it in the movie. That brings so much authenticity.

Let’s talk sound. I would imagine the sound design was integral to your rough cuts. How did you tackle that?
Andrew Buckland: We were fortunate to have the sound team on very early during preproduction. We were cutting in a 5.1 environment, so we wanted to create sound design early. The engine sounds might not have been the exact sounds that would end up in the final, but they were adequate enough to allow you to experience the scenes as intended. Because we needed to get Jim’s response early, some of the races were cut with the production sound — from the live mics during filming. This allowed Jim and us to quickly see how the scenes would flow.

Other scenes were cut strictly MOS because the sound design would have been way too complicated for the initial cut of the scene. Once the scene was cut visually, we’d hand over the scene to sound supervisor Don Sylvester, who was able to provide us with a set of 5.1 stems. That was great, because we could recut and repurpose those stems for other races.

McCusker: We had developed a strategy with Don to split the sound design into four or five stems to give us enough discrete channels to recut these sequences. The stems were a palette of interior perspectives, exterior perspectives, crowds, car-bys, and so on. By employing this strategy, we didn’t need to continually turn over the cut to sound for patch-up work.

Then, as Don went out and recorded the real cars and was developing the actual sounds for what was going to be used in the mix, he’d generate new stems and we would put them into the Media Composer. This was extremely informative to Jim, because he could experience our Avid temp mix in 5.1 and give notes, which ultimately informed the final sound design and the mix.

What about temp music? Did you also weave that into your rough cuts?
McCusker: Ted Caplan, our music editor, has also worked with Jim for 15 years. He’s a bit of a renaissance man — a screenwriter, a novelist, a one-time musician and a sound designer in his own right. When he sits down to work with music, he’s coming at it from a story point-of-view. He has a very instinctual knowledge of where music should start, and it happens to dovetail into the aesthetic that Jim, Andrew, and I are working toward. None of us like music to lead scenes in a way that anticipates what the scene is going to be about before you experience it.

For this movie, it was challenging to develop what the musical tone of the movie would be. Ted was developing the temp track along with us from a very early stage. We found over time that not one particular musical style was going to work. This is a very complex score. It includes a kind of surf-rock sound with Carroll Shelby in LA, an almost jaunty, lounge jazz sound for Detroit and the Ford executives, and then the hard-driving rhythmic sound for the racing.

The final score was composed by Marco Beltrami and Buck Sanders.

I presume you were housed in multiple cutting rooms at a central facility.
McCusker: We cut at 20th Century Fox, where Jim has a large office space. We cut Logan and Wolverine there before this movie. It has several cutting spaces and I was situated between Andrew and Don. Ted was next to Don and John Berri, our additional editor. Assistants were right around the corner. It makes for a very efficient working environment.

Since the team was cutting with Avid Media Composer, did any of its features stand out to you for this film?
Both: FluidMorph! (laughing)

McCusker: FluidMorph, speed-ramping — we often had to manipulate the shot speeds to communicate the speed of the cars. A lot of these cars were kit cars that could drive safely at a certain speed for photography, but not at race speed. So we had to manipulate the speed a lot to get the sense of action that these cars have.

What about Avid’s ScriptSync? I know a lot of narrative editors love it.
McCusker: I used ScriptSync once a few years ago and I never cut a scene faster. I was so excited. Then I watched it, and it was terrible. To me there’s so much more to editing than hitting the next line of dialogue. I’m more interested in the lines between the lines — subtext. I do understand the value of it in certain applications. For instance, I think it’s great on straight comedy. It’s helpful to get around and find things when you are shooting tons of coverage for a particular joke. But for me, it’s not something I lean on. I mark up my own dailies and find stuff that way.

Tell me a bit more about your organizational process. Do you start with a Kem roll or stringouts of selected takes?
McCusker: I don’t watch dailies, at least in a traditional sense. I don’t start in the morning, watch the dailies and then cut. And I don’t ask my assistants to organize any of my dailies in bins. I come in and grab the scene that I have in front on me. I’ll look at the last take of every set-up quickly and then I spend an enormous amount of time — particularly on complex scenes — creating a bin structure that I can work with.

Sometimes it’s the beats in a scene, sometimes I organize by shot size, sometimes by character — it depends on what’s driving the scene. I learn my footage by organizing it. I remember shot sizes. I remember what was shot from set-up to set-up. I have a strong visual memory of where things are in a bin. So, if I ask an assistant to do that, then I’m not going to remember it. If there are a lot of resets or restarts in a take, I’ll have the assistant mark those up. But, I’ll go through and mark up beats or pivotal points in a scene, or particularly beautiful moments, and then I’ll start cutting.

Buckland: I’ve adopted a lot of Mike’s methodology, mainly because I assisted Mike on a few films. But it actually works for me, as well. I have a similar aesthetic to Mike.

Was this was shot digitally?
McCusker: It was primarily shot with ARRI Alexa 65 LFs, plus some other small-format cameras. A lot of it was shot with old anamorphic lenses on the Alexa that allowed them to give it a bit of a vintage feeling. It’s interesting that as you watch it, you see the effect of the old lenses. There’s a fall-off on the edges, which is kind of cool. There were a couple of places where the subject matter was framed into the curve of the lens, which affects the focus. But we stuck with it, because it feels “of the time.”

Since the film takes place in the 1960s and has a lot of racing sequences, I assume there a lot of VFX?
McCusker: The whole movie is a period film and we would temp certain things in the Avid for the rough cuts. John Berri was wrangling visual effects. He’s a master in the Avid and also Adobe After Effects. He has some clever ways of filling in backgrounds or greenscreens with temp elements to give the director an idea of what’s going to go there. We try to do as much temp work in the Avid as we are capable of doing, but there’s so much 3D visual effects work in this movie that we weren’t able to do that all of the time.

The racing is real. The cars are real. The visual effects work was for a lot of the backgrounds. The movie was shot almost entirely in Los Angeles with some second unit footage shot in Georgia. The modern-day Le Mans track isn’t at all representative of what Le Mans was in 1966, so there was no way to shoot that. Everything had to be doubled and then augmented with visual effects. In addition to Georgia, where they shot most of the actual racing for Le Mans, they went to France to get some shots of the actual town of Le Mans. Of those, I think only about four of those shots are left. (laughs)

Any final thoughts about how this film turned out?
McCusker: I’m psyched that people seem to like the film. Our concern was that we had a lot of story to tell. Would we wear audiences out? We continually have people tell us, “That was two and a half hours? We had no idea.” That’s humbling for us and a great feeling. It’s a movie about these really great characters with great scope and great racing. You can put all the big visual effects in a film that you want to, but it’s really about people.

Buckland: I agree. It’s more of a character movie with racing. Also, because I am not a racing fan per se, the character drama really pulled me into the film while working on it.


Oliver Peters is an experienced film and commercial editor/colorist. In addition, he regularly interviews editors for trade publications. He may be contacted through his website at oliverpeters.com.

The 70th annual ACE Eddie Award nominations

The American Cinema Editors (ACE), the honorary society of the world’s top film editors, has announced its nominations for the 70th Annual ACE Eddie Awards recognizing outstanding editing in 11 categories of film, television and documentaries.

For the first time in ACE’s history, three foreign language films are among the nominees, including The Farewell, I Lost My Body and Parasite, despite there not being a specific category for films predominantly in a foreign language.

Winners will be revealed during a ceremony on Friday, January 17 at the Beverly Hilton Hotel and will be presided over by ACE president, Stephen Rivkin, ACE. Final ballots open December 16 and close on January 6.

Here are the nominees:

BEST EDITED FEATURE FILM (DRAMA):
Ford v Ferrari
Michael McCusker, ACE & Andrew Buckland

The Irishman
Thelma Schoonmaker, ACE

Joker 
Jeff Groth

Marriage Story
Jennifer Lame, ACE

Parasite
Jinmo Yang

BEST EDITED FEATURE FILM (COMEDY):
Dolemite is My Name
Billy Fox, ACE

The Farewell
Michael Taylor & Matthew Friedman

Jojo Rabbit
Tom Eagles

Knives Out
Bob Ducsay

Once Upon a Time in Hollywood
Fred Raskin, ACE

BEST EDITED ANIMATED FEATURE FILM:
Frozen 2
Jeff Draheim, ACE

I Lost My Body
Benjamin Massoubre

Toy Story 4
Axel Geddes, ACE

BEST EDITED DOCUMENTARY (FEATURE):
American Factory
Lindsay Utz

Apollo 11
Todd Douglas Miller

Linda Ronstadt: The Sound of My Voice
Jake Pushinsky, ACE & Heidi Scharfe, ACE

Making Waves: The Art of Cinematic Sound
David J. Turner & Thomas G. Miller, ACE

BEST EDITED DOCUMENTARY (NON-THEATRICAL):
Abducted in Plain Sight
James Cude

Bathtubs Over Broadway
Dava Whisenant

Leaving Neverland
Jules Cornell

What’s My Name: Muhammad Ali
Jake Pushinsky, ACE

BEST EDITED COMEDY SERIES FOR COMMERCIAL TELEVISION:
Better Things: “Easter”
Janet Weinberg, ACE

Crazy Ex-Girlfriend: “I Need To Find My Frenemy” 
Nena Erb, ACE

The Good Place: “Pandemonium” 
Eric Kissack

Schitt’s Creek: “Life is a Cabaret”
Trevor Ambrose

BEST EDITED COMEDY SERIES FOR NON-COMMERCIAL TELEVISION:
Barry: “berkman > block”
Kyle Reiter, ACE

Dead to Me: “Pilot”
Liza Cardinale

Fleabag: “Episode 2.1”
Gary Dollner, ACE

Russian Doll: “The Way Out”
Todd Downing

BEST EDITED DRAMA SERIES FOR COMMERCIAL TELEVISION:
Chicago Med: “Never Going Back To Normal”
David J. Siegel, ACE

Killing Eve: “Desperate Times”
Dan Crinnion

Killing Eve: “Smell Ya Later”
Al Morrow

Mr. Robot: “401 Unauthorized”
Rosanne Tan, ACE

BEST EDITED DRAMA SERIES FOR NON-COMMERCIAL TELEVISION:
Euphoria: “Pilot””
Julio C. Perez IV

Game of Thrones: “The Long Night”
Tim Porter, ACE

Mindhunter: “Episode 2”
Kirk Baxter, ACE

Watchmen: “It’s Summer and We’re Running Out of Ice”
David Eisenberg

BEST EDITED MINISERIES OR MOTION PICTURE FOR TELEVISION:
Chernobyl: “Vichnaya Pamyat”
Jinx Godfrey & Simon Smith

Fosse/Verdon: “Life is a Cabaret”
Tim Streeto, ACE

When They See Us: “Part 1”
Terilyn A. Shropshire, ACE

BEST EDITED NON-SCRIPTED SERIES:
Deadliest Catch: “Triple Jeopardy”
Ben Bulatao, ACE, Rob Butler, ACE, Isaiah Camp, Greg Cornejo, Joe Mikan, ACE

Surviving R. Kelly: “All The Missing Girls”
Stephanie Neroes, Sam Citron, LaRonda Morris, Rachel Cushing, Justin Goll, Masayoshi Matsuda, Kyle Schadt

Vice Investigates: “Amazon on Fire”
Cameron Dennis, Kelly Kendrick, Joe Matoske, Ryo Ikegami

Main Image: Marriage Story

Storage for Editors

By Karen Moltenbrey

Whether you are a small-, medium- or large-size facility, storage is at the heart of your workflow. Consider, for instance, the one-person shop Fin Film Company, which films and edits footage for branding and events, often on water. Then there’s Uppercut, a boutique creative/post studio where collaborative workflow is the key to pushing boundaries on commercials and other similar projects.

Let’s take a look at Uppercut’s workflow first…

Uppercut
Uppercut is a creative editorial boutique shop founded by Micah Scarpelli in 2015 and offering a range of post services. Based in New York and soon Atlanta, the studio employs five editors with their own suites along with an in-house Flame artist who has his own suite.

Taylor Schafer

In contrast to Uppercut’s size, its storage needs are quite large, with five editors working on as many as five projects at a time. Although most of it is commercial work, some of those projects can get heavy in terms of the generated media, which is stored on-site.

So, for its storage needs, the studio employs an EditShare RAID system. “Sometimes we have multiple editors working on one large campaign, and then usually an assistant is working with an editor, so we want to make sure they have access to all the media at the same time,” says Taylor Schafer, an assistant editor at Uppercut.

Additionally, Uppercut uses a Supermicro nearline server to store some of its VFX data, as the Flame artist cannot access the EditShare system on his CentOS operating system. Furthermore, the studio uses LTO-6 archive media in a number of ways. “We use EditShare’s Ark to LTO our partitions once the editors are done with them for their projects. It’s wonderfully integrated with the whole EditShare system. Ark is easy to navigate, and it’s easy to swap LTO tapes in and out, and everything is in one location,” says Schafer.

The studio employs the EditShare Ark to archive its editors’ working files, such as Premiere and Avid projects, graphics, transcodes and so forth. Uppercut also uses BRU (Backup Restore Utility) from Tolis Group to archive larger files that only live on LaCie hard drives and not on EditShare, such as a raw grade. “Then we’re LTO’ing the project and the whole partition with all the working files at the end through Ark,” Schafer explains.

The importance of having a system like this was punctuated over the summer when Uppercut underwent a renovation and had to move into temporary office space at Light Iron, New York — without the EditShare system. As a result, the team had to work off of hard drives and Light Iron’s Avid Nexis for some limited projects. “However, due to storage limits, we mainly worked off of the hard drives, and I realized how important a file storage system that has the ability to share data in real time truly is,” Schafer recalls. “It was a pain having to copy everything onto a hard drive, hand it back to the editor to make new changes, copy it again and make sure all the files were up to date, as opposed to using a storage system like ours, where everything is instantly up to date. You don’t have to worry whether something copied over correctly or not.”

She continues: “Even with Nexis, we were limited in our ability to restore old projects, which lived on EditShare.”

When a new project comes in at Uppercut, the first thing Schafer and her colleagues do is create a partition on EditShare and copy over the working template, whether it’s for Avid or Premiere, on that partition. Then they get their various working files and start the project, copying over the transcodes they receive. As the project progresses, the artists will get graphics and update the partition size as needed. “It’s so easy to change on our end,” notes Schafer. And once the project is completed, she or another assistant will make sure all the files they would possibly need, dating back to day one of the project, are on the EditShare, and that the client files are on the various hard drives and FTP links.

Reebok

“We’ll LTO the partition on EditShare through Ark onto an LTO-6 tape, and once that is complete, then generally we will take the projects or partition off the EditShare,” Schafer continues. The studio has approximately 26TB of RAID storage but, due to the large size of the projects, cannot retain everything on the EditShare long term. Nevertheless, the studio has a nearline server that hosts its masters and generics, as well as any other file the team might need to send to a client. “We don’t always need to restore. Generally the only time we try to restore is when we need to go back to the actual working files, like the Premiere or Avid project,” she adds.

Uppercut avoids keeping data locally on workstations due to the collaborative workflow.

According to Schafer, the storage setup is easy to use. Recently, Schafer finished a Reebok project she and two editors had been working on. The project initially started in Avid Media Composer, which was preferred by one of the editors. The other editor prefers Premiere but is well-versed on the Avid. After they received the transcodes and all the materials, the two editors started working in tandem using the EditShare. “It was great to use Avid on top of it, having Avid bins to open separately and not having to close out of the project and sharing through a media browser or closing out of entire projects, like you have to do with a Premiere project,” she says. “Avid is nice to work with in situations where we have multiple editors because we can all have the project open at once, as opposed to Premiere projects.”

Later, after the project was finished, the editor who prefers Premiere did a director’s cut in that software. As a result, Schafer had to re-transcode the footage, “which was more complicated because it was shot on 16mm, so it was also digitized and on one large video reel instead of many video files — on top of everything else we were doing,” she notes. She re-transcoded for Premiere and created a Premiere project from scratch, then added more storage on EditShare to make sure the files were all in place and that everything was up to date and working properly. “When we were done, the client had everything; the director had his director’s cut and everything was backed up to our nearline for easy access. Then it was LTO’d through Ark on LTO-6 tapes and taken off EditShare, as well as LTO’d on BRU for the raw and the grade. It is now done, inactive and archived.”

Without question, says Schafer, storage is important in the work she and her colleagues do. “It’s not so much about the storage itself, but the speed of the storage, how easily I’m able to access it, how collaborative it allows me to be with the other people I’m working with. Storage is great when it’s accessible and easy for pretty much anyone to use. It’s not so good when it’s slow or hard to navigate and possibly has tech issues and failures,” Schafer says. “So, when I’m looking for storage, I’m looking for something that is secure, fast and reliable, and most of all, easy to understand, no matter the person’s level of technical expertise.”

Chris Aguilar

Fin Film Company
People can count themselves fortunate when they can mix business with pleasure and integrate their beloved hobby with their work. Such is the case for solo producer/director/editor Chris Aguilar of Fin Film Company in Southern California, which he founded a decade ago. As Aguilar says, he does it all, as does Fin Film, which produces everything from conferences to music videos and commercial/branded content. But his real passion involves outdoor adventure paddle sports, from stand-up paddleboarding to pro paddleboarding.

“That’s been pretty much my niche,” says Aguilar, who got his start doing in-house production (photography, video and so forth) for a paddleboard company. Since then, he has been able to turn his passion and adventures into full-time freelance work. “When someone wants an event video done, especially one involving paddleboard races, I get the phone call and go!”

Like many videographers and editors, Aguilar got his start filming weddings. Always into surfing himself, he would shoot surfing videos of friends “and just have fun with it,” he says of augmenting that work. Eventually, this allowed him to move into areas he is more passionate about, such as surfing events and outdoor sports. Now, Aguilar finds that a lot of his time is spent filming paddleboard events around the globe.

Today, there are many one-person studios with solo producers, directors and editors. And as Aguilar points out, their storage needs might not be on the level of feature filmmakers or even independent TV cinematographers, but that doesn’t negate their need for storage. “I have some pretty wide-ranging storage needs, and it has definitely increased over the years,” he says.

In his work, Aguilar has to avoid cumbersome and heavy equipment, such as Atomos recorders, because of their weight on board the watercraft he uses to film paddleboard events. “I’m usually on a small boat and don’t have a lot of room to haul a bunch of gear around,” he says. Rather, Aguilar uses Panasonic’s AG-CX350 as well as Panasonic’s EVA1 and GH5, and on a typical two-day shoot (the event and interviews), he will fill five to six 64GB cards.

“Because most paddleboard races are long-distance, we’re usually on the water for about five to eight hours,” says Aguilar. “Although I am not rolling cameras the whole time, the weight still adds up pretty quickly.”

As for storage, Aguilar offloads his video onto SSD drives or other kinds of external media. “I call it my ‘working drive for editing and that kind of thing,’” he says. “Once I am done with the edit and other tasks, I have all those source files somewhere.” He calls on the G-Technology G-Drive Mobile SSD 1TB for in the field and some editing and their Ev Raw portable raw drive for back ups and some editing. He also calls on Gylph’s Atom SSD for the field.

For years, that “somewhere” has been a cabinet that was filled with archived files. Indeed, that cabinet is currently holding, in Aguilar’s estimate, 30TB of data, if not more. “That’s just the archives. I have 10 or 11 years of archives sitting there. It’s pretty intense,” he adds. But, as soon as he gets an opportunity, those will be ported to the same cloud backup solution he is using for all his current work.

Yes, he still uses the source cards, but for a typical project involving an end-to-end shoot, Aguilar will use at least a 1TB drive to house all the source cards and all the subsequent work files. “Things have changed. Back in the day, I used hard drives – you should see the cabinet in my office with all these hard drives in it. Thank God for SSDs and other options out there. It’s changed our lives. I can get [some brands of] 1TB SSD for $99 or a little more right now. My workflow has me throwing all the source cards onto something like that that’s dedicated to all those cards, and that becomes my little archive,” explains Aguilar.

He usually uploads the content as fast as possible to keep the data secure. “That’s always the concern, losing it, and that’s where Backblaze comes in,” Aguilar says. Backblaze is a cloud backup solution that is easily deployed across desktops and laptops and managed centrally — a solution Aguilar recently began employing. He also uses Iconik Solutions’ digital management system, which eases the task of looking up video files or pulling archived files from Backblaze. The digital management system sits on top of Backblaze and creates little offline proxies of the larger content, allowing Aguilar to view the entire 10-year archive online in one interface.

According to Aguilar, his archived files are an important aspect of his work. Since he works so many paddleboard events, he often receives requests for clips from specific racers or races, some dating back years. Prior to using Backblaze, if someone requested footage, it was a challenge to locate it because he’d have to pull that particular hard drive and plug it into the computer, “and if I had been organized that year, I’ll know where that piece of content is because I can find it. If I wasn’t organized that year, I’d be in trouble,” he explains. “At best, though, it would be an hour and a half or more of looking around. Now I can locate and send it in 15 minutes.”

Aguilar says the Iconik digital management system allows him to pull up the content on the interface and drill down to the year of the race, click on it, download it and send it off or share it directly through his interface to the person requesting the footage.

Aguilar went live with this new Backblaze and digital management system storage workflow this year and has been fully on board with it for just the past two to three months. He is still uncovering all the available features and the power underneath the hood. “Even for a guy who’s got a technical background, I’m still finding things I didn’t know I could do,” and as such, Aguilar is still fine-tuning his workflow. “The neat thing with Iconik is that it could actually support online editing straight up, and that’s the next phase of my workflow, to accommodate that.”

Fortunately or unfortunately, at this time Aguilar is just starting to come off his busy season, so now he can step back and explore the new system. And transfer onto the new system all the material on the old source cards in that cabinet of his.

“[The new solution] is more efficient and has reduced costs since I am not buying all these drives anymore. I can reuse them now. But mostly, it has given me peace of mind that I know the data is secure,” says Aguilar. “I have been lucky in my career to be present for a lot of cool moments in the sport of paddling. It’s a small community and a very close-knit group. The peace of mind knowing that this history is preserved, well, that’s something I greatly appreciate. And I know my fellow paddlers also appreciate it.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

The Irishman editor Thelma Schoonmaker

By Iain Blair

Editor Thelma Schoonmaker is a three-time Academy Award winner who has worked alongside filmmaker Martin Scorsese for almost 50 years. Simply put, Schoonmaker has been Scorsese’s go-to editor and key collaborator over the course of some 25 films, winning Oscars for Raging Bull, The Aviator and The Departed. The 79-year-old also received a career achievement award from the American Cinema Editors (ACE).

Thelma Schoonmaker

Schoonmaker cut Scorsese’s first feature, 1967’s Who’s That Knocking at My Door, and since 1980’s Raging Bull has worked on all of his features, receiving a number of Oscar nominations along the way. There are too many to name, but some highlights include The King of Comedy, After Hours, The Color of Money, The Last Temptation of Christ, Goodfellas, Casino and Hugo.

Now Scorsese and Schoonmaker have once again turned their attention to the mob with The Irishman, which was nominated for 10 Academy Awards, including one for Shoonmaker’s editing work. Starring Robert De Niro, Al Pacino and Joe Pesci, it’s an epic saga that runs 3.5 hours and focuses on organized crime in post-war America. It’s told through the eyes of World War II veteran Frank Sheeran (De Niro). He’s a hustler and hitman who worked alongside some of the most notorious figures of the 20th century. Spanning decades, the film chronicles one of the greatest unsolved mysteries in American history, the disappearance of legendary union boss Jimmy Hoffa. It also offers a monumental journey through the hidden corridors of organized crime — its inner workings, rivalries and connections to mainstream politics.

But there’s a twist to this latest mob drama that Scorsese directed for Netflix from a screenplay by Steven Zaillian. Gone are the flashy wise guys and the glamour of Goodfellas and Casino. Instead, the film examines the mundane nature of mob killings and the sad price any survivors pay in the end.

Here, Schoonmaker — who in addition to her film editing works to promote the films and writings of her late husband, famed British director Michael Powell (The Red Shoes, Black Narcissus) — talks about cutting The Irishman, working with Scorsese and their long and storied collaboration.

The Irishman must have been very challenging to cut, just in terms of its 3.5-hour length?
Actually, it wasn’t very challenging to cut. It came together much more quickly than some of our other films because Scorsese and Steve Zaillian had created a very strong structure. I think some critics think I came up with this structure, but it was already there in the script. We didn’t have to restructure, which we do sometimes, and only dropped a few minor scenes.

Did you stay in New York cutting while he shot on location, or did you visit the set?
Almost everything in the The Irishman was shot in or around New York. The production was moving all over the place, so I never got to the set. I couldn’t afford the time.

When I last interviewed Marty, he told me that editing and post are his favorite parts of filmmaking. When the two of you sit down to edit, is it like having two editors in the room rather than a director and his editor?
Marty’s favorite part of filmmaking is editing, and he directs the editing after he finishes shooting. I do an assembly based on what he tells me in dailies and what I feel, and then we do all the rest of the editing together.

Could you give us some sense of how that collaboration works?
We’ve worked together for almost 50 years, and it’s a wonderful collaboration. He taught me how to edit at first, but then gradually it has become more of a collaboration. The best thing is that we both work for what is best for the film — it never becomes an ego battle.

How long did it take to edit the film, and what were the main challenges?
We edited for a year and the footage was so incredibly rich: the only challenge was to make sure we chose the best of it and took advantage of the wonderful improvisations the actors gave us. It was a complete joy for Scorsese and me to edit this film. After we locked the film, we turned over to ILM so they could do the “youthifying” of the actors. That took about seven months.

Could you talk about finding the overall structure and considerable use of flashbacks to tell the story?
Scorsese had such a strong concept for this film — and one of his most important ideas was to not explain too much. He respects the audience’s ability to figure things out themselves without pummeling them with facts. It was a bold choice and I was worried about it, frankly, at first. But he was absolutely right. He didn’t want the film to feel like a documentary. He wanted to use brushstrokes of history just to show how they affected the characters. The way the characters were developed in the film, particularly Frank Sheeran, the De Niro character, was what was most important.

Could you talk about the pacing, and how you and Marty kept its momentum going?
Scorsese was determined that The Irishman would have a slower pace than many films today. He gave the film a deceptive simplicity. Interestingly, our first audiences had no problem with this — they became gripped by the characters and kept saying they didn’t mind the length and loved the pace. Many of them said they wanted to see the film again right away.

There are several slo-mo sequences. Could you talk about why you used them and to what effect?
The Phantom camera slo-motion wedding sequence (250fps) near the end of the film was done to give the feeling of a funeral, instead of a wedding, because the DeNiro character has just been forced to do the worst thing he will ever do in his life. Scorsese wanted to hold on De Niro’s face and evoke what he is feeling and to study the Italian-American faces of the mobsters surrounding him. Instead of the joy a wedding is supposed to bring, there is a deep feeling of grief.

What was the most difficult sequence to cut and why?
The montage where De Niro repeatedly throws guns into the river after he has killed someone took some time to get right. It was very normal at first — and then we started violating the structure and jump cutting and shortening until we got the right feeling. It was fun.

There’s been a lot of talk about the digital de-aging process. How did it impact the edit?
Pablo Helman at ILM came up with the new de-aging process, and it works incredibly well. He would send shots and we would evaluate them and sometimes ask for changes — usually to be sure that we kept the amazing performances of De Niro, Pacino and Pesci intact. Sometimes we would put back in a few wrinkles if it meant we could keep the subtlety of De Niro’s acting, for example. Scorsese was adamant that he didn’t want to have younger actors play the three main parts in the beginning of the film. So he really wanted this “youthifying” process to work — and it does!

There’s a lot of graphic violence. How do you feel about that in the film?
Scorsese made the violence very quick in The Irishman and shot it in a deceptively simple way. There aren’t any complicated camera moves and flashy editing. Sometimes the violence takes place after a simple pan, when you least expect it because of the blandness of the setting. He wanted to show the banality of violence in the mob — that it is a job, and if you do it well, you get rewarded. There’s no morality involved.

Last time we talked, you were using the Lightworks editing system. Do you still use Lightworks, and if so, can you talk about the system’s advantages for you?
I use Lightworks because the editing surface is still the fastest and most efficient and most intuitive to use. Maintaining sync is different from all other NLE systems. You don’t correct sync by sync lock — if you go out of sync, Lightworks gives you a red icon with a number of frames that you are out of sync. You get to choose where you want to correct sync. Since editors place sound and picture on the timeline, adjusting sync where you want to adjust the sync is much more efficient.

You’ve been Marty’s editor since his very first film — a 50-year collaboration. What’s the secret?
I think Scorsese felt when he first met me that I would do what was right for his films — that there wouldn’t be ego battles. We work together extremely well. That’s all there is to it. There couldn’t be a better job.

Do you ever have strong disagreements about the editing?
If we do have disagreements, which is very rare, they are never strong. He is very open to experimentation. Sometimes we will screen two ways and see what the audience says. But that is very rare.

What’s next?
A movie about the Osage Nation in Oklahoma, based on the book “Killers of the Flower Moon” by David Grann.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Director Robert Eggers talks about his psychological thriller The Lighthouse

By Iain Blair

Writer/director Robert Eggers burst onto the scene when his feature film debut, The Witch, won the Directing Award in the US Dramatic category at the 2015 Sundance Film Festival. He followed up that success by co-writing and directing another supernatural, hallucinatory horror film, The Lighthouse, which is set in the maritime world of the late 19th century.

L-R: Director Robert Eggers and cinematographer Jarin Blaschke on set.

The story begins when two lighthouse keepers (Willem Dafoe and Robert Pattinson) arrive on a remote island off the coast of New England for their month-long stay. But that stay gets extended as they’re trapped and isolated due to a seemingly never-ending storm. Soon, the two men engage in an escalating battle of wills, as tensions boil over and mysterious forces (which may or may not be real) loom all around them.

The Lighthouse has the power of an ancient myth. To tell this tale, which was shot in black and white, Eggers called on many of those who helped him create The Witch, including cinematographer Jarin Blaschke, production designer Craig Lathrop, composer Mark Korven and editor Louise Ford.

I recently talked to Eggers, who got his professional start directing and designing experimental and classical theater in New York City, about making the film, his love of horror and the post workflow.

Why does horror have such an enduring appeal?
My best argument is that there’s darkness in humanity, and we need to explore that. And horror is great at doing that, from the Gothic to a bad slasher movie. While I may prefer authors who explore the complexities in humanity, others may prefer schlocky films with jump scares that make you spill your popcorn, which still give them that dose of darkness. Those films may not be seriously probing the darkness, but they can relate to it.

This film seems more psychological than simple horror.
We’re talking about horror, but I’m not even sure that this is a horror film. I don’t mind the label, even though most wannabe auteurs are like, “I don’t like labels!” It started with an idea my brother Max had for a ghost story set in a lighthouse, which is not what this movie became. But I loved the idea, which was based on a true story. It immediately evoked a black and white movie on 35mm negative with a boxy aspect ratio of 1.19:1, like the old movies, and a fusty, dusty, rusty, musty atmosphere — the pipe smoke and all the facial hair — so I just needed a story that went along with all of that. (Laughs) We were also thinking a lot about influences and writers from the time — like Poe, Melville and Stevenson — and soaking up the jargon of the day. There were also influences like Prometheus and Proteus and God knows what else.

Casting the two leads was obviously crucial. What did Willem and Robert bring to their roles?
Absolute passion and commitment to the project and their roles. Who else but Willem can speak like a North Atlantic pirate stereotype and make it totally believable? Robert has this incredible intensity, and together they play so well against each other and are so well suited to this world. And they both have two of the best faces ever in cinema.

What were the main technical challenges in pulling it all together, and is it true you actually built the lighthouse?
We did. We built everything, including the 70-foot tower — a full-scale working lighthouse, along with its house and outbuildings — on Cape Forchu in Nova Scotia, which is this very dramatic outcropping of volcanic rock. Production designer Craig Lathrop and his team did an amazing job, and the reason we did that was because it gave us far more control than if we’d used a real lighthouse.

We scouted a lot but just couldn’t find one that suited us, and the few that did were far too remote to access. We needed road access and a place with the right weather, so in the end it was better to build it all. We also shot some of the interiors there as well, but most of them were built on soundstages and warehouses in Halifax since we knew it’d be very hard to shoot interiors and move the camera inside the lighthouse tower itself.

Your go-to DP, Jarin Blaschke, shot it. Talk about how you collaborated on the look and why you used black and white.
I love the look of black and white, because it’s both dreamlike and also more realistic than color in a way. It really suited both the story and the way we shot it, with the harsh landscape and a lot of close-ups of Willem and Robert. Jarin shot the film on the Panavision Millennium XL2, and we also used vintage Baltar lenses from the 1930s, which gave the film a great look, as they make the sea, water and sky all glow and shimmer more. He also used a custom cyan filter by Schneider Filters that gave us that really old-fashioned look. Then by using black and white, it kept the overall look very bleak at all times.

How tough was the shoot?
It was pretty tough, and all the rain and pounding wind you see onscreen is pretty much real. Even on the few sunny days we had, the wind was just relentless. The shoot was about 32 days, and we were out in the elements in March and April of last year, so it was freezing cold and very tough for the actors. It was very physically demanding.

Where did you post?
We did it all in New York at Harbor Post, with some additional ADR work at Goldcrest in London with Robert.

Do you like the post process?
I love post, and after the very challenging shoot, it was such a relief to just get in a warm, dry, dark room and start cutting and pulling it all together.

Talk about editing with Louise Ford, who also cut The Witch. How did that work?
She was with us on the shoot at a bed and breakfast, so I could check in with her at the end of the day. But it was so tough shooting that I usually waited until the weekends to get together and go over stuff. Then when we did the stage work at Halifax, she had an edit room set up there, and that was much easier.

What were the big editing challenges?
The DP and I developed such a specific and detailed cinema language without a ton of coverage and with little room for error that we painted ourselves into a corner. So that became the big challenge… when something didn’t work. It was also about getting the running time down but keeping the right pace since the performances dictate the pace of the edit. You can’t just shorten stuff arbitrarily. But we didn’t leave a lot of stuff on the cutting room floor. The assembly was just over two hours and the final film isn’t much shorter.

All the sound effects play a big role. Talk about the importance of sound and working on them with sound designer Damian Volpe, whose credits include Can You Ever Forgive Me?, Leave No Trace, Mudbound, Drive, Winter’s Bone and Margin Call.
It’s hugely important in this film, and Louise and I did a lot of work in the picture edit to create temps for Damian to inspire him. And he was so relentless in building up the sound design, and even creating weird sounds to go with the actual light, and to go with the score by Mark Korven, who did The Witch, and all the brass and unusual instrumentation he used on this. So the result is both experimental and also quite traditional, I think.

There are quite a few VFX shots. Who did them, and what was involved?
We had MELS and Oblique in Quebec and Brainstorm Digital in New York also did some. The big one was that the movie’s set on an island but we shot on a peninsula, which also had a lighthouse further north, which unfortunately didn’t look at all correct, so we framed it out a lot but we had to erase it for some of the time. And our period-correct sea ship broke down and had to be towed around by other ships, so there was a lot of clean up. Also with all the safety cables we had to use for cliff shots with the actors.

Where did you do the DI, and how important is it to you?
We did it at Harbor with colorist Joe Gawler, and it was hugely important although it was fairly simple because there’s very little latitude on the Double-X film stock we used. We did a lot of fine detail work to finesse it, but it was a lot quicker than if it’d been in color.

Did the film turn out the way you hoped?
No, they always change and surprise you, but I’m very proud of what we did.

What’s next?
I’m prepping another period piece, but it’s not a horror film. That’s all I can say.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Julian Clarke on editing Terminator: Dark Fate

By Oliver Peters

Linda Hamilton’s Sarah Connor and Arnold Schwarzenegger T-800 are back to save humanity from a dystopian future in this latest installment of the Terminator franchise. James Cameron is also back and brings with him writing and producing credits, which is fitting — Terminator: Dark Fate is in essence Cameron’s sequel to Terminator 2: Judgment Day.

Julian Clarke

Tim Miller (Deadpool) is at the helm to direct the tale. It’s roughly two decades after the time of T2, and a new Rev-9 machine has been sent from an alternate future to kill Dani Ramos (Natalia Reyes), an unsuspecting auto plant worker in Mexico. But the new future’s resistance has sent back Grace (Mackenzie Davis), an enhanced super-soldier, to combat the Rev-9 and save her. They cross paths with Connor, and the story sets off on a mad dash to the finale at Hoover Dam.

Miller brought back much of his Deadpool team, including his VFX shop Blur, DP Ken Seng and editor Julian Clarke. This is also the second pairing of Miller and Clarke with Adobe. Both Deadpool and Terminator: Dark Fate were edited using Premiere Pro. In fact, Adobe was also happy to tie in with the film’s promotion through its own #CreateYourFate trailer remix challenge. Participants could re-edit their own trailer using supplied content from the film.

I recently spoke with Clarke about the challenges and fun of cutting this latest iteration of such an iconic film franchise.

Terminator: Dark Fate picks up two decades after Terminator 2, leaving out the timelines of the subsequent sequels. Was that always the plan, or did it evolve out of the process of making the film?
That had to do with the screenplay. You were written into a corner by the various sequels. We really wanted to bring Linda Hamilton’s character back. With Jim involved, we wanted to get back to first principles and have it based on Cameron’s mythology alone. To get back to the Linda/Arnold character arcs, and then add some new stuff to that.

Many fans were attracted to the franchise by Cameron’s two original Terminator films. Was there a conscious effort at integrating that nostalgia?
I come from a place of deep fandom for Terminator 2. As a teenager I had VHS copies of Aliens and Terminator 2 and watched them on repeat after school! Those films are deeply embedded in my psyche, and both of them have aged well — they still hold up. I watched the sequels, and they just didn’t feel like a Terminator film to me. So the goal was definitely to make it of the DNA of those first two movies. There’s going to be a chase. It’s going to be more grounded. It’s going to get back into the Sarah Connor character and have more heart.

This film tends to have elements of humor unlike most other action films. That must have posed a challenge to set the right tone without getting campy.
The humor thing is interesting. Terminator 2 has a lot of humor throughout. We have a little bit of humor in the first half and then more once Arnold shows up, but that’s really the way it had to be. The Dani Ramos character — who’s your entry point into the movie — is devastated when her whole family is killed. To have a lot of jokes happening would be terrible. It’s not the same in Terminator 2 because John Connor’s stepparents get very little screen time, and they don’t seem that nice. You feel bad for them, but it’s OK that you get into this funny stuff right off the bat. On this one we had to ease into the humor so you could [experience] the gravity of the situation at the start of the movie.

Did you have to do much to alter that balance during the edit?
There were one or two jokes that we nipped out, but it wasn’t like that whole first act was chock full of jokes. The tone of the first act is more like Terminator, which is more of a thriller or horror movie. Then it becomes more like T2 as the action gets bigger and the jokes come in. So the first half is like a bigger Terminator and the second half more like T2.

Deadpool, which Tim Miller also directed, used a very nonlinear story structure, balancing action, comedic moments and drama. Terminator was always designed with a linear, straightforward storyline. Right?
A movie hands you certain editing tools. Deadpool was designed to be nonlinear, with characters in different places, so there are a whole bunch of options for you. Terminator: Dark Fate is more like a road movie. The detonation of certain paths along the road are predetermined. You can’t be in Texas before Mexico. So the structural options you had were where to check in with the Rev-9, as well as the inter-scene structure. Once you are in the detention center, who are you cutting to? Sarah? Dani? However, where that is placed in the movie is pretty much set. All you can do is pace it up, pace it down, adjust how to get there. There aren’t a lot of mobile pieces that can be swapped around.

When we had talked after Deadpool, you discussed how you liked the assistants to build string-outs — what some call a Kem roll. Similar action is assembled back to back into a sequence in order from every take. Did you use that same organizational method on Terminator: Dark Fate?
Sometimes we were so swamped with material that there wasn’t time to create string-outs. I still like to have those. It’s a nice way to quickly see all the pieces that cover a moment. If you are trying to find the one take or action that’s 5% better than another, then it’s good to see them all in a row, rather than trying to keep it all in your head for a five-minute take. There was a lot of footage that we shot in the action scenes, but we didn’t do 11 or 12 takes for a dialogue scene. I didn’t feel like I needed some tool to quickly navigate through the dialogue takes. We would string out the ones that were more complicated.

Depending on the directing style, a series of takes may have increasingly calibrated performances with successive takes. With other directors, each take might be a lot different than the one before and after it. What is your approach to evaluating which is the best take to use?
It’s interesting when you use the earlier takes versus the later takes and what you get from them. The later takes are usually the ones that are most directed. The actors are warmed up and most closely nail what the director has in mind. So they are strong in that regard, but sometimes they can become more self-conscious. So sometimes the first take is more thrown away and may have less power but feels more real — more off the cuff. Sometimes a delivered dialogue line feels less written, and you’ll buy it more. Other times you’ll want that more dramatic quality of the later takes. My instinct is to first use the later takes, but as you start to revise a scene, you often go back to pieces of the earlier takes to ground it a little more.

How long did the production and post take?
It took a little over 100 days of shooting with a lot of units. I work on a lot of mid-budget films, so this seemed like a really long shoot. It was a little relentless for everyone — even squeezing it into those 100 days. Shooting action with a lot of VFX is slow due to the reset time needed between takes. The ending of the movie is 30 minutes of action in a row. That’s a big job shooting all of that stuff. When they have a couple of units cranking through the dialogue scenes plus shooting action sequences — that’s when I have to work hard to keep up. Once you hit the roadblocks of shooting just those little action pieces, you get a little time to catch up.

We had the usual director’s cut period and finished by the end of this September. The original plan was to finish by the beginning of September, but we needed the time for VFX. So everything piled up with the DI and the mix in order to still hit the release date. September got a little crazy. It seems like a long time — a total of 13 or 14 months — but it still was an absolute sprint to get the movie in shape and get the VFX into the film in time. This might be normal for some of these films, but compared to the other VFX movies I’ve done, it was definitely turning things up a notch!

I imagine that there was a fair amount of previz required to lay out the action for the large VFX and CG scenes. Did you have that to work with as placeholder shots? How did you handle adjusting the cut as the interim and final shots were delivered?
Tim is big into previz with his background in VFX and animation and owning his own VFX company. We had very detailed animatics going into production. Depending on a lot of factors, you still abandon a lot of things. For example, the freeway chases are quite a bit different because when you go there and do it with real cars, they do different things. Or only part of the cars look like they are going fast enough. Those scenes became quite different than the previz.

Others are almost 100% CG, so you can drop in the previz as placeholders. Although, even in those cases, sometimes the finished shot doesn’t feel real enough. In the “cartoon” world of previz, you can do wild camera moves and say, “Wow, that seems cool!” But when you start doing it at photoreal quality, then you go, “This seems really fake.” So we tried to get ahead of that stuff and find what to do with the camera to ground it. Kind of mess it up so it’s not too dynamic and perfect.

How involved were you with shaping the music? Did you use previous Terminator films’ scores as a temp track to cut with?
I was very involved with the music production. I definitely used a lot of temp music. Some of it was ripped from old Terminator movies, but there’s only so much Terminator 2 music you can put in. Those scores used a lot of synthesizers that date the sound. I did use “Desert Suite” from Terminator 2, when Sarah is in the hotel room. I loved having a very direct homage to a Sarah Connor moment while she’s talking about John. Then I begged our composer, Tom Holkenborg (from Junkie XL), to consider doing a version of it for our movie. So it is essentially the same chord progression.

That was an interesting musical and general question about how much do you lean into the homage thing. It’s powerful when you do it, but if you do it too much, it starts to feel artificial or pandering. So I tried to hit the sweet spot so you knew you were watching a Terminator movie, but not so much that it felt like Terminator karaoke. How many times can you go da-dum-dum-da-da-dum? You have to pick your moments for those Terminator motifs. It’s diminishing returns if you do it too much.

Another inspirational moment for me was another part in Terminator 2. There’s a disturbing industrial sound for the T-1000. It sounds more like a foghorn or something in a factory rather than music, and it created this unnerving quality to the T-1000 scenes, when he’s just scoping things out. So we came up with a modern-day electronic equivalent for the Rev-9 character, and that was very potent.

Was James Cameron involved much in the post production?
He’s quite busy with his Avatar movies. Some of the time he was in New Zealand, some of the time he was in Los Angeles. Depending on where he was and where we were in the process, we would hit milestones, like screenings or the first cut. We would send him versions and download a bunch of his thoughts.

Editing is very much a part of his wheelhouse. Unlike many other directors, he really thinks about this shot, then that shot, then the next shot. His mind really works that way. Sometimes he would give us pretty specific, dialed-in notes on things. Sometimes it would just be bigger suggestions, like, “Maybe the action cutting pattern could be more like this …” So we’d get his thoughts — and, of course, he’s Jim Cameron, and he knows the business and the Terminator franchise — so I listened pretty carefully to that input.

This is the second film that you’ve cut with Premiere Pro. Deadpool was first, and there were challenges using it on such a complex project. What was the experience like this time around?
Whenever you set out to use a new workflow — not to say Premiere is new because it’s been around a long time and has millions of users, but it’s unusual to use it on large VFX movies for specific reasons.

L-R: Matthew Carson and Julian Clarke

On Deadpool, that led to certain challenges, and that’s just what happens when you try to do something new. The fact that we had to split the movie into separate projects for each reel, instead of one large project. Even so, the size of our project files made it tough. They were so full of media that they would take five minutes to open. Nevertheless, we made it work, and there are lots of benefits to using Adobe over other applications.

In comparison, the interface to Avid Media Composer looks like it was designed 20 years ago, but they have multi-user collaboration nailed, and I love the trim tool. Yet, some things are old and creaky. Adobe’s not that at all. It’s nice and elegant in terms of the actual editing process. We got through it and sat down with Adobe to point out things that needed work, and they worked on them. When we started up Terminator, they had a whole new build for us. Project files now opened in 15 seconds. They are about halfway there in terms of multi-user editing. Now everyone can go into a big, shared project, and you can move bins back and forth. Although, only one user at a time has write access to the master project.

This is not simple software they are writing. Adobe is putting a lot of work into making it a more fitting tool for this type of movie. Even though this film was exponentially larger than Deadpool, from the Adobe side it was a smoother process. Props to them for doing that! The cool part about pioneering this stuff is the amount of work that Adobe is on board to do. They’ll have people work on stuff that is helpful to us, so we get to participate a little in how Adobe’s software gets made.

With two large Premiere Pro projects under your belt, what sort of new features would you like to see Adobe add to the application to make it even better for feature film editors?
They’ve built out the software from being a single-user application to being multi-user software, but the inherent software at the base level is still single-user. Sometimes your render files get unlinked when you go back and forth between multiple users. There’s probably stuff where they have to dig deep into the code to make those minor annoyances go away. Other items I’d like to see — let’s not use third-party software to send change lists to the mix stage.

I know Premiere Pro integrates beautifully with After Effects, but for me, After Effects is this precise tool for executing shots. I don’t want a fine tool for compositing — I want to work in broad strokes and then have someone come back and clean it up. I would love to have a tracking tool to composite two shots together for a seamless, split screen of two combined takes — features like that.

The After Effects integration and the color correction are awesome features for a single user to execute the film, but I don’t have the time to be the guy to execute the film at that high level. I just have to keep going. I want to be able to do a fast and dirty version so I know it’s not a terrible idea, and then turn to someone else and say, “OK, make that good.” After Effects is cool, but it’s more for VFX editors or single users who are trying to make a film on their own.

After all of these action films, are you ready to do a different type of film, like a period drama?
Funny you should say that. After Deadpool I worked on The Handmaid’s Tale pilot, and it was exactly that. I was working on this beautifully acted, elegant project with tons of women characters and almost everything was done in-camera. It was a lot of parlor room drama and power dynamics. And that was wonderful to work on after all of this VFX/action stuff. Periodically it’s nice to flex a different creative muscle.

It’s not that I only work on science-fiction/VFX projects — which I love — but, in part, people start associating you with a certain genre, and then that becomes an easy thing to pursue and get work for.

Much like acting, if you want to be known for doing a lot of different things, you have to actively pursue it. It’s easy to go where momentum will take you. If you want to be the editor who can cut any genre, you have to make it a mission to pursue those projects that will keep your resume looking diverse. For a brief moment after Deadpool, I might have been able to pivot to a comedy career (laughs). That was a real hybrid, so it was challenging to thread the needle of the different tones of the film and make it feel like one piece.

Any final thoughts on the challenges of editing Terminator: Dark Fate?
The biggest challenge of the film was that, in a way, the film was an ensemble with the Dani character, the Grace character, the Sarah character and Arnold’s character — the T-800. All of these characters are protagonists that all have their individual arcs. Feeling that you were adequately servicing those arcs without grinding the movie to a halt or not touching bases with a character often enough — finding out how to dial that in was the major challenge of the movie, plus the scale of the VFX and finessing all the action scenes. I learned a lot.


Oliver Peters is an experienced film and commercial editor/colorist. In addition, he regularly interviews editors for trade publications. He may be contacted through his website at oliverpeters.com

Final Cut ups Zoe Schack to editor

Final Cut in LA has promoted Zoe Schack to editor after working at the studio as an assistant editor for three years. While at Final Cut, Schack has been mentored by Final Cut editors Crispin Struthers, Joe Guest, Jeff Buchanan and Rick Russell.

Schack has edited branded content and commercials for Audi, Infiniti, Doritos and Dollar Shave Club as well as music videos for Swae Lee and Whitney Woerz. She has also worked with a number of high-profile directors, including Dougal Wilson, Ava DuVernay, Michel Gondry, Craig Gillespie and Steve Ayson.

Originally from a small town north of New York, Schack studied film at Rhode Island School of Design and NYU’s Tisch School of the Arts. Her love for documentaries led her to intern with renowned filmmaker Albert Maysles and to produce the Bicycle Film Festival in Portland, Oregon. She edited several short documentaries and a pilot series that were featured in many film festivals.

“It’s been amazing watching Zoe’s growth the last few years,” says Final Cut executive producer Suzy Ramirez. “She’s so meticulous, always doing a deep dive into the footage. Clients love working with her because she makes the process fun. She’s grown here at Final Cut so much already under the guidance of our editors, and her craft keeps evolving. I’m excited to see what’s ahead”

Todd Phillips talks directing Warner Bros.’ Joker

By Iain Blair

Filmmaker Todd Phillips began his career in comedy, most notably with the blockbuster franchise The Hangover, which racked up $1.4 billion at the box office globally. He then leveraged that clout and left his comedy comfort zone to make the genre-defying War Dogs.

Todd Phillips directing Joaquin Phoenix

Joker puts comedy even further in his rearview mirror. This bleak, intense, disturbing and chilling tragedy has earned over a $1 billion worldwide since its release, making it the seventh-highest-grossing film of 2019 and the highest-grossing R-rated film of all time. Not surprisingly, Joker was celebrated by the Academy, earning a total of 11 Oscar nods, including two for Phillips.

Directed, co-written and produced by Phillips (nominated for Directing and Screenplay), Joker is the filmmaker’s original vision of the infamous DC villain — an origin story infused with the character’s more traditional mythologies. Phillips’ exploration of Arthur Fleck, who is portrayed — and fully inhabited — by three-time Oscar-nominee Joaquin Phoenix, is of a man struggling to find his way in Gotham’s fractured society. Longing for any light to shine on him, he tries his hand as a stand-up comic but finds the joke always seems to be on him. Caught in a cyclical existence between apathy, cruelty and, ultimately, betrayal, Arthur makes one bad decision after another that brings about a chain reaction of escalating events in this powerful, allegorical character study.

Phoenix is joined by Oscar-winner Robert De Niro, who plays TV host Murray Franklin, and a cast that includes Zazie Beetz, Frances Conroy, Brett Cullen, Marc Maron, Josh Pais and Leigh Gill.

Behind the scenes, Phillips was joined by a couple of frequent collaborators in DP Lawrence Sher, ASC, and editor Jeff Groth. Also on the journey were Oscar-nominated co-writer Scott Silver, production designer Mark Friedberg and Oscar-winning costume designer Mark Bridges. Hildur Guðnadóttir provided the music.

Joker was produced by Phillips and actor/director Bradley Cooper, under their Joint Effort banner, and Emma Tillinger Koskoff.

I recently talked to Phillips, whose credits include Borat (for which he earned an Oscar nod for Best Adapted Screenplay), Due Date, Road Trip and Old School, about making the film, his love of editing and post.

You co-wrote this very complex, timely portrait of a man and a city. Was that the appeal for you?
Absolutely, 100 percent. While it takes place in the late ‘70s and early ‘80s, and we wrote it in 2016, it was very much about making a movie that deals with issues happening right now. Movies are often mirrors of society, and I feel this is exactly that.

Do you think that’s why so many people have been offended by it?
I do. It’s really resonated with audiences. I know it’s also been somewhat divisive, and a lot of people were saying, “You can’t make a movie about a guy like this — it’s irresponsible.” But do we want to pretend that these people don’t exist? When you hold up a mirror to society, people don’t always like what they see.

Especially when we don’t look so good.
(Laughs) Exactly.

This is a million miles away from the usual comic-book character and cartoon violence. What sort of film did you set out to make?
We set out to make a tragedy, which isn’t your usual Hollywood approach these days, for sure.

It’s hard to picture any other actor pulling this off. What did Joachin bring to the role?
When Scott and I wrote it, we had him in mind. I had a picture of him as my screensaver on my laptop — and he still is. And then when I pitched this, it was with him in mind. But I didn’t really know him personally, even though we created the character “in his voice.” Everything we wrote, I imagined him saying. So he was really in the DNA of the whole film as we wrote it, and he brought the vulnerability and intensity needed.

You’d assume that he’d jump at this role, but I heard it wasn’t so simple getting him.
You’re right. Getting him was a bit of a thing because it wasn’t something he was looking to do — to be in a movie set in the comic book world. But we spent a lot of timing talking about it, what it would be, what it means and what it says about society today and the lack of empathy and compassion that we have now. He really connected with those themes.

Now, looking back, it seems like an obvious thing for him to do, but it’s hard for actors because the business has changed so much and there’s so many of these superhero movies and comic book films now. Doing them is a big thing for an actor, because then you’re in “that group,” and not every actor wants to be in that group because it follows you, so to speak. A lot of actors have done really well in superhero movies and have done other things too, but it’s a big step and commitment for an actor. And he’d never really been in this kind of film before.

What were the main technical challenges in pulling it all together?
I really wanted to shoot on location all around New York City, and that was a big challenge because it’s far harder than it sounds. But it was so important to the vibe and feel of the movie. So many superhero movies use lots of CGI, but I needed that gritty reality of the actual streets. And I think that’s why it’s so unsettling to people because it does feel so real. Luckily, we had Emma Tillinger Koskoff, who’s one of the great New York producers. She was key in getting locations.

Did you do a lot of previz?
I don’t usually do that much. We did it once for War Dogs and it worked well, but it’s a really slow and annoying process to some extent. As crazy as it sounds, we tried it once on the big Murray Franklin scene with De Niro at the end, which is not a scene you’d normally previz — it’s just two guys sitting on a couch. But it was a 12-page scene with so many camera angles, so we began to previz it and then just abandoned it half-way through. The DP and I were like, “This isn’t worth it. We’ll just do it like we always do and just figure it out as we go.” But previz is an amazing tool. It just needed more time and money than we had, and definitely more patience than I have.

Where did you post?
We started off at my house, where Jeff and I had an Avid setup. We also had a satellite office at 9000 Sunset, where all the assistants were. VFX and our VFX supervisor Edwin Rivera were also based out of there along with our music editor, and that’s where most of it was done. Our supervising sound editor was Alan Robert Murray, a two-time Oscar-winner for his work on American Sniper and Letters From Iwo Jima, and we did the Atmos sound mix on the lot at Warners with Tom Ozanich and Dean Zupancic.

Talk about editing with Jeff Groth. What were the big editing challenges?
There are a lot of delusions in Arthur’s head, so it was a big challenge to know when to hide them and when to reveal them. The scene order in the final film is pretty different from the scripted order, and that’s all about deciding when to reveal information. When you write the script, every scene seems important, and everything has to happen in this order, but when you edit, it’s like, “What were we thinking? This could move here, we can cut this, and so on.”

Todd Phillips on set with Robert DeNiro

That’s what’s so fun about editing and why I love it and post so much. I see my editor as a co-writer. I think every director loves editing the most, because let’s face it — directors are all control freaks, and you have the most control in post and the editing room. So for me at least, I direct movies and go through all the stress of production and shooting just to get to the editing room. It’s all stuff I just have to deal with so I can then sit down and actually make the movie. So it’s the final draft of the script and I very much see it as a writing exercise.

Post is your last shot at getting the script right, and the most fun part of making a movie is the first 10 to 12 weeks of editing. The worst part is the final stretch of post, all that detail work and watching the movie 400 times. You get sick of it, and it’s so hard to be objective. This ended up taking 20 weeks before we had the first cut. Usually you get 10 for the director’s cut, but I asked Warners for more time and they were like, “OK.”

Visual effects play a big role in the film. How many were there?
More than you’d think, but they’re not flashy. I told Edwin early on, if you do your job right, no one will guess there are any VFX shots at all. He had a great team, and we used various VFX houses, including Scanline, Shade and Branch.

There’s a lot of blood, and I’m guessing that was all enhanced a lot?
In fact, there was no real blood — not a drop — used on set, and that amazes people when I tell them. That’s one of the great things about VFX now — you can do all the blood work in post. For instance, traditionally, when you film a guy being shot on the subway, you have all the blood spatters and for take two, you have to clean all that up and repaint the walls and reset, and it takes 45 minutes. This way, with VFX, you don’t have to deal with any of that. You just do a take, do it again until it’s right, and add all the blood in post. That’s so liberating.

L-R: Iain Blair and Todd Phillips

What was the most difficult VFX shot to do?
I’d say the scene with Randall at his apartment, and all that blood tracking on the walls and on Arthur’s face and hands is pretty amazing, and we spent the most time on all that, getting it right.

Where did you do the DI, and how important is it to you?
At Company 3 with my regular colorist Jill Bogdanowicz, and it’s vital for the look. I only began doing DIs on the first Hangover, and the great thing about it is you can go in and surgically fix anything. And if you have a great DP like Larry Sher, who’s shot the last six movies for me, you don’t get lost in the maze of possibilities, and I trust him more than I trust myself sometimes.

We shot it digitally, though the original plan was to shoot 65mm large format, and when that fell through to shoot 35mm. Then Larry and I did a lot of tests and decided we’d shoot digital and make it look like film. And thanks to the way he lit and all the work he and Jill did, it has this weird photochemical feel and look. It’s not quite film, but it’s definitely not digital. It’s somewhere in the middle, its own thing.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

James Norris joins Nomad in London as editor, partner

Nomad in London has added James Norris as editor and partner. A self-taught, natural editor, James started out running for the likes of Working Title, Partizan and Tomboy Films. He then moved to Whitehouse Post as an assistant where he refined his craft and rose through the ranks to become an editor.

Over the past 15 years, he’s worked across commercials, music videos, features and television. Norris edited Ikea’s Fly Robot Fly spot and Asda’s Get Possessed piece, and has recently cut a new project for Nike. Working within television and film, he also cut an episode of the BAFTA-nominated drama Our World War and feature film We Are Monster.

“I was attracted to Nomad for their vision for the future and their dedication to the craft of editing. They have a wonderful history but are also so forward-thinking and want to create new, exciting things. The New York and LA offices have seen incredible success over the last few years, and now there’s Tokyo and London too. On top of this, Nomad feels like home already. They’re really lovely people — it really does feel like a family.”

Norris will be cutting on Avid Media Composer at Nomad.

 

Good Company adds director Daniel Iglesias Jr.

Filmmaker Daniel Iglesias Jr., whose reel spans narrative storytelling to avant-garde fashion films with creativity and an eccentric visual style, has signed with full-service creative studio Good Company.

Iglesias’ career started while attending Chapman University’s renowned film school, where he earned a BFA in screen acting. At the same time, Iglesias and his friend Zack Sekuler began crafting images for his friends in the alt-rock band The Neighbourhood. Iglesias’ career took off after directing his first music video for the band’s breakout hit “Sweater Weather,” which reached over 310 million views. He continues working behind the camera for The Neighbourhood and other artists like X Ambassadors and AlunaGeorge.

Iglesias uses elements of surrealism and a blend of avant-garde and commercial compositions, often stemming from innovative camera techniques. His work includes projects for clients like Ralph Lauren, Steve Madden, Skyy Vodka and Chrysler and the Vogue film Death Head Sphinx.

One of his most celebrated projects was a two-minute promo for Margaux the Agency. Designed as a “living magazine,” Margaux Vol 1 merges creative blocking, camera movement and effects to create a kinetic visual catalog that is both classic and contemporary. The piece took home Best Picture at the London Fashion Film Festival, along with awards from the Los Angeles Film Festival, the International Fashion Film Awards and Promofest in Spain.

Iglesias’ first project since joining Good Company was Ikea’s Kama Sutra commercial for Ogilvy NY, a tongue-in-cheek exploration of the boudoir. Now he is working on a project for Paper Magazine and Tiffany.

“We all see the world through our own lens; through film, I can unscrew my lens and pop in onto other people and, by effect, change their point of view or even the depth of culture,” he says. “That’s why the medium excites me — I want to show people my lens.”

We reached out to Iglesias to learn a bit more about how he works.

How do you go about picking the people you work with?
I do have a couple DPs and PDs I like to work with on the regular, depending on the job, and sometimes it makes sense to work with someone new. If it’s someone new that I haven’t worked with before, I typically look at three things to get a sense of how right they are for the project: image quality, taste and versatility. Then it’s a phone call or meeting to discuss the project in person so we can feel out chemistry and execution strategy.

Do you trust your people completely in terms of what to shoot on, or do you like to get involved in that process as well?
I’m a pretty hands-on and involved director, but I think it’s important to know what you don’t know and delegate/trust accordingly. I think it’s my job as a director to communicate, as detailed and effectively as possible, an accurate explanation of the vision (because nobody sees the vision of the project better than I do). Then I must understand that the DPs/PDs/etc. have a greater knowledge of their field than I do, so I must trust them to execute (because nobody understands how to execute in their fields better than they do).

Since Good Company also provides post, how involved do you get in that process?
I would say I edit 90% of my work. If I’m not editing it myself, then I still oversee the creative in post. It’s great to have such a strong post workflow with Good Company.

The editors of Ad Astra: John Axelrad and Lee Haugen

By Amy Leland

The new Brad Pitt film Ad Astra follows astronaut Roy McBride (Pitt) as he journeys deep into space in search of his father, astronaut Clifford McBride (Tommy Lee Jones). The elder McBride disappeared years before, and his experiments in space might now be endangering all life on Earth. Much of the film features Pitt’s character alone in space with his thoughts, creating a happy challenge for the film’s editing team, who have a long history of collaboration with each other and the film’s director James Gray.

L-R: Lee Haugen and John Axelrad

Co-editors John Axelrad, ACE, and Lee Haugen share credits on three previous films — Haugen served as Axelrad’s apprentice editor on Two Lovers, and the two co-edited The Lost City of Z and Papillon. Ad Astra’s director, James Gray, was also at the helm of Two Lovers and The Lost City of Z. A lot can be said for long-time collaborations.

When I had the opportunity to speak with Axlerad and Haugen, I was eager to find out more about how this shared history influenced their editing process and the creation of this fascinating story.

What led you both to film editing?
John Axelrad: I went to film school at USC and graduated in 1990. Like everyone else, I wanted to be a director. Everyone that goes to film school wants that. Then I focused on studying cinematography, but then I realized several years into film school that I don’t like being on the set.

Not long ago, I spoke to Fred Raskin about editing Once Upon a Time… in Hollywood. He originally thought he was going to be a director, but then he figured out he could tell stories in an air-conditioned room.
Axelrad: That’s exactly it. Air conditioning plays a big role in my life; I can tell you that much. I get a lot of enjoyment out of putting a movie together and of being in my own head creatively and really working with the elements that make the magic. In some ways, there are a lot of parallels with the writer when you’re an editor; the difference is I’m not dealing with a blank page and words — I’m dealing with images, sound and music, and how it all comes together. A lot of people say the first draft is the script, the second draft is the shoot, and the third draft is the edit.

L-R: John and Lee at the Papillon premiere.

I started off as an assistant editor, working for some top editors for about 10 years in the ’90s, including Anne V. Coates. I was an assistant on Out of Sight when Anne Coates was nominated for the Oscar. Those 10 years of experience really prepped me for dealing with what it’s like to be the lead editor in charge of a department — dealing with the politics, the personalities and the creative content and learning how to solve problems. I started cutting on my own in the late ‘90s, and in the early 2000s, I started editing feature films.

When did you meet your frequent collaborator James Gray?
Axelrad: I had done a few horror features, and then I hooked up with James on We Own the Night, and that went very well. Then we did Two Lovers after that. That’s where Lee Haugen came in — and I’ll let him tell his side of the story — but suffice it to say that I’ve done five films for James Gray, and Lee Haugen rose up through the ranks and became my co-editor on the Lost City of Z. Then we edited the movie Papillon together, so it was just natural that we would do Ad Astra together as a team.

What about you, Lee? How did you wind your way to where we are now?
Lee Haugen: Growing up in Wisconsin, any time I had a school project, like writing a story or writing an article, I would change it into a short video or short film instead. Back then I had to shoot on VHS tape and edited tape to tape by pushing play and hitting record and timing it. It took forever, but that was when I really found out that I loved editing.

So I went to school with a focus on wanting to be an editor. After graduating from Wisconsin, I moved to California and found my way into reality television. That was the mid-2000s and it was the boom of reality television; there were a lot of jobs that offered me the chance to get in the hours needed for becoming a member of the Editors Guild as well as more experience on Avid Media Composer.

After about a year of that, I realized working the night shift as an assistant editor on reality television shows was not my real passion. I really wanted to move toward features. I was listening to a podcast by Patrick Don Vito (editor of Green Book, among other things), and he mentioned John Axelrad. I met John on an interview for We Own the Night when I first moved out here, but I didn’t get the job. But a year or two later, I called him, and he said, “You know what? We’re starting another James Gray movie next week. Why don’t you come in for an interview?” I started working with John the day I came in. I could not have been more fortunate to find this group of people that gave me my first experience in feature films.

Then I had the opportunity to work on a lower-budget feature called Dope, and that was my first feature editing job by myself. The success of the film at Sundance really helped launch my career. Then things came back around. John was finishing up Krampus, and he needed somebody to go out to Northern Ireland to edit the assembly of The Lost City of Z with James Gray. So, it worked out perfectly, and from there, we’ve been collaborating.

Axelrad: Ad Astra is my third time co-editing with Lee, and I find our working as a team to be a naturally fluid and creative process. It’s a collaboration entailing many months of sharing perspectives, ideas and insights on how best to approach the material, and one that ultimately benefits the final edit. Lee wouldn’t be where he is if he weren’t a talent in his own right. He proved himself, and here we are together.

How has your collaborative process changed and grown from when you were first working together (John, Lee and James) to now, on Ad Astra?
Axelrad: This is my fifth film with James. He’s a marvelous filmmaker, and one of the reasons he’s so good is that he really understands the subtlety and power of editing. He’s very neoclassical in his approach, and he challenges the viewer since we’re all accustomed to faster cutting and faster pacing. But with James, it’s so much more of a methodical approach. James is very performance-driven. It’s all about the character, it’s all about the narrative and the story, and we really understand his instincts. Additionally, you need to develop a second-hand language and truly understand what the director wants.

Working with Lee, it was just a natural process to have the two of us cutting. I would work on a scene, and then I could say, “Hey Lee, why don’t you take a stab at it?” Or vice versa. When James was in the editing room working with us, he would often work intensely with one of us and then switch rooms and work with the other. I think we each really touched almost everything in the film.

Haugen: I agree with John. Our way of working is very collaborative —that includes John and I, but also our assistant editors and additional editors. It’s a process that we feel benefits the film as a whole; when we have different perspectives, it can help us explore different options that can raise the film to another level. And when James comes in, he’s extremely meticulous. And as John said, he and I both touched every single scene, and I think we’ve even touched every frame of the film.

Axelrad: To add to what Lee said, about involving our whole editing team, I love mentoring, and I love having my crew feel very involved. Not just technical stuff, but creatively. We worked with a terrific guy, Scott Morris, who is our first assistant editor. Ultimately, he got bumped up during the course of the film and got an additional editor credit on Ad Astra.

We involve everyone, even down to the post assistant. We want to hear their ideas and make them feel like a welcome part of a collaborative environment. They obviously have to focus on their primary tasks, but I think it just makes for a much happier editing room when everyone feels part of a team.

How did you manage an edit that was so collaborative? Did you have screenings of dailies or screenings of cuts?
Axelrad: During dailies it was just James, and we would send edits for him to look at. But James doesn’t really start until he’s in the room. He really wants to explore every frame of film and try all the infinite combinations, especially when you’re dealing with drama and dealing with nuance and subtlety and subtext. Those are the scenes that take the longest. When I put together the lunar rover chase, it was almost easier in some ways than some of the intense drama scenes in the film.

Haugen: As the dailies came in, John and I would each take a scene and do a first cut. And then, once we had something to present, we would call everybody in to watch the scene. We would get everybody’s feedback and see what was working, what wasn’t working. If there were any problems that we could address before moving to the next scene, we would. We liked to get the outside point of view, because once you get further and deeper into the process of editing a film, you do start to lose perspective. To be able to bring somebody else in to watch a scene and to give you feedback is extremely helpful.

One thing that John established with me on Two Lovers — my first editing job on a feature — was allowing me to come and sit in the room during the editing. After my work was done, I was welcome to sit in the back of the room and just observe the interaction between John and James. We continued that process with this film, just to give those people experience and to learn and to observe how an edit room works. That helped me become an editor.

John, you talked about how the action scenes are often easier to cut than the dramatic scenes. It seems like that would be even more true with Ad Astra, because so much of this film is about isolation. How does that complicate the process of structuring a scene when it’s so much about a person alone with his own thoughts?
Axelrad: That was the biggest challenge, but one we were prepared for. To James’ credit, he’s not precious about his written words; he’s not precious about the script. Some directors might say, “Oh no, we need to mold it to fit the script,” but he allows the actors to work within a space. The script is a guide for them, and they bring so much to it that it changes the story. That’s why I always say that we serve the ego of the movie. The movie, in a way, informs us what it wants to be, and what it needs to be. And in the case of this, Brad gave us such amazing nuanced performances. I believe you can sometimes shape the best performance around what is not said through the more nuanced cues of facial expressions and gestures.

So, as an editor, when you can craft something that transcends what is written and what is photographed and achieve a compelling synergy of sound, music and performance — to create heightened emotions in a film — that’s what we’re aiming for. In the case of his isolation, we discovered early on that having voiceover and really getting more interior was important. That wasn’t initially part of the cut, but James had written voiceover, and we began to incorporate that, and it really helped make this film into more of an existential journey.

The further he goes out into space, the deeper we go into his soul, and it’s really a dive into the subconscious. That sequence where he dives underwater in the cooling liquid of the rocket, he emerges and climbs up the rocket, and it’s almost like a dream. Like how in our dreams we have superhuman strength as a way to conquer our demons and our fears. The intent really was to make the film very hypnotic. Some people get it and appreciate it.

As an editor, sound often determines the rhythm of the edit, but one of the things that was fascinating with this film is how deafeningly quiet space likely is. How do you work with the material when it’s mostly silent?
Haugen: Early on, James established that he wanted to make the film as realistic as possible. Sound, or lack of sound, is a huge part of space travel. So the hard part is when you have, for example, the lunar rover chase on the moon, and you play it completely silent; it’s disarming and different and eerie, which was very interesting at first.

But then we started to explore how we could make this sound more realistic or find a way to amplify the action beats through sound. One way was, when things were hitting him or things were vibrating off of his suit, he could feel the impacts and he could hear the vibrations of different things going on.

Axelrad: It was very much part of our rhythm, of how we cut it together, because we knew James wanted to be as realistic as possible. We did what we could with the soundscapes that were allowable for a big studio film like this. And, as Lee mentioned, playing it from Roy’s perspective — being in the space suit with him. It was really just to get into his head and hear things how he would hear things.

Thanks to Max Richter’s beautiful score, we were able to hone the rhythms to induce a transcendental state. We had Gary Rydstrom and Tom Johnson mix the movie for us at Skywalker, and they were the ultimate creators of the balance of the rhythms of the sounds.

Did you work with music in the cut?
Axelrad: James loves to temp with classical music. In previous films, we used a lot of Puccini. In this film, there was a lot of Wagner. But Max Richter came in fairly early in the process and developed such beautiful themes, and we began to incorporate his themes. That really set the mood.

When you’re working with your composer and sound designer, you feed off each other. So things that they would do would inspire us, and we would change the edits. I always tell the composers when I work with them, “Hey, if you come up with something, and you think musically it’s very powerful, let me know, and I am more than willing to pitch changing the edit to accommodate.” Max’s music editor, Katrina Schiller, worked in-house with us and was hugely helpful, since Max worked out of London.

We tend not to want to cut with music because initially you want the edit not to have music as a Band-Aid to cover up a problem. But once we feel the picture is working, and the rhythm is going, sometimes the music will just fit perfectly, even as temp music. And if the rhythms match up to what we’re doing, then we know that we’ve done it right.

What is next for the two of you?
Axelrad: I’m working on a lower-budget movie right now, a Lionsgate feature film. The title is under wraps, but it stars Janelle Monáe, and it’s kind of a socio-political thriller.

What about you Lee?
Haugen: I jumped onto another film as well. It’s an independent film starring Zoe Saldana. It’s called Keyhole Garden, and it’s this very intimate drama that takes place on the border between Mexico and America. So it’s a very timely story to tell.


Amy Leland is a film director and editor. Her short film, Echoes, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Charlieuniformtango names company vets as new partners

Charlieuniformtango principal/CEO Lola Lott has named three of the full-service studio’s most veteran artists as new partners — editors Deedle LaCour and James Rayburn, and Flame artist Joey Waldrip. This is the first time in the company’s almost 25-year history that the partnership has expanded. All three will continue with their current jobs but have received the expanded titles of senior editor/partner and senior Flame artist/partner, respectively. Lott, who retains majority ownership of Charlieuniformtango, will remain principal/CEO, and Jack Waldrip will remain senior editor/co-owner.

“Deedle, Joey and James came to me and Jack with a solid business plan about buying into the company with their futures in mind,” explains Lott. “All have been with Charlieuniformtango almost from the beginning: Deedle for 20 years, Joey for 19 years and James for 18. Jack and I were very impressed and touched that they were interested and willing to come to us with funding and plans for continuing and growing their futures with us.

So why now after all these years? “Now is the right time because while Jack and I still have a passion for this business and we also have employees/talent — that have been with us for over 18 years — who also have a passion be a partner in this company,” says Lott. “While still young, they have invested and built their careers within the Tango culture and have the client bonds, maturity and understanding of the business to be able to take Tango to a greater level for the next 20 years. That was mine and Jack’s dream, and they came to us at the perfect time.”

Charlieuniformtango is a full-service creative studio that produces, directs, shoots, edits, mixes, animates and provides motion graphics, color grading, visual effects and finishing for commercials, short films, full-length feature films, documentaries, music videos and digital content.

Main Image: (L-R) Joey Waldrip, James Rayburn, Jack Waldrip, Lola Lott and Deedle LaCour

HPA Awards name 2019 creative nominees

The HPA Awards Committee has announced the nominees for the creative categories for the 2019 HPA Awards. The HPA Awards honor outstanding achievement and artistic excellence by the individuals and teams who help bring stories to life. Launched in 2006, the HPA Awards recognize outstanding achievement in color grading, editing, sound and visual effects for work in episodic, spots and feature films.

The winners of the 14th Annual HPA Awards will be announced at a gala ceremony on November 21 at the Skirball Cultural Center in Los Angeles.

The 2019 HPA Awards Creative Category nominees are:

Outstanding Color Grading – Theatrical Feature

-“First Man”

Natasha Leonnet // Efilm

-“Roma”

Steven J. Scott // Technicolor

-“Green Book”

Walter Volpatto // FotoKem

-“The Nutcracker and the Four Realms”

Tom Poole // Company 3

-“Us”

Michael Hatzer // Technicolor

-“Spider-Man: Into the Spider-Verse”

Natasha Leonnet // Efilm

 

Outstanding Color Grading – Episodic or Non-theatrical Feature

-“The Handmaid’s Tale – Liars”

Bill Ferwerda // Deluxe Toronto

-“The Marvelous Mrs. Maisel – Vote for Kennedy, Vote for Kennedy”

Steven Bodner // Light Iron

-“Game of Thrones – Winterfell”

Joe Finley // Sim, Los Angeles

-“I am the Night – Pilot”

Stefan Sonnenfeld // Company 3

-“Gotham – Legend of the Dark Knight: The Trial of Jim Gordon”

Paul Westerbeck // Picture Shop

-“The Man in the High Castle – Jahr Null”

Roy Vasich // Technicolor

 

Outstanding Color Grading – Commercial  

-Zara – “Woman Campaign Spring Summer 2019”

Tim Masick // Company 3

-Tiffany & Co. – “Believe in Dreams: A Tiffany Holiday”

James Tillett // Moving Picture Company

-Hennessy X.O. – “The Seven Worlds”

Stephen Nakamura // Company 3

-Palms Casino – “Unstatus Quo”

Ricky Gausis // Moving Picture Company

-Audi – “Cashew”

Tom Poole // Company 3

 

Outstanding Editing – Theatrical Feature

-“Once Upon a Time… in Hollywood”

Fred Raskin, ACE

-“Green Book”

Patrick J. Don Vito, ACE

-“Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese”

David Tedeschi, Damian Rodriguez

-“The Other Side of the Wind”

Orson Welles, Bob Murawski, ACE

-“A Star Is Born”

Jay Cassidy, ACE

 

Outstanding Editing – Episodic or Non-theatrical Feature (30 Minutes and Under)

“Russian Doll – The Way Out”

Todd Downing

-“Homecoming – Redwood”

Rosanne Tan, ACE

-“Veep – Pledge”

Roger Nygard, ACE

-“Withorwithout”

Jake Shaver, Shannon Albrink // Therapy Studios

-“Russian Doll – Ariadne”

Laura Weinberg

 

Outstanding Editing – Episodic or Non-theatrical Feature (Over 30 Minutes)

-“Stranger Things – Chapter Eight: The Battle of Starcourt”

Dean Zimmerman, ACE, Katheryn Naranjo

-“Chernobyl – Vichnaya Pamyat”

Simon Smith, Jinx Godfrey // Sister Pictures

-“Game of Thrones – The Iron Throne”

Katie Weiland, ACE

-“Game of Thrones – The Long Night”

Tim Porter, ACE

-“The Bodyguard – Episode One”

Steve Singleton

 

Outstanding Sound – Theatrical Feature

-“Godzilla: King of Monsters”

Tim LeBlanc, Tom Ozanich, MPSE // Warner Bros.

Erik Aadahl, MPSE, Nancy Nugent, MPSE, Jason W. Jennings // E Squared

-“Shazam!”

Michael Keller, Kevin O’Connell // Warner Bros.

Bill R. Dean, MPSE, Erick Ocampo, Kelly Oxford, MPSE // Technicolor

-“Smallfoot”

Michael Babcock, David E. Fluhr, CAS, Jeff Sawyer, Chris Diebold, Harrison Meyle // Warner Bros.

-“Roma”

Skip Lievsay, Sergio Diaz, Craig Henighan, Carlos Honc, Ruy Garcia, MPSE, Caleb Townsend

-“Aquaman”

Tim LeBlanc // Warner Bros.

Peter Brown, Joe Dzuban, Stephen P. Robinson, MPSE, Eliot Connors, MPSE // Formosa Group

 

Outstanding Sound – Episodic or Non-theatrical Feature

-“Chernobyl – 1:23:45”

Stefan Henrix, Stuart Hilliker, Joe Beal, Michael Maroussas, Harry Barnes // Boom Post

-“Deadwood: The Movie”

John W. Cook II, Bill Freesh, Mandell Winter, MPSE, Daniel Coleman, MPSE, Ben Cook, MPSE, Micha Liberman // NBC Universal

-“Game of Thrones – The Bells”

Tim Kimmel, MPSE, Onnalee Blank, CAS, Mathew Waters, CAS, Paula Fairfield, David Klotz

-“The Haunting of Hill House – Two Storms”

Trevor Gates, MPSE, Jason Dotts, Jonathan Wales, Paul Knox, Walter Spencer // Formosa Group

-“Homecoming – Protocol”

John W. Cook II, Bill Freesh, Kevin Buchholz, Jeff A. Pitts, Ben Zales, Polly McKinnon // NBC Universal

 

Outstanding Sound – Commercial 

-John Lewis & Partners – “Bohemian Rhapsody”

Mark Hills, Anthony Moore // Factory

Audi – “Life”

Doobie White // Therapy Studios

-Leonard Cheshire Disability – “Together Unstoppable”

Mark Hills // Factory

-New York Times – “The Truth Is Worth It: Fearlessness”

Aaron Reynolds // Wave Studios NY

-John Lewis & Partners – “The Boy and the Piano”

Anthony Moore // Factory

 

Outstanding Visual Effects – Theatrical Feature

-“Avengers: Endgame”

Matt Aitken, Marvyn Young, Sidney Kombo-Kintombo, Sean Walker, David Conley // Weta Digital

-“Spider-Man: Far From Home”

Alexis Wajsbrot, Sylvain Degrotte, Nathan McConnel, Stephen Kennedy, Jonathan Opgenhaffen // Framestore

-“The Lion King”

Robert Legato

Andrew R. Jones

Adam Valdez, Elliot Newman, Audrey Ferrara // MPC Film

Tom Peitzman // T&C Productions

-“Alita: Battle Angel”

Eric Saindon, Michael Cozens, Dejan Momcilovic, Mark Haenga, Kevin Sherwood // Weta Digital

-“Pokemon Detective Pikachu”

Jonathan Fawkner, Carlos Monzon, Gavin Mckenzie, Fabio Zangla, Dale Newton // Framestore

 

Outstanding Visual Effects – Episodic (Under 13 Episodes) or Non-theatrical Feature

-“Game of Thrones – The Long Night”

Martin Hill, Nicky Muir, Mike Perry, Mark Richardson, Darren Christie // Weta Digital

-“The Umbrella Academy – The White Violin”

Everett Burrell, Misato Shinohara, Chris White, Jeff Campbell, Sebastien Bergeron

-“The Man in the High Castle – Jahr Null”

Lawson Deming, Cory Jamieson, Casi Blume, Nick Chamberlain, William Parker, Saber Jlassi, Chris Parks // Barnstorm VFX

-“Chernobyl – 1:23:45”

Lindsay McFarlane

Max Dennison, Clare Cheetham, Steven Godfrey, Luke Letkey // DNEG

-“Game of Thrones – The Bells”

Steve Kullback, Joe Bauer, Ted Rae

Mohsen Mousavi // Scanline

Thomas Schelesny // Image Engine

 

Outstanding Visual Effects – Episodic (Over 13 Episodes)

-“Hawaii Five-O – Ke iho mai nei ko luna”

Thomas Connors, Anthony Davis, Chad Schott, Gary Lopez, Adam Avitabile // Picture Shop

-“9-1-1 – 7.1”

Jon Massey, Tony Pizadeh, Brigitte Bourque, Gavin Whelan, Kwon Choi // FuseFX

-“Star Trek: Discovery – Such Sweet Sorrow Part 2”

Jason Zimmerman, Ante Dekovic, Aleksandra Kochoska, Charles Collyer, Alexander Wood // CBS Television Studios

-“The Flash – King Shark vs. Gorilla Grodd”

Armen V. Kevorkian, Joshua Spivack, Andranik Taranyan, Shirak Agresta, Jason Shulman // Encore VFX

-“The Orville – Identity: Part II”

Tommy Tran, Kevin Lingenfelser, Joseph Vincent Pike // FuseFX

Brandon Fayette, Brooke Noska // Twentieth Century Fox TV

 

In addition to the nominations announced today, the HPA Awards will present a small number of special awards. Visual effects supervisor and creative Robert Legato (The Lion King, The Aviator, Hugo, Harry Potter and the Sorcerer’s Stone, Titanic, Avatar) will receive the HPA Award for Lifetime Achievement.

Winners of the Engineering Excellence Award include Adobe, Epic Games, Pixelworks, Portrait Displays Inc. and LG Electronics. The recipient of the Judges Award for Creativity and Engineering, a juried honor, will be announced in the coming weeks. All awards will be bestowed at the HPA Awards gala.

For more information or to buy tickets to the 2019 HPA Awards, click here.

 

 

Uppercut ups Tyler Horton to editor

After spending two years as an assistant at New York-based editorial house Uppercut, Tyler Horton has been promoted to editor. This is the first internal talent promotion for Uppercut.

Horton first joined Uppercut in 2017 after a stint as an assistant editor at Whitehouse Post. Stepping up as editor he’s cut notable projects, such as a recent Nike campaign “Letters to Heroes,” a series launched in conjunction with the US Open that highlights young athletes meeting their role models, including Serena Williams and Naomi Osaka. He also has cut campaigns for brands such as Asics, Hypebeast, Volvo and MOMA.

“From the beginning, Uppercut was always intentionally a boutique studio that embraced a collaborative of visions and styles — never just a one-person shop,” says Uppercut EP Julia Williams. “Tyler took initiative from day one to be as hands-on as possible with every project and we’ve been proud to see him really grow and refine his own voice.”

Horton’s love of film was sparked by watching sports reels and highlight videos. He went on to study film editing, then hit the road to tour with his band for four years before returning to his passion for film.

Behind the Title: Chapeau CD Lauren Mayer-Beug

This creative director loves the ideation process at the start of a project when anything is possible, and saving some of those ideas for future use.

COMPANY: LA’s Chapeau Studios

CAN YOU DESCRIBE YOUR COMPANY?
Chapeau provides visual effects, editorial, design, photography and story development fluidly with experience in design, web development, and software and app engineering.

WHAT’S YOUR JOB TITLE?
Creative Director

WHAT DOES THAT ENTAIL?
It often entails seeing a job through from start to finish. I look at it like making a painting or a sculpture.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Perhaps just how hands-on the process actually is. And how analog I am, considering we work in such a tech-driven environment.

Beats

WHAT’S YOUR FAVORITE PART OF THE JOB?
Thinking. I’m always thinking big picture to small details. I love the ideation process at the start of a project when anything is possible. Saving some of those ideas for future use, learning about what you want to do through that process. I always learn more about myself through every ideation session.

WHAT’S YOUR LEAST FAVORITE?
Letting go of the details that didn’t get addressed. Not everything is going to be perfect, so since it’s a learning process there is inevitably something that will catch your eye.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
My mind goes to so many buckets. A published children’s book author with a kick-ass coffee shop. A coffee bean buyer so I could travel the world.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I always skewed in this direction. My thinking has always been in the mindset of idea coaxer and gatherer. I was put in that position in my mid-20s and realized I liked it (with lots to learn, of course), and I’ve run with it ever since.

IS THERE A PROJECT YOU ARE MOST PROUD OF?
That’s hard to say. Every project is really so different. A lot of what I’m most proud of is behind the scenes… the process that will go into what I see as bigger things. With Chapeau, I will always love the Facebook projects, all the pieces that came together — both on the engineering side and the fun creative elements.

Facebook

What I’m most excited about is our future stuff. There’s a ton on the sticky board that we aim to accomplish in the very near future. Thinking about how much is actually being set in motion is mind-blowing, humbling and — dare I say — makes me outright giddy. That is why I’m here, to tell these new stories — stories that take part in forming the new landscape of narrative.

WHAT TOOLS DO YOU USE DAY TO DAY?
Anything Adobe. My most effective tool is the good-old pen to paper. That works clearly in conveying ideas and working out the knots.

WHERE DO YOU FIND INSPIRATION?
I’m always looking for inspiration and find it everywhere, as many other creatives do. However, nature is where I’ve always found my greatest inspiration. I’m constantly taking photos of interesting moments to save for later. Oftentimes I will refer back to those moments in my work. When I need a reset I hike, run or bike. Movement helps.

I’m always going outside to look at how the light interacts with the environment. Something I’ve become known for at work is going out of my way to see a sunset (or sunrise). They know me to be the first one on the roof for a particularly enchanting magic hour. I’m always staring at the clouds — the subtle color combinations and my fascination with how colors look the way they do only by context. All that said, I often have my nose in a graphic design book.

The overall mood realized from gathering and creating the ever-popular Pinterest board is so helpful. Seeing the mood color wise and texturally never gets old. Suddenly, you have a fully formed example of where your mind is at. Something you could never have talked your way through.

Then, of course, there are people. People/peers and what they are capable of will always amaze me.

An editor’s recap of EditFestLA

By Barry Goch

In late August, I attended my first American Cinema Editors’ EditFest on the Disney lot, and I didn’t know what to expect. However, I was very happy indeed to have spent the day learning from top-notch editors discussing our craft.

Joshua Miller from C&I Studios

The day started with a presentation by Joshua Miller from C&I Studios on DaVinci Resolve. Over the past few releases, Blackmagic has added many new editor-specific and -requested features.

The first panel, “From the Cutting Room to the Red Carpet: ACE Award Nominees Discuss Their Esteemed Work,” was moderated by Margot Nack, senior manager at Adobe. The panel included Heather Capps (Portlandia); Nena Erb, ACE (Insecure); Robert Fisher, ACE (Spider-Man: Into the Spider-Verse); Eric Kissack (The Good Place) and Cindy Mollo, ACE (Ozark). Like film school, we would watch a scene and then the editor of the scene would break it down and discuss their choices. For example, we watched a very dramatic scene from Ozark, then Mollo described how she amplified a real baby’s crying with sound design to layer on more tension. She also had the music in the scene start at a precise moment to guide the viewer’s emotional state.

The second panel, “Reality vs. Scripted Editing: Demystifying the Difference,” was moderated by Avid’s Matt Feury and featured panelists Maura Corey, ACE (Good Girls, America’s Got Talent); Tom Costantino, ACE (The Orville, Intervention); Jamie Nelsen, ACE (Black-ish, Project Runway) and Molly Shock, ACE (Naked and Afraid, RuPauls Drag Race All Stars). The consensus of the panel was that an editor can create stories from reality or from script. The panel also noted that an editor can be quickly pigeonholed by their credits — it’s often hard to look past the credits and discover the person. However, it’s way more important to be able to “gel” with an editor as a person, since the creative is going to spend many hours with the editor. As with the previous panel, we were also treated to short clips and behind-the-scenes discussions. For example, Shock told of how she crafted a dramatic scene of an improvised shelter getting washed away during a flood in the middle of a jungle at night — all while the participants were completely naked.

Joe Walker, ACE, and Bobbie O’Steen

The next panel was “Inside the Cutting Room with Bobbie O’Steen: A Conversation with Joe Walker, ACE.” O’Steen, who authored “The Invisible Cut” and “Cut to the Chase,” moderated a discussion with Walker, whose credits include Widows, Blade Runner 2049, Arrival, Sicario and 12 Years a Slave, in which she lead Walker in a wide-ranging conversation about his career, enlivened with clips from his films. In what could be called “evolution of a scene,” Walker broke down the casino lounge scene in Blade Runner 2049, from previs to dailies, and then talked about how the VFX evolved during the edit and how he shaped the scene to final.

The final panel, “The Lean Forward Moment: A Tribute to Norman Hollyn, ACE,” was moderated by Alan Heim, ACE, president of the Motion Picture Editors Guild, and featured Ashley Alizor, assistant editor; Reine-Claire Dousarkissian, associate professor of the practice of cinematic arts at USC; Saira Haider (Creed II), editor; and professor of the practice of cinema arts at USC, Thomas G. Miller, ACE.

I had the pleasure of interviewing Norm for postPerspective, and he was the kind of man you meet once and never forget — a kind and giving spirit who we lost too soon. The panelists each had a story about how wonderful Norm was and they honored his teaching by sharing a favorite scene with the audience and explaining how it impacted them through Norm’s teaching. Norm’s colleague at USC, Dousarkissian, chose a scene from the 1952 Noir film Sudden Fear, with Jack Palance and Joan Crawford. It’s amazing how much tension can be created by a simple wind-up toy.

I thoroughly enjoyed my experience at EditFest. So often we see VFX breakdowns, which are amazing things, but to see and hear how scenes and story beats are crafted by the best in the business was a treat. I’m looking forward to attending next year already.


Barry Goch is a finishing artist at LA’s The Foundation, as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Behind the Title: Bindery editor Matt Dunne

Name: Matt Dunne

Company: Bindery

Can you describe your company?
Bindery is an indie film and content studio based in NYC. We model ourself after independent film studios, where we tackle every phase of a project from concept all the way through finishing. Our work varies from branded web content and national broadcast commercials to shorts and feature films.

What’s your job title?
Senior Editor

What does that entail?
I’m part of all things post at Bindery. I get involved early on in projects to help ensure we have a workflow set up, and if I’m the editor I’ll often get a chance to work with the director on conceptualizing the piece. When I get to go on set I’m able to become the hub of the production side. I’ll work with the director and DP to make sure the image is what they want and

I’ll start assembling the edit as they are shooting. Most of my time is spent in an edit suite with a director and clients working through their concept and really bringing their story to life. An advantage of working with Bindery is that I’m able to sit and work with directors before they shoot and sometimes even before a concept is locked. There’s a level of trust that’s developed and we get to work through ideas and plan for anything that may come up later on during the post process. Even though post is the last stage of a film project, it needs to be involved in the beginning. I’m a big believer in that. From the early stages to the very end, I get to touch a lot of projects.

What would surprise people the most about what falls under that title?
I’m a huge tech nerd and gear head, so with the help of two other colleagues I help maintain the post infrastructure of Bindery. When we expanded the office we had to rewire everything and I recently helped put a new server together. That’s something I never imagined myself doing.

Editors also become a sounding board for creatives. I think it’s partially because we are good listeners and partially because we have couches in our suites. People like to come in and riff an idea or work through something out loud, even if you aren’t the editor on that project. I think half of being a good editor is just being able to listen.

What’s your favorite part of the job?
Working in an open environment that nurtures ideas and creativity. I love working with people that want to push their product and encourage one another to do the same. It’s really special getting to play a role in it all.

What’s your least favorite?
I think anything that takes me away from the editing process. Any sort of hardware or software issue will completely kill your momentum and at times it can be difficult to get that back.

What’s your most productive time of the day?
Early in the morning. I’m usually walking around the post department checking the stations, double checking processes that took place overnight or maintaining the server. Opposite that I’ve always felt very productive late at night. If I’m not actively editing in the office, then I’m usually rolling the footage back in my head that I screened during the day to try and piece it together away from the computer.

If you didn’t have this Job, what would you be doing instead?
I would be running a dog sanctuary for senior and abused dogs.

How early on did you know this would be your path?
I first fell in love with post production when I was a kid. It was when Jurassic Park was in theaters and Fox would run these amazing behind-the-scene specials. There was this incredible in-depth coverage of how things in the film industry are done. I was too young to see the movie but I remember just devouring the content. That’s when I knew I wanted to be part of that scene.

Neurotica

Can you name some recent projects you have worked on?
I recently got to help finish a pilot for a series we released called Neurotica. We were lucky enough to premiere it at Tribeca this past season, and getting to see that on the big screen with the people who helped make it was a real thrill for me.

I also just finished cutting a JBL spot where we built soundscapes for Yankees player Aaron Judge and captured him as he listened and was taken on a journey through his career, past and present. The original concept was a bit different than the final deliverable, but because of the way it was shot we were able to re-conceptualize the piece in the edit. There was a lot of room to play and experiment with that one.

Do you put on a different hat when cutting for a specific genre? Can you elaborate?
Absolutely. With every job there comes a different approach and tools you need to use. If I’m cutting something more narrative focused I’ll make sure I have the script notes up, break my project out by scene and spend a lot of time auditioning different takes to make a scene work. Docu-style is a different approach entirely.

I’ll spend more time prepping that by location or subject and then break that down further. There’s even more back and forth when cutting doc. On a scripted project you have an idea of what the story flow is, but when you’re tasked with finding the edit you’re very much jumping around the story as it evolves. Whether it’s comedy, music or any type of genre, I’m always getting a chance to flex a different editing muscle.

1800 Tequila

What is the project you are most proud of?
There are a few, but one of my favorite collaborative experiences was when we worked with Billboard and 1800 Tequila to create a branded documentary series following Christian Scott aTunde Adjuh. It was five episodes shot in New York, Philadelphia and New Orleans, and the edit was happening simultaneously with production.

As the crew traveled and mapped out their days, I was able to screen footage, assemble and collaborate with the director on ideas that we thought could really enhance the piece. I was on the phone with him when they went back to NOLA for the last shoot and we were writing story beats that we needed to gather to make Episode 1 and 2 work more seamlessly now that the story had evolved. Being able to rework sections of earlier episodes before we were wrapped with production was an amazing opportunity.

What do you use to edit?
Software-wise I’m all in on the Adobe Creative Suite. I’ve been meaning to learn Resolve a bit more since I’ve been spending more and more time with it as a powerful tool in our workflow.

What is your favorite plugin?
Neat Video is a denoiser that’s really incredible. I’ve been able to work with low-light footage that would otherwise be unusable.

Are you often asked to do more than edit? If so, what else are you asked to do?
Since Bindery is involved in every stage of the process, I get this great opportunity to work with audio designers and colorists to see the project all the way through. I love learning by watching other people work.

Name three pieces of technology you can’t live without.
My phone. I think that’s a given at this point. A great pair of headphones, and a really comfortable chair that lets me recline as far back as possible for those really demanding edits.

What do you do to de-stress from it all?
I met my wife back in college and we’ve been best friends ever since, so spending any amount of time with her helps to wash away the stress. We also just bough our first house in February, so there’s plenty of projects for me to focus all of my stress into.

Review: Dell’s Precision T5820 workstation

By Brady Betzel

Multimedia creators are looking for faster, more robust computer systems and seeing an increase in computing power among all brands and products. Whether it’s an iMac Pro with a built-in 5K screen or a Windows-based, Nvidia-powered PC workstation, there are many options to consider. Many of today’s content creation apps are operating-system-agnostic, but that’s not necessarily true of hardware — mainly GPUs. So for those looking at purchasing a new system, I am going to run through one of Dell’s Windows-based offerings: the Dell Precision T5820 workstation.

The most important distinction between a “standard” computer system and a workstation is the enterprise-level quality and durability of internal parts. While you might build or order a custom-built system for less money, you will most likely not get the same back-end assurances that “workstations” bring to the party. Workstations aren’t always the fastest, but they are built with zero downtime and hardware/software functionality in mind. So while non-workstations might use high-quality components, like an Nvidia RTX 2080 Ti (a phenomenal graphics card), they aren’t necessarily meant to run 24 hours a day, 365 days a year. On the other hand, the Nvidia Quadro series GPUs are enterprise-level graphics cards that are meant to run constantly with low failure rates. This is just one example, but I think you get the point: Workstations run constantly and are warrantied against breakdowns — typically.

Dell Precision T5820
Dell has a long track record of building everyday computer systems that work. Even more impressive are its next-level workstation computers that not only stand up to constant use and abuse but are also certified with independent software vendors (ISVs). ISV is a designation that suggests Dell has not only tested but supports the end-user’s primary software choices. For instance, in the nonlinear editing software space I found out that Dell had tested the Precision T5820 workstation with Adobe Premiere Pro 13.x in Windows 10 and has certified that the AMD Radeon Pro WX 2100 and 3100 GPUs with 18.Q3.1 drivers are approved.

You can see for yourself here. Dell also has driver suggestions from some recent versions of Avid Media Composer, as well as other software packages. That being said, Dell not only tests but will support hardware configurations in the approved software apps.

Beyond the ISV certifications and the included three-year hardware warranty with on-site/in-home service after remote diagnostics, how does the Dell Precision T5820 perform? Well, it’s fast and well-built.

The specs are as follows:
– Intel Xeon W-2155 3.3GHz, 4.5GHz Turbo, 10-core, 13.75MB cache with hyperthreading
– Windows 10 Pro (four cores plus for workstations — this is an additional cost)
– Precision 5820 Tower with 950W chassis
– Nvidia Quadro P4000, 8GB, four DisplayPorts (5820T)
– 64GB (8x8GB) 2666MHz DDR and four RDIMM ECC
– Intel vPro technology enabled
– Dell Ultra-Speed Drive Duo PCIe SSD x8 Card, 1 M.2 512GB PCIe NVMe class 50 Solid State Drive (boot drive)
– 3.5-inch 2TB 7200rpm SATA hard drive (secondary drive)
– Wireless keyboard and mouse
– 1Gb network interface card
– USB 3.1 G2 PCIe card (two Type C ports, one DisplayPort)
– Three years hardware warranty with onsite/in-home service after remote diagnosis

All of this costs around $5,200 without tax or shipping and not including any sale prices.

The Dell Precision T5820 is the mid-level workstation offering from Dell that finds the balance between affordability, performance and reliability — kind of the “better, Cheaper, faster” concept. It is one of the quietest Dell workstations I have tested. Besides the spinning hard drive that was included on the model I was sent, there aren’t many loud cards or fans that distract me when I turn on the system. Dell is touting the new multichannel thermal design for advanced cooling and acoustics.

The actual 5820 case is about the size of a mid-sized tower system but feels much slimmer. I even cracked open the case to tinker around with the internal components. The inside fans and multichannel cooling are sturdy and even a little hard to remove without some force — not necessarily a bad thing. You can tell that Dell made it so that when something fails, it is a relatively simple replacement. The insides are very modular. The front of the 5820 has an optical drive, some USB ports (including two USB-C ports) and an audio port. If you get fancy, you can order the systems with what Dell calls “Flex Bays” in the front. You can potentially add up to six 2.5-inch or five 3.5-inch drives and front-accessible storage of up to four M.2 or U.2 PCIe NVMe SSDs. The best part about the front Flex Bays is that, if you choose to use M.2 or U.2 media, they are hot-swappable. This is great for editing projects that you want to archive to an M.2 or save to your Blackmagic DaVinci Resolve cache and remove later.

In the back of the workstation, you get audio in/out, one serial port, PS/2, Ethernet and six USB 3.1 Gen 1 Type A ports. This particular system was outfitted with an optional USB 3.1 Gen 2 10GB/s Type C card with one DisplayPort passthrough. This is used for the Dell UltraSharp 32-inch 4K (UHD) USB-C monitor that I received along with the T5820.

The large Dell UltraSharp 32-inch monitor (U3219Q) offers a slim footprint and a USB-C connection that is very intriguing, but they aren’t giving them away. They cost $879.99 if ordered through Dell.com. With the ultra-minimal Infinity Edge bezel, 400 nits of brightness for HDR content, up to UHD (3840×2160) resolution, 60Hz refresh rate and multiple input/output connections, you can see all of your work in one large IPS panel. For those of you who want to run two computers off one monitor, this Dell UltraSharp has a built-in KVM switch function. Anyone with a MacBook Pro featuring USB-C/Thunderbolt 3 ports can in theory use one USB-C cable to connect and charge. I say “in theory” only because I don’t have a new MacBook Pro to test it on. But for PCs, you can still use the USB-C as a hub.

The monitor comes equipped with a DisplayPort 1.4, HDMI, four USB 3.0 Type A ports and a USB-C port. Because I use my workstation mainly for video and photo editing, I am always concerned with proper calibration. The U3219Q is purported by Dell to be 99% Adobe sRGB-, 95% DCI-P3- and 99% Rec. 709-accurate, so if you are using Resolve and outputting through a DeckLink, you will be able to get some decent accuracy and even use it for HDR. Over the years, I have really fallen in love with Dell monitors. They don’t break the bank, and they deliver crisp and accurate images, so there is a lot to love. Check out more of this monitor here.

Performance
Working in media creation I jump around between a bunch of apps and plugins, from Media Composer to Blackmagic’s DaVinci Resolve and even from Adobe After Effects to Maxon’s Cinema 4D. So I need a system that can not only handle CPU-focused apps like After Effects but GPU-weighted apps like Resolve. With the Intel Xeon and Nvidia Quadro components, this system should work just fine. I ran some tests in Premiere Pro, After Effects and Resolve. In fact, I used Puget Systems’ benchmarking tool with Premiere and After Effects projects. You can find one for Premiere here. In addition, I used the classic 3D benchmark Cinebench R20 from Maxon, and even did some of my own benchmarks.

In Premiere, I was able to play 4K H.264 (50MB and 100MB 10-bit) and ProRes files (HQ and 4444) in realtime at full resolution. Red Raw 4K was able to playback in full-quality debayer. But as the Puget Systems’ Premiere Benchmark shows, 8K (as well as heavily effected clips) started to bog the system down. With 4K, the addition of Lumetri color correction slowed down playback and export a little bit — just a few frames under realtime. It was close though. At half quality I was essentially playing in realtime. According to the Puget Systems’ Benchmark, the overall CPU score was much higher than the GPU score. Adobe uses a lot of single core processing. While certain effects, like resizes and blurs, will open up the GPU pipes, I saw the CPU (single-core) kicking in here.

In the Premiere Pro tests, the T5820 really shined bright when working with mezzanine codec-based media like ProRes (HQ and 4444) and even in Red 4K raw media. The T5820 seemed to slow down when multiple layers of effects, such as color correction and blurs, were added on top of each other.

In After Effects, I again used Puget Systems’ benchmark — this time the After Effects-specific version. Overall, the After Effects scoring was a B or B-, which isn’t terrible considering it was up against the prosumer powerhouse Nvidia RTX 2080. (Puget Systems used the 2080 as the 100% score). It seemed the tracking on the Dell T5820 was a 90%, while Render and Preview scores were around 80%. While this is just what it says — a benchmark — it’s a great way to see comparisons between machines like the benchmark standard Intel i9, RTX 2080 GPU, 64GB of memory and much more.

In Resolve 16 Beta 7, I ran multiple tests on the same 4K (UHD), 29.97fps Red Raw media that Puget Systems used in its benchmarks. I created four 10-minute sequences:
Sequence 1: no effects or LUTs
Sequence 2: three layers of Resolve OpenFX Gaussian blurs on adjustment layers in the Edit tab
Sequence 3: five serial nodes of Blur Radius (at 1.0) created in the Color tab
Sequence 4: in the Color tab, spatial noise reduction was set at 25 radius to medium, blur set to 1.0 and sharpening in the Blur tab set to zero (it starts at 0.5).

Sequence 1, without any effects, would play at full debayer quality in real time and export at a few frames above real time, averaging about 33fps. Sequence 2, with Resolve’s OpenFX Gaussian blur applied three times to the entire frame via adjustment layers in the Edit tab, would play back in real time and export at between 21.5fps and 22.5fps. Sequence 3, with five serial nodes of blur radius set at 1.0 in the Blur tab in the Color tab, would play realtime and export at about 23fps. Once I added a sixth serial blur node, the system would no longer lock onto realtime playback. Sequence 4 — with spatial noise reduction set at 25 radius to medium, blur set to 1.0 and sharpening in the Blur tab set to zero in the Color tab — would play back at 1fps to 2fps and export at 6.5fps.

All of these exports were QuickTime-based H.264s exported using the Nvidia encoder (the native encoder would slow it down by 10 frames or so). The settings were UHD resolution; “automatic — best” quality; disabled frame reordering; force sizing to highest quality; force debayer to highest quality and no audio. Once I stacked two layers of raw Red 4K media, I started to drop below realtime playback, even without color correction or effects. I even tried to play back some 8K media, and I would get about 14fps on full-res. Premium debayer, 14 to 16 on half res. Premium 25 on half res. good, and 29.97fps (realtime) on quarter res. good.

Using the recently upgraded Maxon Cinebench R20 benchmark, I found the workstation to be performing adequately around the fourth-place spot. Keep in mind, there are thousands of combinations of results that can be had depending on CPU, GPU, memory and more. These are only sample results that you could verify against your own for 3D artists. The Cinebench R20 results were CPU: 4682, CPU (single-core): 436, and MP ratio: 10.73x. If you Google or check out some threads for Cinebench R20 result comparisons, you will eventually find some results to compare mine against. My results are a B to B+. A much higher-end Intel Xeon or i9 or an AMD Threadripper processor would really punch this system up a weight class.

Summing Up
The Dell Precision T5820 workstation comes with a lot of enterprise-level benefits that simply don’t come with your average consumer system. The components are meant to be run constantly, and Dell has tested its systems against current industry applications using the hardware in these systems to identify the best optimizations and driver packages with these ISVs. Should anything fail, Dell’s three-year warranty (which can be upgraded) will get you up and running fast. Before taxes and shipping, the Dell T5820 I was sent for review would retail for just under $5,200 (maybe even a little more with the DVD drive, recovery USB drive, keyboard and mouse). This is definitely not the system to look at if you are a DIYer or an everyday user who does not need to be running 24 hours a day, seven days a week.

But in a corporate environment, where time is money and no one wants to be searching for answers, the Dell T5820 workstation with accompanying three-year ProSupport with next-day on-site service will be worth the $5,200. Furthermore, it’s invaluable that optimization with applications such as the Adobe Creative Suite is built-in, and Dell’s ProSupport team has direct experience working in those professional apps.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Fred Raskin talks editing and Once Upon a Time… in Hollywood

By Amy Leland

Once Upon a Time… in Hollywood is marketed in a style similar to its predecessors — “the ninth film from Quentin Tarantino.” It is also the third film with Fred Raskin, ACE, as Tarantino’s editor. Having previously edited Django Unchained and The Hateful Eight, as well as working as assistant editor on the Kill Bill films, Raskin has had the opportunity to collaborate with a filmmaker who has always made it clear how much he values collaboration.

On top of this remarkable director/editor relationship, Raskin has also lent his editing hand to a slew of other incredibly popular films, including three entries in the Fast & Furious saga and both Guardians of the Galaxy films. I had the chance to talk with him about his start, his transition to editor and his work on Once Upon a Time… in Hollywood. A tribute to Hollywood’s golden age, the film stars Brad Pitt as the stunt double for a faded actor, played by Leonardo DiCaprio, as they try to find work in a changing industry.

Fred Raskin

How did you get your start as an editor?
I went to film school at NYU to become a director, but I had this realization about midway through that that I might not get a directing gig immediately upon graduation, so perhaps I should focus on a craft. Editing was always my favorite part of the process, and I think that of all the crafts, it’s the closest to directing. You’re crafting performances, you’re figuring out how you’re going to tell the story visually… and you can do all of this from the comfort of an air-conditioned room.

I told all of my friends in school, if you need an editor for your projects, please consider me. While continuing to make my own stuff, I also cut my friends’ projects. Maybe a month after I graduated, a friend of mine got a job as an assistant location manager on a low-budget movie shooting in New York. He said, “Hey, they need an apprentice editor on this movie. There’s no pay, but it’s probably good experience. Are you interested?” I said, “Sure.” The editor and I got along really well. He asked me if I was going to move out to LA, because that’s really where the work is. He then said, “When you get out to LA, one of my closest friends in the world is Rob Reiner’s editor, Bob Leighton. I’ll introduce the two of you.”

So that’s what I did, and this kind of ties into Once Upon a Time… in Hollywood, because when I made the move to LA, I called Bob Leighton, who invited me to lunch with his two assistants, Alan Bell and Danny Miller. We met at Musso & Frank. So the first meeting that I had was at this classic, old Hollywood restaurant. Cut to 23 years later, and I’m on the set of a movie that’s shooting at Musso & Frank. It’s a scene between Al Pacino and Leonardo DiCaprio, arguably the two greatest actors of their generations, and I’m editing it. I thought back to that meeting, and actually got kind of emotional.

So Bob’s assistants introduced me to people. That led to an internship, which led to a paying apprentice gig, which led to me getting into the union. I then spent nine years as an assistant editor before working my way up to editor.

When you were starting out, were there any particular filmmakers or editors who influenced the types of stories you wanted to tell?
Growing up, I was a big genre guy. I read Fangoria magazine and gravitated to horror, action and sci-fi. Those were the kinds of movies I made when I was in film school. So when I got out to LA, Bob Leighton got a pretty good sense as to what my tastes were, and he gave me the numbers of a couple of friends of his, Mark Goldblatt and Mark Helfrich, who are huge action/sci-fi editors. I spoke with them, and that was just a real thrill because I was so familiar with their work. Now we are all colleagues, and I pinch myself regularly.

 You have edited many action and VFX films. Has that presented particular challenges to your way of working as an editor?
The challenges, honestly, are more ones of time management because when you’re on a big visual effects movie, at a certain point in the schedule you’re spending two to four hours a day watching visual effects. Then you have to make adjustments to the edit to accommodate for how things look when the finished visual effects come in. It’s extremely time-consuming, and when you’re not only dealing with visual effects, but also making changes to the movie, you have to figure out a way to find time for all of this.

Every project has its own specific set of challenges. Yes, the big Marvel movies have a ton of visual effects, and you want to make sure that they look good. The upside is that Marvel has a lot of money, so when you want to experiment with a new visual effect or something, they’re usually able to support your ideas. You can come up with a concept while you’re sitting behind the Avid and actually get to see it become a reality. It’s very exciting.

Let’s talk about the world of Tarantino. A big part of his legacy was his longtime collaboration with editor Sally Menke, who tragically passed away. How were you then brought in? I’m assuming it has something to do with your assistant editor credit on Kill Bill?
Yes. I assisted Sally for seven years. There were a couple of movies that we worked on together, and then she brought me in for the Kill Bill movies. And that’s when I met Quentin. She taught me how an editing room is supposed to work. When she finished a scene, she would bring me and the other assistants into the room and get our thoughts. It was a welcoming, family-like environment, which I think Quentin really leaned into as well.

While he’s shooting, Quentin doesn’t come into the editing room. He comes in during post, but during production, he’s really focused on shooting the movie. On Kill Bill, I didn’t meet him until a few weeks after the shoot ended. He started coming in, and whenever he and Sally worked on a scene together, they would bring us in and get our thoughts. I learned pretty quickly that the more feedback you’re able to give, the more appreciated it will be. Quentin has said that at least part of the reason why he went with me on Django Unchained was because I was so open with my comments. Also, as the whole world knows, Quentin is a huge movie lover. We frequently would find ourselves talking about movies. He’d be walking through the hall, and we’d just strike up a conversation, and so I think he saw in me a kindred spirit. He really kept me in the family after Kill Bill.

I got my first big editing break right after Kill Bill ended. I cut a movie called Annapolis, which Justin Lin directed. I was no longer on Quentin’s crew, but we still crossed paths a lot. Over the years we’d just bump into each other at the New Beverly Cinema, the revival house that he now owns. We’d talk about whatever we’d seen lately. So he always kept me in mind. When he and Sally finished the rough cuts on Death Proof and Inglourious Basterds, he invited me to come to their small friends-and-family screenings, which was a tremendous honor.

On Django, you were working with a director who had the same collaborator in Sally Menke for such a long time. What was it like in those early days working on Django?
It was without question the most daunting experience that I have gone through in Hollywood. We’re talking about an incredibly talented editor, Sally, whose shoes I had to attempt to fill, and a filmmaker for whom I had the utmost respect.

Some of the western town stuff was shot at movie ranches just outside of LA, and we would do dailies screenings in a trailer there. I made sure that I sat near him with a list of screening notes. I really just took note of where he laughed. That was the most important thing. Whatever he laughed at, it meant that this was something that he liked. There was a PA on set when they went to New Orleans. I stayed in LA, but I asked her to write down where he laughs.

I’m a fan of his. When I went to see Reservoir Dogs, I remember walking out of the theater and thinking, “Well, that’s like the most exciting filmmaker that I’ve seen in quite some time.” Now I’m getting the chance to work with him. And I’ll say because of my fandom, I have a pretty good sense as to his style and his sense of humor. I think that that all helped me when I was in the process of putting the scenes together on Django. I was very confident in my work when I started showing him stuff on that movie.

Now, seven years later, you are on your third film with him. Have you found a different kind of rhythm working with him than you had on that first film?
I would say that a couple of little things have changed. I personally have gained some confidence in how I approach stuff with him. If there was something that I wasn’t sure was working, or that maybe I felt was extraneous, in Django, I might have had some hesitation about expressing it because I wouldn’t want to offend him. But now both of us are coming from the perspective of just wanting to make the best movie that we possibly can. I’m definitely more open than I might have been back then.

Once Upon a Time… in Hollywood has an interesting blend of styles and genres. The thing that stands out is that it is a period piece. Beyond that, you have the movies and TV shows within the movie that give you additional styles. And there is a “horror movie” scene.
Right, the Spahn Ranch sequence.

 That was so creepy! I really had that feeling the whole time of, “They can’t possibly kill off Brad Pitt’s character this early, can they?
That’s the idea. That’s what you’re supposed to be feeling.

When you are working with all of those overlapping styles, do you have to approach the work a different way?
The style of the films within the film was influenced by the movies of the era to some degree. There wasn’t anything stylistically that had us trying to make the movie itself feel like a movie from 1969. For example, Leonardo DiCaprio’s character, Rick Dalton, is playing the heavy on a western TV show called Lancer in the movie. Quentin referred to the Lancer stuff as, “Lancer is my third western, after Django and The Hateful Eight.” He didn’t direct that show as though it was a TV western from the late ’60s. He directed it like it was a Quentin Tarantino western from 2019. Quentin’s style is really all his own.

There are no rules when you’re working on a Quentin Tarantino movie because he knows everything that’s come before, and he is all about pushing the boundaries of what you can do — which is both tremendously exciting and a little scary, like is this going to work for everyone? The idea that we have a narrator who appears once in the first 10 minutes of the movie and then doesn’t appear again until the last 40 minutes, is that something that’s going to throw people off? His feeling is like, yeah, there are going to be some people out there who are going to feel that it’s weird, but they’re also going to understand it. That’s the most important thing. He’s a firm believer in doing whatever we need to do to tell the story as clearly and as concisely as possible. That voiceover narration serves that purpose. Weird or not.

You said before that he doesn’t come into the edit during production. What is your work process during production? Are you beginning the rough cut? And if so, are you sending him things, or are you really not collaborating with him on that process at all until post begins?
This movie was shot in LA, so for the first half of the shoot, we would do regular dailies screenings. I’d sit next to him and write down whatever he laughed at. That process that began on Django has continued. Then I’ll take those notes. Then I assemble the material as we’re shooting, but I don’t show him any of it. I’m not sending him cuts. He doesn’t want to see cuts. I don’t think he wants the distractions of needing to focus on editing.

On this movie, there were only two occasions when he did come into the editing room during production. The movie takes place over the course of three days, and at the end of the second day, the characters are watching Rick on the TV show The F.B.I., which was a real show and that episode was called “All the Streets Are Silent.” The character of Michael Murtaugh was played in the original episode by a young Burt Reynolds. They found a location that matched pretty perfectly and reshot only the shots that had Burt Reynolds in them. They reshot with Leonardo DiCaprio, as Rick Dalton, playing that character. He had to come into the editing room to see how it played and how it matched, and it matched remarkably well. I think that people watching the movie probably assume that Quentin shot the whole thing, or that we used some CG technology to get Leo into the shots. But no, they just figured out exactly the shots that they needed to shoot, and that was all the new material. The rest was from the original episode.

 The other time he came into the edit during production was the sequence in which Bruce Lee and Cliff have their fight. The whole dialogue scene that opens that sequence, it all plays out in one long take. So he was very excited to see how that shot played out. But one of the things that we had spoken about over the course of working together is when you do a long take, the most important thing is what that cut is going to be at the end of the long take. How can we make that cut the most impactful? In this case, the cut is to Cliff throwing Bruce Lee into the car. He wanted to watch the whole scene play out, and then see how that cut worked. When I showed it to him, I had my finger on the stop button so that after that cut, I would stop it so he wouldn’t see anything more and wouldn’t get tempted to get sucked into maybe giving notes. I reached to stop, but he was like, “No, no, no let it play out.” He watched the fight scene, and he was like, “That’s fantastic.” He was very happy.

Once you were in post, what were some of the particular challenges of this film?
One of the really important things is how integral sound was to the process of making this movie. First there were the movies and shows within the movie. When we’re watching the scenes from Bounty Law, the ‘50s Western that Rick starred in, it wasn’t just about the 4×3, black and white photography, but also how we treated the sound. Our sound editorial team and our sound mixing team did an amazing job of getting that stuff to sound like a 16-millimeter print. Like, they put just the right amount of warble into the dialogue, and it makes it feel very authentic. Also, all the Bounty Law stuff is mono, not this wide stereo thing that would not be appropriate for the material from that era.

And I mentioned the Spahn Ranch sequence, when for 20 minutes the movie turns into an all-out horror movie. One of Quentin’s rules for me when I’m putting my assembly together is that he generally does not want me cutting with music. He frequently has specific ideas in his head about what the music is going to be, and he doesn’t want to see something that’s not the way he imagined it. That’s going to take him out of it, and he won’t be able to enjoy the sequence.

When I was putting the Spahn Ranch sequence together, I knew that I had to make it suspenseful without having music to help me. So, I turned to our sound editors, Wylie Stateman and Leo Marcil, and said, “I want this to sound like The Texas Chain Saw Massacre, like I want to have low tones and creaking wood and metal wronks. Let’s just feel the sense of dread through this sequence.” They really came through.

And what ended up happening is, I don’t know if Quentin’s intention originally was to play it without music, but ultimately all the music in the scene comes from what Dakota Fanning’s character, Squeaky, is watching on the TV. Everything else is just sound effects, which were then mixed into the movie so beautifully by Mike and Chris Minkler. There’s just a terrific sense of dread to that sequence, and I credit the sound effects as much as I do the photography.

This film was cut on Avid. Have you always cut on Avid? Do you ever cut on anything else?
When I was in film school, I cut on film. If fact, I took the very first Avid class that NYU offered. That was my junior year, which was long before there were such things as film options or anything. It was really just kind of the basics, a basic Avid Media Composer.

I’ve worked on Final Cut Pro a few times. That’s really the only other nonlinear digital editing system that I’ve used. I’ve never actually used Premiere.

At this point my whole sound effects and music library is Avid-based, and I’m just used to using the Avid. I have a keyboard where all of my keys are mapped, and I find, at this point, that it’s very intuitive for me. I like working with it.

This movie was shot on film, and we printed dailies from the negative. But the negative was also scanned in at 4K, and then those 4K scans were down-converted to DNx115, which is an HD resolution on the Avid. So we were editing in HD, and we could do screenings from that material when we needed to. But we would also do screenings on film.

Wow, so even with your rough cuts, you were turning them around to film cuts again?
Yeah. Once production ended, and Quentin came into the editing room, when we refined a scene to his liking, I would immediately turn that over to my Avid assistant, Chris Tonick. He would generate lists from that cut and would turn it over to our film assistants, Bill Fletcher and Andrew Blustain. They would conform the film print to match the edit that we had in the Avid so that we were capable of screening the movie on film whenever we wanted to. There was always going to be a one- or two-day lag time, depending on when we finished cutting on the Avid. But we were able to get it up there pretty quickly.

Sometimes if you have something like opticals or titles, you wouldn’t be able to generate those for film quickly enough. So if we wanted to screen something immediately, we would have to do it digitally. But as long as we had a couple of days, we would be able to put it up on film, and we did end up doing one of our test screenings on 35 millimeter, which was really great. It added one more layer of authenticity to the movie, getting to see it projected on film.

For a project of this scope, how many assistants do you work with, and how do you like to work with those assistants?
Our team consists of post production supervisor Tina Anderson, who really oversees everything. She runs the editing room. She figures out what we’re going to need. She’s got this long list of items that she goes down every day, and makes sure that we are prepared for whatever is going to come our way. She’s really remarkable.

My first assistant Chris Tonick is the Avid assistant. He cut a handful of scenes during production, and I would occasionally ask him to do some sound work. But primarily during production, he was getting the dailies prepped — getting them into the Avid for me and laying out my bins the way I like them.

In post, we added an Avid second named Brit DeLillo, who would help Chris when we needed to do turnovers for sound or visual effects, music, all of those people.

Then we had our film crew, Bill Fletcher and Andrew Blustain. They were syncing dailies during production, and then they were conforming the film print during post.

Last, but certainly not least, we had Alana Feldman, our post PA, who made sure we had everything we needed.

And honestly, for everybody on the crew, their most important role beyond the work that they were hired to do, was to be an audience member for us whenever we finished a scene. That tradition I experienced as an assistant working under Sally is the tradition that we’ve continued. Whenever we finish a sequence, we bring the whole crew up and show them the scene. We want people to react. We want to hear how they’re responding. We want to know what’s working and what isn’t working. Being good audience members is actually a key part of the job.

L-R: Quentin Tarantino, post supervisor Tina Anderson, first assistant editor (Film) Bill Fletcher, Fred Raskin, 2nd assistant editor (Film) Andrew Blustain, 2nd assistant editor (Avid) Brit DeLillo, post assistant Alana Feldman, producer Shannon McIntosh, 1st assistant editor (Avid) Chris Tonick, assistant to producer Ryan Jaeger and producer David Heyman

When you’re looking for somebody to join your team as an assistant, what are you looking for?
There are a few things. One obvious thing, right off the bat, is someone who is personable. Is this someone I’m going to want to have lunch with every day for months on end? Generally, especially working on a Quentin Tarantino movie, somebody with a good knowledge of film history who has a love of movies is going to be appreciated in that environment.

The other thing that I would say honestly  — and this might sound funny — is having the ability to see the future. And I don’t mean that I need psychic film assistants. I mean they need to be able to figure out what we’re going to need later on down the line and be prepared for it.

If I turn over a sequence, they should be looking at it and realizing, oh, there are some visual effects in here that we’re going to have to address, so we have to alert the visual effects companies about this stuff, or at least ask me if it’s something that I want.

If there were somebody who thought to themselves, “I want a career like Fred Raskin’s. I want to edit these kinds of cool films,” what advice would you give them as they’re starting out?
I have three standard pieces of advice that I give to everyone. My experience, I think, is fairly unique. I’ve been incredibly fortunate to get to work with some of my favorite filmmakers. The way my story unfolded … not everybody is going to have the opportunities I’ve had.

But my standard pieces of advice are, number one — and I mentioned this earlier — be personable. You’re working with people you’re going to share space with for many months on end. You want to be the kind of person with whom they’re going to want to spend time. You want to be able to get along with everyone around you. And you know, sometimes you’ve got some big personalities to deal with, so you have to be the type who can navigate that.

Then I would say, watch everything you possibly can. Quentin is obviously an extreme example, but most filmmakers got into this business because they love movies. And so the more you know about movies, and the more you’re able to talk about movies, the more those filmmakers are going to respect you and want to work with you. This kind of goes hand in hand with being personable.

The other piece of advice — and I know this sounds like a no-brainer — if you’re going for an interview with a filmmaker, make sure you’ve familiarized yourself with that person’s work. Be able to talk with them about their movies. They’re going to appreciate that you took the time to explore their work. Everybody wants to talk about the work they’ve done, so if you’re able to engage them on that level, I think it’s going to reflect well on you.

Absolutely. That’s great advice.


Amy Leland is a film director and editor. Her short film, “Echoes”, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Harbor expands to LA and London, grows in NY

New York-based Harbor has expanded into Los Angeles and London and has added staff and locations in New York. Industry veteran Russ Robertson joins Harbor’s new Los Angeles operation as EVP of sales, features and episodic after a 20-year career with Deluxe and Panavision. Commercial director James Corless and operations director Thom Berryman will spearhead Harbor’s new UK presence following careers with Pinewood Studios, where they supported clients such as Disney, Netflix, Paramount, Sony, Marvel and Lucasfilm.

Harbor’s LA-based talent pool includes color grading from Yvan Lucas, Elodie Ichter, Katie Jordan and Billy Hobson. Some of the team’s projects include Once Upon a Time … in Hollywood, The Irishman, The Hunger Games, The Maze Runner, Maleficent, The Wolf of Wall Street, Snow White and the Huntsman and Rise of the Planet of the Apes.

Paul O’Shea, formerly of MPC Los Angeles, heads the visual effects teams, tapping lead CG artist Yuichiro Yamashita for 3D out of Harbor’s Santa Monica facility and 2D creative director Q Choi out of Harbor’s New York office. The VFX artists have worked with brands such as Nike, McDonald’s, Coke, Adidas and Samsung.

Harbor’s Los Angeles studio supports five grading theaters for feature film, episodic and commercial productions, offering private connectivity to Harbor NY and Harbor UK, with realtime color-grading sessions, VFX reviews and options to conform and final-deliver in any location.

The new UK operation, based out of London and Windsor, will offer in-lab and near-set dailies services along with automated VFX pulls and delivery through Harbor’s Anchor system. The UK locations will draw from Harbor’s US talent pool.

Meanwhile, the New York operation has grown its talent roster and Soho footprint to six locations, with a recently expanded offering for creative advertising. Veteran artists on the commercial team include editors Bruce Ashley and Paul Kelly, VFX supervisor Andrew Granelli, colorist Adrian Seery, and sound mixers Mark Turrigiano and Steve Perski.

Harbor’s feature and episodic offering continues to expand, with NYC-based artists available in Los Angeles and London.

Digital Arts expands team, adds Nutmeg Creative talent

Digital Arts, an independently owned New York-based post house, has added several former Nutmeg Creative talent and production staff members to its roster — senior producer Lauren Boyle, sound designer/mixers Brian Beatrice and Frank Verderosa, colorist Gary Scarpulla, finishing editor/technical engineer Mark Spano and director of production Brian Donnelly.

“Growth of talent, technology, and services has always been part of the long-term strategy for Digital Arts, and we’re fortunate to welcome some extraordinary new talent to our staff,” says Digital Arts owner Axel Ericson. “Whether it’s long-form content for film and television, or working with today’s leading agencies and brands creating dynamic content, we have the talent and technology to make all of our clients’ work engaging, and our enhanced services bring their creative vision to fruition.”

Brian Donnelly, Lauren Boyle and Mark Spano.

As part of this expansion, Digital Arts will unveil additional infrastructure featuring an ADR stage/mix room. The current facility boasts several state-of-the-art audio suites, a 4K finishing theater/mixing dubstage, four color/finishing suites and expansive editorial and production space, which is spread over four floors.

The former Nutmeg team has hit the ground running working their long-time ad agency, network, animation and film studio clients. Gary Scarpulla worked on color for HBO’s Veep and Los Espookys, while Frank Verderosa has been working with agency Ogilvy on several Ikea campaigns. Beatrice mixed spots for Tom Ford’s cosmetics line.

In addition, Digital Arts’ in-house theater/mixing stage has proven to be a valuable resource for some of the most popular TV productions, including recording recent commentary sessions for the legendary HBO series, Game of Thrones and the final season of Veep.

Especially noteworthy is colorist Ericson’s and finishing editor Mark Spano’s collaboration with Oscar-winning directors Karim Amer and Jehane Noujaim to bring to fruition the Netflix documentary The Great Hack.

Digital Arts also recently expanded its offerings to include production services. The company has already delivered projects for agencies Area 23, FCB Health and TCA.

“Digital Arts’ existing infrastructure was ideally suited to leverage itself into end-to-end production,” Donnelly says. “Now we can deliver from shoot to post.”

Tools employed across post are Avid Pro Tools, D Control ES, S3 for audio post and Avid Media Composer, Adobe Premiere and Blackmagic Resolve for editing. Color grading is via Resolve.

Main Image: (L-R) Frank Verderosa, Brian Beatrice and Gary Scarpulla

 

Blackmagic: Resolve 16.1 in public beta, updates Pocket Cinema Camera

Blackmagic Design has announced DaVinci Resolve 16.1, an updated version of its edit, color, visual effects and audio post software that features updates to the new cut page, further speeding up the editing process.

With Resolve 16, introduced at NAB 2019, now in final release, the Resolve 16.1 public beta is now available for download from the Blackmagic Design website. This new public beta will help Blackmagic continue to develop new ideas while collaborating with users to ensure those ideas are refined for real-world workflows.

The Resolve 16.1 public beta features changes to the bin that now make it possible to place media in various folders and isolate clips from being used when viewing them in the source tape, sync bin or sync window. Clips will appear in all folders below the current level, and as users navigate around the levels in the bin, the source tape will reconfigure in real time. There’s even a menu for directly selecting folders in a user’s project.

Also new in this public beta is the smart indicator. The new cut page in DaVinci Resolve 16 introduced multiple new smart features, which work by estimating where the editor wants to add an edit or transition and then applying it without the editor having to waste time placing exact in and out points. The software guesses what the editor wants to do and just does it — it adds the inset edit or transition to the edit closest to where the editor has placed the CTI.

But a problem can arise in complex edits, where it is hard to know what the software would do and which edit it would place the effect or clip into. That’s the reason for the beta version’s new smart indicator. The smart indicator provides a small marker in the timeline so users get constant feedback and always know where DaVinci Resolve 16.1 will place edits and transitions. The new smart indicator constantly live-updates as the editor moves around the timeline.

One of the most common items requested by users was a faster way to cut clips in the timeline, so now DaVinci Resolve 16.1 includes a “cut clip” icon in the user interface. Clicking on it will slice the clips in the timeline at the CTI point.

Multiple changes have also been made to the new DaVinci Resolve Editor Keyboard, including a new adaptive scroll feature on the search dial, which will automatically slow down a job when editors are hunting for an in point. The live trimming buttons have been renamed to the same labels as the functions in the edit page, and they have been changed to trim in, trim out, transition duration, slip in and slip out. The function keys along the top of the keyboard are now being used for various editing functions.

There are additional edit models on the function keys, allowing users to access more types of editing directly from dedicated keys on the keyboard. There’s also a new transition window that uses the F4 key, and pressing and rotating the search dial allows instant selection from all the transition types in DaVinci Resolve. Users who need quick picture picture-in in-picture effects can use F5 and apply them instantly.

Sometimes when editing projects with tight deadlines, there is little time to keep replaying the edit to see where it drags. DaVinci Resolve 16.1 features something called a Boring Detector that highlights the timeline where any shot is too long and might be boring for viewers. The Boring Detector can also show jump cuts, where shots are too short. This tool allows editors to reconsider their edits and make changes. The Boring Detector is helpful when using the source tape. In that case, editors can perform many edits without playing the timeline, so the Boring Detector serves as an alternative live source of feedback.

Another one of the most requested features of DaVinci Resolve 16.1 is the new sync bin. The sync bin is a digital assistant editor that constantly sorts through thousands of clips to find only what the editor needs and then displays them synced to the point in the timeline the editor is on. The sync bin will show the clips from all cameras on a shoot stacked by camera number. Also, the viewer transforms into a multi-viewer so users can see their options for clips that sync to the shot in the timeline. The sync bin uses date and timecode to find and sync clips, and by using metadata and locking cameras to time of day, users can save time in the edit.

According to Blackmagic, the sync bin changes how multi-camera editing can be completed. Editors can scroll off the end of the timeline and keep adding shots. When using the DaVinci Resolve Editor Keyboard, editors can hold the camera number and rotate the search dial to “live overwrite” the clip into the timeline, making editing faster.

The closeup edit feature has been enhanced in DaVinci Resolve 16.1. It now does face detection and analysis and will zoom the shot based on face positioning to ensure the person is nicely framed.

If pros are using shots from cameras without timecode, the new sync window lets them sort and sync clips from multiple cameras. The sync window supports sync by timecode and can also detect audio and sync clips by sound. These clips will display a sync icon in the media pool so editors can tell which clips are synced and ready for use. Manually syncing clips using the new sync window allows workflows such as multiple action cameras to use new features such as source overwrite editing and the new sync bin.

Blackmagic Pocket Cinema Camera
Besides releasing the DaVinci Resolve 16.1 public beta, Blackmagic also updated the Blackmagic Pocket Cinema Camera. Blackmagic not only upgraded the camera from 4K to 6K resolution, but it changed the mount to the much used Canon EF style. Previous iterations of the Pocket Cinema Camera used a Micro 4/3s mount, but many users chose to purchase a Micro 4/3s-to-Canon EF adapter, which easily runs over $500 new. Because of the mount change in the Pocket Cinema Camera 6K, users can avoid buying the adapter and — if they shoot with Canon EF — can use the same lenses.

Speed controls now available in Premiere Rush V.1.2

Adobe has added a new panel in Premiere Rush called Speed, which allows users to manipulate the speed of their footage while maintaining control over the audio pitch, range, ramp speed and duration of the edited clip. Adobe’s Premiere Rush teams say speed control has been the most requested feature by users.

Basic speed adjustments: A clip’s speed is displayed as a percentage value, with 100% being realtime. Values below 100% result in slow motion, and values above 100% create fast motion. To adjust the speed, users simply open the speed panel, select “Range Speed” and drag the slider. Or they can tap on the speed percentage next to the slider and enter a specific value.

Speed ranges: Speed ranges allow users to adjust the speed within a specific section of a clip. To create a range, users drag the blue handles on the clip in the timeline or in the speed panel under “Range.” The speed outside the range is 100%, while speed inside the range is adjustable.

Ramping: Rush’s adjustable speed ramps make it possible to progressively speed up or slow down into or out of a range. Ramping helps smooth out speed changes that might otherwise seem jarring.

Duration adjustments: For precise control, users can manually set a clip’s duration. After setting the duration, Rush will do the math and adjust the clip speed to the appropriate value — a feature that is especially useful for time lapses.

Maintain Pitch: Typically, speeding up footage will raise the audio’s pitch (think mouse voice), while slowing down footage will lower it (think deep robot voice). Maintain Pitch in the speed panel takes care of the problem by preserving the original pitch of the audio at any speed.

As with everything in Rush, speed adjustments will transfer seamlessly when opening a Rush project in Premiere Pro.

The Umbrella Academy‘s Emmy-nominated VFX supe Everett Burrell

By Iain Blair

If all ambitious TV shows with a ton of visual effects aspire to be cinematic, then Netflix’s The Umbrella Academy has to be the gold standard. The acclaimed sci-fi, superhero, adventure mash-up was just Emmy-nominated for its season-ending episode “The White Violin,” which showcased a full range of spectacular VFX. This included everything from the fully-CG Dr. Pogo to blowing up the moon and a mansion to the characters’ varied superpowers. Those VFX, mainly created by movie powerhouse Weta Digital in New Zealand and Spin VFX in Toronto, indeed rival anything in cinema. This is partly thanks to Netflix’s 4K pipeline.

The Umbrella Academy is based on the popular, Eisner Award-winning comics and graphic novels created and written by Gerard Way (“My Chemical Romance”), illustrated by Gabriel Bá, and published by Dark Horse Comics.

The story starts when, on the same day in 1989, 43 infants are born to unconnected women who showed no signs of pregnancy the day before. Seven are adopted by Sir Reginald Hargreeves, a billionaire industrialist, who creates The Umbrella Academy and prepares his “children” to save the world. But not everything went according to plan. In their teenage years, the family fractured and the team disbanded. Now, six of the surviving members reunite upon the news of Hargreeves’ death. Luther, Diego, Allison, Klaus, Vanya and Number Five work together to solve a mystery surrounding their father’s death. But the estranged family once again begins to come apart due to divergent personalities and abilities, not to mention the imminent threat of a global apocalypse.

The live-action series stars Ellen Page, Tom Hopper, Emmy Raver-Lampman, Robert Sheehan, David Castañeda, Aidan Gallagher, Cameron Britton and Mary J. Blige. It is produced by Universal Content Productions for Netflix. Steve Blackman (Fargo, Altered Carbon) is the executive producer and showrunner, with additional executive producers Jeff F. King, Bluegrass Television, and Mike Richardson and Keith Goldberg from Dark Horse Entertainment.

Everett Burrell

I spoke with senior visual effects supervisor and co-producer Everett Burrell (Pan’s Labyrinth, Altered Carbon), who has an Emmy for his work on Babylon 5, about creating the VFX and the 4K pipeline.

Congratulations on being nominated for the first season-ending episode “The White Violin,” which showcased so many impressive visual effects.
Thanks. We’re all really proud of the work.

Have you started season two?
Yes, and we’re already knee-deep in the shooting up in Canada. We shoot in Toronto, where we’re based, as well as Hamilton, which has this great period look. So we’re up there quite a bit. We’re just back here in LA for a couple of weeks working on editorial with Steve Blackman, the executive producer and showrunner. Our offices are in Encino, in a merchant bank building. I’m a co-producer as well, so I also deal a lot with editorial — more than normal.

Have you planned out all the VFX for the new season?
To a certain extent. We’re working on the scripts and have a good jump on them. We definitely plan to blow the first season out of the water in terms of what we come up with.

What are the biggest challenges of creating all the VFX on the show?
The big one is the sheer variety of VFX, which are all over the map in terms of the various types. They go from a completely animated talking CG chimpanzee Dr. Pogo to creating a very unusual apocalyptic world, with scenes like blowing up the moon and, of course, all the superpowers. One of the hardest things we had to do — which no one will ever know just watching it — was a ton of leaf replacement on trees.

Digital leaves via Montreal’s Folks.

When we began shooting, it was winter and there were no leaves on the trees. When we got to editorial we realized that the story spans just eight days, so it wouldn’t make any sense if in one scene we had no leaves and in the next we had leaves. So we had to add every single leaf to the trees for all of the first five episodes, which was a huge amount of work. The way we did it was to go back to all the locations and re-shoot all the trees from the same angles once they were in bloom. Then we had to composite all that in. Folks in Montreal did all of it, and it was very complicated. Lola did a lot of great work on Hargreeves, getting his young look for the early 1900s and cleaning up the hair and wrinkles and making it all look totally realistic. That was very tricky too.

Netflix is ahead of the curve thanks to its 4K policy. Tell us about the pipeline.
For a start, we shoot with the ARRI Alexa 65, which is a very robust cinema camera that was used on The Revenant. With its 65mm sensor, it’s meant for big-scope, epic movies, and we decided to go with it to give our show that great cinema look. The depth of field is like film, and it can also emulate film grain for this fantastic look. That camera shoots natively at 5K — it won’t go any lower. That means we’re at a much higher resolution than any other show out there.

And you’re right, Netflix requires a 4K master as future-proofing for streaming and so on. Those very high standards then trickle down to us and all the VFX. We also use a very unique system developed by Deluxe and Efilm called Portal, which basically stores the entire show in the cloud on a server somewhere, and we can get background plates to the vendors within 10 minutes. It’s amazing. Back in the old days, you’d have to make a request and maybe within 24 or 48 hours, you’d get those plates. So this system makes it almost instantaneous, and that’s a lifesaver.

   
Method blows up the moon.

How closely do you work with Steve Blackman and the editors?
I think Steve said it best:”There’s no daylight between the two of us” We’re linked at the hip pretty much all the time. He comes to my office if he has issues, and I go to his if we have complications; we resolve all of it together in probably the best creative relationship I’ve ever had. He relies on me and counts on me, and I trust him completely. Bottom line, if we need to write ourselves out of a sticky situation, he’s also the head writer, so he’ll just go off and rewrite a scene to help us out.

How many VFX do you average for each show?
We average between 150 and 200 per episode. Last season we did nearly 2,000 in total, so it’s a huge amount for a TV show, and there’s a lot of data being pushed. Luckily, I have an amazing team, including my production manager Misato Shinohara. She’s just the best and really takes care of all the databases, and manages all the shot data, reference, slates and so on. All that stuff we take on set has to go into this massive database, and just maintaining that is a huge job.

Who are the main VFX vendors?
The VFX are mainly created by Weta in New Zealand and Spin VFX in Toronto. Weta did all the Pogo stuff. Then we have Folks, Lola, Marz, Deluxe Toronto, DigitalFilm Tree in LA… and then Method Studios in Vancouver did great work on our end-of-the-world apocalyptic sequence. They blew up the moon and had a chunk of it hitting the Earth, along with all the surrounding imagery. We started R&D on that pretty early to get a jump on it. We gave them storyboards and they did previz. We used that as a cut to get iterations of it all. There were a lot of particle simulations, which was pretty intense.

Weta created Dr. Pogo

What have been the most difficult VFX sequences to create?
Just dealing with Pogo is obviously very demanding, and we had to come up with a fast shortcut to dealing with the photo-real look as we just don’t have the time or budget they have for the Planet of the Apes movies. The big thing is integrating him in the room as an actor with the live actors, and that was a huge challenge. We used just two witness cameras to capture our Pogo body performer. All the apocalyptic scenes were also very challenging because of the scale, and then those leaves were very hard to do and make look real. That alone took us a couple of months. And we might have the same problem this year, as we’re shooting in the summer through fall, and I’m praying that the leaves don’t start falling before we wrap.

What have been the main advances in technology that have really helped you pull off some of the show’s VFX?
I think the rendering and the graphics cards are the big ones, and the hardware talks together much more efficiently now. Even just a few years ago, and it might have taken weeks and weeks to render a Pogo. Now we can do it in a day. Weta developed new software for creating the texture and fabric of Pogo’s clothes. They also refined their hair programs.

 

I assume as co-producer that you’re very involved with the DI?
I am… and keeping track of all that and making sure we keep pushing the envelope. We do the DI at Company 3 with colorist Jill Bogdanowicz, who’s a partner in all of this. She brings so much to the show, and her work is a big part of why it looks so good. I love the DI. It’s where all the magic happens, and I get in there early with Jill and take care of the VFX tweaks. Then Steve comes in and works on contrast and color tweaks.By the time Steve gets there, we’re probably 80% of the way there already.

What can fans expect from season two?
Bigger, better visual effects. We definitely pay attention to the fans. They love the graphic novel, so we’re getting more of that into the show.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.