OWC 12.4

Category Archives: Editing

Talking with Franki Ashiruka of Nairobi’s Africa Post Office

By Randi Altman

After two decades of editing award-winning film and television projects for media companies throughout Kenya and around the world, Franki Ashiruka opened Africa Post Office, a standalone, post house in Nairobi, Kenya. The studio provides color grading, animation, visual effects, motion graphics, compositing and more. In addition, they maintain a database of the Kenyan post production community that allows them to ramp up with the right artists when the need arises.

Here she talks about the company, its workflow and being a pioneer in Nairobi’s production industry.

When did you open Africa Post Office, and what was your background prior to starting this studio?
Africa Post Office (APO) opened its doors in February 2017. Prior to starting APO, I was a freelance editor with plenty of experience working with well-established media houses such as Channel 4 (UK), Fox International Channels (UK), 3D Global Leadership (Nigeria), PBS (USA), Touchdown (New Zealand), Greenstone Pictures (New Zealand) and Shadow Films (South Africa).

In terms of Kenya-based projects, I’ve worked with a number of production houses including Quite Bright Films, Fat Rain Films, Film Crew in Africa, Mojo Productions, Multichoice, Zuku, Content House and Ginger Ink Films.

I imagine female-run, independent studios in Africa are rare?
On the contrary, Kenya has reached a point where more and more women are emerging as leaders of their own companies. I actually think there are more women-led film production companies than male-led. The real challenge was that before APO, there was nothing quite like it in Nairobi. Historically, video production here was very vertical — if you shot something, you’d need to also manage post within whatever production house you were working in. There were no standalone post houses until us. That said, with my experience, even though hugely daunting, I never thought twice about starting APO. It is what I have always wanted to do, and if being the first company of our kind didn’t intimidate me, being female was never going to be a hindrance.

L-R: Franki Ashiruka, Kevin Kyalo, Carole Kinyua and Evans Wenani

What is the production and post industry like in Nairobi? 
When APO first opened, the workload was commercial-heavy, but in the last two years that has steadily declined. We’re seeing this gap filled by documentary films, corporate work and television series. Feature films are also slowly gaining traction and becoming the focus of many up-and-coming filmmakers.

What services do you provide, and what types of projects do you work on?
APO has a proven track record of successful delivery on hundreds of film and video projects for a diverse range of clients and collaborators, including major corporate entities, NGOs, advertising and PR agencies, and television stations. We also have plenty of experience mastering according to international delivery standards. We’re proud to house a complete end-to-end post ecosystem of offline and online editing suites.

Most importantly, we maintain a very thorough database of the post production community in Kenya.
This is of great benefit to our clients who come to us for a range of services including color grading, animation, visual effects, motion graphics and compositing. We are always excited to collaborate with the right people and get additional perspectives on the job at hand. One of our most notable collaborators is Ikweta Arts (Avatar, Black Panther, Game of Thrones, Hacksaw Ridge), owned and run by Yvonne Muinde. They specialize in providing VFX services with a focus in quality matte painting/digital environments, art direction, concept and post visual development art. We also collaborate with Keyframe (L’Oréal, BMW and Mitsubishi Malaysia) for motion graphics and animations.

Can you name some recent projects and the work you provided?
We are incredibly fortunate to be able to select projects that align with our beliefs and passions.

Our work on the short film Poacher (directed by Tom Whitworth) won us three global Best Editing Awards from the Short to the Point Online Film Festival (Romania, 2018), Feel the Reel International Film Festival (Glasgow, 2018) and Five Continents International Film Festival (Venezuela, 2019).

Other notable work includes three feature documentaries for the Big Story segment on China Global Television Network, directed by Juan Reina (director of the Netflix Original film Diving Into the Unknown), Lion’s Den (Quite Bright Films) an adaptation of ABC’s Shark Tank and The Great Kenyan Bake Off (Showstopper Media) adopted from the BBC series The Great British Bake Off. We also worked on Disconnect, a feature film produced by Kenya’s Tosh Gitonga (Nairobi Half Life), a director who is passionate about taking Africa’s budding film industry to the next level. We have also worked on a host of television commercials for clients extending across East Africa, including Kenya, Rwanda, South Sudan and Uganda.

What APO is most proud of though, is our clients’ ambitions and determination to contribute toward the growth of the African film industry. This truly resonates with APO’s mantra.

You recently added a MAM and some other gear. Can you talk about the need to upgrade?
Bringing on the EditShare EFS 200 nodes has significantly improved the collaborative possibilities of APO. We reached a point where we were quickly growing, and the old approach just wasn’t going to cut it.

Prior to centralizing our content, projects lived on individual hard disks. This meant that if I was editing and needed my assistant to find me a scene or a clip, or I needed VFX on something, I would have to export individual clips to different workstations. This created workflow redundancies and increased potential for versioning issues, which is something we couldn’t afford to be weighed down with.

The remote capabilities of the EditShare system were very appealing as well. Our color grading collaborator, Nic Apostoli of Comfort and Fame, is based in Cape Town, South Africa. From there, he can access the footage on the server and grade it while the client reviews with us in Nairobi. Flow media asset management also helps in this regard. We’re able to effectively organize and index clips, graphics, versions, etc. into clearly marked folders so there is no confusion about what media should be used. Collaboration among the team members is now seamless regardless of their physical location or tools used, which include the Adobe Creative Suite, Foundry Nuke, Autodesk Maya and Maxon Cinema 4D.

Any advice for others looking to break out on their own and start a post house?
Know what you want to do, and just do it! Thanks Nike …


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

ACE Eddie Awards: Parasite and Jojo Rabbit among winners

By Dayna McCalllum

At the 70th Annual ACE Eddie Awards, Parasite (edited by Jinmo Yang) won Best Edited Feature Film (Dramatic) and Jojo Rabbit (edited by Tom Eagles) won for Best Edited Feature Film (Comedy). The ACE Eddies recognized the best editing of 2019 in 11 categories of film, television and documentaries.  This marks the first time in ACE Eddie Awards history that a foreign language film won the top prize.

American Cinema Editors president Stephen Rivkin, ACE, presided over the evening’s festivities with actress D’Arcy Carden, star of NBC’s The Good Place, serving as the evening’s host.

Here is the list of winners:

BEST EDITED FEATURE FILM (DRAMA):
Parasite 
Jinmo Yang

Tom Eagles – Jojo Rabbit

BEST EDITED FEATURE FILM (COMEDY):
Jojo Rabbit
Tom Eagles

BEST EDITED ANIMATED FEATURE FILM:
Toy Story 4
Axel Geddes, ACE

BEST EDITED DOCUMENTARY (FEATURE):
Apollo 11
Todd Douglas Miller

BEST EDITED DOCUMENTARY (NON-THEATRICAL):
What’s My Name: Muhammad Ali
Jake Pushinsky, ACE

BEST EDITED COMEDY SERIES FOR COMMERCIAL TELEVISION:
Better Things: “Easter”
Janet Weinberg, ACE

BEST EDITED COMEDY SERIES FOR NON-COMMERCIAL TELEVISION:
Fleabag: “Episode 2.1”
Gary Dollner, ACE

BEST EDITED DRAMA SERIES FOR COMMERCIAL TELEVISION: 
Killing Eve: “Desperate Times”
Dan Crinnion

BEST EDITED DRAMA SERIES FOR NON-COMMERCIAL TELEVISION:
Game of Thrones: “The Long Night”
Tim Porter, ACE

BEST EDITED MINISERIES OR MOTION PICTURE FOR TELEVISION:
Chernobyl: “Vichnaya Pamyat”
Jinx Godfrey & Simon Smith

BEST EDITED NON-SCRIPTED SERIES:
VICE Investigates: “Amazon on Fire”
Cameron Dennis, Kelly Kendrick, Joe Matoske, Ryo Ikegami

ANNE V. COATES AWARD FOR STUDENT EDITING
Chase Johnson – California State University, Fullerton


Main Image: Parasite editor Jinmo Yang

OWC 12.4

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.


Review: HP’s ZBook G6 mobile workstation

By Brady Betzel

In a year that’s seen AMD reveal an affordable 64-core processor with its Threadripper 3, it appears as though we are picking up steam toward next-level computing.

Apple finally released its much-anticipated Mac Pro (which comes with a hefty price tag for the 1.5TB upgrade), and custom-build workstation companies — like Boxx and Puget Systems — can customize good-looking systems to fit any need you can imagine. Additionally, over the past few months, I have seen mobile workstations leveling the playing field with their desktop counterparts.

HP is well-known in the M&E community for its powerhouse workstations. Since I started my career, I have either worked on a MacPro or an HP. Both have their strong points. However, workstation users who must be able to travel with their systems, there have always been some technical abilities you had to give up in exchange for a smaller footprint. That is, until now.

The newly released HP ZBook 15 G6 has become the rising the rising tide that will float all the boats in the mobile workstation market. I know I’ve said it before, but the classification of “workstation” is technically much more than just a term companies just throw around. The systems with workstation-level classification (at least from HP) are meant to be powered on and run at high levels 24 hours a day, seven days a week, 365 days a year.

They are built with high-quality, enterprise-level components, such as ECC (error correcting code) memory. ECC memory will self-correct errors that it sees, preventing things like blue screens of death and other screen freezes. ECC memory comes at a cost, and that is why these workstations are priced a little higher than a standard computer system. In addition, the warranties are a little more inclusive — the HP ZBook 15 G6 comes with a standard three-year/on-site service warranty.

Beyond the “workstation” classification, the ZBook 15 G6 is amazingly powerful, brutally strong and incredibly colorful and bright. But what really matters is under the hood. I was sent the HP ZBook 15 G6 that retails for $4,096 and contains the following specs:
– Intel Xeon E-2286M (eight cores/16 threads — 2.4GHz base/5GHz Turbo)
– Nvidia Quadro RTX 3000 (6GB VRAM)
15.6-inch UHD HP Dream Color display, anti-glare, WLED backlit 600 nits, 100% DCI-P3
– 64GB DDR4 2667MHz
– 1TB PCIe Gen 3 x4 NVMe SSD TLC
– FHD webcam 1080p plus IR camera
– HP collaboration keyboard with dual point stick
– Fingerprint sensor
– Smart Card reader
– Intel Wi-Fi 6 AX 200, 802.11ac 2×2 +BT 4.2 combo adapter (vPro)
– HP long-life battery four-cell 90 Wh
– Three-year limited warranty

The ZBook 15 G6 is a high-end mobile workstation with a price that reflects it. However, as I said earlier, true workstations are built to withstand constant use and, in this case, abuse. The ZBook 15 G6 has been designed to pass up to 21 extensive MIL-STD 810G tests, which is essentially worst-case scenario testing. For instance, drop testing of around four feet, sand and dust testing, radiation testing (the sun beating down on the laptop for an extended period) and much more.

The exterior of the G6 is made of aluminum and built to withstand abuse. The latest G6 is a little bulky/boxy, in my opinion, but I can see why it would hold up to some bumps and bruises, all while working at blazingly fast speeds, so bulk isn’t a huge issue for me. Because of that bulk, you can imagine that this isn’t the lightest laptop either. It weighs in at 5.79 pounds for the lowest end and measures 1 inch by 14.8 inches by 10.4 inches.

On the bottom of the workstation is an easy-to-access panel for performing repairs and upgrades yourself. I really like the bottom compartment. I opened it and noticed I could throw in an additional NVMe drive and an SSD if needed. You can also access memory here. I love this because not only can you perform easy repairs yourself, but you can perform upgrades or part replacements without voiding your warranty on the original equipment. I’m glad to see that HP kept this in mind.

The keyboard is smaller than a full-size version but has a number keypad, which I love using when typing in timecodes. It is such a time-saver for me. (I credit entering in repair order numbers when I fixed computers at Best Buy as a teenager.) On the top of the keyboard are some handy shortcuts if you do web conferences or calls on your computer, including answering and ending calls. The Bang & Olufsen speakers are some of the best laptop speakers I’ve heard. While they aren’t quite monitor-quality, they do have some nice sound on the low end that I was able to fine-tune in the Bang & Olufsen audio control app.

Software Tests
All right, enough of the technical specs. Let’s get on to what people really want to know — how the HP ZBook 15 G6 performs while using apps like Blackmagic’s DaVinci Resolve and Adobe Premiere Pro. I used sample Red and Blackmagic Raw footage that I use a lot in testing. You can grab the Red footage here and the BRaw footage here. Keep in mind you will need to download the BRaw software to edit with BRaw inside of Adobe products, which you can find here).

Performance monitor while exporting in Resolve with VFX.

For testing in Resolve and Premiere, I strung out one-minute of 4K, 6K and 8K Red media in one sequence and the 4608×2592 4K and 6K BRaw media in another. During the middle of my testing Resolve had a giant Red API upgrade to allow for better realtime playback of Red Raw files if you have an Nvidia CUDA-based GPU.

First up is Resolve 16.1.1 and then Resolve 16.1.2. Both sequences are set to UHD (3840×2160) resolution. One sequence of each codec contains just color correction, while another of each codec contains effects and color correction. The Premiere sequence with color and effects contains basic Lumetri color correction, noise reduction (50) and a Gaussian blur with settings of 0.4. In Resolve, the only difference in the color and effects sequence is that the noise reduction is spatial and set to Enhanced, Medium and 25/25.

In Resolve, the 4K Red media would play in realtime while the 6K (RedCode 3:1) would jump down to about 14fps to 15fps, and the 8K (RedCode 7:1) would play at 10fps at full resolution with just color correction. With effects, the 4K media would play at 20fps, 6K at 3fps and 8K at 10fps. The Blackmagic Raw video would play at real time with just color correction and around 3fps to 4fps with effects.

This is where I talk about just how loud the fans in the ZBook 15 G6 can get. When running exports and benchmarks, the fans are noticeable and a little distracting. Obviously, we are running some high-end testing with processor- and GPU-intensive tests but still, the fans were noticeable. However, the bottom of the mobile workstation was not terribly hot, unlike the MacBook Pros I’ve tested before. So my lap was not on fire.

In my export testing, I used those same sequences as before and from Adobe Premiere Pro 2020. I exported UHD files using Adobe Media Encoder in different containers and codecs: H.264 (Mov), H.265 (Mov), ProResHQ, DPX, DCP and MXF OP1a (XDCAM). The MXF OP1a was at 1920x1080p export.
Here are my results:

Red (4K,6K,8K)
– Color Only: H.264 – 5:27; H.265 – 4:45; ProResHQ – 4:29; DPX – 3:37; DCP – 10:38; MXF OP1a – 2:31

Red Color, Noise Reduction (50), Gaussian Blur .4: H.264 – 4:56; H.265 – 4:56; ProResHQ – 4:36; DPX – 4:02; DCP – 8:20; MXF OP1a – 2:41

Blackmagic Raw
Color Only: H.264 – 2:05; H.265 – 2:19; ProResHQ – 2:04; DPX – 3:33; DCP – 4:05; MXF OP1a – 1:38

Color, Noise Reduction (50), Gaussian Blur 0.4: H.264 – 1:59; H.265 – 2:22; ProResHQ – 2:07; DPX – 3:49; DCP – 3:45; MXF OP1a – 1:51

What is surprising is that when adding effects like noise reduction and a Gaussian blur in Premiere, the export times stayed similar. While using the ZBook 15 G6, I noticed my export times improved when I upgraded driver versions, so I re-did my tests with the latest Nvidia drivers to make sure I was consistent. The drivers also solved an issue in which Resolve wasn’t reading BRaw properly, so remember to always research drivers.

The Nvidia Quadro RTX 3000 really pulled its weight when editing and exporting in both Premiere and Resolve. In fact, in previous versions of Premiere, I noticed that the GPU was not really being used as well as it should have been. With the Premiere Pro 2020 upgrade it seems like Adobe really upped its GPU usage game — at some points I saw 100% GPU usage.

In Resolve, I performed similar tests, but instead of ProResHQ I exported a DNxHR QuickTime file/package instead of a DCP and IMF package. For the most part, they are stock exports in the Deliver page of Resolve, except I forced Video Levels, Forced Debayer and Resizing to Highest Quality. Here are my results from Resolve version 16.1.1 and 16.1.2. (16.1.2 will be in parenthesis.)

– Red (4K, 6K, 8K) Color Only: H.264 – 2:17 (2:31); H.265 – 2:23 (2:37); DNxHR – 2:59 (3:06); IMF – 6:37 (6:40); DPX – 2:48 (2:45); MXF OP1A – 2:45 (2:33)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 5:00 (5:15); H.265 – 5:18 (5:21); DNxHR – 5:25 (5:02); IMF – 5:28 (5:11); DPX – 5:23 (5:02); MXF OP1a – 5:20 (4:54)

-Blackmagic Raw Color Only: H.264 – 0:26 (0:25); H.265 – 0:31 (0:30); DNxHR – 0:50 (0:50); IMF – 3:51 (3:36); DPX – 0:46 (0:46); MXF OP1a – 0:23 (0:22)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 7:51 (7:53); H.265 – 7:45 (8:01); DNxHR – 7:53 (8:00); IMF – 8:13 (7:56); DPX – 7:54 (8:18); MXF OP1a – 7:58 (7:57)

Interesting to note: Exporting Red footage with color correction only was significantly faster from Resolve, but for Red footage with effects applied, export times were similar between Resolve and Premiere. With the CUDA Red SDK update to Resolve in 16.1.2, I thought I would see a large improvement, but I didn’t. I saw an approximate 10% increase in playback but no improvement in export times.

Puget

Puget Systems has some great benchmarking tools, so I reached out to Matt Bach, Puget Systems’ senior labs technician, about my findings. He suggested that the mobile Xeon could possibly still be the bottleneck for Resolve. In his testing he saw a larger increase in speed with AMD Threadripper 3 and Intel i9-based systems. Regardless, I am kind of going deep on realtime playback of 8K Red Raw media on a mobile workstation — what a time we are in. Nonetheless, Blackmagic Raw footage was insanely fast when exporting out of Resolve, while export time for the Blackmagic Raw footage with effects was higher than I expected. There was a consistent use of the GPU and CPU in Resolve much like in the new version of Premiere 2020, which is a trend that’s nice to see.

In addition to Premiere and Resolve testing, I ran some common benchmarks that provide a good 30,000-foot view of the HP ZBook 15 G6 when comparing it to other systems. I decided to use the Puget Systems benchmarking tools. Unfortunately, at the time of this review, the tools were only working properly with Premiere and After Effects 2019, so I ran the After Effects benchmark using the 2019 version. The ZBook 15 G6 received an overall score of 802, render score of 79, preview score of 75.2 and tracking score of 86.4. These are solid numbers that beat out some desktop systems I have tested.

Corona

To test some 3D applications, I ran the Cinebench R20, which gave a CPU score of 3243, CPU (single core) score of 470 and an M/P ratio of 6.90x. I recently began running the Gooseberry benchmark scene in Blender to get a better sense of 3D rendering performance, and it took 29:56 to export. Using the Corona benchmark, it took 2:33 to render 16 passes, 3,216,368 rays/s. Using Octane Bench the ZBook 15 G6 received a score of 139.79. In the Vray benchmark for CPU, it received 9833 Ksamples, and in the Vray GPU testing, 228 mpaths. I’m not going to lie; I really don’t know a lot about what these benchmarks are trying to tell me, but they might help you decide whether this is the mobile workstation for your work.

Cinebench

One benchmark I thought was interesting between driver updates for the Nvidia Quadro RTX 3000 was the Neat Bench from Neat Video — the noise reduction plugin for video. It measures whether your system should use the CPU, GPU or a combination thereof to run Neat Video. Initially, the best combination result was to use the CPU only (seven cores) at 11.5fps.

After updating to the latest Nvidia drivers, the best combination result was to use the CPU (seven cores) and GPU (Quadro RTX 3000) at 24.2fps. A pretty incredible jump just from a driver update. Moral of the story: Make sure you have the correct drivers always!

Summing Up
Overall, the HP ZBook 15 G6 is a powerful mobile workstation that will work well across the board. From 3D to color correction apps, the Xeon processor in combination with the Quadro RTX 3000 will get you running 4K video without a problem. With the HP DreamColor anti-glare display using up to 600 nits of brightness and covering 100% of the DCI-P3 color space, coupled with the HDR option, you can rely on the attached display for color accuracy if you don’t have your output monitor attached. And with features like two USB Type-C ports (Thunderbolt 3 plus DP 1.4 plus USB 3.1 Gen 2), you can connect external monitors for a larger view of your work

The HP Fast Charge will get you out of a dead battery fiasco with the ability to go from 0% to 50% charge in 45 minutes. All of this for around $4,000 seems to be a pretty low price to pay, especially because it includes a three-year on-site warranty and because the device is certified to work seamlessly with many apps that pros use with HP’s independent software vendor verifications.

If you are looking for a mobile workstation upgrade, are moving from desktop to mobile or want an alternative to a MacBook Pro, you should price a system out online.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Behind the Title: Film Editor Edward Line

By Randi Altman

This British editor got his start at Final Cut in London, honing his craft and developing his voice before joining Cartel in Santa Monica.

NAME: Edward Line

COMPANY: Cartel

WHAT KIND OF COMPANY IS CARTEL?
Cartel is an editorial and post company based in Santa Monica. We predominantly service the advertising industry but also accommodate long-form projects and other creative content. I joined Cartel as one of the founding editors in 2015.

CAN YOU GIVE US SOME MORE DETAIL ABOUT YOUR JOB?
I assemble the raw material from a film shoot into a sequence that tells the story and communicates the idea of a script. Sometimes I am involved before the shoot and cut together storyboard frames to help the director decide what to shoot. Occasionally, I’ll edit on location if there is a technical element that requires immediate approval for the shoot to move forward.

Edward Line working on Media Composer

During the edit, I work closely with the directors and creative teams to realize their vision of the script or concept and bring their ideas to life. In addition to picture editing, I incorporate sound design, music, visual effects and graphics into the edit. It’s a collaboration between many departments and an opportunity to validate existing ideas and try new ones.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THE FILM EDITOR TITLE?
A big part of my job involves collaborating with others, working with notes and dealing with tricky situations in the cutting room. Part of being a good editor is having the ability to manage people and ideas while not compromising the integrity and craft of the edit. It’s a skill that I’m constantly refining.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love being instrumental in bringing creative visions together and seeing them realized on screen, while being able to express my individual style and craft.

WHAT’S YOUR LEAST FAVORITE?
Tight deadlines. Filming with digital formats has allowed productions to shoot more and specify more deliverables. However, providing the editor proportional time to process everything is not always a consideration and can add pressure to the process.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I am a morning person so I tend to be most productive when I have fresh eyes. I’ve often executed a scene in the first few hours of a day and then spent the rest of the day (and night) fine-tuning it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I have always had a profound appreciation for design and architecture, and in an alternate universe, I could see myself working in that world.

WHY DID YOU CHOOSE THIS PROFESSION?
I’ve always had ambitions to work in filmmaking and initially worked in TV production after I graduated college. After a few years, I became curious about working in post and found an entry-level job at the renowned editorial company Final Cut in London. I was inspired by the work Final Cut was doing, and although I’d never edited before, I was determined to give editing a chance.

CoverGirl

I spent my weekends and evenings at the office, teaching myself how to edit on Avid Media Composer and learning editing techniques with found footage and music. It was during this experimental process, that I fell in love with editing and I never looked back.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
In the past year I have edited commercials for CoverGirl, Sephora, Bulgari, Carl’s Jr. and Smartcar. I have also cut a short film called Dad Was, which will be submitted to festivals in 2020.

HOW HAVE YOU DEVELOPED NEW SKILLS WHEN CUTTING FOR A SPECIFIC GENRE OR FORMAT?
Cutting music videos allowed me to hone my skills to edit musical performance while telling visual stories efficiently. I learned how to create rhythm and pace through editing and how to engage an audience when there is no obvious narrative. The format provided me with a fertile place to develop my individual editing style and perfect my storytelling skills.

When I started editing commercials, I learned to be more disciplined in visual storytelling, as most commercials are rarely longer than 60 seconds. I learned how to identify nuances in performance and the importance of story beats, specifically when editing comedy. I’ve also worked on numerous films with VFX, animation and puppetry. These films have allowed me to learn about the potential for these visual elements while gaining an understanding of the workflow and process.

More recently, I have been enjoying cutting dialogue in short films. Unlike commercials, this format allows more time for story and character to develop. So when choosing performances, I am more conscious of the emotional signals they send to the audience and overarching narrative themes.

Sephora

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It’s tough to narrow this down to one project…

Recently, I worked on a commercial for the beauty retailer Sephora that promoted its commitment to diversity and inclusivity. The film Identify As We is a celebration of the non-binary community and features a predominantly transgender cast. The film champions ideas of being different and self expression while challenging traditional perceptions of beauty. I worked tirelessly with the director and creative team to make sure we treated the cast and footage with respect while honoring the message of the campaign.

I’m also particularly proud of a short film that I edited called Wale. The film was selected for over 30 film festivals across the globe and won several awards. The culmination of the film’s success was receiving a BAFTA nomination and being shortlisted for the 91st Academy Awards for Best Live Action Short Film.

WHAT DO YOU USE TO EDIT?
I work on Avid Media Composer, but I have recently started to flirt with Adobe Premiere. I think it’s good to be adaptable, and I’d hate to restrict my ability to work on a project because of software.

Wale

ARE YOU OFTEN ASKED TO DO MORE THAN EDIT? IF SO, WHAT ELSE ARE YOU ASKED TO DO?
Yes, I usually incorporate other elements such as sound design, music and visual effects into my edits as they can be instrumental to the storytelling or communication of an idea. It’s often useful for the creative team and other film departments to see how these elements contribute to the final film, and they can sometimes inform decisions in the edit.

For example, sound can play a major part in accenting a moment or providing a transition to another scene, so I often spend time placing sound effects and sourcing music during the edit process. This helps me visualize the scene in a broader context and provides new perspective if I’ve become overfamiliar with the footage.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
No surprises, but my smartphone! Apart from the obvious functions, it’s a great place to review edits and source music when I’m on the move. I’ve also recently purchased a Bluetooth keyboard and Wacom tablet, which make for a tidy work area.

I’m also enjoying using my “smart thermostat” at home which learns my behavior and seems to know when I’m feeling too hot or cold.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Once I have left the edit bay, I decompress by listening to music on the way home. Once home, I take great pleasure from cooking for myself, friends and family.


Maryanne Brandon’s path, and editing Star Wars: The Rise of Skywalker

By Amy Leland

In the interest of full disclosure, I have been a fan of both the Star Wars world and the work of J.J. Abrams for a very long time. I saw Star Wars: Episode IV – A New Hope  in the theaters with my big brother when I was five years old, and we were hooked. I don’t remember a time in my life without Star Wars. And I have been a fan of all of Abrams’ work, starting with Felicity. Periodically, I go back and rewatch Felicity, Alias and Lost. I was, in fact, in the middle of Season 2 of Alias and had already purchased my ticket for The Rise of Skywalker when I was assigned this interview.

As a female editor, I have looked up to Maryann Brandon, ACE, and Mary Jo Markey, ACE — longtime Abrams collaborators — for years. A chance to speak with Brandon was more than a little exciting. After getting the fangirl out of my system at the start of the interview, we had a wonderful conversation about her incredible career and this latest Star Wars offering.

After working in the world of indie film in New York City after NYU film school, Brandon has not only been an important part of J.J. Abrams’ world — serving as a primary editor on Alias, and then on Mission Impossible III, Super 8 and two films each in the Star Trek and Star Wars worlds — but has also edited The Jane Austen Book Club, How to Train Your Dragon and Venom, among others.

Maryann Brandon

Let’s dig a bit deeper with Brandon…

How did your path to editing begin?
I started in college, but I wasn’t really editing. I was just a member of the film society. I was recruited by the NYU Graduate Film program in 1981 because they wanted women in the program. And I thought, it’s that or working on Wall Street, and I wasn’t really that great with the money or numbers. I chose film school.

I had no idea what it was going to be like because I don’t come from a film background or a film family. I just grew up loving films. I ended up spending three years just running around Manhattan, making movies with everyone, and everyone did every job. Then, when I got out of school, I had to finish my thesis film, and there was no one to edit it for me. So I ended up editing it myself. I started to meet people in the business because New York was very close. I got offered a paid position in editing, and I stayed.

I met and worked for some really incredible people along the way. I worked as a second assistant on the Francis Ford Coppola film The Cotton Club. I went from that to working as a first assistant on Richard Attenborough’s version of A Chorus Line. I was sent to London and got swept up in the editing part of it. I like telling stories. It became the thing I did. And that’s how it happened.

Who inspired you in those early days?
I was highly influenced by Dede Allen. She was this matriarch of New York at that time, and I was so blown away by her and her personality. I mean, her work spoke for itself, but she was also this incredible person. I think it’s my nature anyway, but I learned from her early on an approach of kindness and caring. I think that’s part of why I stayed in the cutting room.

On set, things tend to become quite fraught sometimes when you’re trying to make something happen, but the cutting room is this calm place of reality, and you could figure stuff out. She was very influential to me, and she was such a kind, caring person. She cared about everyone in the cutting room, and she took time to talk to everyone.

There was also John Bloom, who was the editor on A Chorus Line. We became very close, and he always used to call me over to see what he was doing. I learned tons from him. In those days, we cut on film, so it was running through your fingers.

The truth is everyone I meet influences me a bit. I am fascinated by each person’s approach and why they see things the way they do.

While your resume is eclectic, you’ve worked on many sci-fi and action films. Was that something you were aiming for, or did it happen by chance?
I was lucky enough to meet J.J. Abrams, and I was lucky enough to get on Alias, which was not something I thought I’d want to do. Then I did it because it seemed to suit me at the time. It was a bit of faith and a bit of, “Oh, that makes sense for you, because you grew up loving Twilight Zone and Star Trek.”

Of course, I’d love to do more drama. I did The Jane Austen Book Club and other films like that. One does tend to get sort of suddenly identified as, now I’m the expert on sci-fi and visual effects. Also, I think because there aren’t a lot of women who do that, it’s probably something people notice. But I’d love to do a good comedy. I’d love to do something like Jumanji, which I think is hilarious.

How did this long and wonderful collaboration with J.J. Abrams get started?
Well, my kids were getting older. It was getting harder and harder for me to go on location with the nanny, the dog, the nanny’s kids, my kids, set up a third grade class and figure out how to do it all. A friend of mine who was a producer on Felicity had originally tried to get me to work on that show. She said, “You’ll love J.J. You’ll love (series creator) Matt Reeves. Come and just meet us.” I just thought television is such hard work.

Then he was starting this new show, Alias. My friend said, “You’re going to love it. Just meet him.” And I did. Honestly, I went to an interview with him, and I spent an hour basically laughing at every joke he told me. I thought, “This guy’s never going to hire me.” But he said, “Okay, I’ll see you tomorrow.” That’s how it started.

What was that like?
Alias was so much fun. I didn’t work on Felicity, which was more of a straightforward drama about a college girl growing up. Alias was this crazy, complicated, action-filled show, but also a girl trying to grow up. It was all of those things. It was classic J.J. It was a challenge, and it was really fun because we all discovered it together. There were three other female editors who are amazing — Mary Jo Markey, Kristin Windell, and Virginia Katz — and there was J.J. and Ken Olin, who was a producer in residence there and director. We just found the show together, and that was really fun.

How has your collaboration with J.J. changed over time?
It’s changed in terms of the scope of a project and what we have to do. And, obviously, the level of conflict and communication is pretty easy because we’ve known each other for so long. There’s not a lot of barriers like, “Hey, I’m trying to get to know you. What do I…?” We just jump right in. Over the years, it’s changed a bit.

On The Rise of Skywalker, I cut this film with a different co-editor. Mary Jo [Markey, Brandon’s longtime co-editor] was doing something else at the time, so I ended up working with Stefan Grube. The way I had worked with Mary Jo was we would divide up the film. She’d do her thing and I’d do mine. But because these films are so massive, I prefer not to divide it up, but instead have both of us work on whatever needs working on at the time to get it done. I proposed this to J.J., and it worked out great. Everything got cut immediately and we got together periodically to ask him what he thought.

Another thing that changed was, because we needed to turn over our visual effects really quickly, I proposed that I cut on the set, on location, when they were shooting. At first J.J. was like, “We’ve never done this before.” I said, “It’s the only way I’m going to get your eyes on sequences,” because by the time the 12-hour day is over, everyone’s exhausted.

It was great and worked out well. I had this little mobile unit, and the joke was it was always within 10 feet of wherever J.J. was. It was also great because I felt like I was part of the crew, and they felt like they could talk to me. I had the DP asking me questions. I had full access to the visual effects supervisor. We worked out shots on the set. Given the fact that you could see what we already had, it really was a game-changer.

What are some of the challenges of working on films that are heavy on action, especially with the Star Wars and Star Trek films and all the effects and CGI?
There’s a scene where they arrive on Exogal, and they’re fighting with each other and more ships are arriving. All of that was in my imagination. It was me going, “Okay, that’ll be on the screen for this amount of time.” I was making up so much of it and using the performances and the story as a guide. I worked really closely with the visual effects people describing what I thought was going to happen. They would then explain that what I thought was going to happen was way too much money to do.

Luckily I was on the set, so I could work it out with J.J. as we went. Sometimes it’s better for me just to build something that I imagine and work off of that, but it’s hard. It’s like having a blank page and then knowing there’s this one element, and then figuring out what the next one will be.

There are people who are incredibly devoted to the worlds of Star Trek and Star Wars and have very strong feelings about those worlds. Does that add more pressure to the process?
I’m a big fan of Star Trek and Star Wars, as is J.J. I grew up with Star Trek, and it’s very different because Star Trek was essentially a week-to-week serial that featured an adventure, and Star Wars is this world where they’re in one major war the whole time.

Sometimes I would go off on a tangent, and J.J. and my co-editor Stefan would be like, “That’s not in the lore,” and I’d have to pull it back and remember that we do serve a fan base that is loyal to it. When I edit anything, I really try to abandon any kind of preconceived thing I have so I can discover things.

I think there’s a lot of pressure to answer to the first two movies, because this is the third, and you can’t just ignore a story that’s been set up, right? We needed to stay within the boundaries of that world. So yeah, there’s a lot of pressure to do that, for sure. One of the things that Chris Terrio and J.J., as the writers, felt very strongly about was having it be Leia’s final story. That was a labor of love for sure. All of that was like a love letter to her.

I don’t know how much of that had been decided before Carrie Fisher (Leia) died. It was my understanding that you had to reconstruct based on things she shot for the other films.
She died before this film was even written, so all of the footage you see is from Episode 7. It’s all been repurposed, and scenes were written around it. Not just for the sake of writing around the footage, but they created scenes that actually work in the context of the film. A lot of what works is due to Daisy Ridley and the other actors who were in the scenes with her. I mean, they really brought her to life and really sold it. I have to say they were incredible.

With two editors co-editing on set during production, you must have needed an extensive staff of assistant editors. How do you work with assistant editors on something of this scale?
I’ve worked with an assistant editor named Jane Tones on the last couple of films. She is amazing. She was the one who figured out how to make the mobile unit work on set. She’s incredibly gifted, both technologically and story-wise. She was instrumental in organizing everything to do with the edit and getting us around. Stefan’s assistant was Warren Paeff, and he is very experienced. We also had a sound person we carried with us and a couple of other assistants. I had another assistant, Ben Cox, who was such a Star Wars fan. When I said, “I’m happy to hire you, but I only have a second assistant position.” He was like, “I’ll take it!”

What advice do you have for someone starting out or who would like to build the kind of career you’ve made?
I would say, try to get a PA job or a job in the cutting room where you really enjoy the people, and pay attention. If you have ideas, don’t be shy but figure out how to express your ideas. I think people in the cutting room are always looking for anyone with an opinion or reaction because you need to step back from it. It’s a love of film, a love of storytelling and a lot of luck. I work really hard, but I also had a lot of good fortune meeting the people I did.


Amy Leland is a film director and editor. Her short film, Echoes, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.


CVLT adds Joe Simons as lead editor

Bi-coastal production studio CVLT, which offers full-service production and post, has Joe Simons as lead editor. He will be tasked with growing CVLT’s editorial department. He edits on Adobe Premiere and will be based in the New York studio.

Simons joins CVLT after three years at The Mill, where his edited the “It’s What Connects Us” campaign for HBO, the “Top Artist of the Year” campaign for Spotify and several major campaigns for Ralph Lauren, among many others. Prior to The Mill, he launched his career at PS260 before spending four years at editing house Cut+Run.

Simons’ addition comes at a time when CVLT is growing into a full concept-to-completion creative studio, launching campaigns for top luxury and fashion brands, including Lexus, Peloton and Louis Vuitton.

“Having soaked up everything I could at The Mill and Cut+Run, it was time for me to take that learning and carve my own path,” says Simons.


Maxon and Red Giant to merge

Maxon, developers of pro 3D software solutions, and Red Giant, makers of tools for editors, VFX artists, and motion designers, have agreed to merge under the media and entertainment division of Nemetschek Group. The transaction is expected to close in January 2020, subject to regulatory approval and customary closing conditions.

Maxon, best known for its 3D product Cinema 4D, was formed in 1986 to provide high-end yet accessible 3D software solutions. Artists across the globe rely on Maxon products to create high-end visuals. In April of this year, Maxon acquired Redshift, developer of the GPU-accelerated Redshift render engine.

Since 2002, Red Giant has built its brand through products such as Trapcode, Magic Bullet, Universe, PluralEyes and its line of visual effects software. Its tools are used in the fields of film, broadcast and advertising.

The two companies provide tools for companies including ABC, CBS, NBC, HBO, BBC, Sky, Fox Networks, Turner Broadcasting, NFL Network, WWE, Viacom, Netflix, ITV Creative, Discovery Channel, MPC, Digital Domain, VDO, Sony, Universal, The Walt Disney Company, Blizzard Entertainment, BMW, Facebook, Apple, Google, Vitra, Nike and many more.

Main Photo: L-R: Maxon CEO Dave McGavran and Red Giant CEP Chad Bechert


Behind the title: Cutters editor Steve Bell

“I’ve always done a fair amount of animation design, music rearranging and other things that aren’t strictly editing, but most editors are expected to play a role in aspects of the post process that aren’t strictly editing.”

Name: Steve Bell

What’s your job title?
Editor

Company: Cutters Editorial

Can you describe your company?
Cutters is part of a global group of companies offering offline editing, audio engineering, VFX and picture finishing, production and design – all of which fall under Cutters Studios. Here in New York, we do traditional broadcast TV advertising and online content, as well as longer format work and social media content for brands, directors and various organizations that hire us to develop a concept, shoot and direct.

Cutters New York

What’s your job title?
Editor

What’s your favorite part of the job?
There’s a stage to pretty much every project where I feel I’ve gotten a good enough grasp of the material that I can connect the storytelling dots and see it come to life. I like problem solving and love the feeling you get when you know you’ve “figured it out.”

Depending on the scale of the project, it can start a few hours in, a few days in or a few weeks in, but once it hits you can’t stop until you see the piece finished. It’s like reading a good page-turner; you can’t put it down. That’s the part of the creative process I love and what I like most about my job.

What’s your least favorite?
It’s those times when it becomes clear that I’ve/we’ve probably looked at something too many times to actually make it better. That certainly doesn’t happen on many jobs, but when it does, it’s probably because too many voices have had a say; too many cooks in the kitchen, as they say.

What is your most productive time of the day?
Early in the morning. I’m most clearheaded at the very beginning of the day, and then sometimes toward the very end of a long day. But those times also happen to be when I’m most likely to be alone with what I’m working on and free from other distractions.

If you didn’t have this job, what would you be doing instead? 
Baseball player? Astronaut? Joking. But let’s face it, we all fantasize about fulfilling the childhood dreams that are completely different from what we do. To be truthful I’m sure I’d be doing some kind of writing, because it was my desire to be a writer, particularly of film, that indirectly led me to be an editor.

Why did you choose this profession? How early on did you know this would be your path?
Well the simple answer is probably that I had opportunities to edit professionally at a relatively young age, which forced me to get better at editing way before I had a chance to get better at writing. If I keep editing I may never know if I can write!

Stella Artois

Can you name some recent projects you have worked on?
The Dwyane Wade Budweiser retirement film, Stella Artois holiday spots, a few films for the Schott/Hamilton watch collaboration. We did some fun work for Rihanna’s Savage X Fenty release. Early in the year I did a bunch of lovely spots for Hallmark Hall of Fame programming.

Do you put on a different hat when cutting for a specific genre?
For sure. There are overlapping tasks, but I do believe it takes a different set of skills to do good dramatic storytelling than it takes to do straight comedy, or doc or beauty. Good “Storytelling” (with a capital ‘S’) is helpful in all of it — I’d probably say crucial. But it comes down to the important element that’s used to create the story: emotion, humor, rhythm, etc. And then you need to know when it needs to be raw versus formal, broad versus subtle and so forth. Different hats are needed to get that exactly right.

What is the project that you are most proud of and why?
I’m still proud of the NHL’s No Words spot I worked on with Cliff Skeete and Bruce Jacobson. We’ve become close friends as we’ve collaborated on a lot of work since then for the NHL and others. I love how effective that spot is, and I’m proud that it continues to be referenced in certain circles.

NHL No Words

In a very different vein, I think I’m equally proud of the work I’ve done for the UN General Assembly meetings, especially the film that accompanied Kathy Jetnil-Kijiner’s spoken word performance of her poem “Dear Matafele Peinem” during the opening ceremonies of the UN’s first Climate Change conference. That’s an issue that’s very important to me and I’m grateful for the chance to do something that had an impact on those who saw it.

What do you use to edit?
I’m a Media Composer editor, and it probably goes back to the days when I did freelance work for Avid and had to learn it inside out. The interface at least is second nature to me. Also, the media sharing and networking capabilities of Avid make it indispensable. That said, I appreciate that Premiere has some clear advantages in other ways. If I had to start over I’m not sure I wouldn’t start with Premiere.

What is your favorite plugin?
I use a lot of Boris FX plugins for stabilization, color correction and so forth. I used to use After Effects often, and Boris FX offers a way of achieving some of what I once did exclusively in After Effects.

Are you often asked to do more than edit? If so, what else are you asked to do?
I’ve always done a fair amount of animation design, music rearranging and other things that aren’t strictly editing, but most editors are expected to play a role in aspects of the post process that aren’t strictly “film editing.”

Many of my clients know that I have strong opinions about those things, so I do get asked to participate in music and animation quite often. I’m also sometimes asked to help with the write-ups of what we’ve done in the edit because I like talking about the process and clarifying what I’ve done. If you can explain what you’ve done you’re probably that much more confident about the reasons you did it. It can be a good way to call “bullshit” on yourself.

This is a high stress job with deadlines and client expectations. What do you do to de-stress from it all?
Yeah, right?! It can be stressful, especially when you’re occasionally lucky enough to be busy with multiple projects all at once. I take decompressing very seriously. When I can, I spend a lot of time outdoors — hiking, biking, you name it — not just for the cardio and exercise, which is important enough, but also because it’s important to give your eyes a chance to look off into the distance. There are tremendous physical and psychological benefits to looking to the horizon.

Review: The Sensel Morph hardware interface

By Brady Betzel

As an online editor and colorist, I have tried a lot of hardware interfaces designed for apps like Adobe Premiere, Avid Media Composer, Blackmagic DaVinci Resolve and others. With the exception of professional color correction surfaces like the FilmLight Baselight, the Resolve Advanced Panel and Tangent’s Element color correction panels, it’s hard to get exactly what I need.

While they typically work well, there is always a drawback for my workflow; usually they are missing one key shortcut or feature. Enter Sensel Morph, a self-proclaimed morphable hardware interface. In reality, it is a pressure-sensitive trackpad that uses individual purchasable magnetic rubber overlays and keys for a variety of creative applications. It can also be used as a pressure-sensitive trackpad without any overlays.

For example, inside of the Sensel app you can identify the Morph as a trackpad and click “Send Map to Morph,” and it will turn itself into a large trackpad. If you are a digital painter, you can turn the Morph into “Paintbrush Area” and use a brush and/or your fingers to paint! Once you understand how to enable the different mappings you can quickly and easily Morph between settings.

For this review, I am going to focus on how you can use the Sensel Morph with Adobe Premiere Pro. For the record, you can actually use it with any NLE by creating your own map inside of the Sensel app. The Morph essentially works with keyboard shortcuts for NLEs. With that in mind, if you customize your keyboard shortcuts you are going to want to enable the default mapping inside of Premiere or adjust your settings to match the Sensel Morph’s settings.

Before you plug in your Morph, you will need to click over to https://sensel.com/pages/support, where you can get a quick-start guide in addition to the Sensel app you will need to install before you get working. After it’s downloaded and installed, you will want to plug in the Morph via the USB and let it charge before using the Bluetooth connection. It took a while for the Morph to fully charge, about two hours, but once I installed the Sensel app, added the Video Editing Overlay and opened Adobe Premiere, I was up and working.

To be honest, I was a little dubious about the Sensel Morph. A lot of these hardware interfaces have come across my desk, and they usually have poor software implementation, or the hardware just doesn’t hold up. But the Sensel Morph broke through my preconceived ideas of hardware controllers for NLEs like Premiere, and for the first time in a long time, I was inspired to use Premiere more often.

It’s no secret that I learned professional editing in Avid Media Composer and Symphony. And most NLEs can’t quite rise to the level of professional experience that I have experienced in Symphony. One of those experiences is how well and fluid the keyboard and Wacom tablet work together. The first time I plugged in the Sensel Morph, overlayed the Video Editing Overlay on top of the Morph and opened Premiere, I began to have that same feeling but inside of Premiere!

While there are still things Premiere has issues with, the Sensel Morph really got me feeling good about how well this Adobe NLE worked. And to be honest, some of those issues relate to me not learning Premiere’s keyboard shortcuts like I did in Avid. The Sensel Morph felt like a natural addition to my Premiere editing workflow. It was the first time I started to feel that “flow state” inside of Premiere that I previously got into when using Media Composer or Symphony, and I started trimming and editing like a mad man. It was kind of shocking to me.

You may be thinking that I am blowing this out of proportion, and maybe I am, a little, but the Morph immediately improved my lazy Premiere editing. In fact, I told someone that Adobe should package these with first-time Premiere users.

I really like the way the timeline navigation works (much like the touch bar). I also like the quick Ripple Left/Right commands, and I like how you can quickly switch timelines by pressing the “Timeline” button multiple times to cycle through them. I did feel like I needed a mouse some of the time and keyboard for some of the time, but for about 60% of the time I could edit without them. Much like how I had to force myself to use a Wacom tablet for editing, if you try not to use a mouse I think you will get by just fine. I did try and use a Wacom stylus with the Sensel Morph and, unfortunately, it did not work.

What improvements could the Sensel Morph make? Specifically in Premiere, I wish they had a full-screen shortcut (“`”) labeled on the Morph. It’s one of those shortcuts I use all the time, whether I want to see my timeline full screen, the effects controls full screen or the Program feed full screen. And while I know I could program it using the Sensel app, the OCD in me wants to see that reflected onto the keys. While we are on the keys subject, or overlay, I do find it a little hard to use when I customize the key presses. Maybe ordering a custom printed overlay could assuage this concern.

One thing I found odd was the GPU usage that the Sensel app needed. My laptop’s fans were kicking on, so I opened up Task Manager and saw that the Sensel app was taking 30% of my Nvidia RTX 2080. Luckily, you really only need it open when changing overlays or turning it into a trackpad, but I found myself leaving it open by accident, which could really hurt performance.

Summing Up
In the end, is the Sensel Morph really worth the $249? It does come with one free overlay of your choice with the $249 purchase price, along with a one-year warranty; but if you want more overlays those will set you back from $35 to $59 depending on the overlay.

The Video Editing one is $35 while the new Buchla Thunder overlay is $59. From a traditional Keyboard, Piano Key, Music Production, or even Drum Pad Overlay there are a few different options you can choose from. If you are a one-person band that goes between Premiere and apps like Abelton, then it’s 100 percent worth it. If you use Premiere a lot, I still think it is worth it. The iPad Mini-size and weight is really nice, and when using over Bluetooth you feel untethered. Its sleek and thin design really allows you to bring this morphable hardware interface with you anywhere you take your laptop or tablet.

The Sensel Morph is not like any of the other hardware interfaces I have used. Not only is it extremely mobile, but it works well and is compatible with a lot of content creation apps that pros use daily. They really delivered on this one.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Ford v Ferrari’s co-editors discuss the cut

By Oliver Peters

After a failed attempt to acquire European carmaker Ferrari, an outraged Henry Ford II sets out to trounce Enzo Ferrari on his own playing field — automobile endurance racing. That is the plot of 20th Century Fox’s Ford v Ferrari, directed by James Mangold. In the end, Ford’s effort falls short, leading him to independent car designer Carroll Shelby (Matt Damon). Shelby’s outspoken lead test driver Ken Miles (Christian Bale) complicates the situation by making an enemy out of Ford senior VP Leo Beebe.

Michael McCusker

Nevertheless, Shelby and his team are able to build one of the greatest race cars ever — the GT40 MkII — setting up a showdown between the two auto legends at the 1966 24 Hours of Le Mans.

The challenge of bringing this clash of personalities to the screen was taken on by director James Mangold (Logan, Wolverine, 3:10 to Yuma) and his team of long-time collaborators.

I recently spoke with film editors Michael McCusker, ACE, (Walk the Line, 3:10 to Yuma, Logan) and Andrew Buckland (The Girl On the Train) — both of whom were recently nominated for an Oscar and ACE Eddie Award for their work on the film — about what it took to bring Ford v Ferrari together.

The post team for this film has worked with James Mangold on quite a few films. Tell me a bit about the relationship.
Michael McCusker: I cut my very first movie, Walk the Line, for Jim 15 years ago and have since cut his last six movies. I was the first assistant editor on Kate & Leopold, which was shot in New York in 2001. That’s where I met Andrew, who was hired as one of the local New York film assistants. We became fast friends. Andrew moved to LA in 2009, and I hired him to assist me on Knight & Day.

Andrew Buckland

I always want to keep myself available for Jim — he chooses good material, attracts great talent and is a filmmaker who works across multiple genres. Since I’ve worked with him, I’ve cut a musical movie, a western, a rom-com, an action movie, a straight-up superhero movie, a dystopian superhero movie and now a racing film.

As a film editor, it must be great not to get typecast for any particular cutting style.
McCusker: Exactly. I worked for David Brenner for years as his first. He was able to cross genres, and that’s what I wanted to do. I knew even then that the most important decisions I would make would be choosing projects. I couldn’t have foreseen that Jim was going to work across all these genres — I simply knew that we worked well together and that the end product was good.

In preparing for Ford v Ferrari, did you study any other recent racing films, like Ron Howard’s Rush?
McCusker: I saw that movie, and liked it. Jim was aware of it, too, but I think he wanted to do something a little more organic. We watched a lot of older racing films, like Steve McQueen’s Le Mans and John Frankenheimer’s Grand Prix.

Jim’s original intention was to play the racing in long takes and bring the audience along for the ride. As he was developing the script, and we were in preproduction, it became clear that there was more drama for him to portray during the racing sequences than he anticipated. So the races took on more of an energized pace.

Energized in what way? Do you mean in how you cut it or in a change of production technique, like more stunt cameras and angles?
McCusker: I was fortunate to get involved about two-and-a-half months prior to the start of production. We were developing the Le Mans race in previs. This required a lot of editing and discussions about shot design and figuring out what the intercutting was going to be during that sequence, which is like the fourth act of the movie.

You’re dealing with Mollie and Peter [Miles’ wife and son] at home watching the race, the pit drama, what’s going on with Shelby and his crew, with Ford and Leo Beebe and also, of course, what’s going on in the car with Ken. It’s a three-act movie unto itself, so Jim was trying to figure out how it was all going to work before he had to shoot it. That’s where I came in. The frenetic pace of Le Mans was more a part of the writing process — and part of the writing process was the previs. The trick was how to make sure we weren’t just following cars around a track. That’s where redundancy can tend to beleaguer an audience in racing movies.

What was the timeline for production and post?
McCusker: I started at the end of May 2018. Production began at the beginning of August and went all the way through to the end of November. We started post in earnest at the beginning of November of last year, took some time off for the holidays, and then showed the film to the studios around February or March.

When did you realize you were going to need help?
The challenge was that there was going to be a lot of racing footage, which meant there was going to be a lot of footage. I knew I was going to need a strong co-editor, so Andrew was the natural choice. He had been cutting on his own and cutting with me over the years. We share a common approach to editing and have a similar aesthetic.

There was a point when things got really intense and we needed another pair of hands, so I brought in Dirk Westervelt to help out for a couple of months. That kept our noses above water, but the process was really enjoyable. We were never in a crisis mode. We got a great response from preview audiences and, of course, that calms everybody down. At that point it was just about quality control and making sure we weren’t resting on our laurels.

How long was your initial cut, and what was your process for trimming the film down to the present run time?
McCusker: We’re at 2:30:00 right now and I think the first cut was 3:10 or 3:12. The Le Mans section was longer. The front end of the movie had more scenes in it. We ended up lifting some scenes and rearranging others. Plus, the basic trimming of scenes brought the length down.

But nothing was the result of a panic, like, “Oh my God, we’ve got to get to 2:30!” There were no demands by the studio or any pressures we placed upon ourselves to hit a particular running time. I like to say that there’s real time and there’s cinematic time. You can watch Once Upon a Time in America, which is 3:45, and feels like it’s an hour. Or you can watch an 89-minute movie and feel like it’s drudgery. We just wanted to make sure we weren’t overstaying our welcome.

How extensively did you rearrange scenes during the edit? Or did the structure of the film stay pretty much as scripted?
McCusker: To a great degree it stayed as scripted. We had some scenes in the beginning that we felt were a little bit tangential and weren’t serving the narrative directly, and those were cut.

The real endeavor of this movie starts the moment that these two guys [Shelby and Miles] decide to tackle the challenge of developing this car. There’s a scene where Miles sees the car for the first time at LAX. We understood that we had to get to that point in a very efficient way, but also set up all the other characters — their motives and their desires.

It’s an interesting movie, because it starts off with a lot of characters. But then it develops into a movie about two guys and their friendship. So it goes from an ensemble piece to being about Ken and Carroll, while at the same time the scope of the movie is opening up and becoming larger as the racing is going on. For us, the trickiest part was the front end — to make sure we spent enough time with each character so that we understood them, but not so much time that audience would go, “Enough already! Get on with it!”

Did that help inform your cutting style for this film?
McCusker: I don’t think so. Where it helped was knowing the sound of the broadcasters and race announcers. I liked Chris Economaki and Jim McKay — guys who were broadcasting the races when I was a kid. I was intrigued about how they gave us the narrative of the race. It came in handy while we were making this movie, because we were able to get our hands on some of Jim McKay’s actual coverage of Le Mans and used it in the movie. That brings so much authenticity.

Let’s talk sound. I would imagine the sound design was integral to your rough cuts. How did you tackle that?
Andrew Buckland: We were fortunate to have the sound team on very early during preproduction. We were cutting in a 5.1 environment, so we wanted to create sound design early. The engine sounds might not have been the exact sounds that would end up in the final, but they were adequate enough to allow you to experience the scenes as intended. Because we needed to get Jim’s response early, some of the races were cut with the production sound — from the live mics during filming. This allowed Jim and us to quickly see how the scenes would flow.

Other scenes were cut strictly MOS because the sound design would have been way too complicated for the initial cut of the scene. Once the scene was cut visually, we’d hand over the scene to sound supervisor Don Sylvester, who was able to provide us with a set of 5.1 stems. That was great, because we could recut and repurpose those stems for other races.

McCusker: We had developed a strategy with Don to split the sound design into four or five stems to give us enough discrete channels to recut these sequences. The stems were a palette of interior perspectives, exterior perspectives, crowds, car-bys, and so on. By employing this strategy, we didn’t need to continually turn over the cut to sound for patch-up work.

Then, as Don went out and recorded the real cars and was developing the actual sounds for what was going to be used in the mix, he’d generate new stems and we would put them into the Media Composer. This was extremely informative to Jim, because he could experience our Avid temp mix in 5.1 and give notes, which ultimately informed the final sound design and the mix.

What about temp music? Did you also weave that into your rough cuts?
McCusker: Ted Caplan, our music editor, has also worked with Jim for 15 years. He’s a bit of a renaissance man — a screenwriter, a novelist, a one-time musician and a sound designer in his own right. When he sits down to work with music, he’s coming at it from a story point-of-view. He has a very instinctual knowledge of where music should start, and it happens to dovetail into the aesthetic that Jim, Andrew, and I are working toward. None of us like music to lead scenes in a way that anticipates what the scene is going to be about before you experience it.

For this movie, it was challenging to develop what the musical tone of the movie would be. Ted was developing the temp track along with us from a very early stage. We found over time that not one particular musical style was going to work. This is a very complex score. It includes a kind of surf-rock sound with Carroll Shelby in LA, an almost jaunty, lounge jazz sound for Detroit and the Ford executives, and then the hard-driving rhythmic sound for the racing.

The final score was composed by Marco Beltrami and Buck Sanders.

I presume you were housed in multiple cutting rooms at a central facility.
McCusker: We cut at 20th Century Fox, where Jim has a large office space. We cut Logan and Wolverine there before this movie. It has several cutting spaces and I was situated between Andrew and Don. Ted was next to Don and John Berri, our additional editor. Assistants were right around the corner. It makes for a very efficient working environment.

Since the team was cutting with Avid Media Composer, did any of its features stand out to you for this film?
Both: FluidMorph! (laughing)

McCusker: FluidMorph, speed-ramping — we often had to manipulate the shot speeds to communicate the speed of the cars. A lot of these cars were kit cars that could drive safely at a certain speed for photography, but not at race speed. So we had to manipulate the speed a lot to get the sense of action that these cars have.

What about Avid’s ScriptSync? I know a lot of narrative editors love it.
McCusker: I used ScriptSync once a few years ago and I never cut a scene faster. I was so excited. Then I watched it, and it was terrible. To me there’s so much more to editing than hitting the next line of dialogue. I’m more interested in the lines between the lines — subtext. I do understand the value of it in certain applications. For instance, I think it’s great on straight comedy. It’s helpful to get around and find things when you are shooting tons of coverage for a particular joke. But for me, it’s not something I lean on. I mark up my own dailies and find stuff that way.

Tell me a bit more about your organizational process. Do you start with a Kem roll or stringouts of selected takes?
McCusker: I don’t watch dailies, at least in a traditional sense. I don’t start in the morning, watch the dailies and then cut. And I don’t ask my assistants to organize any of my dailies in bins. I come in and grab the scene that I have in front on me. I’ll look at the last take of every set-up quickly and then I spend an enormous amount of time — particularly on complex scenes — creating a bin structure that I can work with.

Sometimes it’s the beats in a scene, sometimes I organize by shot size, sometimes by character — it depends on what’s driving the scene. I learn my footage by organizing it. I remember shot sizes. I remember what was shot from set-up to set-up. I have a strong visual memory of where things are in a bin. So, if I ask an assistant to do that, then I’m not going to remember it. If there are a lot of resets or restarts in a take, I’ll have the assistant mark those up. But, I’ll go through and mark up beats or pivotal points in a scene, or particularly beautiful moments, and then I’ll start cutting.

Buckland: I’ve adopted a lot of Mike’s methodology, mainly because I assisted Mike on a few films. But it actually works for me, as well. I have a similar aesthetic to Mike.

Was this was shot digitally?
McCusker: It was primarily shot with ARRI Alexa 65 LFs, plus some other small-format cameras. A lot of it was shot with old anamorphic lenses on the Alexa that allowed them to give it a bit of a vintage feeling. It’s interesting that as you watch it, you see the effect of the old lenses. There’s a fall-off on the edges, which is kind of cool. There were a couple of places where the subject matter was framed into the curve of the lens, which affects the focus. But we stuck with it, because it feels “of the time.”

Since the film takes place in the 1960s and has a lot of racing sequences, I assume there a lot of VFX?
McCusker: The whole movie is a period film and we would temp certain things in the Avid for the rough cuts. John Berri was wrangling visual effects. He’s a master in the Avid and also Adobe After Effects. He has some clever ways of filling in backgrounds or greenscreens with temp elements to give the director an idea of what’s going to go there. We try to do as much temp work in the Avid as we are capable of doing, but there’s so much 3D visual effects work in this movie that we weren’t able to do that all of the time.

The racing is real. The cars are real. The visual effects work was for a lot of the backgrounds. The movie was shot almost entirely in Los Angeles with some second unit footage shot in Georgia. The modern-day Le Mans track isn’t at all representative of what Le Mans was in 1966, so there was no way to shoot that. Everything had to be doubled and then augmented with visual effects. In addition to Georgia, where they shot most of the actual racing for Le Mans, they went to France to get some shots of the actual town of Le Mans. Of those, I think only about four of those shots are left. (laughs)

Any final thoughts about how this film turned out?
McCusker: I’m psyched that people seem to like the film. Our concern was that we had a lot of story to tell. Would we wear audiences out? We continually have people tell us, “That was two and a half hours? We had no idea.” That’s humbling for us and a great feeling. It’s a movie about these really great characters with great scope and great racing. You can put all the big visual effects in a film that you want to, but it’s really about people.

Buckland: I agree. It’s more of a character movie with racing. Also, because I am not a racing fan per se, the character drama really pulled me into the film while working on it.


Oliver Peters is an experienced film and commercial editor/colorist. In addition, he regularly interviews editors for trade publications. He may be contacted through his website at oliverpeters.com.

The 70th annual ACE Eddie Award nominations

The American Cinema Editors (ACE), the honorary society of the world’s top film editors, has announced its nominations for the 70th Annual ACE Eddie Awards recognizing outstanding editing in 11 categories of film, television and documentaries.

For the first time in ACE’s history, three foreign language films are among the nominees, including The Farewell, I Lost My Body and Parasite, despite there not being a specific category for films predominantly in a foreign language.

Winners will be revealed during a ceremony on Friday, January 17 at the Beverly Hilton Hotel and will be presided over by ACE president, Stephen Rivkin, ACE. Final ballots open December 16 and close on January 6.

Here are the nominees:

BEST EDITED FEATURE FILM (DRAMA):
Ford v Ferrari
Michael McCusker, ACE & Andrew Buckland

The Irishman
Thelma Schoonmaker, ACE

Joker 
Jeff Groth

Marriage Story
Jennifer Lame, ACE

Parasite
Jinmo Yang

BEST EDITED FEATURE FILM (COMEDY):
Dolemite is My Name
Billy Fox, ACE

The Farewell
Michael Taylor & Matthew Friedman

Jojo Rabbit
Tom Eagles

Knives Out
Bob Ducsay

Once Upon a Time in Hollywood
Fred Raskin, ACE

BEST EDITED ANIMATED FEATURE FILM:
Frozen 2
Jeff Draheim, ACE

I Lost My Body
Benjamin Massoubre

Toy Story 4
Axel Geddes, ACE

BEST EDITED DOCUMENTARY (FEATURE):
American Factory
Lindsay Utz

Apollo 11
Todd Douglas Miller

Linda Ronstadt: The Sound of My Voice
Jake Pushinsky, ACE & Heidi Scharfe, ACE

Making Waves: The Art of Cinematic Sound
David J. Turner & Thomas G. Miller, ACE

BEST EDITED DOCUMENTARY (NON-THEATRICAL):
Abducted in Plain Sight
James Cude

Bathtubs Over Broadway
Dava Whisenant

Leaving Neverland
Jules Cornell

What’s My Name: Muhammad Ali
Jake Pushinsky, ACE

BEST EDITED COMEDY SERIES FOR COMMERCIAL TELEVISION:
Better Things: “Easter”
Janet Weinberg, ACE

Crazy Ex-Girlfriend: “I Need To Find My Frenemy” 
Nena Erb, ACE

The Good Place: “Pandemonium” 
Eric Kissack

Schitt’s Creek: “Life is a Cabaret”
Trevor Ambrose

BEST EDITED COMEDY SERIES FOR NON-COMMERCIAL TELEVISION:
Barry: “berkman > block”
Kyle Reiter, ACE

Dead to Me: “Pilot”
Liza Cardinale

Fleabag: “Episode 2.1”
Gary Dollner, ACE

Russian Doll: “The Way Out”
Todd Downing

BEST EDITED DRAMA SERIES FOR COMMERCIAL TELEVISION:
Chicago Med: “Never Going Back To Normal”
David J. Siegel, ACE

Killing Eve: “Desperate Times”
Dan Crinnion

Killing Eve: “Smell Ya Later”
Al Morrow

Mr. Robot: “401 Unauthorized”
Rosanne Tan, ACE

BEST EDITED DRAMA SERIES FOR NON-COMMERCIAL TELEVISION:
Euphoria: “Pilot””
Julio C. Perez IV

Game of Thrones: “The Long Night”
Tim Porter, ACE

Mindhunter: “Episode 2”
Kirk Baxter, ACE

Watchmen: “It’s Summer and We’re Running Out of Ice”
David Eisenberg

BEST EDITED MINISERIES OR MOTION PICTURE FOR TELEVISION:
Chernobyl: “Vichnaya Pamyat”
Jinx Godfrey & Simon Smith

Fosse/Verdon: “Life is a Cabaret”
Tim Streeto, ACE

When They See Us: “Part 1”
Terilyn A. Shropshire, ACE

BEST EDITED NON-SCRIPTED SERIES:
Deadliest Catch: “Triple Jeopardy”
Ben Bulatao, ACE, Rob Butler, ACE, Isaiah Camp, Greg Cornejo, Joe Mikan, ACE

Surviving R. Kelly: “All The Missing Girls”
Stephanie Neroes, Sam Citron, LaRonda Morris, Rachel Cushing, Justin Goll, Masayoshi Matsuda, Kyle Schadt

Vice Investigates: “Amazon on Fire”
Cameron Dennis, Kelly Kendrick, Joe Matoske, Ryo Ikegami

Main Image: Marriage Story

Storage for Editors

By Karen Moltenbrey

Whether you are a small-, medium- or large-size facility, storage is at the heart of your workflow. Consider, for instance, the one-person shop Fin Film Company, which films and edits footage for branding and events, often on water. Then there’s Uppercut, a boutique creative/post studio where collaborative workflow is the key to pushing boundaries on commercials and other similar projects.

Let’s take a look at Uppercut’s workflow first…

Uppercut
Uppercut is a creative editorial boutique shop founded by Micah Scarpelli in 2015 and offering a range of post services. Based in New York and soon Atlanta, the studio employs five editors with their own suites along with an in-house Flame artist who has his own suite.

Taylor Schafer

In contrast to Uppercut’s size, its storage needs are quite large, with five editors working on as many as five projects at a time. Although most of it is commercial work, some of those projects can get heavy in terms of the generated media, which is stored on-site.

So, for its storage needs, the studio employs an EditShare RAID system. “Sometimes we have multiple editors working on one large campaign, and then usually an assistant is working with an editor, so we want to make sure they have access to all the media at the same time,” says Taylor Schafer, an assistant editor at Uppercut.

Additionally, Uppercut uses a Supermicro nearline server to store some of its VFX data, as the Flame artist cannot access the EditShare system on his CentOS operating system. Furthermore, the studio uses LTO-6 archive media in a number of ways. “We use EditShare’s Ark to LTO our partitions once the editors are done with them for their projects. It’s wonderfully integrated with the whole EditShare system. Ark is easy to navigate, and it’s easy to swap LTO tapes in and out, and everything is in one location,” says Schafer.

The studio employs the EditShare Ark to archive its editors’ working files, such as Premiere and Avid projects, graphics, transcodes and so forth. Uppercut also uses BRU (Backup Restore Utility) from Tolis Group to archive larger files that only live on LaCie hard drives and not on EditShare, such as a raw grade. “Then we’re LTO’ing the project and the whole partition with all the working files at the end through Ark,” Schafer explains.

The importance of having a system like this was punctuated over the summer when Uppercut underwent a renovation and had to move into temporary office space at Light Iron, New York — without the EditShare system. As a result, the team had to work off of hard drives and Light Iron’s Avid Nexis for some limited projects. “However, due to storage limits, we mainly worked off of the hard drives, and I realized how important a file storage system that has the ability to share data in real time truly is,” Schafer recalls. “It was a pain having to copy everything onto a hard drive, hand it back to the editor to make new changes, copy it again and make sure all the files were up to date, as opposed to using a storage system like ours, where everything is instantly up to date. You don’t have to worry whether something copied over correctly or not.”

She continues: “Even with Nexis, we were limited in our ability to restore old projects, which lived on EditShare.”

When a new project comes in at Uppercut, the first thing Schafer and her colleagues do is create a partition on EditShare and copy over the working template, whether it’s for Avid or Premiere, on that partition. Then they get their various working files and start the project, copying over the transcodes they receive. As the project progresses, the artists will get graphics and update the partition size as needed. “It’s so easy to change on our end,” notes Schafer. And once the project is completed, she or another assistant will make sure all the files they would possibly need, dating back to day one of the project, are on the EditShare, and that the client files are on the various hard drives and FTP links.

Reebok

“We’ll LTO the partition on EditShare through Ark onto an LTO-6 tape, and once that is complete, then generally we will take the projects or partition off the EditShare,” Schafer continues. The studio has approximately 26TB of RAID storage but, due to the large size of the projects, cannot retain everything on the EditShare long term. Nevertheless, the studio has a nearline server that hosts its masters and generics, as well as any other file the team might need to send to a client. “We don’t always need to restore. Generally the only time we try to restore is when we need to go back to the actual working files, like the Premiere or Avid project,” she adds.

Uppercut avoids keeping data locally on workstations due to the collaborative workflow.

According to Schafer, the storage setup is easy to use. Recently, Schafer finished a Reebok project she and two editors had been working on. The project initially started in Avid Media Composer, which was preferred by one of the editors. The other editor prefers Premiere but is well-versed on the Avid. After they received the transcodes and all the materials, the two editors started working in tandem using the EditShare. “It was great to use Avid on top of it, having Avid bins to open separately and not having to close out of the project and sharing through a media browser or closing out of entire projects, like you have to do with a Premiere project,” she says. “Avid is nice to work with in situations where we have multiple editors because we can all have the project open at once, as opposed to Premiere projects.”

Later, after the project was finished, the editor who prefers Premiere did a director’s cut in that software. As a result, Schafer had to re-transcode the footage, “which was more complicated because it was shot on 16mm, so it was also digitized and on one large video reel instead of many video files — on top of everything else we were doing,” she notes. She re-transcoded for Premiere and created a Premiere project from scratch, then added more storage on EditShare to make sure the files were all in place and that everything was up to date and working properly. “When we were done, the client had everything; the director had his director’s cut and everything was backed up to our nearline for easy access. Then it was LTO’d through Ark on LTO-6 tapes and taken off EditShare, as well as LTO’d on BRU for the raw and the grade. It is now done, inactive and archived.”

Without question, says Schafer, storage is important in the work she and her colleagues do. “It’s not so much about the storage itself, but the speed of the storage, how easily I’m able to access it, how collaborative it allows me to be with the other people I’m working with. Storage is great when it’s accessible and easy for pretty much anyone to use. It’s not so good when it’s slow or hard to navigate and possibly has tech issues and failures,” Schafer says. “So, when I’m looking for storage, I’m looking for something that is secure, fast and reliable, and most of all, easy to understand, no matter the person’s level of technical expertise.”

Chris Aguilar

Fin Film Company
People can count themselves fortunate when they can mix business with pleasure and integrate their beloved hobby with their work. Such is the case for solo producer/director/editor Chris Aguilar of Fin Film Company in Southern California, which he founded a decade ago. As Aguilar says, he does it all, as does Fin Film, which produces everything from conferences to music videos and commercial/branded content. But his real passion involves outdoor adventure paddle sports, from stand-up paddleboarding to pro paddleboarding.

“That’s been pretty much my niche,” says Aguilar, who got his start doing in-house production (photography, video and so forth) for a paddleboard company. Since then, he has been able to turn his passion and adventures into full-time freelance work. “When someone wants an event video done, especially one involving paddleboard races, I get the phone call and go!”

Like many videographers and editors, Aguilar got his start filming weddings. Always into surfing himself, he would shoot surfing videos of friends “and just have fun with it,” he says of augmenting that work. Eventually, this allowed him to move into areas he is more passionate about, such as surfing events and outdoor sports. Now, Aguilar finds that a lot of his time is spent filming paddleboard events around the globe.

Today, there are many one-person studios with solo producers, directors and editors. And as Aguilar points out, their storage needs might not be on the level of feature filmmakers or even independent TV cinematographers, but that doesn’t negate their need for storage. “I have some pretty wide-ranging storage needs, and it has definitely increased over the years,” he says.

In his work, Aguilar has to avoid cumbersome and heavy equipment, such as Atomos recorders, because of their weight on board the watercraft he uses to film paddleboard events. “I’m usually on a small boat and don’t have a lot of room to haul a bunch of gear around,” he says. Rather, Aguilar uses Panasonic’s AG-CX350 as well as Panasonic’s EVA1 and GH5, and on a typical two-day shoot (the event and interviews), he will fill five to six 64GB cards.

“Because most paddleboard races are long-distance, we’re usually on the water for about five to eight hours,” says Aguilar. “Although I am not rolling cameras the whole time, the weight still adds up pretty quickly.”

As for storage, Aguilar offloads his video onto SSD drives or other kinds of external media. “I call it my ‘working drive for editing and that kind of thing,’” he says. “Once I am done with the edit and other tasks, I have all those source files somewhere.” He calls on the G-Technology G-Drive Mobile SSD 1TB for in the field and some editing and their Ev Raw portable raw drive for back ups and some editing. He also calls on Gylph’s Atom SSD for the field.

For years, that “somewhere” has been a cabinet that was filled with archived files. Indeed, that cabinet is currently holding, in Aguilar’s estimate, 30TB of data, if not more. “That’s just the archives. I have 10 or 11 years of archives sitting there. It’s pretty intense,” he adds. But, as soon as he gets an opportunity, those will be ported to the same cloud backup solution he is using for all his current work.

Yes, he still uses the source cards, but for a typical project involving an end-to-end shoot, Aguilar will use at least a 1TB drive to house all the source cards and all the subsequent work files. “Things have changed. Back in the day, I used hard drives – you should see the cabinet in my office with all these hard drives in it. Thank God for SSDs and other options out there. It’s changed our lives. I can get [some brands of] 1TB SSD for $99 or a little more right now. My workflow has me throwing all the source cards onto something like that that’s dedicated to all those cards, and that becomes my little archive,” explains Aguilar.

He usually uploads the content as fast as possible to keep the data secure. “That’s always the concern, losing it, and that’s where Backblaze comes in,” Aguilar says. Backblaze is a cloud backup solution that is easily deployed across desktops and laptops and managed centrally — a solution Aguilar recently began employing. He also uses Iconik Solutions’ digital management system, which eases the task of looking up video files or pulling archived files from Backblaze. The digital management system sits on top of Backblaze and creates little offline proxies of the larger content, allowing Aguilar to view the entire 10-year archive online in one interface.

According to Aguilar, his archived files are an important aspect of his work. Since he works so many paddleboard events, he often receives requests for clips from specific racers or races, some dating back years. Prior to using Backblaze, if someone requested footage, it was a challenge to locate it because he’d have to pull that particular hard drive and plug it into the computer, “and if I had been organized that year, I’ll know where that piece of content is because I can find it. If I wasn’t organized that year, I’d be in trouble,” he explains. “At best, though, it would be an hour and a half or more of looking around. Now I can locate and send it in 15 minutes.”

Aguilar says the Iconik digital management system allows him to pull up the content on the interface and drill down to the year of the race, click on it, download it and send it off or share it directly through his interface to the person requesting the footage.

Aguilar went live with this new Backblaze and digital management system storage workflow this year and has been fully on board with it for just the past two to three months. He is still uncovering all the available features and the power underneath the hood. “Even for a guy who’s got a technical background, I’m still finding things I didn’t know I could do,” and as such, Aguilar is still fine-tuning his workflow. “The neat thing with Iconik is that it could actually support online editing straight up, and that’s the next phase of my workflow, to accommodate that.”

Fortunately or unfortunately, at this time Aguilar is just starting to come off his busy season, so now he can step back and explore the new system. And transfer onto the new system all the material on the old source cards in that cabinet of his.

“[The new solution] is more efficient and has reduced costs since I am not buying all these drives anymore. I can reuse them now. But mostly, it has given me peace of mind that I know the data is secure,” says Aguilar. “I have been lucky in my career to be present for a lot of cool moments in the sport of paddling. It’s a small community and a very close-knit group. The peace of mind knowing that this history is preserved, well, that’s something I greatly appreciate. And I know my fellow paddlers also appreciate it.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

The Irishman editor Thelma Schoonmaker

By Iain Blair

Editor Thelma Schoonmaker is a three-time Academy Award winner who has worked alongside filmmaker Martin Scorsese for almost 50 years. Simply put, Schoonmaker has been Scorsese’s go-to editor and key collaborator over the course of some 25 films, winning Oscars for Raging Bull, The Aviator and The Departed. The 79-year-old also received a career achievement award from the American Cinema Editors (ACE).

Thelma Schoonmaker

Schoonmaker cut Scorsese’s first feature, 1967’s Who’s That Knocking at My Door, and since 1980’s Raging Bull has worked on all of his features, receiving a number of Oscar nominations along the way. There are too many to name, but some highlights include The King of Comedy, After Hours, The Color of Money, The Last Temptation of Christ, Goodfellas, Casino and Hugo.

Now Scorsese and Schoonmaker have once again turned their attention to the mob with The Irishman, which was nominated for 10 Academy Awards, including one for Shoonmaker’s editing work. Starring Robert De Niro, Al Pacino and Joe Pesci, it’s an epic saga that runs 3.5 hours and focuses on organized crime in post-war America. It’s told through the eyes of World War II veteran Frank Sheeran (De Niro). He’s a hustler and hitman who worked alongside some of the most notorious figures of the 20th century. Spanning decades, the film chronicles one of the greatest unsolved mysteries in American history, the disappearance of legendary union boss Jimmy Hoffa. It also offers a monumental journey through the hidden corridors of organized crime — its inner workings, rivalries and connections to mainstream politics.

But there’s a twist to this latest mob drama that Scorsese directed for Netflix from a screenplay by Steven Zaillian. Gone are the flashy wise guys and the glamour of Goodfellas and Casino. Instead, the film examines the mundane nature of mob killings and the sad price any survivors pay in the end.

Here, Schoonmaker — who in addition to her film editing works to promote the films and writings of her late husband, famed British director Michael Powell (The Red Shoes, Black Narcissus) — talks about cutting The Irishman, working with Scorsese and their long and storied collaboration.

The Irishman must have been very challenging to cut, just in terms of its 3.5-hour length?
Actually, it wasn’t very challenging to cut. It came together much more quickly than some of our other films because Scorsese and Steve Zaillian had created a very strong structure. I think some critics think I came up with this structure, but it was already there in the script. We didn’t have to restructure, which we do sometimes, and only dropped a few minor scenes.

Did you stay in New York cutting while he shot on location, or did you visit the set?
Almost everything in the The Irishman was shot in or around New York. The production was moving all over the place, so I never got to the set. I couldn’t afford the time.

When I last interviewed Marty, he told me that editing and post are his favorite parts of filmmaking. When the two of you sit down to edit, is it like having two editors in the room rather than a director and his editor?
Marty’s favorite part of filmmaking is editing, and he directs the editing after he finishes shooting. I do an assembly based on what he tells me in dailies and what I feel, and then we do all the rest of the editing together.

Could you give us some sense of how that collaboration works?
We’ve worked together for almost 50 years, and it’s a wonderful collaboration. He taught me how to edit at first, but then gradually it has become more of a collaboration. The best thing is that we both work for what is best for the film — it never becomes an ego battle.

How long did it take to edit the film, and what were the main challenges?
We edited for a year and the footage was so incredibly rich: the only challenge was to make sure we chose the best of it and took advantage of the wonderful improvisations the actors gave us. It was a complete joy for Scorsese and me to edit this film. After we locked the film, we turned over to ILM so they could do the “youthifying” of the actors. That took about seven months.

Could you talk about finding the overall structure and considerable use of flashbacks to tell the story?
Scorsese had such a strong concept for this film — and one of his most important ideas was to not explain too much. He respects the audience’s ability to figure things out themselves without pummeling them with facts. It was a bold choice and I was worried about it, frankly, at first. But he was absolutely right. He didn’t want the film to feel like a documentary. He wanted to use brushstrokes of history just to show how they affected the characters. The way the characters were developed in the film, particularly Frank Sheeran, the De Niro character, was what was most important.

Could you talk about the pacing, and how you and Marty kept its momentum going?
Scorsese was determined that The Irishman would have a slower pace than many films today. He gave the film a deceptive simplicity. Interestingly, our first audiences had no problem with this — they became gripped by the characters and kept saying they didn’t mind the length and loved the pace. Many of them said they wanted to see the film again right away.

There are several slo-mo sequences. Could you talk about why you used them and to what effect?
The Phantom camera slo-motion wedding sequence (250fps) near the end of the film was done to give the feeling of a funeral, instead of a wedding, because the DeNiro character has just been forced to do the worst thing he will ever do in his life. Scorsese wanted to hold on De Niro’s face and evoke what he is feeling and to study the Italian-American faces of the mobsters surrounding him. Instead of the joy a wedding is supposed to bring, there is a deep feeling of grief.

What was the most difficult sequence to cut and why?
The montage where De Niro repeatedly throws guns into the river after he has killed someone took some time to get right. It was very normal at first — and then we started violating the structure and jump cutting and shortening until we got the right feeling. It was fun.

There’s been a lot of talk about the digital de-aging process. How did it impact the edit?
Pablo Helman at ILM came up with the new de-aging process, and it works incredibly well. He would send shots and we would evaluate them and sometimes ask for changes — usually to be sure that we kept the amazing performances of De Niro, Pacino and Pesci intact. Sometimes we would put back in a few wrinkles if it meant we could keep the subtlety of De Niro’s acting, for example. Scorsese was adamant that he didn’t want to have younger actors play the three main parts in the beginning of the film. So he really wanted this “youthifying” process to work — and it does!

There’s a lot of graphic violence. How do you feel about that in the film?
Scorsese made the violence very quick in The Irishman and shot it in a deceptively simple way. There aren’t any complicated camera moves and flashy editing. Sometimes the violence takes place after a simple pan, when you least expect it because of the blandness of the setting. He wanted to show the banality of violence in the mob — that it is a job, and if you do it well, you get rewarded. There’s no morality involved.

Last time we talked, you were using the Lightworks editing system. Do you still use Lightworks, and if so, can you talk about the system’s advantages for you?
I use Lightworks because the editing surface is still the fastest and most efficient and most intuitive to use. Maintaining sync is different from all other NLE systems. You don’t correct sync by sync lock — if you go out of sync, Lightworks gives you a red icon with a number of frames that you are out of sync. You get to choose where you want to correct sync. Since editors place sound and picture on the timeline, adjusting sync where you want to adjust the sync is much more efficient.

You’ve been Marty’s editor since his very first film — a 50-year collaboration. What’s the secret?
I think Scorsese felt when he first met me that I would do what was right for his films — that there wouldn’t be ego battles. We work together extremely well. That’s all there is to it. There couldn’t be a better job.

Do you ever have strong disagreements about the editing?
If we do have disagreements, which is very rare, they are never strong. He is very open to experimentation. Sometimes we will screen two ways and see what the audience says. But that is very rare.

What’s next?
A movie about the Osage Nation in Oklahoma, based on the book “Killers of the Flower Moon” by David Grann.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Alaina Zanotti rejoins Cartel as executive producer

Santa Monica-based editorial and post studio Cartel has named Alaina Zanotti as executive producer to help with business development and to oversee creative operations along with partner and executive producer Lauren Bleiweiss. Additionally, Cartel has bolstered its roster with the signing of comedic editor Kevin Zimmerman.

Kevin Zimmerman

With more than 15 years of experience, Zanotti joins Cartel after working for clients that include BBDO, Wieden+Kennedy, Deutsch, Google, Paramount and Disney. Zanotti most recently served as senior executive producer at Method Studios, where she oversaw business development for global VFX and post. Prior to that stint, she joined Cartel in 2016 to assist the newly established post and editorial house’s growth. Previously, Zanotti spent more than a decade driving operations and raising brand visibility for Method and Company 3.

Editor Zimmerman joins Cartel following a tenure as a freelance editor, during which his comedic timing and entrepreneurial spirit earned him commercial work for Avocados From Mexico and Planters that aired during 2019’s Super Bowl.

Throughout his two-decade career in editorial, Zimmerman has held positions at Spot Welders, NO6, Whitehouse Post and FilmCore, with recent work for Sprite, Kia, hotels.com, Microsoft and Miller Lite, and a PSA for Girls Who Code. Zimmer has previously worked with Cartel partners Adam Robinson and Leo Scott.

Chris Hellman joins Harbor as ECD of editorial

Harbor has added award-winning editor Chris Hellman as executive creative director of editorial. Hellman brings 35 years of experience collaborating and editing with producers, art directors, writers and directors working on commercials. He will be based at Harbor in New York but available at its locations in LA and London as well.

During his long and distinguished career, Hellman has garnered multiple Cannes Lions, Addy Awards, Clios, The One Clubs, London International Awards, CA Annuals, and AICP Awards. Chris served as senior editor at Crew Cuts for 16 years. He was owner/partner and senior editor at Homestead Editorial and then became senior editor at Cutting Room Films. Hellman then took up the role of creative director of post production with the FCB Health network of agencies. His work has been seen in movie theaters, during concerts, and the Super Bowl and as short films and spoof commercials on Saturday Night Live.

“Creating great commercial advertising is about collaboration,” says Hellman. “Harbor is evolving to take that collaboration to a new level, offering clients an approach where the editor is brought into the creative process early on, bringing a new paradigm and a singular creative force.”

Hellman’s clients have included AT&T, Verizon, IBM, Intel, ESPN, NFL, MLB, NBA, Nike, Adidas, New Balance, 3M, Starbucks, Coke, Pepsi, Lipton, Tropicana, Audi, BMW, Volvo, Ford, Jaguar, GMC, Chrysler, Porsche, Pfizer, Merck, Novartis, AstraZeneca, Bayer, Johnson & Johnson, General Mills, Unilever, Lancome, Estee Lauder, Macy’s, TJ Maxx, Tommy Hilfiger, Victorias Secret, Lands End and The Jon Stewart Show, among many others.

Behind the Title: Logan & Sons director Tom Schlagkamp

This director also loves editing, sound design and working with VFX long before and after the shoot.

Name: Tom Schlagkamp

Company: Logan & Sons, the live-action division of bicoastal content creation studio Logan, which is based in NYC and LA.

Job Title: Director

What’s your favorite part of the job?
I can honestly say I love every detail of the job, even the initial pitch, as it’s the first contact with a new story, a new project and a new challenge. I put a lot of heart into every aspect of a film — the better you’ve prepared in pre-production, the more creative you can be during the shoot; it brings you more time and oversight during shooting and more power to react if anything changes.

Tom Schlagkamp’s short film Dysconnected.

For my European, South African and Asian projects, I’m also very happy to be deeply involved in editing, sound design and post production, as I love working with the material. I usually shoot footage, so there are more possibilities to work with in editing.

What’s your least favorite?
Not winning a job, that’s why I’m trying to avoid that… (laughs).

If you didn’t have this job, what would you be doing instead?
Well, plan A would be a rock star — specifically, a guitarist in a thrash metal band. Plan B would be the exact opposite: working at my family’s winery — Schlagkamp-Desoye in Germany’s beautiful Mosel Valley. My brother runs this company now, which is in its 11th generation. Our family has grown wine since 1602. The winery also includes a wine museum.

How early on did you know this would be your path?
In Germany, you don’t necessarily jump from high school to college right away, so I took a short time to learn all the basics of filmmaking with as much practical experience as I could get. That included directing music videos and short films while I worked for Germany’s biggest TV station, RTL. There I learned to edit and produced campaigns for shows, and in particular movie trailers and campaigns for the TV premieres of blockbuster movies. That was a lot of work and fun at the same time.

What was it about directing that attracted you?
The whole idea of creating something completely new. I loved (and still do) the films of the “New Hollywood” and the Nouvelle Vague — they challenged the regular way of storytelling and created something outstanding that changed filmmaking forever. This fascinated me, and I knew I had to learn the rules first in order to be able to question them, so I started studying at Germany’s film academy, the Filmakademie Baden-Württemberg.

What is it about directing that keeps you interested?
It’s about always moving forward. There are so many more ways you can tell a story and so many stories that have not yet been told, so I love working on as many projects as possible.

Dysconnected

Do you get involved with post at all?
Yes, I love to be part of that whenever the circumstances allow it. As mentioned before, I love editing and sound design as well, but also planning and working with VFX long before and after the shoot is fascinating to me.

Can you name some recent projects you have worked on?
As I answer these questions, I’m sitting at the airport in Berlin, traveling to Johannesburg, South Africa. I’m excited about shooting a series of commercials in the African savanna. I shot many commercials this year, but was also happy that my short film Dysconnected, which I shot in Los Angeles last year, premiered at LA Shorts International Film Festival this summer.

What project are you most proud of?
I loved shooting the Rock ’n’ Roll Manifesto for Visions magazine, because it was the perfect combination of my job as a director and my before-mentioned “alternative Plan A,” making my living as a musician. Also, everybody involved in the project was so into it and it’s been the best shooting experience. And winning awards with it in the end was an added bonus.

Rock ‘n’ Roll Manifesto

Name three pieces of technology you can’t live without.
1. Noise cancelling headphones. When I travel, I love listening to music and podcasts, and with these headphones you can dive into that world perfectly.
2. My mobile phone, which I hardly use for phone calls anymore but everything else.
3. My laptop, which is part of every project from the beginning until the end.

What do you do to de-stress from it all?
Cycling, hiking and rock concerts. There is nothing like the silence of being in pure nature and the loudness of heavy guitars and drums at a metal show (laughs).

2019 HPA Award winners announced

The industry came together on November 21 in Los Angeles to celebrate its own at the 14th annual HPA Awards. Awards were given to individuals and teams working in 12 creative craft categories, recognizing outstanding contributions to color grading, sound, editing and visual effects for commercials, television and feature film.

Rob Legato receiving Lifetime Achievement Award from presenter Mike Kanfer. (Photo by Ryan Miller/Capture Imaging)

As was previously announced, renowned visual effects supervisor and creative Robert Legato, ASC, was honored with this year’s HPA Lifetime Achievement Award; Peter Jackson’s They Shall Not Grow Old was presented with the HPA Judges Award for Creativity and Innovation; acclaimed journalist Peter Caranicas was the recipient of the very first HPA Legacy Award; and special awards were presented for Engineering Excellence.

The winners of the 2019 HPA Awards are:

Outstanding Color Grading – Theatrical Feature

WINNER: “Spider-Man: Into the Spider-Verse”
Natasha Leonnet // Efilm

“First Man”
Natasha Leonnet // Efilm

“Roma”
Steven J. Scott // Technicolor

Natasha Leonnet (Photo by Ryan Miller/Capture Imaging)

“Green Book”
Walter Volpatto // FotoKem

“The Nutcracker and the Four Realms”
Tom Poole // Company 3

“Us”
Michael Hatzer // Technicolor

 

Outstanding Color Grading – Episodic or Non-theatrical Feature

WINNER: “Game of Thrones – Winterfell”
Joe Finley // Sim, Los Angeles

 “The Handmaid’s Tale – Liars”
Bill Ferwerda // Deluxe Toronto

“The Marvelous Mrs. Maisel – Vote for Kennedy, Vote for Kennedy”
Steven Bodner // Light Iron

“I Am the Night – Pilot”
Stefan Sonnenfeld // Company 3

“Gotham – Legend of the Dark Knight: The Trial of Jim Gordon”
Paul Westerbeck // Picture Shop

“The Man in The High Castle – Jahr Null”
Roy Vasich // Technicolor

 

Outstanding Color Grading – Commercial  

WINNER: Hennessy X.O. – “The Seven Worlds”
Stephen Nakamura // Company 3

Zara – “Woman Campaign Spring Summer 2019”
Tim Masick // Company 3

Tiffany & Co. – “Believe in Dreams: A Tiffany Holiday”
James Tillett // Moving Picture Company

Palms Casino – “Unstatus Quo”
Ricky Gausis // Moving Picture Company

Audi – “Cashew”
Tom Poole // Company 3

 

Outstanding Editing – Theatrical Feature

Once Upon a Time… in Hollywood

WINNER: “Once Upon a Time… in Hollywood”
Fred Raskin, ACE

“Green Book”
Patrick J. Don Vito, ACE

“Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese”
David Tedeschi, Damian Rodriguez

“The Other Side of the Wind”
Orson Welles, Bob Murawski, ACE

“A Star Is Born”
Jay Cassidy, ACE

 

Outstanding Editing – Episodic or Non-theatrical Feature (30 Minutes and Under)

VEEP

WINNER: “Veep – Pledge”
Roger Nygard, ACE

“Russian Doll – The Way Out”
Todd Downing

“Homecoming – Redwood”
Rosanne Tan, ACE

“Withorwithout”
Jake Shaver, Shannon Albrink // Therapy Studios

“Russian Doll – Ariadne”
Laura Weinberg

 

Outstanding Editing – Episodic or Non-theatrical Feature (Over 30 Minutes)

WINNER: “Stranger Things – Chapter Eight: The Battle of Starcourt”
Dean Zimmerman, ACE, Katheryn Naranjo

“Chernobyl – Vichnaya Pamyat”
Simon Smith, Jinx Godfrey // Sister Pictures

“Game of Thrones – The Iron Throne”
Katie Weiland, ACE

“Game of Thrones – The Long Night”
Tim Porter, ACE

“The Bodyguard – Episode One”
Steve Singleton

 

Outstanding Sound – Theatrical Feature

WINNER: “Godzilla: King of Monsters”
Tim LeBlanc, Tom Ozanich, MPSE // Warner Bros.
Erik Aadahl, MPSE, Nancy Nugent, MPSE, Jason W. Jennings // E Squared

“Shazam!”
Michael Keller, Kevin O’Connell // Warner Bros.
Bill R. Dean, MPSE, Erick Ocampo, Kelly Oxford, MPSE // Technicolor

“Smallfoot”
Michael Babcock, David E. Fluhr, CAS, Jeff Sawyer, Chris Diebold, Harrison Meyle // Warner Bros.

“Roma”
Skip Lievsay, Sergio Diaz, Craig Henighan, Carlos Honc, Ruy Garcia, MPSE, Caleb Townsend

“Aquaman”
Tim LeBlanc // Warner Bros.
Peter Brown, Joe Dzuban, Stephen P. Robinson, MPSE, Eliot Connors, MPSE // Formosa Group

 

Outstanding Sound – Episodic or Non-theatrical Feature

WINNER: “The Haunting of Hill House – Two Storms”
Trevor Gates, MPSE, Jason Dotts, Jonathan Wales, Paul Knox, Walter Spencer // Formosa Group

“Chernobyl – 1:23:45”
Stefan Henrix, Stuart Hilliker, Joe Beal, Michael Maroussas, Harry Barnes // Boom Post

“Deadwood: The Movie”
John W. Cook II, Bill Freesh, Mandell Winter, MPSE, Daniel Colman, MPSE, Ben Cook, MPSE, Micha Liberman // NBC Universal

“Game of Thrones – The Bells”
Tim Kimmel, MPSE, Onnalee Blank, CAS, Mathew Waters, CAS, Paula Fairfield, David Klotz

“Homecoming – Protocol”
John W. Cook II, Bill Freesh, Kevin Buchholz, Jeff A. Pitts, Ben Zales, Polly McKinnon // NBC Universal

 

Outstanding Sound – Commercial 

WINNER: John Lewis & Partners – “Bohemian Rhapsody”
Mark Hills, Anthony Moore // Factory

Audi – “Life”
Doobie White // Therapy Studios

Leonard Cheshire Disability – “Together Unstoppable”
Mark Hills // Factory

New York Times – “The Truth Is Worth It: Fearlessness”
Aaron Reynolds // Wave Studios NY

John Lewis & Partners – “The Boy and the Piano”
Anthony Moore // Factory

 

Outstanding Visual Effects – Theatrical Feature

WINNER: “The Lion King”
Robert Legato
Andrew R. Jones
Adam Valdez, Elliot Newman, Audrey Ferrara // MPC Film
Tom Peitzman // T&C Productions

“Avengers: Endgame”
Matt Aitken, Marvyn Young, Sidney Kombo-Kintombo, Sean Walker, David Conley // Weta Digital

“Spider-Man: Far From Home”
Alexis Wajsbrot, Sylvain Degrotte, Nathan McConnel, Stephen Kennedy, Jonathan Opgenhaffen // Framestore

“Alita: Battle Angel”
Eric Saindon, Michael Cozens, Dejan Momcilovic, Mark Haenga, Kevin Sherwood // Weta Digital

“Pokemon Detective Pikachu”
Jonathan Fawkner, Carlos Monzon, Gavin Mckenzie, Fabio Zangla, Dale Newton // Framestore

 

Outstanding Visual Effects – Episodic (Under 13 Episodes) or Non-theatrical Feature

Game of Thrones

WINNER: “Game of Thrones – The Bells”
Steve Kullback, Joe Bauer, Ted Rae
Mohsen Mousavi // Scanline
Thomas Schelesny // Image Engine

“Game of Thrones – The Long Night”
Martin Hill, Nicky Muir, Mike Perry, Mark Richardson, Darren Christie // Weta Digital

“The Umbrella Academy – The White Violin”
Everett Burrell, Misato Shinohara, Chris White, Jeff Campbell, Sebastien Bergeron

“The Man in the High Castle – Jahr Null”
Lawson Deming, Cory Jamieson, Casi Blume, Nick Chamberlain, William Parker, Saber Jlassi, Chris Parks // Barnstorm VFX

“Chernobyl – 1:23:45”
Lindsay McFarlane
Max Dennison, Clare Cheetham, Steven Godfrey, Luke Letkey // DNEG

 

Outstanding Visual Effects – Episodic (Over 13 Episodes)

Team from The Orville – Outstanding VFX, Episodic, Over 13 Episodes (Photo by Ryan Miller/Capture Imaging)

WINNER: “The Orville – Identity: Part II”
Tommy Tran, Kevin Lingenfelser, Joseph Vincent Pike // FuseFX
Brandon Fayette, Brooke Noska // Twentieth Century FOX TV

“Hawaii Five-O – Ke iho mai nei ko luna”
Thomas Connors, Anthony Davis, Chad Schott, Gary Lopez, Adam Avitabile // Picture Shop

“9-1-1 – 7.1”
Jon Massey, Tony Pirzadeh, Brigitte Bourque, Gavin Whelan, Kwon Choi // FuseFX

“Star Trek: Discovery – Such Sweet Sorrow Part 2”
Jason Zimmerman, Ante Dekovic, Aleksandra Kochoska, Charles Collyer, Alexander Wood // CBS Television Studios

“The Flash – King Shark vs. Gorilla Grodd”
Armen V. Kevorkian, Joshua Spivack, Andranik Taranyan, Shirak Agresta, Jason Shulman // Encore VFX

The 2019 HPA Engineering Excellence Awards were presented to:

Adobe – Content-Aware Fill for Video in Adobe After Effects

Epic Games — Unreal Engine 4

Pixelworks — TrueCut Motion

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix for Photon.

Julian Clarke on editing Terminator: Dark Fate

By Oliver Peters

Linda Hamilton’s Sarah Connor and Arnold Schwarzenegger T-800 are back to save humanity from a dystopian future in this latest installment of the Terminator franchise. James Cameron is also back and brings with him writing and producing credits, which is fitting — Terminator: Dark Fate is in essence Cameron’s sequel to Terminator 2: Judgment Day.

Julian Clarke

Tim Miller (Deadpool) is at the helm to direct the tale. It’s roughly two decades after the time of T2, and a new Rev-9 machine has been sent from an alternate future to kill Dani Ramos (Natalia Reyes), an unsuspecting auto plant worker in Mexico. But the new future’s resistance has sent back Grace (Mackenzie Davis), an enhanced super-soldier, to combat the Rev-9 and save her. They cross paths with Connor, and the story sets off on a mad dash to the finale at Hoover Dam.

Miller brought back much of his Deadpool team, including his VFX shop Blur, DP Ken Seng and editor Julian Clarke. This is also the second pairing of Miller and Clarke with Adobe. Both Deadpool and Terminator: Dark Fate were edited using Premiere Pro. In fact, Adobe was also happy to tie in with the film’s promotion through its own #CreateYourFate trailer remix challenge. Participants could re-edit their own trailer using supplied content from the film.

I recently spoke with Clarke about the challenges and fun of cutting this latest iteration of such an iconic film franchise.

Terminator: Dark Fate picks up two decades after Terminator 2, leaving out the timelines of the subsequent sequels. Was that always the plan, or did it evolve out of the process of making the film?
That had to do with the screenplay. You were written into a corner by the various sequels. We really wanted to bring Linda Hamilton’s character back. With Jim involved, we wanted to get back to first principles and have it based on Cameron’s mythology alone. To get back to the Linda/Arnold character arcs, and then add some new stuff to that.

Many fans were attracted to the franchise by Cameron’s two original Terminator films. Was there a conscious effort at integrating that nostalgia?
I come from a place of deep fandom for Terminator 2. As a teenager I had VHS copies of Aliens and Terminator 2 and watched them on repeat after school! Those films are deeply embedded in my psyche, and both of them have aged well — they still hold up. I watched the sequels, and they just didn’t feel like a Terminator film to me. So the goal was definitely to make it of the DNA of those first two movies. There’s going to be a chase. It’s going to be more grounded. It’s going to get back into the Sarah Connor character and have more heart.

This film tends to have elements of humor unlike most other action films. That must have posed a challenge to set the right tone without getting campy.
The humor thing is interesting. Terminator 2 has a lot of humor throughout. We have a little bit of humor in the first half and then more once Arnold shows up, but that’s really the way it had to be. The Dani Ramos character — who’s your entry point into the movie — is devastated when her whole family is killed. To have a lot of jokes happening would be terrible. It’s not the same in Terminator 2 because John Connor’s stepparents get very little screen time, and they don’t seem that nice. You feel bad for them, but it’s OK that you get into this funny stuff right off the bat. On this one we had to ease into the humor so you could [experience] the gravity of the situation at the start of the movie.

Did you have to do much to alter that balance during the edit?
There were one or two jokes that we nipped out, but it wasn’t like that whole first act was chock full of jokes. The tone of the first act is more like Terminator, which is more of a thriller or horror movie. Then it becomes more like T2 as the action gets bigger and the jokes come in. So the first half is like a bigger Terminator and the second half more like T2.

Deadpool, which Tim Miller also directed, used a very nonlinear story structure, balancing action, comedic moments and drama. Terminator was always designed with a linear, straightforward storyline. Right?
A movie hands you certain editing tools. Deadpool was designed to be nonlinear, with characters in different places, so there are a whole bunch of options for you. Terminator: Dark Fate is more like a road movie. The detonation of certain paths along the road are predetermined. You can’t be in Texas before Mexico. So the structural options you had were where to check in with the Rev-9, as well as the inter-scene structure. Once you are in the detention center, who are you cutting to? Sarah? Dani? However, where that is placed in the movie is pretty much set. All you can do is pace it up, pace it down, adjust how to get there. There aren’t a lot of mobile pieces that can be swapped around.

When we had talked after Deadpool, you discussed how you liked the assistants to build string-outs — what some call a Kem roll. Similar action is assembled back to back into a sequence in order from every take. Did you use that same organizational method on Terminator: Dark Fate?
Sometimes we were so swamped with material that there wasn’t time to create string-outs. I still like to have those. It’s a nice way to quickly see all the pieces that cover a moment. If you are trying to find the one take or action that’s 5% better than another, then it’s good to see them all in a row, rather than trying to keep it all in your head for a five-minute take. There was a lot of footage that we shot in the action scenes, but we didn’t do 11 or 12 takes for a dialogue scene. I didn’t feel like I needed some tool to quickly navigate through the dialogue takes. We would string out the ones that were more complicated.

Depending on the directing style, a series of takes may have increasingly calibrated performances with successive takes. With other directors, each take might be a lot different than the one before and after it. What is your approach to evaluating which is the best take to use?
It’s interesting when you use the earlier takes versus the later takes and what you get from them. The later takes are usually the ones that are most directed. The actors are warmed up and most closely nail what the director has in mind. So they are strong in that regard, but sometimes they can become more self-conscious. So sometimes the first take is more thrown away and may have less power but feels more real — more off the cuff. Sometimes a delivered dialogue line feels less written, and you’ll buy it more. Other times you’ll want that more dramatic quality of the later takes. My instinct is to first use the later takes, but as you start to revise a scene, you often go back to pieces of the earlier takes to ground it a little more.

How long did the production and post take?
It took a little over 100 days of shooting with a lot of units. I work on a lot of mid-budget films, so this seemed like a really long shoot. It was a little relentless for everyone — even squeezing it into those 100 days. Shooting action with a lot of VFX is slow due to the reset time needed between takes. The ending of the movie is 30 minutes of action in a row. That’s a big job shooting all of that stuff. When they have a couple of units cranking through the dialogue scenes plus shooting action sequences — that’s when I have to work hard to keep up. Once you hit the roadblocks of shooting just those little action pieces, you get a little time to catch up.

We had the usual director’s cut period and finished by the end of this September. The original plan was to finish by the beginning of September, but we needed the time for VFX. So everything piled up with the DI and the mix in order to still hit the release date. September got a little crazy. It seems like a long time — a total of 13 or 14 months — but it still was an absolute sprint to get the movie in shape and get the VFX into the film in time. This might be normal for some of these films, but compared to the other VFX movies I’ve done, it was definitely turning things up a notch!

I imagine that there was a fair amount of previz required to lay out the action for the large VFX and CG scenes. Did you have that to work with as placeholder shots? How did you handle adjusting the cut as the interim and final shots were delivered?
Tim is big into previz with his background in VFX and animation and owning his own VFX company. We had very detailed animatics going into production. Depending on a lot of factors, you still abandon a lot of things. For example, the freeway chases are quite a bit different because when you go there and do it with real cars, they do different things. Or only part of the cars look like they are going fast enough. Those scenes became quite different than the previz.

Others are almost 100% CG, so you can drop in the previz as placeholders. Although, even in those cases, sometimes the finished shot doesn’t feel real enough. In the “cartoon” world of previz, you can do wild camera moves and say, “Wow, that seems cool!” But when you start doing it at photoreal quality, then you go, “This seems really fake.” So we tried to get ahead of that stuff and find what to do with the camera to ground it. Kind of mess it up so it’s not too dynamic and perfect.

How involved were you with shaping the music? Did you use previous Terminator films’ scores as a temp track to cut with?
I was very involved with the music production. I definitely used a lot of temp music. Some of it was ripped from old Terminator movies, but there’s only so much Terminator 2 music you can put in. Those scores used a lot of synthesizers that date the sound. I did use “Desert Suite” from Terminator 2, when Sarah is in the hotel room. I loved having a very direct homage to a Sarah Connor moment while she’s talking about John. Then I begged our composer, Tom Holkenborg (from Junkie XL), to consider doing a version of it for our movie. So it is essentially the same chord progression.

That was an interesting musical and general question about how much do you lean into the homage thing. It’s powerful when you do it, but if you do it too much, it starts to feel artificial or pandering. So I tried to hit the sweet spot so you knew you were watching a Terminator movie, but not so much that it felt like Terminator karaoke. How many times can you go da-dum-dum-da-da-dum? You have to pick your moments for those Terminator motifs. It’s diminishing returns if you do it too much.

Another inspirational moment for me was another part in Terminator 2. There’s a disturbing industrial sound for the T-1000. It sounds more like a foghorn or something in a factory rather than music, and it created this unnerving quality to the T-1000 scenes, when he’s just scoping things out. So we came up with a modern-day electronic equivalent for the Rev-9 character, and that was very potent.

Was James Cameron involved much in the post production?
He’s quite busy with his Avatar movies. Some of the time he was in New Zealand, some of the time he was in Los Angeles. Depending on where he was and where we were in the process, we would hit milestones, like screenings or the first cut. We would send him versions and download a bunch of his thoughts.

Editing is very much a part of his wheelhouse. Unlike many other directors, he really thinks about this shot, then that shot, then the next shot. His mind really works that way. Sometimes he would give us pretty specific, dialed-in notes on things. Sometimes it would just be bigger suggestions, like, “Maybe the action cutting pattern could be more like this …” So we’d get his thoughts — and, of course, he’s Jim Cameron, and he knows the business and the Terminator franchise — so I listened pretty carefully to that input.

This is the second film that you’ve cut with Premiere Pro. Deadpool was first, and there were challenges using it on such a complex project. What was the experience like this time around?
Whenever you set out to use a new workflow — not to say Premiere is new because it’s been around a long time and has millions of users, but it’s unusual to use it on large VFX movies for specific reasons.

L-R: Matthew Carson and Julian Clarke

On Deadpool, that led to certain challenges, and that’s just what happens when you try to do something new. The fact that we had to split the movie into separate projects for each reel, instead of one large project. Even so, the size of our project files made it tough. They were so full of media that they would take five minutes to open. Nevertheless, we made it work, and there are lots of benefits to using Adobe over other applications.

In comparison, the interface to Avid Media Composer looks like it was designed 20 years ago, but they have multi-user collaboration nailed, and I love the trim tool. Yet, some things are old and creaky. Adobe’s not that at all. It’s nice and elegant in terms of the actual editing process. We got through it and sat down with Adobe to point out things that needed work, and they worked on them. When we started up Terminator, they had a whole new build for us. Project files now opened in 15 seconds. They are about halfway there in terms of multi-user editing. Now everyone can go into a big, shared project, and you can move bins back and forth. Although, only one user at a time has write access to the master project.

This is not simple software they are writing. Adobe is putting a lot of work into making it a more fitting tool for this type of movie. Even though this film was exponentially larger than Deadpool, from the Adobe side it was a smoother process. Props to them for doing that! The cool part about pioneering this stuff is the amount of work that Adobe is on board to do. They’ll have people work on stuff that is helpful to us, so we get to participate a little in how Adobe’s software gets made.

With two large Premiere Pro projects under your belt, what sort of new features would you like to see Adobe add to the application to make it even better for feature film editors?
They’ve built out the software from being a single-user application to being multi-user software, but the inherent software at the base level is still single-user. Sometimes your render files get unlinked when you go back and forth between multiple users. There’s probably stuff where they have to dig deep into the code to make those minor annoyances go away. Other items I’d like to see — let’s not use third-party software to send change lists to the mix stage.

I know Premiere Pro integrates beautifully with After Effects, but for me, After Effects is this precise tool for executing shots. I don’t want a fine tool for compositing — I want to work in broad strokes and then have someone come back and clean it up. I would love to have a tracking tool to composite two shots together for a seamless, split screen of two combined takes — features like that.

The After Effects integration and the color correction are awesome features for a single user to execute the film, but I don’t have the time to be the guy to execute the film at that high level. I just have to keep going. I want to be able to do a fast and dirty version so I know it’s not a terrible idea, and then turn to someone else and say, “OK, make that good.” After Effects is cool, but it’s more for VFX editors or single users who are trying to make a film on their own.

After all of these action films, are you ready to do a different type of film, like a period drama?
Funny you should say that. After Deadpool I worked on The Handmaid’s Tale pilot, and it was exactly that. I was working on this beautifully acted, elegant project with tons of women characters and almost everything was done in-camera. It was a lot of parlor room drama and power dynamics. And that was wonderful to work on after all of this VFX/action stuff. Periodically it’s nice to flex a different creative muscle.

It’s not that I only work on science-fiction/VFX projects — which I love — but, in part, people start associating you with a certain genre, and then that becomes an easy thing to pursue and get work for.

Much like acting, if you want to be known for doing a lot of different things, you have to actively pursue it. It’s easy to go where momentum will take you. If you want to be the editor who can cut any genre, you have to make it a mission to pursue those projects that will keep your resume looking diverse. For a brief moment after Deadpool, I might have been able to pivot to a comedy career (laughs). That was a real hybrid, so it was challenging to thread the needle of the different tones of the film and make it feel like one piece.

Any final thoughts on the challenges of editing Terminator: Dark Fate?
The biggest challenge of the film was that, in a way, the film was an ensemble with the Dani character, the Grace character, the Sarah character and Arnold’s character — the T-800. All of these characters are protagonists that all have their individual arcs. Feeling that you were adequately servicing those arcs without grinding the movie to a halt or not touching bases with a character often enough — finding out how to dial that in was the major challenge of the movie, plus the scale of the VFX and finessing all the action scenes. I learned a lot.


Oliver Peters is an experienced film and commercial editor/colorist. In addition, he regularly interviews editors for trade publications. He may be contacted through his website at oliverpeters.com

Final Cut ups Zoe Schack to editor

Final Cut in LA has promoted Zoe Schack to editor after working at the studio as an assistant editor for three years. While at Final Cut, Schack has been mentored by Final Cut editors Crispin Struthers, Joe Guest, Jeff Buchanan and Rick Russell.

Schack has edited branded content and commercials for Audi, Infiniti, Doritos and Dollar Shave Club as well as music videos for Swae Lee and Whitney Woerz. She has also worked with a number of high-profile directors, including Dougal Wilson, Ava DuVernay, Michel Gondry, Craig Gillespie and Steve Ayson.

Originally from a small town north of New York, Schack studied film at Rhode Island School of Design and NYU’s Tisch School of the Arts. Her love for documentaries led her to intern with renowned filmmaker Albert Maysles and to produce the Bicycle Film Festival in Portland, Oregon. She edited several short documentaries and a pilot series that were featured in many film festivals.

“It’s been amazing watching Zoe’s growth the last few years,” says Final Cut executive producer Suzy Ramirez. “She’s so meticulous, always doing a deep dive into the footage. Clients love working with her because she makes the process fun. She’s grown here at Final Cut so much already under the guidance of our editors, and her craft keeps evolving. I’m excited to see what’s ahead”

A post engineer’s thoughts on Adobe MAX, new offerings

By Mike McCarthy

Last week, I had the opportunity to attend Adobe’s MAX conference at the LA Convention Center. Adobe showed me, and 15,000 of my closest friends, the newest updates to pretty much all of its Creative Cloud applications, as well as a number of interesting upcoming developments. From a post production perspective, the most significant pieces of news are the release of Premiere Pro 14 and After Effects 17 (a.ka., the 2020 releases of those Creative Cloud apps).

The main show ran from Monday to Wednesday, with a number of pre-show seminars and activities the preceding weekend. My experience started off by attending a screening of the new Terminator Dark Fate film at LA Live, followed by Q&A with the director and post team. The new Terminator was edited in Premiere Pro, sharing the project assets between a large team of editors and assistants, with extensive use of After Effects, Adobe’s newly acquired Substance app and various other tools in the Creative Cloud.

The post team extolled the improvements in shared project support and project opening times since their last Premiere endeavor on the first Deadpool movie. Visual effects editor Jon Carr shared how they used the integration between Premiere and After Effects to facilitate rapid generation of temporary “postvis” effects. This helped the editors tell the story while they were waiting on the VFX teams to finish generating the final CGI characters and renders.

MAX
The conference itself kicked off with a keynote presentation of all of Adobe’s new developments and releases. The 150-minute presentation covered all aspects of the company’s extensive line of applications. “Creativity for All” is the primary message Adobe is going for, and they focused on the tension between creativity and time. So they are trying to improve their products in ways that give their users more time to be creative.

The three prongs of that approach for this iteration of updates were:
– Faster, more powerful, more reliable — fixing time-wasting bugs, improving hardware use.
– Create anywhere, anytime, with anyone — adding functionality via the iPad, and shared Libraries for collaboration.
– Explore new frontiers — specifically in 3D with Adobe’s Dimension, Substance and Aero)

Education is also an important focus for Adobe, with 15 million copies of CC in use in education around the world. They are also creating a platform for CC users to stream their working process to viewers who want to learn from them, directly from within the applications. That will probably integrate with the new expanded Creative Cloud app released last month. They also have released integration for Office apps to access assets in CC libraries.

The first application updates they showed off were in Photoshop. They have made the new locked aspect ratio scaling a toggle-able behavior, improved the warp tool and improved ways to navigate deep layer stacks by seeing which layers effect particular parts of an image. But the biggest improvement is AI-based object selection. This makes detailed maskings based on simple box selections or rough lassos. Illustrator now has GPU acceleration, improving performance of larger documents and a path simplifying tool to reduce the number of anchor points.

They released Photoshop for the iPad and announced that Illustrator will be following that path as well. Fresco is headed the other direction and now available on Windows. That is currently limited to Microsoft Surface products, but I look forward to being able to try it out on my ZBook-X2 at some point. Adobe XD has new features, and apparently is the best way to move complex Illustrator files into After Effects, which I learned at one of the sessions later.

Premiere
Premiere Pro 14 has a number of new features, the most significant one being AI-driven automatic reframe to allow you to automatically convert your edited project into other aspect ratios for various deliverables. While 16×9 is obviously a standard size, certain web platforms are optimized for square or tall videos. The feature can also be used to reframe content for 2.35 to 16×9 or 4×3, which are frequent delivery requirements for feature films that I work on. My favorite aspect of this new functionality is that the user has complete control over the results.

Unlike other automated features like warp stabilizer, which only offers on/off of applying the results, the auto-frame function just generates motion effect keyframes that can be further edited and customized by the user… once the initial AI pass is complete. It also has a nesting feature for retaining existing framing choices, that results in the creation of a new single-layer source sequence. I can envision this being useful for a number of other workflow processes — such as preparing for external color grading or texturing passes, etc.

They also added better support for multi-channel audio workflows and effects, improved playback performance for many popular video formats, better HDR export options and a variety of changes to make the motion graphics tools more flexible and efficient for users who use them extensively. They also increased the range of values available for clip playback speed and volume, and added support for new camera formats and derivations.

The brains behind After Effects have focused on improving playback and performance for this release and have made some significant improvements in that regard. The other big feature that actually may make a difference is content-aware fill for video. This was sneak previewed at MAX last year and first implemented in the NAB 2019 release of After Effects, but it should be greatly refined and improved in this version since it’s now twice as fast.

They also greatly improved support for OpenEXR frame sequences, especially with multiple render pass channels. The channels can be labeled; it creates a video contact sheet for viewing all the layers in thumbnail form. EXR playback performance is supposed to be greatly improved as well.

Character Animator is now at 3.0, and they have added keyframing of all editable values, trigger-able reposition “cameras” and trigger-able audio effects, among other new features. And Adobe Rush now supports publishing directly to TikTok.

Content Authenticity Initiative
Outside of individual applications, Adobe has launched the Content Authenticity Initiative in partnership with the NY Times and Twitter. It aims to fight fake news and restore consumer confidence in media. Its three main goals are: trust, attribution and authenticity. It aims to present end users with who created an image and who edited or altered it and, if so, in what ways. Seemingly at odds with that, they also released a new mobile app that edits images upon capture, using AI empowered “lenses” for highly stylized looks, even providing a live view.

This opening keynote was followed by a selection of over 200 different labs and sessions available over the next three days. I attended a couple sessions focused on After Effects, as that is a program I know I don’t use to its full capacity. (Does anyone, really?)

Partners
A variety of other partner companies were showing off their products in the community pavilion. HP was pushing 3D printing and digital manufacturing tools that integrate with Photoshop and Illustrator. Dell has a new 27-inch color accurate monitor with built-in colorimeter, presumably to compete with HP’s top end DreamColor displays. Asus also has some new HDR monitors that are Dolby Vision compatible. One is designed to be portable, and is as thin and lightweight as a laptop screen. I have always wondered why that wasn’t a standard approach for desktop displays.

Keynotes
Tuesday opened with a keynote presentation from a number of artists of different types, speaking or being interviewed. Jason Levine’s talk with M. Night Shyamalan was my favorite part, even though thrillers aren’t really my cup of tea. Later, I was able to sit down and talk with Patrick Palmer, Adobe’s Premiere Pro product manager about where Premiere is headed and the challenges of developing HDR creation tools when there is no unified set of standards for final delivery. But I am looking forward to being able to view my work in HDR while I am editing at some point in the future.

One of the highlights of MAX is the 90-minute Sneaks session on Tuesday night, where comedian John Mulaney “helped” a number of Adobe researchers demonstrate new media technologies they are working on. These will eventually improve audio quality, automate animation, analyze photographic authenticity and many other tasks once they are refined into final products at some point in the future.

This was only my second time attending MAX, and with Premiere Rush being released last year, video production was a key part of that show. This year, without that factor, it was much more apparent to me that I was an engineer attending an event catering to designers. Not that this is bad, but I mention it here because it is good to have a better idea of what you are stepping into when you are making decisions about whether to invest in attending a particular event.

Adobe focuses MAX on artists and creatives as opposed to engineers and developers, who have other events that are more focused on their interests and needs. I suppose that is understandable since it is not branded Creative Cloud for nothing. But it is always good to connect with the people who develop the tools I use, and the others who use them with me, which is a big part of what Adobe MAX is all about.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

James Norris joins Nomad in London as editor, partner

Nomad in London has added James Norris as editor and partner. A self-taught, natural editor, James started out running for the likes of Working Title, Partizan and Tomboy Films. He then moved to Whitehouse Post as an assistant where he refined his craft and rose through the ranks to become an editor.

Over the past 15 years, he’s worked across commercials, music videos, features and television. Norris edited Ikea’s Fly Robot Fly spot and Asda’s Get Possessed piece, and has recently cut a new project for Nike. Working within television and film, he also cut an episode of the BAFTA-nominated drama Our World War and feature film We Are Monster.

“I was attracted to Nomad for their vision for the future and their dedication to the craft of editing. They have a wonderful history but are also so forward-thinking and want to create new, exciting things. The New York and LA offices have seen incredible success over the last few years, and now there’s Tokyo and London too. On top of this, Nomad feels like home already. They’re really lovely people — it really does feel like a family.”

Norris will be cutting on Avid Media Composer at Nomad.

 

Review: Lenovo Yoga A940 all-in-one workstation

By Brady Betzel

While more and more creators are looking for alternatives to the iMac, iMac Pro and Mac Pro, there are few options with high-quality, built-in monitors: Microsoft Surface Studio, HP Envy, and Dell 7000 are a few. There are even fewer choices if you want touch and pen capabilities. It’s with that need in mind that I decided to review the Lenovo Yoga A940, a 27-inch, UHD, pen- and touch-capable Intel Core i7 computer with an AMD Radeon RX 560 GPU.

While I haven’t done a lot of all-in-one system reviews like the Yoga A940, I have had my eyes on the Microsoft Surface Studio 2 for a long time. The only problem is the hefty price tag of around $3,500. The Lenovo’s most appealing feature — in addition to the tech specs I will go over — is its price point: It’s available from $2,200 and up. (I saw Best Buy selling a similar system to the one I reviewed for around $2,299. The insides of the Yoga and the Surface Studio 2 aren’t that far off from each other either, at least not enough to make up for the $1,300 disparity.)

Here are the parts inside the Lenovo Yoga A940: Intel Core i7-8700 3.2GHz processor (up to 4.6GHz with Turbo Boost), six cores (12 threads) and 12MB cache; 27-inch 4K UHD IPS multitouch 100% Adobe RGB display; 16GB DDR4 2666MHz (SODIMM) memory; 1TB 5400 RPM drive plus 256GB PCIe SSD; AMD Radeon RX 560 4GB graphics processor; 25-degree monitor tilt angle; Dolby Atmos speakers; Dimensions: 25 inches by 18.3 inches by 9.6 inches; Weight: 32.2 pounds; 802.11AC and Bluetooth 4.2 connectivity; side panel inputs: Intel Thunderbolt, USB 3.1, 3-in-1 card reader and audio jack; rear panel inputs: AC-in, RJ45, HDMI and four USB 3.0; Bluetooth active pen (appears to be the Lenovo Active Pen 2); and QI wireless charging technology platform.

Digging In
Right off the bat, I just happened to put my Android Galaxy phone on the odd little flat platform located on the right side of the all-in-one workstation, just under the monitor, and I saw my phone begin to charge wirelessly. QI wireless charging is an amazing little addition to the Yoga; it really comes through in a pinch when I need my phone charged and don’t have the cable or charging dock around.

Other than that nifty feature, why would you choose a Lenovo Yoga A940 over any other all-in-one system? Well, as mentioned, the price point is very attractive, but you are also getting a near-professional-level system in a very tiny footprint — including Thunderbolt 3 and USB connections, HDMI port, network port and SD card reader. While it would be incredible to have an Intel i9 processor inside of the Yoga, the i7 clocks in at 3.2GHz with six cores. Not a beast, but enough to get the job done inside of Adobe Premiere and Blackmagic’s DaVinci Resolve, but maybe with transcoded files instead of Red raw or the like.

The Lenovo Yoga A940 is outfitted with a front-facing Dolby Atmos audio speaker as well as Dolby Vision technology in the IPS display. The audio could use a little more low end, but it is good. The monitor is surprisingly great — the whites are white and the blacks are black; something not everyone can get right. It has 100% Adobe RGB color coverage and is Pantone-validated. The HDR is technically Dolby Vision and looks great at about 350 nits (not the brightest, but it won’t burn your eyes out either). The Lenovo BT active pen works well. I use Wacom tablets and laptop tablets daily, so this pen had a lot to live up to. While I still prefer the Wacom pen, the Lenovo pen, with 4,096 levels of sensitivity, will do just fine. I actually found myself using the touchscreen with my fingers way more than the pen.

One feature that sets the A940 apart from the other all-in-one machines is the USB Content Creation dial. With the little time I had with the system, I only used it to adjust speaker volume when playing Spotify, but in time I can see myself customizing the dials to work in Premiere and Resolve. The dial has good action and resistance. To customize the dial, you can jump into the Lenovo Dial Customization Assistant.

Besides the Intel i7, there is an AMD Radeon RX 560 with 4GB of memory, two 3W and two 5W speakers, 32 GB of DDR4 2666 MHz memory, a 1 TB 5400 RPM hard drive for storage, and a 256GB PCIe SSD. I wish the 1TB drive was also an SSD, but obviously Lenovo has to keep that price point somehow.

Real-World Testing
I use Premiere Pro, After Effects and Resolve all the time and can understand the horsepower of a machine through these apps. Whether editing and/or color correcting, the Lenovo A940 is a good medium ground — it won’t be running much more than 4K Red raw footage in real time without cutting the debayering quality down to half if not one-eighth. This system would make a good “offline” edit system, where you transcode your high-res media to a mezzanine codec like DNxHR or ProRes for your editing and then up-res your footage back to the highest resolution you have. Or, if you are in Resolve, maybe you could use optimized media for 80% of the workflow until you color. You will really want a system with a higher-end GPU if you want to fluidly cut and color in Premiere and Resolve. That being said, you can make it work with some debayer tweaking and/or transcoding.

In my testing I downloaded some footage from Red’s sample library, which you can find here. I also used some BRAW clips to test inside of Resolve, which can be downloaded here. I grabbed 4K, 6K, and 8K Red raw R3D files and the UHD-sized Blackmagic raw (BRAW) files to test with.

Adobe Premiere
Using the same Red clips as above, I created two one-minute-long UHD (3840×2160) sequences. I also clicked “Set to Frame Size” for all the clips. Sequence 1 contained these clips with a simple contrast, brightness and color cast applied. Sequence 2 contained these same clips with the same color correction applied, but also a 110% resize, 100 sharpen and 20 Gaussian Blur. I then exported them to various codecs via Adobe Media Encoder using the OpenCL for processing. Here are my results:

QuickTime (.mov) H.264, No Audio, UHD, 23.98 Maximum Render Quality, 10 Mb/s:
Color Correction Only: 24:07
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 26:11
DNxHR HQX 10 bit UHD
Color Correction Only: 25:42
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 27:03

ProRes HQ
Color Correction Only: 24:48
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 25:34

As you can see, the export time is pretty long. And let me tell you, once the sequence with the Gaussian Blur and Resize kicked in, so did the fans. While it wasn’t like a jet was taking off, the sound of the fans definitely made me and my wife take a glance at the system. It was also throwing some heat out the back. Because of the way Premiere works, it relies heavily on the CPU over GPU. Not that it doesn’t embrace the GPU, but, as you will see later, Resolve takes more advantage of the GPUs. Either way, Premiere really taxed the Lenovo A940 when using 4K, 6K and 8K Red raw files. Playback in real time wasn’t possible except for the 4K files. I probably wouldn’t recommend this system for someone working with lots of higher-than-4K raw files; it seems to be simply too much for it to handle. But if you transcode the files down to ProRes, you will be in business.

Blackmagic Resolve 16 Studio
Resolve seemed to take better advantage of the AMD Radeon RX 560 GPU in combination with the CPU, as well as the onboard Intel GPU. In this test I added in Resolve’s amazing built-in spatial noise reduction, so other than the Red R3D footage, this test and the Premiere test weren’t exactly comparing apples to apples. Overall the export times will be significantly higher (or, in theory, they should be). I also added in some BRAW footage to test for fun, and that footage was way easier to work and color with. Both sequences were UHD (3840×2160) 23.98. I will definitely be looking into working with more BRAW footage. Here are my results:

Playback: 4K realtime playback at half-premium, 6K no realtime playback, 8K no realtime playback

H.264 no audio, UHD, 23.98fps, force sizing and debayering to highest quality
Export 1 (Native Renderer)
Export 2 (AMD Renderer)
Export 3 (Intel QuickSync)

Color Only
Export 1: 3:46
Export 2: 4:35
Export 3: 4:01

Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:51
Export 2: 37:21
Export 3: 37:13

BRAW 4K (4608×2592) Playback and Export Tests

Playback: Full-res would play at about 22fps; half-res plays at realtime

H.264 No Audio, UHD, 23.98 fps, Force Sizing and Debayering to highest quality
Color Only
Export 1: 1:26
Export 2: 1:31
Export 3: 1:29
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:30
Export 2: 36:24
Export 3: 36:22

DNxHR 10 bit:
Color Correction Only: 3:42
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur: 39:03

One takeaway from the Resolve exports is that the color-only export was much more efficient than in Premiere, taking just over three or four times realtime for the intensive Red R3D files, and just over one and a half times real time for BRAW.

Summing UpIn the end, the Lenovo A940 is a sleek looking all-in-one touchscreen- and pen-compatible system. While it isn’t jam-packed with the latest high-end AMD GPUs or Intel i9 processors, the A940 is a mid-level system with an incredibly good-looking IPS Dolby Vision monitor with Dolby Atmos speakers. It has some other features — like IR camera, QI wireless charger and USB Dial — that you might not necessarily be looking for but love to find.

The power adapter is like a large laptop power brick, so you will need somewhere to stash that, but overall the monitor has a really nice 25-degree tilt that is comfortable when using just the touchscreen or pen, or when using the wireless keyboard and mouse.

Because the Lenovo A940 starts at just around $2,299 I think it really deserves a look when searching for a new system. If you are working in primarily HD video and/or graphics this is the all-in-one system for you. Check out more at their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

The editors of Ad Astra: John Axelrad and Lee Haugen

By Amy Leland

The new Brad Pitt film Ad Astra follows astronaut Roy McBride (Pitt) as he journeys deep into space in search of his father, astronaut Clifford McBride (Tommy Lee Jones). The elder McBride disappeared years before, and his experiments in space might now be endangering all life on Earth. Much of the film features Pitt’s character alone in space with his thoughts, creating a happy challenge for the film’s editing team, who have a long history of collaboration with each other and the film’s director James Gray.

L-R: Lee Haugen and John Axelrad

Co-editors John Axelrad, ACE, and Lee Haugen share credits on three previous films — Haugen served as Axelrad’s apprentice editor on Two Lovers, and the two co-edited The Lost City of Z and Papillon. Ad Astra’s director, James Gray, was also at the helm of Two Lovers and The Lost City of Z. A lot can be said for long-time collaborations.

When I had the opportunity to speak with Axlerad and Haugen, I was eager to find out more about how this shared history influenced their editing process and the creation of this fascinating story.

What led you both to film editing?
John Axelrad: I went to film school at USC and graduated in 1990. Like everyone else, I wanted to be a director. Everyone that goes to film school wants that. Then I focused on studying cinematography, but then I realized several years into film school that I don’t like being on the set.

Not long ago, I spoke to Fred Raskin about editing Once Upon a Time… in Hollywood. He originally thought he was going to be a director, but then he figured out he could tell stories in an air-conditioned room.
Axelrad: That’s exactly it. Air conditioning plays a big role in my life; I can tell you that much. I get a lot of enjoyment out of putting a movie together and of being in my own head creatively and really working with the elements that make the magic. In some ways, there are a lot of parallels with the writer when you’re an editor; the difference is I’m not dealing with a blank page and words — I’m dealing with images, sound and music, and how it all comes together. A lot of people say the first draft is the script, the second draft is the shoot, and the third draft is the edit.

L-R: John and Lee at the Papillon premiere.

I started off as an assistant editor, working for some top editors for about 10 years in the ’90s, including Anne V. Coates. I was an assistant on Out of Sight when Anne Coates was nominated for the Oscar. Those 10 years of experience really prepped me for dealing with what it’s like to be the lead editor in charge of a department — dealing with the politics, the personalities and the creative content and learning how to solve problems. I started cutting on my own in the late ‘90s, and in the early 2000s, I started editing feature films.

When did you meet your frequent collaborator James Gray?
Axelrad: I had done a few horror features, and then I hooked up with James on We Own the Night, and that went very well. Then we did Two Lovers after that. That’s where Lee Haugen came in — and I’ll let him tell his side of the story — but suffice it to say that I’ve done five films for James Gray, and Lee Haugen rose up through the ranks and became my co-editor on the Lost City of Z. Then we edited the movie Papillon together, so it was just natural that we would do Ad Astra together as a team.

What about you, Lee? How did you wind your way to where we are now?
Lee Haugen: Growing up in Wisconsin, any time I had a school project, like writing a story or writing an article, I would change it into a short video or short film instead. Back then I had to shoot on VHS tape and edited tape to tape by pushing play and hitting record and timing it. It took forever, but that was when I really found out that I loved editing.

So I went to school with a focus on wanting to be an editor. After graduating from Wisconsin, I moved to California and found my way into reality television. That was the mid-2000s and it was the boom of reality television; there were a lot of jobs that offered me the chance to get in the hours needed for becoming a member of the Editors Guild as well as more experience on Avid Media Composer.

After about a year of that, I realized working the night shift as an assistant editor on reality television shows was not my real passion. I really wanted to move toward features. I was listening to a podcast by Patrick Don Vito (editor of Green Book, among other things), and he mentioned John Axelrad. I met John on an interview for We Own the Night when I first moved out here, but I didn’t get the job. But a year or two later, I called him, and he said, “You know what? We’re starting another James Gray movie next week. Why don’t you come in for an interview?” I started working with John the day I came in. I could not have been more fortunate to find this group of people that gave me my first experience in feature films.

Then I had the opportunity to work on a lower-budget feature called Dope, and that was my first feature editing job by myself. The success of the film at Sundance really helped launch my career. Then things came back around. John was finishing up Krampus, and he needed somebody to go out to Northern Ireland to edit the assembly of The Lost City of Z with James Gray. So, it worked out perfectly, and from there, we’ve been collaborating.

Axelrad: Ad Astra is my third time co-editing with Lee, and I find our working as a team to be a naturally fluid and creative process. It’s a collaboration entailing many months of sharing perspectives, ideas and insights on how best to approach the material, and one that ultimately benefits the final edit. Lee wouldn’t be where he is if he weren’t a talent in his own right. He proved himself, and here we are together.

How has your collaborative process changed and grown from when you were first working together (John, Lee and James) to now, on Ad Astra?
Axelrad: This is my fifth film with James. He’s a marvelous filmmaker, and one of the reasons he’s so good is that he really understands the subtlety and power of editing. He’s very neoclassical in his approach, and he challenges the viewer since we’re all accustomed to faster cutting and faster pacing. But with James, it’s so much more of a methodical approach. James is very performance-driven. It’s all about the character, it’s all about the narrative and the story, and we really understand his instincts. Additionally, you need to develop a second-hand language and truly understand what the director wants.

Working with Lee, it was just a natural process to have the two of us cutting. I would work on a scene, and then I could say, “Hey Lee, why don’t you take a stab at it?” Or vice versa. When James was in the editing room working with us, he would often work intensely with one of us and then switch rooms and work with the other. I think we each really touched almost everything in the film.

Haugen: I agree with John. Our way of working is very collaborative —that includes John and I, but also our assistant editors and additional editors. It’s a process that we feel benefits the film as a whole; when we have different perspectives, it can help us explore different options that can raise the film to another level. And when James comes in, he’s extremely meticulous. And as John said, he and I both touched every single scene, and I think we’ve even touched every frame of the film.

Axelrad: To add to what Lee said, about involving our whole editing team, I love mentoring, and I love having my crew feel very involved. Not just technical stuff, but creatively. We worked with a terrific guy, Scott Morris, who is our first assistant editor. Ultimately, he got bumped up during the course of the film and got an additional editor credit on Ad Astra.

We involve everyone, even down to the post assistant. We want to hear their ideas and make them feel like a welcome part of a collaborative environment. They obviously have to focus on their primary tasks, but I think it just makes for a much happier editing room when everyone feels part of a team.

How did you manage an edit that was so collaborative? Did you have screenings of dailies or screenings of cuts?
Axelrad: During dailies it was just James, and we would send edits for him to look at. But James doesn’t really start until he’s in the room. He really wants to explore every frame of film and try all the infinite combinations, especially when you’re dealing with drama and dealing with nuance and subtlety and subtext. Those are the scenes that take the longest. When I put together the lunar rover chase, it was almost easier in some ways than some of the intense drama scenes in the film.

Haugen: As the dailies came in, John and I would each take a scene and do a first cut. And then, once we had something to present, we would call everybody in to watch the scene. We would get everybody’s feedback and see what was working, what wasn’t working. If there were any problems that we could address before moving to the next scene, we would. We liked to get the outside point of view, because once you get further and deeper into the process of editing a film, you do start to lose perspective. To be able to bring somebody else in to watch a scene and to give you feedback is extremely helpful.

One thing that John established with me on Two Lovers — my first editing job on a feature — was allowing me to come and sit in the room during the editing. After my work was done, I was welcome to sit in the back of the room and just observe the interaction between John and James. We continued that process with this film, just to give those people experience and to learn and to observe how an edit room works. That helped me become an editor.

John, you talked about how the action scenes are often easier to cut than the dramatic scenes. It seems like that would be even more true with Ad Astra, because so much of this film is about isolation. How does that complicate the process of structuring a scene when it’s so much about a person alone with his own thoughts?
Axelrad: That was the biggest challenge, but one we were prepared for. To James’ credit, he’s not precious about his written words; he’s not precious about the script. Some directors might say, “Oh no, we need to mold it to fit the script,” but he allows the actors to work within a space. The script is a guide for them, and they bring so much to it that it changes the story. That’s why I always say that we serve the ego of the movie. The movie, in a way, informs us what it wants to be, and what it needs to be. And in the case of this, Brad gave us such amazing nuanced performances. I believe you can sometimes shape the best performance around what is not said through the more nuanced cues of facial expressions and gestures.

So, as an editor, when you can craft something that transcends what is written and what is photographed and achieve a compelling synergy of sound, music and performance — to create heightened emotions in a film — that’s what we’re aiming for. In the case of his isolation, we discovered early on that having voiceover and really getting more interior was important. That wasn’t initially part of the cut, but James had written voiceover, and we began to incorporate that, and it really helped make this film into more of an existential journey.

The further he goes out into space, the deeper we go into his soul, and it’s really a dive into the subconscious. That sequence where he dives underwater in the cooling liquid of the rocket, he emerges and climbs up the rocket, and it’s almost like a dream. Like how in our dreams we have superhuman strength as a way to conquer our demons and our fears. The intent really was to make the film very hypnotic. Some people get it and appreciate it.

As an editor, sound often determines the rhythm of the edit, but one of the things that was fascinating with this film is how deafeningly quiet space likely is. How do you work with the material when it’s mostly silent?
Haugen: Early on, James established that he wanted to make the film as realistic as possible. Sound, or lack of sound, is a huge part of space travel. So the hard part is when you have, for example, the lunar rover chase on the moon, and you play it completely silent; it’s disarming and different and eerie, which was very interesting at first.

But then we started to explore how we could make this sound more realistic or find a way to amplify the action beats through sound. One way was, when things were hitting him or things were vibrating off of his suit, he could feel the impacts and he could hear the vibrations of different things going on.

Axelrad: It was very much part of our rhythm, of how we cut it together, because we knew James wanted to be as realistic as possible. We did what we could with the soundscapes that were allowable for a big studio film like this. And, as Lee mentioned, playing it from Roy’s perspective — being in the space suit with him. It was really just to get into his head and hear things how he would hear things.

Thanks to Max Richter’s beautiful score, we were able to hone the rhythms to induce a transcendental state. We had Gary Rydstrom and Tom Johnson mix the movie for us at Skywalker, and they were the ultimate creators of the balance of the rhythms of the sounds.

Did you work with music in the cut?
Axelrad: James loves to temp with classical music. In previous films, we used a lot of Puccini. In this film, there was a lot of Wagner. But Max Richter came in fairly early in the process and developed such beautiful themes, and we began to incorporate his themes. That really set the mood.

When you’re working with your composer and sound designer, you feed off each other. So things that they would do would inspire us, and we would change the edits. I always tell the composers when I work with them, “Hey, if you come up with something, and you think musically it’s very powerful, let me know, and I am more than willing to pitch changing the edit to accommodate.” Max’s music editor, Katrina Schiller, worked in-house with us and was hugely helpful, since Max worked out of London.

We tend not to want to cut with music because initially you want the edit not to have music as a Band-Aid to cover up a problem. But once we feel the picture is working, and the rhythm is going, sometimes the music will just fit perfectly, even as temp music. And if the rhythms match up to what we’re doing, then we know that we’ve done it right.

What is next for the two of you?
Axelrad: I’m working on a lower-budget movie right now, a Lionsgate feature film. The title is under wraps, but it stars Janelle Monáe, and it’s kind of a socio-political thriller.

What about you Lee?
Haugen: I jumped onto another film as well. It’s an independent film starring Zoe Saldana. It’s called Keyhole Garden, and it’s this very intimate drama that takes place on the border between Mexico and America. So it’s a very timely story to tell.


Amy Leland is a film director and editor. Her short film, Echoes, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Updated Apple Final Cut Pro features new Metal engine

Apple has updated Final Cut Pro X with a new Metal engine designed to provide performance gains across a wide range of Mac systems. It takes advantage of the new Mac Pro and the high-resolution, high-dynamic-range viewing experience of Apple Pro Display XDR. The company also optimized Motion and Compressor with Metal as well.

The Metal-based engine improves playback and accelerates graphics tasks in FCP X, including rendering, realtime effects and exporting on compatible Mac computers. According to Apple, video editors with a 15-inch MacBook Pro will benefit from performance that’s up to 20 percent faster, while editors using an iMac Pro will see gains up to 35 percent.

Final Cut Pro also works with the new Sidecar feature of macOS Catalina, which allows users to extend their Mac workspace by using an iPad as a second display to show the browser or viewer. Video editors can use Sidecar with a cable or they can connect wirelessly.

Final Cut Pro will now support multiple GPUs and up to 28 CPU cores. This means that rendering is up to 2.9 times faster and transcoding is up to 3.2 times faster than on the previous-generation 12-core Mac Pro. And Final Cut Pro uses the new Afterburner card when working with ProRes and ProRes Raw. This allows editors to simultaneously play up to 16 streams of 4K ProRes 422 video or work in 8K resolution with support for up to three streams of 8K ProRes Raw video.

Pro Display XDR
The Pro Display XDR features a 32-inch Retina 6K display, P3 wide color and extreme dynamic range. Final Cut Pro users can view, edit, grade and deliver HDR video with 1,000 nits of full screen sustained brightness, 1,600 nits peak brightness and a 1,000,000:1 contrast ratio. Pro Display XDR connects to the Mac through a single Thunderbolt cable, and pros using Final Cut Pro on Mac Pro can simultaneously use up to three Pro Display XDR units — two for the Final Cut Pro interface and one as a dedicated professional reference monitor.

Final Cut Pro 10.4.7 is available now as a free update for existing users and for $299.99 for new users on the Mac App Store. Motion 5.4.4 and Compressor 4.4.5 are also available today as free updates for existing users and for $49.99 each for new users on the Mac App Store.

Uppercut ups Tyler Horton to editor

After spending two years as an assistant at New York-based editorial house Uppercut, Tyler Horton has been promoted to editor. This is the first internal talent promotion for Uppercut.

Horton first joined Uppercut in 2017 after a stint as an assistant editor at Whitehouse Post. Stepping up as editor he’s cut notable projects, such as a recent Nike campaign “Letters to Heroes,” a series launched in conjunction with the US Open that highlights young athletes meeting their role models, including Serena Williams and Naomi Osaka. He also has cut campaigns for brands such as Asics, Hypebeast, Volvo and MOMA.

“From the beginning, Uppercut was always intentionally a boutique studio that embraced a collaborative of visions and styles — never just a one-person shop,” says Uppercut EP Julia Williams. “Tyler took initiative from day one to be as hands-on as possible with every project and we’ve been proud to see him really grow and refine his own voice.”

Horton’s love of film was sparked by watching sports reels and highlight videos. He went on to study film editing, then hit the road to tour with his band for four years before returning to his passion for film.

Wildlife DP Steve Lumpkin on the road and looking for speed

For more than a decade, Steve Lumpkin has been traveling to the Republic of Botswana to capture and celebrate the country’s diverse and protected wildlife population. As a cinematographer and still photographer, Under Prairies Skies Photography‘s Lumpkin will spend a total of 65 days this year filming in the bush for his current project, Endless Treasures of Botswana.

Steve Lumpkin

It’s a labor of love that comes through in his stunning photographs, whether they depict a proud and healthy lioness washed with early-morning sunlight, an indolent leopard draped over a tree branch or a herd of elephants traversing a brilliant green meadow. The big cats hold a special place in Lumpkin’s heart, and documenting Botswana’s largest pride of lions is central to the project’s mission.

“Our team stands witness to the greatest conservation of the natural world on the planet. Botswana has the will and the courage to protect all things wild,” he explains. “I wanted to fund a not-for-profit effort to create both still images and films that would showcase The Republic of Botswana’s success in protecting these vulnerable species. In return, the government granted me a two-year filming permit to bring back emotional, true tales from the bush.”

Lumpkin recently graduated to shooting 4K video in the bush in Apple ProRes Raw, using a Sony FS5 camera and an Atomos Inferno recorder. He brings the raw footage back to his US studio for post, working in Apple Final Cut Pro on an iMac 5K and employing a variety of tools, including Color Grading Central and Neat Video.

Leopard

Until recently, Lumpkin was hitting a performance snag when transferring files from his QNAP TBS 882T NAS storage system to his iMac Pro. “I was only getting read times of about 100 Mb/sec from Thunderbolt, so editing 4K footage was painful,” he says. “At the time, I was transitioning to ProRes RAW, and I knew I needed a big performance kick.”

On the recommendation of Bob Zelin, video engineering consultant and owner of Rescue 1, Lumpkin installed Sonnet’s Solo10G Thunderbolt 3 adapter. The Solo10G uses the 10GbE standard to connect computers via Ethernet cables to high-speed infrastructure and storage systems. “Instantly, I jumped to a transfer rate of more than 880MB per second, a nearly tenfold throughput increase,” he says. “The system just screams now – the Solo10G has accelerated every piece of my workflow, from ingest to 4K editing to rendering and output.”

“So many colleagues I know are struggling with this exact problem — they need to work with huge files and they’ve got these big storage arrays, but their Thunderbolt 2 or 3 connections alone just aren’t cutting it.”

With Lumpkin, everything comes down to the wildlife. He appreciates any tools that help streamline his ability to tell the story of the country and its tremendous success in protecting threatened species. “The work we’re doing on behalf of Botswana is really what it’s all about — in 10 or 15 years, that country might be the only place on the planet where some of these animals still exist.

“Botswana has the largest herd of elephants in Africa and the largest group of wild dogs, of which there are only about 6,000 left,” says Lumpkin. “Products like Sonnet’s Solo10G, Final Cut, the Sony FS5 camera and Atomos Inferno, among others, help our team celebrate Botswana’s recognition as the conservation leader of Africa.”

IBC 2019 in Amsterdam: Big heads in the cloud

By David Cox

IBC 2019 kicked off with an intriguing announcement from Avid. The company entered into a strategic alliance with Microsoft and Disney’s Studio Lab to enable remote editorial workflows in the cloud.

The interesting part for me is how this affects the perception of post producing in the cloud, rather than the actual technology of it. It has been technically possible to edit remotely in the cloud for some time —either by navigating the Wild West interfaces of the principal cloud providers and “spinning up” a remote computer, connecting some storage and content, and then running an edit app or alternatively, by using a product that takes care of all that such as Blackbird. No doubt, the collaboration with Disney will produce products and services within an ecosystem that makes the technical use of the cloud invisible.

Avid press conference

However, what interests me is that arguably, the perception of post producing in the cloud is instantly changed. The greatest fear of post providers relates to the security of their clients’ intellectual property. Should a leak ever occur, to retain the client (or indeed avoid a catastrophic lawsuit), the post facility would have to make a convincing argument that security protocols were appropriate. Prior to the Disney/Avid/Microsoft Azure announcement, the part of that argument where the post houses say “…then we sent your valuable intellectual property to the cloud” caused a sticky moment. However, following this announcement, there has been an inherent endorsement by the owner of one of the most valuable IP catalogs (Disney) that post producing in the cloud is safe — or at least will be.

Cloudy Horizons
At the press conference where Avid made its Disney announcement, I asked whether the proposed cloud service would be a closed, Avid-only environment or an open platform to include other vendors. I pointed out that many post producers also use non-Avid products for various aspects, from color grading to visual. Despite my impertinence in mentioning competitors (even though Avid had kindly provided lunch), CEO Jeff Rosica provided a well-reasoned and practical response. To paraphrase, while he did not explicitly say the proposed ecosystem would be closed, he suggested that from a commercial viewpoint, other vendors would more likely want to make their own cloud offerings.

Rosica’s comments suggest that post houses can expect many clouds on their horizons from various application developers. The issue will then be how these connect to make coherent and streamlined workflows. This is not a new puzzle for post people to solve — we have been trying to make local systems from different manufacturers to talk to each other for years, with varying degrees of success. Making manufacturers’ various clouds work together would be an extension of that endeavor. Hopefully, manufacturers will use their own migrations to the cloud to further open their systems, rather than see it as an opportunity to play defensive, locking bespoke file systems and making cross-platform collaboration unnecessarily awkward. Too optimistic, perhaps!

Or One Big Cloud?
Separately to the above, just prior to IBC, MovieLabs introduced its white paper, which discussed a direction of travel for movie production toward the year 2030. The IBC produced a MovieLabs panel on the Sunday of the show, moderated by postPerspective’s own Randi Altman and featuring tech chiefs from the major studios. It would be foolish not to pay it proper consideration, given that it’s backed by Disney, Sony, Paramount, Warner Bros. and Universal.

MovieLabs panel

To summarize, the proposition is that the digital assets that will be manipulated to make content stay in one centralized cloud. Apps that manipulate those assets, such as editorial and visual effects apps, delivery processes and so on, will operate in the same cloud space. The talent that drives those apps will do so via the cloud. Or to put it slightly differently, the content assets don’t move — rather, the production apps and talent move to the assets. Currently, we do the opposite: the assets are transferred to where the post services are provided.

There are many advantages to this idea. Multiple transfers of digital assets to many post facilities would end. Files would be secured on a policy basis, enabling only the relevant operators to have access for the appropriate duration. Centralized content libraries would be produced, helping to enable on-the-fly localization, instant distribution and multi-use derivatives, such as marketing materials and games.

Of course, there are many questions. How do the various post application manufacturers maintain their product values if they all work as in-cloud applications on someone else’s hardware? What happens to traditional post production facilities if they don’t need any equipment and their artists log in from wherever? How would a facility protect itself from payment disputes if it does not have control over the assets it produces?

Personally, I have moved on from the idea of brick-and-mortar facilities. Cloud post permits nearly unlimited resources and access to a global pool of talent, not just those who reside within a commutable distance from the office. I say, bring it on… within reason. Of course, this initiative relates only to the production of content for those key studios. There’s a whole world of content production beyond that scope.

Blackmagic

Knowing Your Customer
Another area of interest for me at IBC 2019 was how offerings to colorists have become quite polarized. On one hand there is the seemingly all-conquering Resolve from Blackmagic Design. Inexpensive, easy to access and ubiquitous. On the other hand there is Baselight from FilmLight — a premium brand with a price tag and associated entry barrier to match. The fact that these two products are both successful in the same market but with very different strategies is testament to a fundamental business rule: “Know your customer.” If you know who your customer is going to be, you can design and communicate the ideal product for them and sell it at the right price.

A chat with FilmLight’s joint founder, Wolfgang Lempp, and development director Martin Tlaskal was very informative. Lempp explained that the demand placed on FilmLight’s customers is similarly polarized. On one hand, clients — including major studios and Netflix — mandate fastidious adherence to advanced and ever-improving technical standards, as well as image pipelines that are certified at every step. On the other hand, different clients place deadline or budget as a prevalent concern. Tlaskal set out for FilmLight to support those color specialists that aim for top-of-the industry excellence. Having the template for the target customer defines and drives what features FilmLight will develop for its Baselight product.

FilmLight

At IBC 2019, FilmLight hosted guest speaker-led demonstrations (“Colour on Stage”) to inspire creative grading and to present its latest features and improvements including better hue-angle keying, tracking and dealing with lens distortions.

Blackmagic is no less focused on knowing its customer, which explains its success in recent years. DaVinci Resolve once shared the “premium” space occupied by FilmLight but went through a transition to aim itself squarely at a democratized post production landscape. This shift meant a recognition that there would be millions of content producers and thousands of small post houses rather than a handful of large post facilities. That transition required a great deal more than merely slashing the price. The software product would have to work on myriad hardware combinations, not just the turnkey approved setup, and would need to have features and documentation aimed at those who hadn’t spent the past three years training in a post facility. By knowing exactly who the customer would be, Blackmagic built Resolve into an extremely successful, cross-discipline, post production powerhouse. Blackmagic was demonstrating the latest Resolve at IBC 2019, although all new features had been previously announced because, as director of software engineering Rohit Gupta explained, Blackmagic does not time its feature releases to IBC.

SGO

Aiming between the extremities established by FilmLight and Blackmagic Design, SGO promoted a new positioning of its flagship product, Mistika, via the Boutique subproduct. This is essentially a software-only Mistika that runs on PC or Mac. Subscription prices range from 99 euros per month to 299 euros per month, depending on features, although there have been several discounted promotions. The more expensive options include SGO’s highly regarded stereo 3D tools and camera stitching features for producing wrap-around movies.

Another IBC — done!


David Cox is a VFX compositor and colorist with more than 20 years of experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox specializes in unusual projects, such as those using very high resolutions and interactive immersive experiences featuring realtime render engines and augmented reality.

Behind the Title: Chapeau CD Lauren Mayer-Beug

This creative director loves the ideation process at the start of a project when anything is possible, and saving some of those ideas for future use.

COMPANY: LA’s Chapeau Studios

CAN YOU DESCRIBE YOUR COMPANY?
Chapeau provides visual effects, editorial, design, photography and story development fluidly with experience in design, web development, and software and app engineering.

WHAT’S YOUR JOB TITLE?
Creative Director

WHAT DOES THAT ENTAIL?
It often entails seeing a job through from start to finish. I look at it like making a painting or a sculpture.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Perhaps just how hands-on the process actually is. And how analog I am, considering we work in such a tech-driven environment.

Beats

WHAT’S YOUR FAVORITE PART OF THE JOB?
Thinking. I’m always thinking big picture to small details. I love the ideation process at the start of a project when anything is possible. Saving some of those ideas for future use, learning about what you want to do through that process. I always learn more about myself through every ideation session.

WHAT’S YOUR LEAST FAVORITE?
Letting go of the details that didn’t get addressed. Not everything is going to be perfect, so since it’s a learning process there is inevitably something that will catch your eye.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
My mind goes to so many buckets. A published children’s book author with a kick-ass coffee shop. A coffee bean buyer so I could travel the world.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I always skewed in this direction. My thinking has always been in the mindset of idea coaxer and gatherer. I was put in that position in my mid-20s and realized I liked it (with lots to learn, of course), and I’ve run with it ever since.

IS THERE A PROJECT YOU ARE MOST PROUD OF?
That’s hard to say. Every project is really so different. A lot of what I’m most proud of is behind the scenes… the process that will go into what I see as bigger things. With Chapeau, I will always love the Facebook projects, all the pieces that came together — both on the engineering side and the fun creative elements.

Facebook

What I’m most excited about is our future stuff. There’s a ton on the sticky board that we aim to accomplish in the very near future. Thinking about how much is actually being set in motion is mind-blowing, humbling and — dare I say — makes me outright giddy. That is why I’m here, to tell these new stories — stories that take part in forming the new landscape of narrative.

WHAT TOOLS DO YOU USE DAY TO DAY?
Anything Adobe. My most effective tool is the good-old pen to paper. That works clearly in conveying ideas and working out the knots.

WHERE DO YOU FIND INSPIRATION?
I’m always looking for inspiration and find it everywhere, as many other creatives do. However, nature is where I’ve always found my greatest inspiration. I’m constantly taking photos of interesting moments to save for later. Oftentimes I will refer back to those moments in my work. When I need a reset I hike, run or bike. Movement helps.

I’m always going outside to look at how the light interacts with the environment. Something I’ve become known for at work is going out of my way to see a sunset (or sunrise). They know me to be the first one on the roof for a particularly enchanting magic hour. I’m always staring at the clouds — the subtle color combinations and my fascination with how colors look the way they do only by context. All that said, I often have my nose in a graphic design book.

The overall mood realized from gathering and creating the ever-popular Pinterest board is so helpful. Seeing the mood color wise and texturally never gets old. Suddenly, you have a fully formed example of where your mind is at. Something you could never have talked your way through.

Then, of course, there are people. People/peers and what they are capable of will always amaze me.

Avid Media Composer now supports ProRes RAW and DNx codecs

Avid has added native support in Media Composer for Apple’s ProRes RAW camera codec and support for ProRes playback and encoding on Windows. In addition, Apple will provide 64-bit decoders for DNxHR and DNxHD codecs within the Pro Video Formats package that is available from Apple as a free download for all users. These integrations will allow content creators and post companies to natively create high-quality ProRes content regardless of their operating system and save time during the creative storytelling process.

ProRes is a high-performance editing codec that provides multistream, high-quality images and low complexity for premium realtime editing. The codec, which will be available to Media Composer users on Windows, supports frame sizes ranging from SD and HD to 2K, 4K and beyond at full resolution with image-quality preservation and reduced storage rates.

In addition, Media Composer for macOS and Windows, which was completely redesigned for 2019, also will add native support for ProRes RAW. ProRes RAW applies ProRes compression to the RAW data from a camera sensor to provide the flexibility of RAW video with the performance of ProRes for editing today’s highest-resolution outputs.

Finally, Avid says the continued availability of Avid’s DNxHD and DNxHR decoders for macOS is a benefit to content creators using Apple and Avid products and will ensure the longevity of content creators’ DNx material encoded in MXF and QuickTime files.

 

Adobe adds Sensei-powered Auto Reframe to Premiere

At IBC 2019, Adobe introduced a new reframing/reformatting feature for Premiere Pro called Auto Reframe. Powered by Adobe Sensei, the company’s AI/machine learning framework, Auto Reframe intelligently reframes and reformats video content for different aspect ratios, from square to vertical to cinematic 16:9 versions. Like the recently introduced Content-Aware Fill for After Effects, Auto Reframe uses AI and machine learning to accelerate manual production tasks without sacrificing creative control.

For anyone who needs to optimize content for different platforms, Auto Reframe will save valuable hours by automating the tedious task of manually reframing content every time a different video platform comes into play. It can be applied as an effect to individual clips or to whole sequences.

Auto Reframe will launch on Premiere Pro later this year. You can watch Adobe’s Victoria Nece talk about Auto Reframe and more from the IBC 2019 show floor.

An editor’s recap of EditFestLA

By Barry Goch

In late August, I attended my first American Cinema Editors’ EditFest on the Disney lot, and I didn’t know what to expect. However, I was very happy indeed to have spent the day learning from top-notch editors discussing our craft.

Joshua Miller from C&I Studios

The day started with a presentation by Joshua Miller from C&I Studios on DaVinci Resolve. Over the past few releases, Blackmagic has added many new editor-specific and -requested features.

The first panel, “From the Cutting Room to the Red Carpet: ACE Award Nominees Discuss Their Esteemed Work,” was moderated by Margot Nack, senior manager at Adobe. The panel included Heather Capps (Portlandia); Nena Erb, ACE (Insecure); Robert Fisher, ACE (Spider-Man: Into the Spider-Verse); Eric Kissack (The Good Place) and Cindy Mollo, ACE (Ozark). Like film school, we would watch a scene and then the editor of the scene would break it down and discuss their choices. For example, we watched a very dramatic scene from Ozark, then Mollo described how she amplified a real baby’s crying with sound design to layer on more tension. She also had the music in the scene start at a precise moment to guide the viewer’s emotional state.

The second panel, “Reality vs. Scripted Editing: Demystifying the Difference,” was moderated by Avid’s Matt Feury and featured panelists Maura Corey, ACE (Good Girls, America’s Got Talent); Tom Costantino, ACE (The Orville, Intervention); Jamie Nelsen, ACE (Black-ish, Project Runway) and Molly Shock, ACE (Naked and Afraid, RuPauls Drag Race All Stars). The consensus of the panel was that an editor can create stories from reality or from script. The panel also noted that an editor can be quickly pigeonholed by their credits — it’s often hard to look past the credits and discover the person. However, it’s way more important to be able to “gel” with an editor as a person, since the creative is going to spend many hours with the editor. As with the previous panel, we were also treated to short clips and behind-the-scenes discussions. For example, Shock told of how she crafted a dramatic scene of an improvised shelter getting washed away during a flood in the middle of a jungle at night — all while the participants were completely naked.

Joe Walker, ACE, and Bobbie O’Steen

The next panel was “Inside the Cutting Room with Bobbie O’Steen: A Conversation with Joe Walker, ACE.” O’Steen, who authored “The Invisible Cut” and “Cut to the Chase,” moderated a discussion with Walker, whose credits include Widows, Blade Runner 2049, Arrival, Sicario and 12 Years a Slave, in which she lead Walker in a wide-ranging conversation about his career, enlivened with clips from his films. In what could be called “evolution of a scene,” Walker broke down the casino lounge scene in Blade Runner 2049, from previs to dailies, and then talked about how the VFX evolved during the edit and how he shaped the scene to final.

The final panel, “The Lean Forward Moment: A Tribute to Norman Hollyn, ACE,” was moderated by Alan Heim, ACE, president of the Motion Picture Editors Guild, and featured Ashley Alizor, assistant editor; Reine-Claire Dousarkissian, associate professor of the practice of cinematic arts at USC; Saira Haider (Creed II), editor; and professor of the practice of cinema arts at USC, Thomas G. Miller, ACE.

I had the pleasure of interviewing Norm for postPerspective, and he was the kind of man you meet once and never forget — a kind and giving spirit who we lost too soon. The panelists each had a story about how wonderful Norm was and they honored his teaching by sharing a favorite scene with the audience and explaining how it impacted them through Norm’s teaching. Norm’s colleague at USC, Dousarkissian, chose a scene from the 1952 Noir film Sudden Fear, with Jack Palance and Joan Crawford. It’s amazing how much tension can be created by a simple wind-up toy.

I thoroughly enjoyed my experience at EditFest. So often we see VFX breakdowns, which are amazing things, but to see and hear how scenes and story beats are crafted by the best in the business was a treat. I’m looking forward to attending next year already.


Barry Goch is a finishing artist at LA’s The Foundation, as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Behind the Title: Bindery editor Matt Dunne

Name: Matt Dunne

Company: Bindery

Can you describe your company?
Bindery is an indie film and content studio based in NYC. We model ourself after independent film studios, where we tackle every phase of a project from concept all the way through finishing. Our work varies from branded web content and national broadcast commercials to shorts and feature films.

What’s your job title?
Senior Editor

What does that entail?
I’m part of all things post at Bindery. I get involved early on in projects to help ensure we have a workflow set up, and if I’m the editor I’ll often get a chance to work with the director on conceptualizing the piece. When I get to go on set I’m able to become the hub of the production side. I’ll work with the director and DP to make sure the image is what they want and

I’ll start assembling the edit as they are shooting. Most of my time is spent in an edit suite with a director and clients working through their concept and really bringing their story to life. An advantage of working with Bindery is that I’m able to sit and work with directors before they shoot and sometimes even before a concept is locked. There’s a level of trust that’s developed and we get to work through ideas and plan for anything that may come up later on during the post process. Even though post is the last stage of a film project, it needs to be involved in the beginning. I’m a big believer in that. From the early stages to the very end, I get to touch a lot of projects.

What would surprise people the most about what falls under that title?
I’m a huge tech nerd and gear head, so with the help of two other colleagues I help maintain the post infrastructure of Bindery. When we expanded the office we had to rewire everything and I recently helped put a new server together. That’s something I never imagined myself doing.

Editors also become a sounding board for creatives. I think it’s partially because we are good listeners and partially because we have couches in our suites. People like to come in and riff an idea or work through something out loud, even if you aren’t the editor on that project. I think half of being a good editor is just being able to listen.

What’s your favorite part of the job?
Working in an open environment that nurtures ideas and creativity. I love working with people that want to push their product and encourage one another to do the same. It’s really special getting to play a role in it all.

What’s your least favorite?
I think anything that takes me away from the editing process. Any sort of hardware or software issue will completely kill your momentum and at times it can be difficult to get that back.

What’s your most productive time of the day?
Early in the morning. I’m usually walking around the post department checking the stations, double checking processes that took place overnight or maintaining the server. Opposite that I’ve always felt very productive late at night. If I’m not actively editing in the office, then I’m usually rolling the footage back in my head that I screened during the day to try and piece it together away from the computer.

If you didn’t have this Job, what would you be doing instead?
I would be running a dog sanctuary for senior and abused dogs.

How early on did you know this would be your path?
I first fell in love with post production when I was a kid. It was when Jurassic Park was in theaters and Fox would run these amazing behind-the-scene specials. There was this incredible in-depth coverage of how things in the film industry are done. I was too young to see the movie but I remember just devouring the content. That’s when I knew I wanted to be part of that scene.

Neurotica

Can you name some recent projects you have worked on?
I recently got to help finish a pilot for a series we released called Neurotica. We were lucky enough to premiere it at Tribeca this past season, and getting to see that on the big screen with the people who helped make it was a real thrill for me.

I also just finished cutting a JBL spot where we built soundscapes for Yankees player Aaron Judge and captured him as he listened and was taken on a journey through his career, past and present. The original concept was a bit different than the final deliverable, but because of the way it was shot we were able to re-conceptualize the piece in the edit. There was a lot of room to play and experiment with that one.

Do you put on a different hat when cutting for a specific genre? Can you elaborate?
Absolutely. With every job there comes a different approach and tools you need to use. If I’m cutting something more narrative focused I’ll make sure I have the script notes up, break my project out by scene and spend a lot of time auditioning different takes to make a scene work. Docu-style is a different approach entirely.

I’ll spend more time prepping that by location or subject and then break that down further. There’s even more back and forth when cutting doc. On a scripted project you have an idea of what the story flow is, but when you’re tasked with finding the edit you’re very much jumping around the story as it evolves. Whether it’s comedy, music or any type of genre, I’m always getting a chance to flex a different editing muscle.

1800 Tequila

What is the project you are most proud of?
There are a few, but one of my favorite collaborative experiences was when we worked with Billboard and 1800 Tequila to create a branded documentary series following Christian Scott aTunde Adjuh. It was five episodes shot in New York, Philadelphia and New Orleans, and the edit was happening simultaneously with production.

As the crew traveled and mapped out their days, I was able to screen footage, assemble and collaborate with the director on ideas that we thought could really enhance the piece. I was on the phone with him when they went back to NOLA for the last shoot and we were writing story beats that we needed to gather to make Episode 1 and 2 work more seamlessly now that the story had evolved. Being able to rework sections of earlier episodes before we were wrapped with production was an amazing opportunity.

What do you use to edit?
Software-wise I’m all in on the Adobe Creative Suite. I’ve been meaning to learn Resolve a bit more since I’ve been spending more and more time with it as a powerful tool in our workflow.

What is your favorite plugin?
Neat Video is a denoiser that’s really incredible. I’ve been able to work with low-light footage that would otherwise be unusable.

Are you often asked to do more than edit? If so, what else are you asked to do?
Since Bindery is involved in every stage of the process, I get this great opportunity to work with audio designers and colorists to see the project all the way through. I love learning by watching other people work.

Name three pieces of technology you can’t live without.
My phone. I think that’s a given at this point. A great pair of headphones, and a really comfortable chair that lets me recline as far back as possible for those really demanding edits.

What do you do to de-stress from it all?
I met my wife back in college and we’ve been best friends ever since, so spending any amount of time with her helps to wash away the stress. We also just bough our first house in February, so there’s plenty of projects for me to focus all of my stress into.

Boris FX beefs up film VFX arsenal, buys SilhouetteFX, Digital Film Tools

Boris FX, a provider of integrated VFX and workflow solutions for video and film, has bought SilhouetteFX (SFX) and Digital Film Tools (DFT). The two companies have a long history of developing tools used on Hollywood blockbusters and experience collaborating with top VFX studios, including Weta Digital, Framestore, Technicolor and Deluxe.

This is the third acquisition by Boris FX in recent years — Imagineer Systems (2014) and GenArts (2016) — and builds upon the company’s editing, visual effects, and motion graphics solutions used by post pros working in film and television. Silhouette and Digital Film Tools join Boris FX’s tools Sapphire, Continuum and Mocha Pro.

Silhouette’s groundbreaking non-destructive paint and advanced rotoscoping technology was recognized earlier this year by the Academy of Motion Pictures (Technical Achievement Award). It first gained prominence after Weta Digital used the rotoscoping tools on King Kong (2005). Now the full-fledged GPU-accelerated node-based compositing app features over 100 VFX nodes and integrated Boris FX Mocha planar tracking. Over the last 15 years, feature film artists have used Silhouette on films including Avatar (2009), The Hobbit (2012), Wonder Woman (2017), Avengers: End Game (2019) and Fast & Furious Presents: Hobbs & Shaw (2019).

Avengers: End Game courtesy of Marvel

Digital Film Tools (DFT) emerged as an off-shoot of a LA-based motion picture visual effects facility whose work included hundreds of feature films, commercials and television shows.

The Digital Film Tools portfolio includes standalone applications as well as professional plug-in collections for filmmakers, editors, colorists and photographers. The products offer hundreds of realistic filters for optical camera simulation, specialized lenses, film stocks and grain, lens flares, optical lab processes, color correction, keying and compositing, as well as natural light and photographic effects. DFT plug-ins support Adobe’s Photoshop, Lightroom, After Effects and Premiere Pro; Apple’s Final Cut Pro X and Motion; Avid’s Media Composer; and OFX hosts, including Foundry Nuke and Blackmagic DaVinci Resolve.

“This acquisition is a natural next step to our continued growth strategy and singular focus on delivering the most powerful VFX tools and plug-ins to the content creation market,”
“Silhouette fits perfectly into our product line with superior paint and advanced roto tools that highly complement Mocha’s core strength in planar tracking and object removal,” says Boris Yamnitsky, CEO/founder of Boris FX. “Rotoscoping, paint, digital makeup and stereo conversion are some of the most time-consuming, labor-intensive aspects of feature film post. Sharing technology and tools across all our products will make Silhouette even stronger as the leader in these tasks. Furthermore, we are very excited to be working with such an accomplished team [at DFT] and look forward to collaborating on new product offerings for photography, film and video.”

Silhouette founders, Marco Paolini, Paul Miller and Peter Moyer, will continue in their current leadership roles and partner with the Mocha product development team to collaborate on delivering next-generation tools. “By joining forces with Boris FX, we are not only dramatically expanding our team’s capabilities, but we are also joining a group of like-minded film industry pros to provide the best solutions and support to our customers,” says Marco Paolini, Product Designer. “The Mocha planar tracking option we currently license is extremely popular with Silhouette paint and roto artists, and more recently through OFX, we’ve added support for Sapphire plug-ins. Working together under the Boris FX umbrella is our next logical step and we are excited to add new features and continue advancing Silhouette for our user base.”

Both Silhouette and Digital Film Tool plug-ins will continue to be developed and sold under the Boris FX brand. Silhouette will adopt the Boris FX commitment to agile development with annual releases, annual support and subscription options.

Main Image: Silhouette

Review: Dell’s Precision T5820 workstation

By Brady Betzel

Multimedia creators are looking for faster, more robust computer systems and seeing an increase in computing power among all brands and products. Whether it’s an iMac Pro with a built-in 5K screen or a Windows-based, Nvidia-powered PC workstation, there are many options to consider. Many of today’s content creation apps are operating-system-agnostic, but that’s not necessarily true of hardware — mainly GPUs. So for those looking at purchasing a new system, I am going to run through one of Dell’s Windows-based offerings: the Dell Precision T5820 workstation.

The most important distinction between a “standard” computer system and a workstation is the enterprise-level quality and durability of internal parts. While you might build or order a custom-built system for less money, you will most likely not get the same back-end assurances that “workstations” bring to the party. Workstations aren’t always the fastest, but they are built with zero downtime and hardware/software functionality in mind. So while non-workstations might use high-quality components, like an Nvidia RTX 2080 Ti (a phenomenal graphics card), they aren’t necessarily meant to run 24 hours a day, 365 days a year. On the other hand, the Nvidia Quadro series GPUs are enterprise-level graphics cards that are meant to run constantly with low failure rates. This is just one example, but I think you get the point: Workstations run constantly and are warrantied against breakdowns — typically.

Dell Precision T5820
Dell has a long track record of building everyday computer systems that work. Even more impressive are its next-level workstation computers that not only stand up to constant use and abuse but are also certified with independent software vendors (ISVs). ISV is a designation that suggests Dell has not only tested but supports the end-user’s primary software choices. For instance, in the nonlinear editing software space I found out that Dell had tested the Precision T5820 workstation with Adobe Premiere Pro 13.x in Windows 10 and has certified that the AMD Radeon Pro WX 2100 and 3100 GPUs with 18.Q3.1 drivers are approved.

You can see for yourself here. Dell also has driver suggestions from some recent versions of Avid Media Composer, as well as other software packages. That being said, Dell not only tests but will support hardware configurations in the approved software apps.

Beyond the ISV certifications and the included three-year hardware warranty with on-site/in-home service after remote diagnostics, how does the Dell Precision T5820 perform? Well, it’s fast and well-built.

The specs are as follows:
– Intel Xeon W-2155 3.3GHz, 4.5GHz Turbo, 10-core, 13.75MB cache with hyperthreading
– Windows 10 Pro (four cores plus for workstations — this is an additional cost)
– Precision 5820 Tower with 950W chassis
– Nvidia Quadro P4000, 8GB, four DisplayPorts (5820T)
– 64GB (8x8GB) 2666MHz DDR and four RDIMM ECC
– Intel vPro technology enabled
– Dell Ultra-Speed Drive Duo PCIe SSD x8 Card, 1 M.2 512GB PCIe NVMe class 50 Solid State Drive (boot drive)
– 3.5-inch 2TB 7200rpm SATA hard drive (secondary drive)
– Wireless keyboard and mouse
– 1Gb network interface card
– USB 3.1 G2 PCIe card (two Type C ports, one DisplayPort)
– Three years hardware warranty with onsite/in-home service after remote diagnosis

All of this costs around $5,200 without tax or shipping and not including any sale prices.

The Dell Precision T5820 is the mid-level workstation offering from Dell that finds the balance between affordability, performance and reliability — kind of the “better, Cheaper, faster” concept. It is one of the quietest Dell workstations I have tested. Besides the spinning hard drive that was included on the model I was sent, there aren’t many loud cards or fans that distract me when I turn on the system. Dell is touting the new multichannel thermal design for advanced cooling and acoustics.

The actual 5820 case is about the size of a mid-sized tower system but feels much slimmer. I even cracked open the case to tinker around with the internal components. The inside fans and multichannel cooling are sturdy and even a little hard to remove without some force — not necessarily a bad thing. You can tell that Dell made it so that when something fails, it is a relatively simple replacement. The insides are very modular. The front of the 5820 has an optical drive, some USB ports (including two USB-C ports) and an audio port. If you get fancy, you can order the systems with what Dell calls “Flex Bays” in the front. You can potentially add up to six 2.5-inch or five 3.5-inch drives and front-accessible storage of up to four M.2 or U.2 PCIe NVMe SSDs. The best part about the front Flex Bays is that, if you choose to use M.2 or U.2 media, they are hot-swappable. This is great for editing projects that you want to archive to an M.2 or save to your Blackmagic DaVinci Resolve cache and remove later.

In the back of the workstation, you get audio in/out, one serial port, PS/2, Ethernet and six USB 3.1 Gen 1 Type A ports. This particular system was outfitted with an optional USB 3.1 Gen 2 10GB/s Type C card with one DisplayPort passthrough. This is used for the Dell UltraSharp 32-inch 4K (UHD) USB-C monitor that I received along with the T5820.

The large Dell UltraSharp 32-inch monitor (U3219Q) offers a slim footprint and a USB-C connection that is very intriguing, but they aren’t giving them away. They cost $879.99 if ordered through Dell.com. With the ultra-minimal Infinity Edge bezel, 400 nits of brightness for HDR content, up to UHD (3840×2160) resolution, 60Hz refresh rate and multiple input/output connections, you can see all of your work in one large IPS panel. For those of you who want to run two computers off one monitor, this Dell UltraSharp has a built-in KVM switch function. Anyone with a MacBook Pro featuring USB-C/Thunderbolt 3 ports can in theory use one USB-C cable to connect and charge. I say “in theory” only because I don’t have a new MacBook Pro to test it on. But for PCs, you can still use the USB-C as a hub.

The monitor comes equipped with a DisplayPort 1.4, HDMI, four USB 3.0 Type A ports and a USB-C port. Because I use my workstation mainly for video and photo editing, I am always concerned with proper calibration. The U3219Q is purported by Dell to be 99% Adobe sRGB-, 95% DCI-P3- and 99% Rec. 709-accurate, so if you are using Resolve and outputting through a DeckLink, you will be able to get some decent accuracy and even use it for HDR. Over the years, I have really fallen in love with Dell monitors. They don’t break the bank, and they deliver crisp and accurate images, so there is a lot to love. Check out more of this monitor here.

Performance
Working in media creation I jump around between a bunch of apps and plugins, from Media Composer to Blackmagic’s DaVinci Resolve and even from Adobe After Effects to Maxon’s Cinema 4D. So I need a system that can not only handle CPU-focused apps like After Effects but GPU-weighted apps like Resolve. With the Intel Xeon and Nvidia Quadro components, this system should work just fine. I ran some tests in Premiere Pro, After Effects and Resolve. In fact, I used Puget Systems’ benchmarking tool with Premiere and After Effects projects. You can find one for Premiere here. In addition, I used the classic 3D benchmark Cinebench R20 from Maxon, and even did some of my own benchmarks.

In Premiere, I was able to play 4K H.264 (50MB and 100MB 10-bit) and ProRes files (HQ and 4444) in realtime at full resolution. Red Raw 4K was able to playback in full-quality debayer. But as the Puget Systems’ Premiere Benchmark shows, 8K (as well as heavily effected clips) started to bog the system down. With 4K, the addition of Lumetri color correction slowed down playback and export a little bit — just a few frames under realtime. It was close though. At half quality I was essentially playing in realtime. According to the Puget Systems’ Benchmark, the overall CPU score was much higher than the GPU score. Adobe uses a lot of single core processing. While certain effects, like resizes and blurs, will open up the GPU pipes, I saw the CPU (single-core) kicking in here.

In the Premiere Pro tests, the T5820 really shined bright when working with mezzanine codec-based media like ProRes (HQ and 4444) and even in Red 4K raw media. The T5820 seemed to slow down when multiple layers of effects, such as color correction and blurs, were added on top of each other.

In After Effects, I again used Puget Systems’ benchmark — this time the After Effects-specific version. Overall, the After Effects scoring was a B or B-, which isn’t terrible considering it was up against the prosumer powerhouse Nvidia RTX 2080. (Puget Systems used the 2080 as the 100% score). It seemed the tracking on the Dell T5820 was a 90%, while Render and Preview scores were around 80%. While this is just what it says — a benchmark — it’s a great way to see comparisons between machines like the benchmark standard Intel i9, RTX 2080 GPU, 64GB of memory and much more.

In Resolve 16 Beta 7, I ran multiple tests on the same 4K (UHD), 29.97fps Red Raw media that Puget Systems used in its benchmarks. I created four 10-minute sequences:
Sequence 1: no effects or LUTs
Sequence 2: three layers of Resolve OpenFX Gaussian blurs on adjustment layers in the Edit tab
Sequence 3: five serial nodes of Blur Radius (at 1.0) created in the Color tab
Sequence 4: in the Color tab, spatial noise reduction was set at 25 radius to medium, blur set to 1.0 and sharpening in the Blur tab set to zero (it starts at 0.5).

Sequence 1, without any effects, would play at full debayer quality in real time and export at a few frames above real time, averaging about 33fps. Sequence 2, with Resolve’s OpenFX Gaussian blur applied three times to the entire frame via adjustment layers in the Edit tab, would play back in real time and export at between 21.5fps and 22.5fps. Sequence 3, with five serial nodes of blur radius set at 1.0 in the Blur tab in the Color tab, would play realtime and export at about 23fps. Once I added a sixth serial blur node, the system would no longer lock onto realtime playback. Sequence 4 — with spatial noise reduction set at 25 radius to medium, blur set to 1.0 and sharpening in the Blur tab set to zero in the Color tab — would play back at 1fps to 2fps and export at 6.5fps.

All of these exports were QuickTime-based H.264s exported using the Nvidia encoder (the native encoder would slow it down by 10 frames or so). The settings were UHD resolution; “automatic — best” quality; disabled frame reordering; force sizing to highest quality; force debayer to highest quality and no audio. Once I stacked two layers of raw Red 4K media, I started to drop below realtime playback, even without color correction or effects. I even tried to play back some 8K media, and I would get about 14fps on full-res. Premium debayer, 14 to 16 on half res. Premium 25 on half res. good, and 29.97fps (realtime) on quarter res. good.

Using the recently upgraded Maxon Cinebench R20 benchmark, I found the workstation to be performing adequately around the fourth-place spot. Keep in mind, there are thousands of combinations of results that can be had depending on CPU, GPU, memory and more. These are only sample results that you could verify against your own for 3D artists. The Cinebench R20 results were CPU: 4682, CPU (single-core): 436, and MP ratio: 10.73x. If you Google or check out some threads for Cinebench R20 result comparisons, you will eventually find some results to compare mine against. My results are a B to B+. A much higher-end Intel Xeon or i9 or an AMD Threadripper processor would really punch this system up a weight class.

Summing Up
The Dell Precision T5820 workstation comes with a lot of enterprise-level benefits that simply don’t come with your average consumer system. The components are meant to be run constantly, and Dell has tested its systems against current industry applications using the hardware in these systems to identify the best optimizations and driver packages with these ISVs. Should anything fail, Dell’s three-year warranty (which can be upgraded) will get you up and running fast. Before taxes and shipping, the Dell T5820 I was sent for review would retail for just under $5,200 (maybe even a little more with the DVD drive, recovery USB drive, keyboard and mouse). This is definitely not the system to look at if you are a DIYer or an everyday user who does not need to be running 24 hours a day, seven days a week.

But in a corporate environment, where time is money and no one wants to be searching for answers, the Dell T5820 workstation with accompanying three-year ProSupport with next-day on-site service will be worth the $5,200. Furthermore, it’s invaluable that optimization with applications such as the Adobe Creative Suite is built-in, and Dell’s ProSupport team has direct experience working in those professional apps.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Fred Raskin talks editing and Once Upon a Time… in Hollywood

By Amy Leland

Once Upon a Time… in Hollywood is marketed in a style similar to its predecessors — “the ninth film from Quentin Tarantino.” It is also the third film with Fred Raskin, ACE, as Tarantino’s editor. Having previously edited Django Unchained and The Hateful Eight, as well as working as assistant editor on the Kill Bill films, Raskin has had the opportunity to collaborate with a filmmaker who has always made it clear how much he values collaboration.

On top of this remarkable director/editor relationship, Raskin has also lent his editing hand to a slew of other incredibly popular films, including three entries in the Fast & Furious saga and both Guardians of the Galaxy films. I had the chance to talk with him about his start, his transition to editor and his work on Once Upon a Time… in Hollywood. A tribute to Hollywood’s golden age, the film stars Brad Pitt as the stunt double for a faded actor, played by Leonardo DiCaprio, as they try to find work in a changing industry.

Fred Raskin

How did you get your start as an editor?
I went to film school at NYU to become a director, but I had this realization about midway through that that I might not get a directing gig immediately upon graduation, so perhaps I should focus on a craft. Editing was always my favorite part of the process, and I think that of all the crafts, it’s the closest to directing. You’re crafting performances, you’re figuring out how you’re going to tell the story visually… and you can do all of this from the comfort of an air-conditioned room.

I told all of my friends in school, if you need an editor for your projects, please consider me. While continuing to make my own stuff, I also cut my friends’ projects. Maybe a month after I graduated, a friend of mine got a job as an assistant location manager on a low-budget movie shooting in New York. He said, “Hey, they need an apprentice editor on this movie. There’s no pay, but it’s probably good experience. Are you interested?” I said, “Sure.” The editor and I got along really well. He asked me if I was going to move out to LA, because that’s really where the work is. He then said, “When you get out to LA, one of my closest friends in the world is Rob Reiner’s editor, Bob Leighton. I’ll introduce the two of you.”

So that’s what I did, and this kind of ties into Once Upon a Time… in Hollywood, because when I made the move to LA, I called Bob Leighton, who invited me to lunch with his two assistants, Alan Bell and Danny Miller. We met at Musso & Frank. So the first meeting that I had was at this classic, old Hollywood restaurant. Cut to 23 years later, and I’m on the set of a movie that’s shooting at Musso & Frank. It’s a scene between Al Pacino and Leonardo DiCaprio, arguably the two greatest actors of their generations, and I’m editing it. I thought back to that meeting, and actually got kind of emotional.

So Bob’s assistants introduced me to people. That led to an internship, which led to a paying apprentice gig, which led to me getting into the union. I then spent nine years as an assistant editor before working my way up to editor.

When you were starting out, were there any particular filmmakers or editors who influenced the types of stories you wanted to tell?
Growing up, I was a big genre guy. I read Fangoria magazine and gravitated to horror, action and sci-fi. Those were the kinds of movies I made when I was in film school. So when I got out to LA, Bob Leighton got a pretty good sense as to what my tastes were, and he gave me the numbers of a couple of friends of his, Mark Goldblatt and Mark Helfrich, who are huge action/sci-fi editors. I spoke with them, and that was just a real thrill because I was so familiar with their work. Now we are all colleagues, and I pinch myself regularly.

 You have edited many action and VFX films. Has that presented particular challenges to your way of working as an editor?
The challenges, honestly, are more ones of time management because when you’re on a big visual effects movie, at a certain point in the schedule you’re spending two to four hours a day watching visual effects. Then you have to make adjustments to the edit to accommodate for how things look when the finished visual effects come in. It’s extremely time-consuming, and when you’re not only dealing with visual effects, but also making changes to the movie, you have to figure out a way to find time for all of this.

Every project has its own specific set of challenges. Yes, the big Marvel movies have a ton of visual effects, and you want to make sure that they look good. The upside is that Marvel has a lot of money, so when you want to experiment with a new visual effect or something, they’re usually able to support your ideas. You can come up with a concept while you’re sitting behind the Avid and actually get to see it become a reality. It’s very exciting.

Let’s talk about the world of Tarantino. A big part of his legacy was his longtime collaboration with editor Sally Menke, who tragically passed away. How were you then brought in? I’m assuming it has something to do with your assistant editor credit on Kill Bill?
Yes. I assisted Sally for seven years. There were a couple of movies that we worked on together, and then she brought me in for the Kill Bill movies. And that’s when I met Quentin. She taught me how an editing room is supposed to work. When she finished a scene, she would bring me and the other assistants into the room and get our thoughts. It was a welcoming, family-like environment, which I think Quentin really leaned into as well.

While he’s shooting, Quentin doesn’t come into the editing room. He comes in during post, but during production, he’s really focused on shooting the movie. On Kill Bill, I didn’t meet him until a few weeks after the shoot ended. He started coming in, and whenever he and Sally worked on a scene together, they would bring us in and get our thoughts. I learned pretty quickly that the more feedback you’re able to give, the more appreciated it will be. Quentin has said that at least part of the reason why he went with me on Django Unchained was because I was so open with my comments. Also, as the whole world knows, Quentin is a huge movie lover. We frequently would find ourselves talking about movies. He’d be walking through the hall, and we’d just strike up a conversation, and so I think he saw in me a kindred spirit. He really kept me in the family after Kill Bill.

I got my first big editing break right after Kill Bill ended. I cut a movie called Annapolis, which Justin Lin directed. I was no longer on Quentin’s crew, but we still crossed paths a lot. Over the years we’d just bump into each other at the New Beverly Cinema, the revival house that he now owns. We’d talk about whatever we’d seen lately. So he always kept me in mind. When he and Sally finished the rough cuts on Death Proof and Inglourious Basterds, he invited me to come to their small friends-and-family screenings, which was a tremendous honor.

On Django, you were working with a director who had the same collaborator in Sally Menke for such a long time. What was it like in those early days working on Django?
It was without question the most daunting experience that I have gone through in Hollywood. We’re talking about an incredibly talented editor, Sally, whose shoes I had to attempt to fill, and a filmmaker for whom I had the utmost respect.

Some of the western town stuff was shot at movie ranches just outside of LA, and we would do dailies screenings in a trailer there. I made sure that I sat near him with a list of screening notes. I really just took note of where he laughed. That was the most important thing. Whatever he laughed at, it meant that this was something that he liked. There was a PA on set when they went to New Orleans. I stayed in LA, but I asked her to write down where he laughs.

I’m a fan of his. When I went to see Reservoir Dogs, I remember walking out of the theater and thinking, “Well, that’s like the most exciting filmmaker that I’ve seen in quite some time.” Now I’m getting the chance to work with him. And I’ll say because of my fandom, I have a pretty good sense as to his style and his sense of humor. I think that that all helped me when I was in the process of putting the scenes together on Django. I was very confident in my work when I started showing him stuff on that movie.

Now, seven years later, you are on your third film with him. Have you found a different kind of rhythm working with him than you had on that first film?
I would say that a couple of little things have changed. I personally have gained some confidence in how I approach stuff with him. If there was something that I wasn’t sure was working, or that maybe I felt was extraneous, in Django, I might have had some hesitation about expressing it because I wouldn’t want to offend him. But now both of us are coming from the perspective of just wanting to make the best movie that we possibly can. I’m definitely more open than I might have been back then.

Once Upon a Time… in Hollywood has an interesting blend of styles and genres. The thing that stands out is that it is a period piece. Beyond that, you have the movies and TV shows within the movie that give you additional styles. And there is a “horror movie” scene.
Right, the Spahn Ranch sequence.

 That was so creepy! I really had that feeling the whole time of, “They can’t possibly kill off Brad Pitt’s character this early, can they?
That’s the idea. That’s what you’re supposed to be feeling.

When you are working with all of those overlapping styles, do you have to approach the work a different way?
The style of the films within the film was influenced by the movies of the era to some degree. There wasn’t anything stylistically that had us trying to make the movie itself feel like a movie from 1969. For example, Leonardo DiCaprio’s character, Rick Dalton, is playing the heavy on a western TV show called Lancer in the movie. Quentin referred to the Lancer stuff as, “Lancer is my third western, after Django and The Hateful Eight.” He didn’t direct that show as though it was a TV western from the late ’60s. He directed it like it was a Quentin Tarantino western from 2019. Quentin’s style is really all his own.

There are no rules when you’re working on a Quentin Tarantino movie because he knows everything that’s come before, and he is all about pushing the boundaries of what you can do — which is both tremendously exciting and a little scary, like is this going to work for everyone? The idea that we have a narrator who appears once in the first 10 minutes of the movie and then doesn’t appear again until the last 40 minutes, is that something that’s going to throw people off? His feeling is like, yeah, there are going to be some people out there who are going to feel that it’s weird, but they’re also going to understand it. That’s the most important thing. He’s a firm believer in doing whatever we need to do to tell the story as clearly and as concisely as possible. That voiceover narration serves that purpose. Weird or not.

You said before that he doesn’t come into the edit during production. What is your work process during production? Are you beginning the rough cut? And if so, are you sending him things, or are you really not collaborating with him on that process at all until post begins?
This movie was shot in LA, so for the first half of the shoot, we would do regular dailies screenings. I’d sit next to him and write down whatever he laughed at. That process that began on Django has continued. Then I’ll take those notes. Then I assemble the material as we’re shooting, but I don’t show him any of it. I’m not sending him cuts. He doesn’t want to see cuts. I don’t think he wants the distractions of needing to focus on editing.

On this movie, there were only two occasions when he did come into the editing room during production. The movie takes place over the course of three days, and at the end of the second day, the characters are watching Rick on the TV show The F.B.I., which was a real show and that episode was called “All the Streets Are Silent.” The character of Michael Murtaugh was played in the original episode by a young Burt Reynolds. They found a location that matched pretty perfectly and reshot only the shots that had Burt Reynolds in them. They reshot with Leonardo DiCaprio, as Rick Dalton, playing that character. He had to come into the editing room to see how it played and how it matched, and it matched remarkably well. I think that people watching the movie probably assume that Quentin shot the whole thing, or that we used some CG technology to get Leo into the shots. But no, they just figured out exactly the shots that they needed to shoot, and that was all the new material. The rest was from the original episode.

 The other time he came into the edit during production was the sequence in which Bruce Lee and Cliff have their fight. The whole dialogue scene that opens that sequence, it all plays out in one long take. So he was very excited to see how that shot played out. But one of the things that we had spoken about over the course of working together is when you do a long take, the most important thing is what that cut is going to be at the end of the long take. How can we make that cut the most impactful? In this case, the cut is to Cliff throwing Bruce Lee into the car. He wanted to watch the whole scene play out, and then see how that cut worked. When I showed it to him, I had my finger on the stop button so that after that cut, I would stop it so he wouldn’t see anything more and wouldn’t get tempted to get sucked into maybe giving notes. I reached to stop, but he was like, “No, no, no let it play out.” He watched the fight scene, and he was like, “That’s fantastic.” He was very happy.

Once you were in post, what were some of the particular challenges of this film?
One of the really important things is how integral sound was to the process of making this movie. First there were the movies and shows within the movie. When we’re watching the scenes from Bounty Law, the ‘50s Western that Rick starred in, it wasn’t just about the 4×3, black and white photography, but also how we treated the sound. Our sound editorial team and our sound mixing team did an amazing job of getting that stuff to sound like a 16-millimeter print. Like, they put just the right amount of warble into the dialogue, and it makes it feel very authentic. Also, all the Bounty Law stuff is mono, not this wide stereo thing that would not be appropriate for the material from that era.

And I mentioned the Spahn Ranch sequence, when for 20 minutes the movie turns into an all-out horror movie. One of Quentin’s rules for me when I’m putting my assembly together is that he generally does not want me cutting with music. He frequently has specific ideas in his head about what the music is going to be, and he doesn’t want to see something that’s not the way he imagined it. That’s going to take him out of it, and he won’t be able to enjoy the sequence.

When I was putting the Spahn Ranch sequence together, I knew that I had to make it suspenseful without having music to help me. So, I turned to our sound editors, Wylie Stateman and Leo Marcil, and said, “I want this to sound like The Texas Chain Saw Massacre, like I want to have low tones and creaking wood and metal wronks. Let’s just feel the sense of dread through this sequence.” They really came through.

And what ended up happening is, I don’t know if Quentin’s intention originally was to play it without music, but ultimately all the music in the scene comes from what Dakota Fanning’s character, Squeaky, is watching on the TV. Everything else is just sound effects, which were then mixed into the movie so beautifully by Mike and Chris Minkler. There’s just a terrific sense of dread to that sequence, and I credit the sound effects as much as I do the photography.

This film was cut on Avid. Have you always cut on Avid? Do you ever cut on anything else?
When I was in film school, I cut on film. If fact, I took the very first Avid class that NYU offered. That was my junior year, which was long before there were such things as film options or anything. It was really just kind of the basics, a basic Avid Media Composer.

I’ve worked on Final Cut Pro a few times. That’s really the only other nonlinear digital editing system that I’ve used. I’ve never actually used Premiere.

At this point my whole sound effects and music library is Avid-based, and I’m just used to using the Avid. I have a keyboard where all of my keys are mapped, and I find, at this point, that it’s very intuitive for me. I like working with it.

This movie was shot on film, and we printed dailies from the negative. But the negative was also scanned in at 4K, and then those 4K scans were down-converted to DNx115, which is an HD resolution on the Avid. So we were editing in HD, and we could do screenings from that material when we needed to. But we would also do screenings on film.

Wow, so even with your rough cuts, you were turning them around to film cuts again?
Yeah. Once production ended, and Quentin came into the editing room, when we refined a scene to his liking, I would immediately turn that over to my Avid assistant, Chris Tonick. He would generate lists from that cut and would turn it over to our film assistants, Bill Fletcher and Andrew Blustain. They would conform the film print to match the edit that we had in the Avid so that we were capable of screening the movie on film whenever we wanted to. There was always going to be a one- or two-day lag time, depending on when we finished cutting on the Avid. But we were able to get it up there pretty quickly.

Sometimes if you have something like opticals or titles, you wouldn’t be able to generate those for film quickly enough. So if we wanted to screen something immediately, we would have to do it digitally. But as long as we had a couple of days, we would be able to put it up on film, and we did end up doing one of our test screenings on 35 millimeter, which was really great. It added one more layer of authenticity to the movie, getting to see it projected on film.

For a project of this scope, how many assistants do you work with, and how do you like to work with those assistants?
Our team consists of post production supervisor Tina Anderson, who really oversees everything. She runs the editing room. She figures out what we’re going to need. She’s got this long list of items that she goes down every day, and makes sure that we are prepared for whatever is going to come our way. She’s really remarkable.

My first assistant Chris Tonick is the Avid assistant. He cut a handful of scenes during production, and I would occasionally ask him to do some sound work. But primarily during production, he was getting the dailies prepped — getting them into the Avid for me and laying out my bins the way I like them.

In post, we added an Avid second named Brit DeLillo, who would help Chris when we needed to do turnovers for sound or visual effects, music, all of those people.

Then we had our film crew, Bill Fletcher and Andrew Blustain. They were syncing dailies during production, and then they were conforming the film print during post.

Last, but certainly not least, we had Alana Feldman, our post PA, who made sure we had everything we needed.

And honestly, for everybody on the crew, their most important role beyond the work that they were hired to do, was to be an audience member for us whenever we finished a scene. That tradition I experienced as an assistant working under Sally is the tradition that we’ve continued. Whenever we finish a sequence, we bring the whole crew up and show them the scene. We want people to react. We want to hear how they’re responding. We want to know what’s working and what isn’t working. Being good audience members is actually a key part of the job.

L-R: Quentin Tarantino, post supervisor Tina Anderson, first assistant editor (Film) Bill Fletcher, Fred Raskin, 2nd assistant editor (Film) Andrew Blustain, 2nd assistant editor (Avid) Brit DeLillo, post assistant Alana Feldman, producer Shannon McIntosh, 1st assistant editor (Avid) Chris Tonick, assistant to producer Ryan Jaeger and producer David Heyman

When you’re looking for somebody to join your team as an assistant, what are you looking for?
There are a few things. One obvious thing, right off the bat, is someone who is personable. Is this someone I’m going to want to have lunch with every day for months on end? Generally, especially working on a Quentin Tarantino movie, somebody with a good knowledge of film history who has a love of movies is going to be appreciated in that environment.

The other thing that I would say honestly  — and this might sound funny — is having the ability to see the future. And I don’t mean that I need psychic film assistants. I mean they need to be able to figure out what we’re going to need later on down the line and be prepared for it.

If I turn over a sequence, they should be looking at it and realizing, oh, there are some visual effects in here that we’re going to have to address, so we have to alert the visual effects companies about this stuff, or at least ask me if it’s something that I want.

If there were somebody who thought to themselves, “I want a career like Fred Raskin’s. I want to edit these kinds of cool films,” what advice would you give them as they’re starting out?
I have three standard pieces of advice that I give to everyone. My experience, I think, is fairly unique. I’ve been incredibly fortunate to get to work with some of my favorite filmmakers. The way my story unfolded … not everybody is going to have the opportunities I’ve had.

But my standard pieces of advice are, number one — and I mentioned this earlier — be personable. You’re working with people you’re going to share space with for many months on end. You want to be the kind of person with whom they’re going to want to spend time. You want to be able to get along with everyone around you. And you know, sometimes you’ve got some big personalities to deal with, so you have to be the type who can navigate that.

Then I would say, watch everything you possibly can. Quentin is obviously an extreme example, but most filmmakers got into this business because they love movies. And so the more you know about movies, and the more you’re able to talk about movies, the more those filmmakers are going to respect you and want to work with you. This kind of goes hand in hand with being personable.

The other piece of advice — and I know this sounds like a no-brainer — if you’re going for an interview with a filmmaker, make sure you’ve familiarized yourself with that person’s work. Be able to talk with them about their movies. They’re going to appreciate that you took the time to explore their work. Everybody wants to talk about the work they’ve done, so if you’re able to engage them on that level, I think it’s going to reflect well on you.

Absolutely. That’s great advice.


Amy Leland is a film director and editor. Her short film, “Echoes”, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Building a massive editing storage setup on a budget

By Mike McCarthy

This year, I oversaw the editing process for a large international film production. This involved setting up a collaborative editing facility in the US, at Vasquez Saloon, with a large amount of high-speed storage for the source footage. While there was “only” 6.5TB of offline DNxHR files, they shot around 150TB of Red footage that we needed to have available for onsite VFX, conform, etc. Once we finished the edit, we were actually using 40TB of that footage in the cut, which we needed at another location for further remote collaboration. So I was in the market for some large storage solutions.

Our last few projects have been small enough to fit on eight-bay desktop eSAS arrays, which are quiet and relatively cheap. (Act of Valor was on a 24TB array of 3TB drives in 2010, while 6 Below was on 64TB arrays of 8TB drives.) Now that we have 12TB drives available, that allows those to go to 96TB, but we needed more capacity than that. With that much data on a single spindle, you lose more capacity to maintain redundancy, with RAID-6 dropping the raw space to 72TB.

Large numbers of smaller drives offer better performance and more efficient redundancy, as well as being cheaper per TB, at least for the drives. But once you get into large rack-mounted arrays, they are much louder and need to be located farther from the creative space, requiring different interconnects than direct attached SAS. My initial quotes were for a 24x 8TB solution, offering 192TB storage, before RAID-6 and such left us with 160 usable Terabytes of space for around $15K.

I was in the process of ordering one of those from ProAvio when they folded last Thanksgiving, resetting my acquisition process. I looked into building one myself, with a SAS storage chassis and bare drives, when I stumbled across refurbished servers on eBay. There are numerous companies selling used servers that include storage chassis, backplanes and RAID cards, for less than just this case costs new.

The added benefit is that these include a fully functioning Xeon-level computer system as well. At the very least, this allows you to share the storage over a 10GbE network, and in our case we were also able to use it as a rendernode and eventually a user workstation. That solution worked well enough that we will be using similar items for future artist stations, even without that type of storage requirement. I have setup two separate systems so far, for different needs, and learned a lot in the process. I thought I would share some of those details on here.

Why use refurbished systems for top end work? Most of the CPU advances in the last few years have come in the form of increased core counts and energy efficiency. This means that in lightly threaded applications, CPUs from a few years ago will perform nearly as well as brand-new ones. And previous generation DDR3 RAM is much cheaper than DDR4. PCIe 3.0 has been around for many generations, but older systems won’t have Thunderbolt3 and may not even have USB 3. USB 3 can be added with an expansion card, but Thunderbolt will require a current generation system. The other primary limitation is finding systems that have drivers for running Windows 10, since those systems are usually designed for Linux and Windows Server. Make sure you verify the motherboard will support Windows 10 before you make a selection. (Unfortunately, Windows 7 is finally dying, with no support from Microsoft or current application releases.)

Workstations and servers are closely related at the hardware level, but have a few design differences. They use the same chipsets and Xeon processors, but servers are designed for remote administration in racks while workstations are designed to be quieter towers with more graphics capability. But servers can be used for workstation tasks with a few modifications, and used servers can be acquired very cheaply. Also, servers frequently have the infrastructure for large drive arrays, while workstations are usually designed to connect to separate storage for larger datasets.

Recognizing these facts, I set out to build a large repository for my 150TB of Red footage on a system that could also run my Adobe applications and process the data. While 8TB drives are currently the optimal size for storing the most data for the lowest total price that will change over time. And 150TB of data required more than 16 drives, so I focused on 4U systems with 24 drive bays. With 192TB of RAW storage, minus two drives for RAID-6 (16TB) and 10% for Windows overhead leaves me with 160TB of storage space reported in Windows.

4U chassis also allow for full-height PCIe cards, which is important for modern GPUs. Finding support for full-height PCIe slots is probably the biggest challenge in selecting a chassis, as most server cards are low profile. A 1U chassis can fit a dual-slot GPU if it’s designed to accept one horizontally, but cooling may be an issue for workstation cards. A 2U chassis has the same issue, so you must have a 3U or 4U chassis to install full-height PCIe cards vertically, and the extra space will help with cooling and acoustics as well.

Dell and HP offer options as well, but I went with Supermicro since their design fit my needs the best. I got a 4U chassis with a 24-port pass through SAS back plane for maximum storage performance and a X9DRi-LNF4+ motherboard that was supposed to support Windows 7 and Windows 10. The pass-through backplane gave full speed access to 24 drives over six-quad channel SFF-8643 ports, but required a 24-port RAID card and more cables. The other option is a port multiplying backplane, which has a single or dual SFF-8643 connection to the RAID card. This allows for further expansion at the expense of potential complexity and latency. And 12G SAS is 1.5GB/s per lane, so in theory a single SFF-8643 cable can pass up to 6GB/s, which should be as much as most RAID controllers can handle anyway.

The system cost about $2K, plus $5K for the 24 drives, which is less than half of what I was looking at paying for a standalone external SAS array and it included a full computer with 20 CPU cores and 128GB RAM. I considered it a bit of a risk, as I had never done something at that scale and there was no warranty, but we decided that the cost savings was worth a try. It wasn’t without its challenges, but it is definitely a viable solution for a certain type of customer. (One with more skills than money.)

Putting it to Use
The machine ran loud, as was to be expected with 24 drives and five fans but it was installed in a machine room with our rack mount UPS and network switches, so the noise wasn’t a problem. I ran 30-foot USB and HDMI cables to the user station in the next room and frequently controlled it via VNC. I added an Nvidia Pascal Quadro card, a 10GbE card and a USB 3 card, as well as a SATA SSD for the OS in an optional 2.5-inch drive tray. Once I got the array set up and initialized, it benchmarked at over 3000MB/s transfer rate. This was far more than I needed for Red files, but I won’t turn down excess speed for future use with uncompressed 8K frames or 40GbE network connections.

Initially, I had trouble with Windows 10. I was getting bluescreen APCI bios errors on boot, but Windows 7 worked flawlessly. I used Win7 for a month, but I knew I would need to move to Win10 within the year and was looking at building more systems. So I knew I needed to confirm that Win10 could work successfully. I eventually determined that it was Windows Update — always been the bane of my existence when using Win10 — which was causing the problem. It was automatically updating one of the chipset drivers to a version that prevented the system from booting. The only solution was to prevent Win10 from accessing the Internet until after the current driver was successfully involved. The only way to disable Windows update during install is to totally disconnect the system from the network. Once I did that everything worked great, and I ordered another system.

The second time I didn’t need as much data, so I went with a 16-bay 3U chassis… which was a mistake. It ran hotter and louder with less case space, and it doesn’t fit GPUs with top-mounted power plugs or full-sized CPU coolers. So regardless of how many drive bays you need, I recommend buying a 24-bay 4U system for the space it gives you. (The SuperMicro 36 bay systems look the same from the front, but have less space available since the extra 12 bays in the rear constrain the motherboard similar to a 2U case.) The extra space also gives you more options for cards and cooling solutions.

I also tried an NVMe drive in a PCIe slot and while it works booting is not an option without modding the BIOS, which I was not about to experiment with. So I installed the OS on a SATA SSD again, and was able to adapt it to one of the 16 standard drive bays, as I only needed 8 of them for my 64TB array. This system had a pass through backplane with 16 single port SATA connectors, which is much messier than the SFF-8643 connectors. But it works, and it’s simpler to mix the drives between the RAID card and the motherboard, which is a plus.

When I received the unit, it was FAR louder than the previously ordered 4U one, for a number of reasons. It had 800W power supplies — instead of the 920W-SQ (Super-quiet) ones in my first one — and the smaller case had different airflow limitations. I needed this one to be quieter than the first system, as I was going to be running it next to my desk instead of in a machine room. So I set about redesigning the cooling system, which was the source of 90% of the noise. I got the power supplies replaced with 920SQ ones, although the 700W ones are supposed to be quiet as well, and much cheaper.

I replaced the 5x 80mm 5000RPM jet engine system fans with Noctua 1800RPM fans, which made the system quiet but didn’t provide enough air flow for the passively cooled CPUs. I then ordered two large CPU coolers with horizontally mounted 92mm fans to cool the Xeon chips, replacing the default passive heatsinks that us case airflow for cooling. I also installed a 40mmx20 fan on the RAID card that had been overheating even with the default jet engine sounding fans. Once I had those eight Noctua fans installed, the system was whisper quiet and could render at 100% CPU usage without throttling or overheating. So I was able to build a system with 16 cores and 128GB RAM for about $1500, not counting the 64TB storage, which doubles that price, and the GPU, which I already had. (Although a GTX1660 can be had for $300, and would be a good fit in that budget range.) The first one I built had 20 cores at 3GHz, and 128GB RAM for about $2,000, plus $5000 for the 192TB storage. I was originally looking at getting just the 192TB external arrays for twice that price, so by comparison this was half the cost with a high-end computer tossed in as a bonus.

Looking Ahead
The things I plan to do differently in the future include:
Always getting the 4U chassis for maximum flexibility,
making sure to get quiet power supplies ($50 to $150) and
budgeting to replace all the fans and CPU coolers if noise is going to be an issue ($200).

But at the end of the day, you should be able to get a powerful dual-socket system ready to support massive storage volume for around $2,000. This solution makes the most sense when you need large capacity storage, as well as the editing system. Otherwise some of what you are paying for is going to waste.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Behind the Title: Element EP Kristen Kearns

NAME: Kristen Kearns

COMPANY: Boston’s Element Productions

CAN YOU DESCRIBE YOUR COMPANY?
Element has been in business for 20 years. We handle production and post production for video content on all platforms.

WHAT’S YOUR JOB TITLE?
Executive Producer / COO

WHAT DOES THAT ENTAIL?
I oversee the office operations and company culture, and I work with clients on their production and post projects. I handle sales and bidding and work with our post and production talent to keep growing and expanding their creative goals.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I wear a lot of hats. I think people are always surprised by how much I have to juggle. From hiring employees, approving bills, bidding projects and collaborating with directors on treatments.

WHAT TOOLS DO YOU USE?
We love Slack, Box and Google Apps. Collaboration is such a big part of what we do, and we could not function as seamlessly as we do without these awesome tools.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The people. I love who I work with.

WHAT’S YOUR LEAST FAVORITE?
When we work really hard on bidding a project and we don’t win. I understand this is a competitive business, but it is still really hard to lose after you put so much time and energy into a bid.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I love the mornings. I like the quiet before everyone comes in. I get into the office early and take that time to think through my day and my priorities. Or, sometimes I use the time to brainstorm and think through business challenges or business goals for the overall growth of the company.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I am a bit obsessed with The Home Edit. If you don’t follow them on Instagram, you should. Their stories are hilarious. Anyway, I would want to work for them. Crazy lives all wrapped up in tidy cabinets.

Alzheimer’s Association

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We recently launched a project for a local bank that featured a Yeti, a unicorn and a Sasquatch. Projects like this are what keep my job interesting and challenging. I had to do a bunch of research on costumes and prosthetics.

We also just wrapped on a short film for the Alzheimer’s Association. Giving back is a really important part of our company culture. We were so moved by the story of this couple and their struggles with this debilitating disease. I was really proud to be a part of this production.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I am proud of a lot of the work that we do, but I would say most recently we worked on a multi-platform project with Dunkin’ that really stretched our producing skills. The idea was very innovative, with the goal being to power a home entirely on coffee grounds.

We connected all the dots of the projects, from finding a biofuel manufacturer to the builder in Nashville, and documented the entire process. The project manifested itself into a live event in New York City before traveling to the coast of Massachusetts to be listed as an Airbnb.

Dunkin

NAME PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
I recently went to Washington, DC, with my family, and the National Museum of American History had an exhibit “Within These Walls.” It highlighted the evolution of one home, and with it the changing technology. I remember being really taken aback by the laundry exhibit. I think we all take for granted the time and convenience it saves us. Can you imagine if we had to spend hours dunking and ringing out clothes? It has actually given us more freedom and convenience to pursue passions and interests. I could live without my phone or a television, but trap me with a bucket and a clothesline and I would lose my mind.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I grew up in a dance studio, so I actually find that I work better with some sort of music in the background. The office has a Sonos system, so we all take turns playing music.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Immersing myself in art and culture. Whether it is going to a museum to view artwork, seeing a band or heading to a movie to truly appreciate other people’s creativity. It is the best way for me to unwind as I enjoy the talent and art of others.

Harbor expands to LA and London, grows in NY

New York-based Harbor has expanded into Los Angeles and London and has added staff and locations in New York. Industry veteran Russ Robertson joins Harbor’s new Los Angeles operation as EVP of sales, features and episodic after a 20-year career with Deluxe and Panavision. Commercial director James Corless and operations director Thom Berryman will spearhead Harbor’s new UK presence following careers with Pinewood Studios, where they supported clients such as Disney, Netflix, Paramount, Sony, Marvel and Lucasfilm.

Harbor’s LA-based talent pool includes color grading from Yvan Lucas, Elodie Ichter, Katie Jordan and Billy Hobson. Some of the team’s projects include Once Upon a Time … in Hollywood, The Irishman, The Hunger Games, The Maze Runner, Maleficent, The Wolf of Wall Street, Snow White and the Huntsman and Rise of the Planet of the Apes.

Paul O’Shea, formerly of MPC Los Angeles, heads the visual effects teams, tapping lead CG artist Yuichiro Yamashita for 3D out of Harbor’s Santa Monica facility and 2D creative director Q Choi out of Harbor’s New York office. The VFX artists have worked with brands such as Nike, McDonald’s, Coke, Adidas and Samsung.

Harbor’s Los Angeles studio supports five grading theaters for feature film, episodic and commercial productions, offering private connectivity to Harbor NY and Harbor UK, with realtime color-grading sessions, VFX reviews and options to conform and final-deliver in any location.

The new UK operation, based out of London and Windsor, will offer in-lab and near-set dailies services along with automated VFX pulls and delivery through Harbor’s Anchor system. The UK locations will draw from Harbor’s US talent pool.

Meanwhile, the New York operation has grown its talent roster and Soho footprint to six locations, with a recently expanded offering for creative advertising. Veteran artists on the commercial team include editors Bruce Ashley and Paul Kelly, VFX supervisor Andrew Granelli, colorist Adrian Seery, and sound mixers Mark Turrigiano and Steve Perski.

Harbor’s feature and episodic offering continues to expand, with NYC-based artists available in Los Angeles and London.

Digital Arts expands team, adds Nutmeg Creative talent

Digital Arts, an independently owned New York-based post house, has added several former Nutmeg Creative talent and production staff members to its roster — senior producer Lauren Boyle, sound designer/mixers Brian Beatrice and Frank Verderosa, colorist Gary Scarpulla, finishing editor/technical engineer Mark Spano and director of production Brian Donnelly.

“Growth of talent, technology, and services has always been part of the long-term strategy for Digital Arts, and we’re fortunate to welcome some extraordinary new talent to our staff,” says Digital Arts owner Axel Ericson. “Whether it’s long-form content for film and television, or working with today’s leading agencies and brands creating dynamic content, we have the talent and technology to make all of our clients’ work engaging, and our enhanced services bring their creative vision to fruition.”

Brian Donnelly, Lauren Boyle and Mark Spano.

As part of this expansion, Digital Arts will unveil additional infrastructure featuring an ADR stage/mix room. The current facility boasts several state-of-the-art audio suites, a 4K finishing theater/mixing dubstage, four color/finishing suites and expansive editorial and production space, which is spread over four floors.

The former Nutmeg team has hit the ground running working their long-time ad agency, network, animation and film studio clients. Gary Scarpulla worked on color for HBO’s Veep and Los Espookys, while Frank Verderosa has been working with agency Ogilvy on several Ikea campaigns. Beatrice mixed spots for Tom Ford’s cosmetics line.

In addition, Digital Arts’ in-house theater/mixing stage has proven to be a valuable resource for some of the most popular TV productions, including recording recent commentary sessions for the legendary HBO series, Game of Thrones and the final season of Veep.

Especially noteworthy is colorist Ericson’s and finishing editor Mark Spano’s collaboration with Oscar-winning directors Karim Amer and Jehane Noujaim to bring to fruition the Netflix documentary The Great Hack.

Digital Arts also recently expanded its offerings to include production services. The company has already delivered projects for agencies Area 23, FCB Health and TCA.

“Digital Arts’ existing infrastructure was ideally suited to leverage itself into end-to-end production,” Donnelly says. “Now we can deliver from shoot to post.”

Tools employed across post are Avid Pro Tools, D Control ES, S3 for audio post and Avid Media Composer, Adobe Premiere and Blackmagic Resolve for editing. Color grading is via Resolve.

Main Image: (L-R) Frank Verderosa, Brian Beatrice and Gary Scarpulla

 

Cabin adds two editors, promotes another

LA-based editorial studio Cabin Editing Company has grown its editing staff with the addition of Greg Scruton and Debbie Berman. They have also promoted Scott Butzer to editor. The trio will work on commercials, music videos, branded content and other short-form projects.

Scruton, who joins Cabin from Arcade Edit, has worked on dozens of high-profile commercials and music videos throughout his career, including Pepsi’s 2019 Grammy’s spot Okurrr, starring Cardi B; Palms Casino Resort’s star-filled Unstatus Quo; and Kendrick Lamar’s iconic Humble music video, for which he earned an AICE Award. Scruton has worked with high-profile ad agencies and directors, including Anomaly; Wieden + Kennedy; 72andSunny; Goodby, Silverstein & Partners; Dave Meyers; and Nadia Lee Cohen. He uses Avid Media Composer and Adobe Premiere.

Feature film editor Berman joins Cabin on the heels of her successful run with Marvel Studios, having recently served as an editor on Spider-Man: Homecoming, Black Panther and Captain Marvel. Her work extends across mediums, with experience editing everything from PSAs and documentaries to animated features. Now expanding her commercial portfolio with Cabin, Berman is currently at work on a Toyota campaign through Saatchi & Saatchi. She will continue to work in features as well. She mostly uses Media Composer but can also work on Premiere.

Cabin’s Butzer was recently promoted to editor after joining the company in 2017 and honing his talent across many platforms, including commercials, music videos and documentaries. His strengths include narrative and automotive work. Recent credits include Every Day Is Your Day for Gatorade celebrating the 2019 Women’s World Cup, The Professor for Mercedes Benz and Vince Staples’ Fun! music video. Butzer has worked with ad agencies and directors, including TBWA\Chiat\Day; Wieden + Kennedy; Goodby, Silverstein & Partners; Team One; Marcus Sonderland; Ryan Booth; and Rachel McDonald. Butzer previously held editorial positions at Final Cut and Whitehouse Post. He studied film at the University of Colorado at Boulder. He also uses Media Composer and Premiere.

Blackmagic: Resolve 16.1 in public beta, updates Pocket Cinema Camera

Blackmagic Design has announced DaVinci Resolve 16.1, an updated version of its edit, color, visual effects and audio post software that features updates to the new cut page, further speeding up the editing process.

With Resolve 16, introduced at NAB 2019, now in final release, the Resolve 16.1 public beta is now available for download from the Blackmagic Design website. This new public beta will help Blackmagic continue to develop new ideas while collaborating with users to ensure those ideas are refined for real-world workflows.

The Resolve 16.1 public beta features changes to the bin that now make it possible to place media in various folders and isolate clips from being used when viewing them in the source tape, sync bin or sync window. Clips will appear in all folders below the current level, and as users navigate around the levels in the bin, the source tape will reconfigure in real time. There’s even a menu for directly selecting folders in a user’s project.

Also new in this public beta is the smart indicator. The new cut page in DaVinci Resolve 16 introduced multiple new smart features, which work by estimating where the editor wants to add an edit or transition and then applying it without the editor having to waste time placing exact in and out points. The software guesses what the editor wants to do and just does it — it adds the inset edit or transition to the edit closest to where the editor has placed the CTI.

But a problem can arise in complex edits, where it is hard to know what the software would do and which edit it would place the effect or clip into. That’s the reason for the beta version’s new smart indicator. The smart indicator provides a small marker in the timeline so users get constant feedback and always know where DaVinci Resolve 16.1 will place edits and transitions. The new smart indicator constantly live-updates as the editor moves around the timeline.

One of the most common items requested by users was a faster way to cut clips in the timeline, so now DaVinci Resolve 16.1 includes a “cut clip” icon in the user interface. Clicking on it will slice the clips in the timeline at the CTI point.

Multiple changes have also been made to the new DaVinci Resolve Editor Keyboard, including a new adaptive scroll feature on the search dial, which will automatically slow down a job when editors are hunting for an in point. The live trimming buttons have been renamed to the same labels as the functions in the edit page, and they have been changed to trim in, trim out, transition duration, slip in and slip out. The function keys along the top of the keyboard are now being used for various editing functions.

There are additional edit models on the function keys, allowing users to access more types of editing directly from dedicated keys on the keyboard. There’s also a new transition window that uses the F4 key, and pressing and rotating the search dial allows instant selection from all the transition types in DaVinci Resolve. Users who need quick picture picture-in in-picture effects can use F5 and apply them instantly.

Sometimes when editing projects with tight deadlines, there is little time to keep replaying the edit to see where it drags. DaVinci Resolve 16.1 features something called a Boring Detector that highlights the timeline where any shot is too long and might be boring for viewers. The Boring Detector can also show jump cuts, where shots are too short. This tool allows editors to reconsider their edits and make changes. The Boring Detector is helpful when using the source tape. In that case, editors can perform many edits without playing the timeline, so the Boring Detector serves as an alternative live source of feedback.

Another one of the most requested features of DaVinci Resolve 16.1 is the new sync bin. The sync bin is a digital assistant editor that constantly sorts through thousands of clips to find only what the editor needs and then displays them synced to the point in the timeline the editor is on. The sync bin will show the clips from all cameras on a shoot stacked by camera number. Also, the viewer transforms into a multi-viewer so users can see their options for clips that sync to the shot in the timeline. The sync bin uses date and timecode to find and sync clips, and by using metadata and locking cameras to time of day, users can save time in the edit.

According to Blackmagic, the sync bin changes how multi-camera editing can be completed. Editors can scroll off the end of the timeline and keep adding shots. When using the DaVinci Resolve Editor Keyboard, editors can hold the camera number and rotate the search dial to “live overwrite” the clip into the timeline, making editing faster.

The closeup edit feature has been enhanced in DaVinci Resolve 16.1. It now does face detection and analysis and will zoom the shot based on face positioning to ensure the person is nicely framed.

If pros are using shots from cameras without timecode, the new sync window lets them sort and sync clips from multiple cameras. The sync window supports sync by timecode and can also detect audio and sync clips by sound. These clips will display a sync icon in the media pool so editors can tell which clips are synced and ready for use. Manually syncing clips using the new sync window allows workflows such as multiple action cameras to use new features such as source overwrite editing and the new sync bin.

Blackmagic Pocket Cinema Camera
Besides releasing the DaVinci Resolve 16.1 public beta, Blackmagic also updated the Blackmagic Pocket Cinema Camera. Blackmagic not only upgraded the camera from 4K to 6K resolution, but it changed the mount to the much used Canon EF style. Previous iterations of the Pocket Cinema Camera used a Micro 4/3s mount, but many users chose to purchase a Micro 4/3s-to-Canon EF adapter, which easily runs over $500 new. Because of the mount change in the Pocket Cinema Camera 6K, users can avoid buying the adapter and — if they shoot with Canon EF — can use the same lenses.

Speed controls now available in Premiere Rush V.1.2

Adobe has added a new panel in Premiere Rush called Speed, which allows users to manipulate the speed of their footage while maintaining control over the audio pitch, range, ramp speed and duration of the edited clip. Adobe’s Premiere Rush teams say speed control has been the most requested feature by users.

Basic speed adjustments: A clip’s speed is displayed as a percentage value, with 100% being realtime. Values below 100% result in slow motion, and values above 100% create fast motion. To adjust the speed, users simply open the speed panel, select “Range Speed” and drag the slider. Or they can tap on the speed percentage next to the slider and enter a specific value.

Speed ranges: Speed ranges allow users to adjust the speed within a specific section of a clip. To create a range, users drag the blue handles on the clip in the timeline or in the speed panel under “Range.” The speed outside the range is 100%, while speed inside the range is adjustable.

Ramping: Rush’s adjustable speed ramps make it possible to progressively speed up or slow down into or out of a range. Ramping helps smooth out speed changes that might otherwise seem jarring.

Duration adjustments: For precise control, users can manually set a clip’s duration. After setting the duration, Rush will do the math and adjust the clip speed to the appropriate value — a feature that is especially useful for time lapses.

Maintain Pitch: Typically, speeding up footage will raise the audio’s pitch (think mouse voice), while slowing down footage will lower it (think deep robot voice). Maintain Pitch in the speed panel takes care of the problem by preserving the original pitch of the audio at any speed.

As with everything in Rush, speed adjustments will transfer seamlessly when opening a Rush project in Premiere Pro.

Bluefish444 adds edit-while-record, REST API to IngeSTore

Bluefish444, makers of uncompressed UltraHD SDI, ASI, video over IP and HDMI I/O cards, and mini converters, has released IngeSTore version 1.1.2. This free update of IngeSTore adds support for new codecs, edit-while-record workflows and a REST API.

Bluefish444 developed IngeSTore software as a complementary multi-channel ingest tool enabling Bluefish444 hardware to capture multiple independent format SDI sources simultaneously.

In IngeSTore 1.1.2, Bluefish444 has expanded codec support to include the popular formats OP1A MPEG-2 and DNxHD within the BlueCodecPack license. Edit-while-record workflows are supported through both industry standard growing files and through Bluefish444’s BlueRT plug-in for Adobe Premiere Pro and Avid Media Composer. BlueRT allows Adobe and Avid NLEs to access media files as they are still being recorded by IngeSTore multi-channel capture software, increasing production efficiency via immediate access to recorded media during live workflows.

Masv now integrates with Premiere for fast file delivery

Masv, which sends large video files via the cloud, is offering a new extension for Adobe Premiere. The extension simplifies the delivery of data-heavy video projects by embedding Masv’s accelerated cloud transfer technology directly within the NLE.

The new extension is available for free at www.massive.io/premiere or the Adobe Exchange.

The new Masv Panel for Premiere reliably renders, uploads and sends large (20GB and higher) files that are typically too big for conventional cloud transfer services. Masv sends files over a high-performance global network, exploiting users’ maximum Internet bandwidth.

“Today’s video professionals are increasingly independent and distributed globally. They need to deliver huge projects faster, often from home studios or remote locations, while collaborating with teams that can change from project to project,” says Dave Horne. “This new production paradigm has broken traditional transfer methods, namely the shipping of hard drives and use of expensive on-prem transfer tools.

“By bringing MASV directly inside Premiere Pro, now even the largest Premiere project can be delivered via Cloud, streamlining the export process and tightly integrating digital project delivery within editors’ workflows.”

Key Features:
• The new Masv extension installs in a dockable panel, integrating perfectly into Premiere Pro CC 2018 and higher (MacOS/Windows)
• Users can upload full projects, project sequences, files and folders from within Premiere Pro. The Masv Panel retries aggressively, sending files successfully even in poor networking conditions.
• Users can render projects to any Adobe Media Encoder export preset and then send. Favorite export formats can be stored for quick use on future uploads.
• When exporting to Media Encoder, users can choose to automatically upload and send after rendering. Alternatively, they can opt to review your export before uploading.
• Users can monitor export and transfer progress, plus upload performance stats, in realtime.• Users can distribute transfer notifications via email and Slack.
• Deliveries are secured by adding a password. Transfers are fully encrypted at rest and in flight and comply with GDPR standards.
• Users can set storage duration based on project requirements. Set a nearer delete date for maximum security or longer for convenience.
• Set download limits protect sensitive content and manage transfer costs.
• Users can send files from Premiere and then use the Masv Web Application to review delivery status, delivery costs and manage active deliveries easily.
• Users can send terabytes of data, at very fast speeds without having to manage storage or deal with file size limits.

Masv launched a new version of the service in February, followed by a chain of significant product updates.

Review: FXhome’s HitFilm Pro 12 for editing, compositing, VFX

By Brady Betzel

If you have ever worked in Adobe Premiere Pro, Apple FCP X or Avid Media Composer and wished you could just flip a tab and be inside After Effects, with access to 3D objects directly in your timeline, you are going to want to take a look at FXhome’s HitFilm Pro 12.

Similar to how Blackmagic brought Fusion inside of its most recent versions of DaVinci Resolve, HitFilm Pro offers a nonlinear editor, a composite/VFX suite and a finishing suite combined into one piece of software. Haven’t heard about HitFilm yet? Let me help fill in some blanks.

Editing and 3D model Import

Editing and 3D model Import

What is HitFilm Pro 12?
Technically, HitFilm Pro 12 is a non-subscription-based nonlinear editor, compositor and VFX suite that costs $299. Not only does that price include 12 months of updates and tech support, but one license can be used on up to three computers simultaneously. In my eyes, HitFilm Pro is a great tool set for independent filmmakers, social media content generators and any editor who goes beyond editing and dives into topics like 3D modeling, tracking, keying, etc. without having to necessarily fork over money for a bunch of expensive third-party plugins. That doesn’t mean you won’t want to buy third-party plugins, but you are less likely to need them with HitFilm’s expansive list of native features and tools.

At my day job, I use Premiere, After Effects, Media Composer and Resolve. I often come home and want to work in something that has everything inside, and that is where HitFilm Pro 12 lives. Not only does it have the professional functionality that I am used to, such as trimming, color scopes and more, but it also has BorisFX’s Mocha planar tracking plugin built in for no extra cost. This is something I use constantly and love.

One of the most interesting and recent updates to HitFilm Pro 12 is the ability to use After Effects plugins. Not all plugins will work since there are so many, but in a video released after NAB 2019, HitFilm said plugins like Andrew Kramer’s Video CoPilot Element3D and ones from Red Giant are on the horizon. If you are within your support window, or you continue to purchase HitFilm, FXhome will work with you to get your favorite After Effects plugins working directly inside of HitFilm.

Timeline and 3D model editor

Some additional updates to HitFilm Pro 12 include a completely redesigned user interface that resembles Premiere Pro… kind of. Threaded rendering has also been added, so Windows users who have Intel and Nvidia hardware will see increased GPU speeds, the ability to add title directly in the editor and more.

The Review
So how doees HitFilm Pro 12 compare to today’s modern software packages? That is an interesting question. I have become more and more of a Resolve convert over the past two years, so I am constantly comparing everything to that. In addition, being an Avid user for over 15 years, I am used to a rock-solid NLE with only a few hiccups here and there. In my opinion, HitFilm 12 lands itself right where Premiere and FCP X live.

It feels prosumer-y, in a YouTuber or content-generator capacity. Would it stand up to 10 hours of abuse with content over 45 minutes? It probably would, but much like with Premiere, I would probably split my edits in scenes or acts to avoid slowdowns, especially when importing things like OBJ files or composites.

The nonlinear editor portion feels like Premiere and FCP X had a baby, but left out FCP X’s Magnetic Timeline feature. The trimming in the timeline feels smooth, and after about 20 minutes of getting comfortable with it I felt like it was what I am generally used to. Cutting in footage feels good using three-point edits or simply dragging and dropping. Using effects feels very similar to the Adobe world, where you can stack them on top of clips and they each affect each other from the top down.

Mocha within HitFilm Pro

Where HitFilm Pro 12 shines is in the inclusion of typically third-party plugins directly in the timeline. From the ability to create a scene with 3D cameras and particle generators to being able to track using BorisFX’s Mocha, HitFilm Pro 12 has many features that will help take your project to the next level. With HitFilm 12 Pro’s true 3D cameras, you can take flat text and enhance it with raytraced lighting, shadows and even textures. You can even use the included BorisFX Continuum 3D Objects to make great titles relatively easily. To take it a step further, you can even track them and animate them.

Color Tools
By day, I am an online editor/colorist who deals with the finishing aspect of media creation. Throughout the process, from color correction to exporting files, I need tools that are not only efficient but accurate. When I started to dig into the color correction side of HitFilm Pro 12, things slowed down for me. The color correction tools are very close to what you’ll find in other NLEs, like Premiere and FCP X, but they don’t quite rise to the level of Resolve. HitFilm Pro 12 does operate inside of a 32-bit color pipeline, which really helps avoid banding and other errors when color correcting. However, I didn’t feel that the toolset was making me more efficient; in fact, it was the opposite. I felt like I had to learn FXhome’s way of doing it. It wasn’t that it totally slowed me down, but I felt it could be better.

Color

Color

Summing Up
In the end, HitFilm 12 Pro will fill a lot of holes for individual content creators. If you love learning new things (like I do), then HitFilm Pro 12 will be a good investment of your time. In fact, FXhome post tons of video tutorials on all sorts of good and topical stuff, like how to create a Stranger Things intro title.

If you are a little more inclined to work with a layer-based workflow, like in After Effects, then HitFilm Pro Pro 12 is the app you’ll want to learn. Check out HitFilm Pro 12 on FXhome’s website and definitely watch some of the company’s informative tutorials.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Biff Butler joins Work as editor, creative partner

Work Editorial has added Biff Butler to its roster as editor and creative partner. Currently based in Los Angeles, Butler will be available to work in all of the company’s locations, which also include New York and London.

Originally from the UK, Butler moved to Los Angeles in 1999 to pursue a career as a musician, releasing albums and touring the country. Inspired by the title sequence for the movie Se7en cut by Angus Wall at Rock Paper Scissors (RPS), he found himself intrigued by the craft of editing. Following the breakup of his band in 2005 Butler got a job at RPS and, as he puts it, RPS was his film school. There he found his editorial voice and satiated another interest — advertising.

(Check out an interview we did with Butler recently while he was still at RPS.)

Within a couple years, he was cutting spots for Nike, Microsoft, Lexus and Adidas, and in 2008 he made a breakthrough with the Emmy Award-winning will.i.am video Yes We Can video by then-presidential candidate Barack Obama. By 2012 his clientele spanned across both coasts and after moving to New York, he went on to collaborate on some of the era-defining work coming out of the US at the time, with Wieden +Kennedy NY, Anomaly and BBDO amongst others. Perhaps, most notably, was his involvement in the Derek Cianfrance/Dicks Sporting Goods campaign that defined a style in sports commercials.

“I’ve always had an interest in advertising and the process,” says Butler. “I love watching a campaign roll out, seeing it permeate the culture. I still get such a kick out of coming out of the subway and seeing a huge poster from something I’ve been involved with.”

Butler has been recognized for his work, winning numerous AICE, the Clio and Cannes Lion awards as well as receiving an Emmy for the six-part documentary Long Live Benjamin, which he edited and co-directed with creative director Jimm Lasser.

Work founding partner Jane Dilworth says, “I have always been aware of Biff and the great work he does. He is an editor with great feeling and instinct that not only works for the director or creative but what is right for the job.”

Review: Western Digital’s Blue SN500 NVMe SSD

By Brady Betzel

Since we began the transfer of power from the old standard SATA 3.5-inch hard drives to SSD drives, multimedia-based computer users have seen a dramatic uptick in read and write speeds. The only issue has been price. You can still find a 3.5-inch brick drive, ranging in size from 2TB to 4TB, for under $200 (maybe closer to $100), but if you upgraded to an SSD drive over the past five years, you were looking at a huge jump in price. Hundreds, if not thousands, of dollars. These days you are looking at just a couple of hundred for 1TB and even less for 256GB or 512GB SSD.

Western Digital hopes you’ll think of NVMe SSD drives as more of an automatic purchase than a luxury with the Western Digital Blue SN500 NVMe M.2 2280 line of SSD drives.

Before you get started, you will need a somewhat modern computer with an NVMe M.2-compatible motherboard (also referred to as a PCIe Gen 3 interface). This NVMe SSD is a “B+M key” configuration, so you will need to make sure you are compatible. Once you confirm that your motherboard is compatible, you can start shopping around. The Western Digital Blue series has always been the budget-friendly level of hard drives. Western Digital also offers the next level up: the Black series. In terms of NVMe SSD M.2 drives, the Western Digital Blue series drives will be budget-friendly, but they also use two fewer PCIe lanes, which results in a slower read/write speed. The Black series uses up to four PCIe lanes, as well as has a heat sink to dissipate the heat. But for this review, I am focusing on the Blue series and how it performs.

On paper the Western Digital Blue SN500 NVMe SSD is available in either 250GB or 500GB sizes, measures approximately 80mm long and uses the M.2 2280 form factor for the PCIe Gen 3 interface in up to two lanes. Technically, the 500GB drive can achieve up to 1,700MB/s read and 1450MB/s write speeds, and the 250GB can achieve up to 1700MB/s read and 1300MB/s write speeds.

As of this review, the 250GB version sells for $53.99, while the 500GB version sells for $75.99. You can find specs on the Western Digital website and learn more about the Black series as well.

One of the coolest things about these NVMe drives is that they come standard with a five-year limited warranty (or max endurance limit). The max endurance (aka TBW — terabytes written) for the 250GB SSD is 150TB, while the max endurance for the 500GB version is 300TB. Both versions have a MTTF (mean time to failure) of 1.75 million hours.

In addition, the drive uses an in-house controller and 3D NAND logic. Now those words might sound like nonsense, but the in-house controller is what tells the NVMe what to do and when to do it (it’s essentially a dedicated processor), while3D NAND is a way of cramming more memory into smaller spaces. Instead of hard drive manufacturers adding more memory on the same platform in an x- or y-axis, they achieve more storage space by stacking layers vertically on top — or on the z-axis.

Testing Read and Write Speeds
Keep in mind that I ran these tests on a Windows-based PC. Doing a straight file transfer, I was getting about 1GB/s. When using Crystal Disk Mark, I would get a burst of speed at the top, slow down a little and then mellow out. Using a 4GB sample, my speeds were:
“Seq Q32T” – Read: 1749.5 MB/s – Write: 1456.6 MB/s
“4KiB Q8T8” – Read: 1020.4 MB/s – Write: 1039.9 MB/s
“4KiB Q32T1” – Read: 732.5 MB/s – Write: 676.5 MB/s
“4KiB Q1T1” – Read: 35.77 MB/s – Write: 185.5 MB/s

If you would like to read exactly what these types of tests entail, check out the Crystal Disk Mark info page. In the AJA System Test I had a little drop off, but with a 4GB test file size, I got an initial read speed of 1457MB/s and a write speed of 1210MB/s, which seems to fall more in line with what Western Digital is touting. The second time I ran the AJA System Test, I got a read speed of 1458MB/s and write speed of 883MB/s. I wanted a third opinion, so I ran the Blackmagic Design Disk Speed Test (you’ll have to install drivers for a Blackmagic card, like the Ultrastudio 4K). On my first run, I got a read speed of 1359.6MB/s and write speed of 1305.8MB/s. On my second run, I got a read speed of 1340.5MB/s and write speed of 968.3MB/s. My read numbers were generally above 1300MB/s, and my write numbers varied between 800 and 1000MB/s. Not terrible for a sub-$100 hard drive.

Summing Up
In the end, the Western Digital Blue SN500 NVMe SSD is an amazing value at under $100, and hopefully we will get expanded sizes in the future. The drive is a B+M key configuration, so when you are looking at compatibility, make sure to check which key your PCIe card, external drive case or motherboard supports. It is typically M or B+M key, but I found a PCI card that supported both. If you need more space and speed than the WD Blue series can offer, check out Western Digital’s Black series of NVMe SSDs.

The sticker price starts to go up significantly when you hit the 1TB or 2TB marks — $279.99 and $529.99, respectively (with the heat sink attachment). If you stick to the 500GB version, you are looking at a more modest price tag.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Quick Chat: Robert Ryang on editing Netflix’s Zion doc

Back in May, Cut+Run’s Robert Ryang took home a Sports Emmy in the Outstanding Editing category for the short film Zion. The documentary, which premiered at Sundance and was released on Netflix, tells the story of Zion Clark, a young man who was born without legs, grew up in foster care and found community and hope in wrestling.

Robert Ryang and his Emmy for his work on Zion.

Clark began wrestling in second grade against his able-bodied peers. The physical challenge became a therapeutic outlet and gave him a sense of family. Moving from foster home to foster home, wrestling became the one constant in his childhood.

Editor Ryang and Zion’s director, Floyd Russ, had worked together previously — on the Ad Council’s Fans of Love and SK-II’s Marriage Market Takeover, among other projects — and developed a creative shorthand that helped tell this compelling, feel-good story.

We spoke with Ryang about the film, his process and working with the director

How and when did you become involved in this project?
In the spring of 2017, my good friend director Floyd Russ asked me to edit his passion project. Initially, I was hesitant, since it was just after the birth of my second child. Two years later, both the film and my kid have turned out great.

You’ve worked him before. What defines the way you work together?
I think Floyd and I work really well together because we’re such good friends; we don’t have to be polite. He’ll text me ideas any time of day, and I feel comfortable enough to tell him if I don’t like something. He wins most of the fights, but I think this dialectic probably makes the work better.

How did you approach the edit on the film? How did you hone the story structure?
At first, Floyd had a basic outline that I followed just to get something on the timeline. But from there, it was a pretty intense process of shuffling and reshaping. At one point, we tried to map the beats onto a whiteboard, and it looked like a Richter scale. Editor Adam Bazadona helped cut some of these iterations while I was on paternity leave.

How does working on a short film like this differ — hats worn, people involved, etc. — from advertising projects?
The editing process was a lot different from most commercial projects in that it was only Floyd and me in the room. Friends floated a few thoughts here and there, but we were only working toward a director’s cut.

What tools did you use?
Avid Media Composer for editing, some Adobe After Effects for rough comps.

What are the biggest creative and technical challenges you faced in the process?
With docs, there are usually infinite ways to put it together, so we did a lot of exploration. Floyd definitely pushed me out of my comfort zone in prescribing the more abstract scenes, but I think those touches ultimately made the film stand out.

From Sundance, to Netflix, to Sports Emmy awards. Did you ever imagine it would take this journey?
There wasn’t much precedent for a studio or network acquiring a 10-minute short, so our biggest hope was that it would get into Sundance then live on Vimeo. It really exceeded everyone’s expectations. And I would have never imagined receiving an Emm, but am really honored I did.

Review: The Loupedeck+ editing console for stills and video

By Brady Betzel

As an online editor I am often tasked with wearing multiple job hats, including VFX artist, compositor, offline editor, audio editor and colorist, which requires me to use special color correction panel hardware. I really love photography and cinematography but have never been able to use the color correction hardware I’m used to in  Adobe’s Photoshop or Lightroom, so for the most part I’ve only done basic photo color correction.

You could call it a hobby, although this knowledge definitely helps many aspects of my job. I’ve known Photoshop for years and use it for things like building clean plates to use in apps like Boris FX Mocha Pro and After Effects, but I had never really mastered Lightroom. However, that changed when I saw the Loupedeck. I was really intrigued with its unique layout but soon dismissed it since it didn’t work on video… until now. I’m happy to say the new Loupedeck+ works with both photo and video apps.

Much like the Tangent Element and Wave or Blackmagic Micro and Mini panels, the Loupedeck+ is made to adjust parameters like contrast, exposure, saturation, highlights, shadows and individual colors. But, unlike Tangent or Blackmagic products, the Loupedeck+ functions not only in Adobe Premiere and Apple Final Cut Pro X but in image editing apps like Lightroom 6, Photoshop CC, and Skylum Aurora HDR; the audio editing app Adobe Audition and the VFX app Adobe After Effects. There’s also beta integration with Capture One.

It works via USB 2.0 connection on Windows 10 and Mac OS 10.12 or later. In order to use the panel and adjust its keys, you must also download the Loupedeck software, which you can find here. The Loupedeck+ costs just $249 dollars, which is significantly less than many of the other color correction panels on the market offering so many functions.

Digging In
In this review, I am going to focus on Loupedeck+’s functionality with Premiere, but keep in mind that half of what makes this panel interesting is that you can jump into Lightroom Classic or Photoshop and have the same, if not more, functionality. Once you install the Loupedeck software, you should restart your system. When I installed the software I had some weird issues until I restarted.

When inside of Premiere, you will need to tell the app that you are using this specific control panel by going to the Edit menu > Preferences > Control Surface > click “Add” and select Loupedeck 2. This is for a PC, but Mac OS works in a similar way. From there you are ready to use the Loupedeck+. If you have any customized keyboard shortcuts (like I do) I would suggest putting your keyboard shortcuts to default for the time being, since they might cause the Loupedeck+ to use different keypresses than you originally intended.

Once I got inside of Premiere, I immediately opened up the Lumetri color panels and began adjusting contrast, exposure and saturation, which are all clearly labeled on the Loupedeck+. Easy enough, but what if you want to use the Loupedeck+ as an editing panel as well as a basic color correction console? That’s when you will want to print out pages six through nine of the Premiere Pro Loupedeck+ manual, which you can find here. (If you like to read on a tablet you could pull that up there, but I like paper for some reason… sorry trees.) In these pages, you will see that there are four layers of controls built into the Loupedeck+.

Shortcuts
Not only can you advance frames using the arrow keypad, jump to different edit points with the jog dial, change LUTs, add keyframes and extend edits, you also have three more layers of shortcuts. To get to the second layer of shortcuts, press the “Fn” button located toward the lower left, and the Fn layer will appear. Here you can do things like adjust the shadows and midtones on the X and Y axes, access the Type Tool or add edits to all tracks. To go even further, you can access the “Custom” mode, which has defaults but can be customized to whichever keypress and functions the Loupedeck+ app allows.

Finally, while in the Custom mode, you can press the Fn button again and enter “Custom Fn” mode — the fourth and final layer of shortcuts. Man, that is a lot of customizable buttons. Do I need all those buttons? Probably not, but still, they are there —and it’s better to have too much than not enough, right?

Beyond the hundreds of shortcuts in the Loupedeck+ console you have eight color-specific scroll wheels for adjusting. In Lightroom Classic, these tools are self-explanatory as they adjust each color’s intensity.

In Premiere they work a little differently. To the left of the color scroll wheels are three buttons: hue, saturation and luminance (Hue, Sat and Lum, respectively). In the standard mode, they each equate to a different color wheel: Hue = highlights, Sat = midtones and Lum = shadows. The scroll wheel above red will adjust the up/down movement in the selected color wheel’s x-axis, orange will adjust the left/right movement in the selected color wheel’s y-axis, and yellow will adjust the intensity (or luminance) of the color wheel.

Controlling the Panel
In traditional color correction panels, color correction is controlled by roller balls surrounded by a literal wheel to control intensity. It’s another way to skin a cat. I personally love the feel of the Tangent Element Tk panel, which simply has three roller balls and rings to adjust the hue, but some people might like the ability to precisely control the color wheels in x- and y-axis.

To solve my issue, I used both. In the preferences, I enabled both Tangent and Loupedeck options. It worked perfectly (once I restarted)! I just couldn’t get past the lack of hue balls and rings in the Loupedeck, but I really love the rest of the knobs and buttons. So in a weird hodge-podge, you can combine a couple of panels to get a more “affordable” set of correction panels. I say affordable in quotes because, as of this review, the Tangent Element Tk panels are over $1,100 for one panel, while the entire set is over $3,000.

So if you already have the Tangent Element Tk panel, but want a more natural button and knob layout, the Loupedeck+ is a phenomenal addition as long as you are staying within the Adobe or FCP X world. And while I clearly like the Tangent Elements panels, I think the overall layout and design of the Loupedeck+ is more efficient and overall more modern.

Summing Up
In the end, I really like the Loupedeck+. I love being able to jump back and forth between photo and video apps seamlessly with one panel. What I think I love the most is the “Export” button in the upper right corner of the Loupedeck+. I wish that button existed on all panels.

When using the Loupedeck+, you can really get your creative juices flowing by hitting the “Full Screen” button and color correcting away, even using multiple adjustments at once to achieve your desired look — similar to how a lot of people use other color correction panels. And at $249, the Loupedeck+ might be the overall best value for the functionality of any editing/color correction panel currently out there.

Can I see using it when editing? I can, but I am such a diehard keyboard and Wacom tablet user that I have a hard time using a panel for editing functions like trimming and three-point edits. I did try the trimming functionality and it was great, not only on a higher-end Intel Xeon-based system but on an even older Windows laptop. The responsiveness was pretty impressive and I am a sucker for adjustments using dials, sliders and roller balls.

If you want to color correct using panels, I think the Loupedeck+ is going to fit the bill for you if you work in Adobe Creative Suite or FCP X. If you are a seasoned colorist, you will probably start to freak out at the lack of rollerballs to adjust hues of shadows, midtones and highlights. But if you are a power user who stays inside the Adobe Creative Cloud ecosystem, there really isn’t a better panel for you. Just print up the shortcut pages of the manual and tape them to the wall by your monitor for constant reference.

As with anything, you will only get faster with repetition. Not only did I test out color correcting footage for this review, I also used the Loupedeck+ in Adobe Lightroom Classic to correct my images!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.