Category Archives: Editing

Apple updates FCPX, Motion and Compressor

By Brady Betzel

While Apple was busy releasing the new Mac Mini’s last month, they were also quietly prepping some new updates. Apple has releasing free updates to FCPX, Motion and Compressor.

CatDV inside of FCPX

The FCPX 10.4.4 update includes Workflow Extensions, batch sharing, Comparison Viewer, built-in video noise reduction, timecode window and more. The Workflow Extensions are sure to take the bulk of the update cake: At launch Apple has announced Shutterstock, Frame.io and CatDV will have extensions directly usable inside of FCPX instead of through a web browser. Frame.io looks to be the most interesting extension with realtime reflection of who is watching your video and at what timecode they are at, a.k.a, “Presence.”

Frame.io being rebuilt from the ground up using Swift will make its venture inside of FCPX extremely streamlined and fast. Notwithstanding Internet bandwidth, Frame.io inside of FCPX looks to be the go-to approval system that FCPX editors will use. I am not quite sure why Apple didn’t create their own approval and note-taking system, but they didn’t go wrong working with Frame.io. Since many editors use this as their main approval system, FCPX users will surely love this implementation directly inside of the app.

When doing color correction, it is essential to compare your current work with either other images or the source image, and luckily for FCPX colorists you can now do this with the all new Comparison Viewer. Essentially, the Comparison Viewer will allow you to compare anything to the clip you are color grading.

One feature of this that I really love is that you can have access to scopes on both the reference image and your working image. If you understand how scopes work, color matching via parade or waveforms can often be quicker than by eyeball match.

Frame.io inside of FCPX

Final Cut Pro 10.4.4 has a few other updates like Batch Share, which allows you to cue a bunch of exports or projects in one step, Timecode Window (which is a “why wasn’t this there already” feature) is essential when editing video footage, and video noise reduction has been added as a built-in feature with adjustable amounts and sharpness. There are a few other updates like Tiny Planet, which allows you to quickly make that spherical 360-degree video look, not really an important technical update but fun nonetheless.

Motion
With Version 5.4.2, Apple has put the advanced color correction toolset from FCPX directly inside of Motion. In addition, you can now add custom LUTs to your work. Apple has added the Tiny Planet effect as well as a Comic filter inside Motion. Those aren’t incredibly impressive, but the addition of the color correction toolkit is an essential addition to Motion and will provide a lot of use.

Compressor
Compressor 4.4.2 in my opinion is the sleeper update. Apple has finally updated Compressor to a 64-bit engine to take advantage of all of your memory, as well as improved overall performance with huge files. And it will still work with legacy 32-bit formats. Closed captions can now be burned into a video, including the SRT format. Compressor has also added automatic configuration to apply correct frame rate, field order and color space to your MXF and QuickTime outputs.

The FCPX, Motion and Compressor updates are available now for free if you have previously purchased the apps. If not FCPX retails for $299.99. Motion and Compressor are $49.99 each.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

The Girl in the Spider’s Web: immersive audio and picture editing

By Mel Lambert

Key members of the post crew responsible for the fast-paced look and feel of director Fede Alvarez’s new film, The Girl in the Spider’s Web, came to the project via a series of right time/right place situations. First, co-supervising sound editor Julian Slater (who played a big role in Baby Driver’s audio post) met picture editor Tatiana Riegel at last year’s ACE Awards.

During early 2018, Slater was approached to work on the lastest adaptation of the crime novels by the Swedish author Stieg Larsson. Alvarez was impressed with Slater’s contribution to both Baby Driver and the Oscar-winning Mad Max: Fury Road (2015). “Fede told me that he uses the soundtrack to Mad Max to show off his home Atmos playback system,” says Slater, who served as sound designer on that film. “I was happy to learn that Tatiana had also been tagged to work on The Girl in the Spider’s Web.”

Back row (L-R): Micah Loken, Sang Kim, Mandell Winter, Dan Boccoli, Tatiana Riegel, Kevin O’Connell, Fede Alvarez, Julian Slater, Hamilton Sterling, Kyle Arzt, Del Spiva and Maarten Hofmeijer. Front row (L-R): Pablo Prietto, Lola Gutierrez, Mathew McGivney and Ben Sherman.

Slater, who would also be working on the crime drama Bad Times at the El Royale for director Drew Goddard, wanted Mandell Winter as his co-supervising sound editor. “I very much liked his work on The Equalizer 2, Death Wish and The Magnificent Seven, and I knew that we could co-supervise well together. I came on full time after completing El Royale.”

Editor Riegel (Gringo, I Tonya, Million Dollar Arm, Bad Words) was a fan of the original Stieg Larsson Millennium Series films —The Girl With the Dragon Tattoo, The Girl Who Kicked the Hornet’s Nest and The Girl Who Played with Fire — as well as David Fincher’s 2011 remake of The Girl With the Dragon Tattoo. She was already a fan of Alvarez, admiring his previous suspense film, Don’t Breathe, and told him she enjoyed working on different types of films to avoid being typecast. “We hit it off immediately,” says Riegel, who then got together with Julian Slater and Mandell Winter to discuss specifics.

The latest outing in the Stieg Larsson franchise, The Girl in the Spider’s Web: A New Dragon Tattoo Story, stars English actress Claire Foy (The Crown) in the eponymous role of a young computer hacker Lisbeth Salander who, along with journalist Mikael Blomkvist, gets caught up in a web of spies, cybercriminals and corrupt government officials. The screenplay was co-written by Jay Basu and Alvarez from the novel by David Lagercrantz. The cast also includes Sylvia Hoeks, Stephen Merchant and Lakeith Stanfield.

Having worked previously with Niels Arden Oplev, the Swedish director of 2009’s The Girl with the Dragon Tattoo, Winter knew the franchise and was interested in working on the newest offering. He was also excited about working with director Fede Alvarez. “I loved the use of color and lighting choices that Fede selected for Don’t Breathe, so when Julian Slater called I jumped at the opportunity. None of us had worked together before, and it was Fede’s first large-budget film, having previously specialized in independent offerings. I was eager to help shepherd the film’s immersive soundtrack through the intricate process from location to the dub stage.”

From the very outset, Slater argued for a native Dolby Atmos soundtrack, with a 7.1-channel Avid Pro Tools bed that evolved through editorial, with appropriate objects being assigned during re-recording to surround and overhead locations. “We knew that the film would be very atmospheric,” Slater recalls, “so we decided to use spaces and ambiences to develop a moody, noir thriller.”

The film was dubbed on the William Holden Stage at Sony Pictures Studios, with Kevin O’ Connell handling dialog and music, and Slater overseeing sound effects elements.

Cutting Picture on Location
Editor Riegel and two assistants joined the project at its Berlin location last January. “It was a 10-month journey until final print mastering in mid-October,” she says. “We knew CGI elements would be added later. Fede didn’t do any previz, instead focusing on VFX during post production. We set up Avid Media Composers and assemble-edited the dailies as we went” against early storyboards. “Fede wanted to play up the film’s rogue theme; he had a very, very clear focus of the film as spectacle. He wanted us to stay true to the Lisbeth Salander character from the original films, yet retain that dark, Scandinavian feel from the previous outings. The film is a fun ride!”

The team returned to Los Angeles in April and turned the VFX over to Pixomondo, which was brought on to handle the greenscreen CGI sequences. “We adjourned to Pivotal Post in Burbank for the Director’s Cut and then to the Sony lot in Culver City for the first temp mix,” explains Riegel. “My editing decisions were based on the innate DNA of the shot material, and honoring the script. I asked Fede a lot of questions to ensure that the story and the pacing were crystal clear. Our first assembly was around two hours and 15 minutes, which we trimmed to just under two hours during a series of refinements. We then removed 15 minutes to reach our final 1:45 running time, which worked for all of us. The cut was better without the dropped section.”

Daniel Boccoli served as first assistant picture editor, Patrick Clancey was post finishing editor, Matthew McGivney was VFX editor and Andrew McGivney was VFX assistant editor.

Because Riegel likes to cut against an evolving soundtrack, she developed a temporary dialog track in her Avid workstation, adding sound effects taken from commercial libraries. “But there is a complex fight and chase sequence in the middle of the film that I turned over to Mandell and Julian early on so I could secure realistic effects elements to help inform the cut,” she explains. “Those early tracks were wonderful and gave me a better idea of what the final film would sound like. That way I can get to know the film better — I can also open up the cut to make space for a sound if it works within the film’s creative arcs.”

“Our overall direction from Fede Alvarez was to make the soundtrack feel cold when we were outside and to grab the audience with the action… while focusing on the story,” Winter explains. “We were also working against a very tight schedule and had little time for distractions. After the first temp, Julian and I got notes from Fede and Tatiana and set off using that feedback, which continued through three more temp mixes.”

Having complete supervising The Equalizer 2, Mandell came aboard full time in mid-June, with temp mixes running through the beginning of September. “We were finaling by the last week of September, ahead of the film’s World Premiere on October 19 at the International Rome Film Festival.”

Since there was no spotting session, from day one we were in a tight post schedule, according to Slater. “There were a number of high-action scenes that needed intricate sound design, including the eight-minute sequence that begins with explosions in Lisbeth Salander’s apartment and the subsequent high-speed motorbike chase.”

Sound designer Hamilton Sterling crafted major sections of the film’s key fight and chase sequences.

Intricate Sound Design
“We liked Hamilton’s outstanding work on Independence Day: Resurgence and Logan and relied upon him to develop truly unique sounds for the industrial heating towers, motorbikes and fights,” says Winter. “Sound effects editor Ryan Collins cut the gas mask fight sequence, as well as a couple of reels, while Karen Vassar Triest handled another couple of reels, and David Esparza worked on several of the early sequences.”

Other sound effects editors included Ando Johnson and Robert Stambler, together with dialog editor Micah Loken and supervising Foley editor Sang Jun Kim.

Sterling is particularly proud of several sequences he designed for the film. “During a scene in which the lead character Lisbeth Salander is drugged, I used the Whoosh plug-in [from the German company, Tonsturm] inside Native Instruments’ Reaktor [modular music software] to create a variable, live-performable heartbeat. I used muffled explosion samples that were Doppler-shifted at different speeds against the picture to mimic the pulse-changing effects of various drugs. I also used Whoosh to create different turbo sounds for the Ducati motorcycle driven by Lisbeth, together with air-release sounds. They were subtle effects, because we didn’t want the result to sound like a ‘sci-fi bike’ — just a souped-up twin-cylinder Ducati.”

For the car chases, Sterling used whale-spout blasts to mimic the sound of a car driving through deep puddles with water striking the inside of the wheel wells. For frightening laughs in another sequence, the sound designer turned to Tonsturm’s Doppler program, which he used in an unorthodox way. “The program can be set to break up a sound sample using, for example, a 5.1-channel star pattern with small Doppler shifts to produce very disturbing laughter,” he says. “For the heating towers I used several sound components, including slowed-down toaster noises to add depth and resonance — a hum from the heating elements, plus ticks and clangs as they warmed up. Julian suggested that we use ‘chittery’ effects for the computer user interfaces, so I used The Cargo Cult’s Envy plug-in to create unusual sounds, and to avoid the conventional ‘bips” and ‘boops’ noises. Envy is a spectral-shift, pitch- and amplitude-change application that is very pitch manipulatable. I also turned to the Sound Particles app to generate complex wind sounds that I delivered as immersive 7.1.2 Pro Tools tracks.”

“We also had a lot of Foley, which was recorded on Stage B at Sony Studios by Nerses Gezalyan with Foley artists Sara Monat and Robin Harlen,” Winter adds. “Unfortunately, the production dialog had a number of compromised tracks from the Berlin locations. As a result, we had a lot of ADR to shoot. Scheduling the ADR was complicated by the time difference, as most of our actors were in London, Berlin, Oslo or Stockholm. We used Foley to support the cleaned-up dialog tracks and backfilled tracks. Our dialog editor was very knowledgeable with iZotope RX7 Advance software. Micah Loken really understood how to use it, and how not to use it. He can dig deep into a track without affecting the quality of the voice, and without overdoing the processing.”

The music from composer Roque Baños — who also worked with Alvarez on Don’t Breathe and Evil Dead — arrived very late in the project, “and remained something of a mystery,” Riegel recalls. “Being a musician himself, Fede knew what he wanted and how to achieve that result. He would disappear into an edit suite close to the stage with the music editors Maarten Hofmeijer and Del Spiva, where they cut together the score against the locked picture — or as locked as it ever was! After that we could balance the music against the dialog and sound effects.”

Regarding sound effects elements, Winter acknowledges that his small editorial team needed to work against a tight schedule. “We had a 7.1.2 template that allowed Tony [Lamberti] and later Julian to use the automated panning data. For the final mix in Atmos, we used objects minimally for the music and dialog. However, we used overhead objects strategically for effects and design. In an early sequence we put the sound of the rope — used to suspend an abusive husband — above the audience.” Re-recording mixer Tony Lamberti handled some of the early temp mixes in Slater’s absence.

Collaborative Re-Recording Process
When the project reached the William Holden Stage, “we could see the overall shape of the film with the VFX elements and decide what sounds would now be needed to match the visuals, since we had a lot of new technology to cover, including computer screens,” Riegel says.

Mandell agrees: “Yes, we could now see where Fede Alvarez wanted to take the film and make suggestions about new material. We started asking: ‘What do you think about this and that option?’ Or, ‘What’s missing?’ It was an ongoing series of conversation through the temp mixes, re-mixes and then the final.”

Having handled the first temp mix at Sony Studios, Slater returned full-time for the final Atmos mixes. “After so many temp mixes using the same templates, I knew that we would not be re-inventing the wheel on the William Holden Stage. We simply focused on changing the spatiality of what we had. Having worked with Kevin O’ Connell on both Jumanji: Welcome to the Jungle and The Public, I knew that I had to do my homework and deliver what he needed from my side of the console. Kevin is very involved. He’ll make suggestions, but always based on what is best for the film. I learned a lot by seeing how he works; he is very experienced. It’s easy to find what works with Kevin, since he has experience with a wide range of technologies and keeps up with new advances.”

Describing the re-recording process as being highly collaborative, Mandell remained objective about creative options. “You can get too close to the soundtrack. With a number of German and English actors, we constantly had to ask ourselves: ‘Do we have clarity?’ If not, can we fix it in the track or turn to ADR? We maintained a continuing conversation with Tatiana and Fede, with ideas that we would circulate backwards and forwards. Since we had a lot of new people working on the crew, trust became a major factor. Everybody was incredibly professional.”

“It was a very rewarding experience working with so many talented new people,” Slater concludes. “I quickly tuned into Fede Alvarez’s specific needs and sensibilities. It was a successful liaison.”

Riegel says that her biggest challenge was “trying to figure out what the film is supposed to be — from the script and pre-production through the shoot and first assembly. It’s a gradual process and one that involves regular conversations with my assistant editors and the director as we develop characters and clarify the information being shown. But I didn’t want to hit the audience over the head with too much information. We needed to decide: ‘What is important?’ and retain as much realism as possible. It’s a complex, creative process … and one that I totally love being a part of!”


Mel Lambert has been involved with production industries on both sides of the Atlantic for more years than he cares to remember. He is principal of Content Creators, a Los Angeles-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. He is also a long-time member of the UK’s National Union of Journalists.

DG 7.9, 8.27, 9.26

Post studio Nomad adds Tokyo location

Creative editorial/VFX/sound design company Nomad has expanded its global footprint with a space in Tokyo, adding to a network that also includes offices in New York, Los Angeles and London. It will be led by managing director Yoshinori Fujisawa and executive producer Masato Midorikawa.

The Tokyo office has three client suites, an assistant support suite, production office and machine room. The tools for post workflow include Adobe Creative Cloud (Premiere, After Effects, Photoshop), Flame, Flame Assist, Avid Pro Tools and other various support tools.

Nomad partner/editor Glenn Martin says the studio often works with creatives who regularly travel between LA and Tokyo. He says Nomad will support the new Tokyo-based group with editors and VFX artists from our other offices whenever larger teams are needed.

“The role of a post production house is quite different between the US and Japan,” says Fujisawa and Midorikawa, jointly. “Although people in Japan are starting to see the value of the Western-style post production model, it has not been properly established here yet. We are able to give our Japanese directors and creatives the ability to collaborate with Nomad’s talented editors and VFX artists, who have great skills in storytelling and satisfying the needs of brands. Nomad has a comprehensive post-production workflow that enables the company to execute global projects. It’s now time for Japan to experience this process and be a part of the future of global advertising.”

Main Image: (L-R) Yoshinori Fujisawa and Masato Midorikawa


Tom Cross talks about editing First Man

By Barry Goch

As a child, First Man editor Tom Cross was fascinated with special effects and visual effects in films. So much so that he would take out library books that went behind the scenes on movies and focused on special effects. He had a particular interest in the artists who made miniature spacecraft, which made working on Damien Chazelle’s First Man feel like it was meant to be.

“When I learned that Damien wanted to use miniatures and do in-camera effects on this film, my childhood and adulthood kind of joined hands,” shares Cross, who is now a frequent collaborator of Chazelle’s, having cut Whiplash, La La Land and now First Man.

We recently spoke with Cross about his work on this Universal Pictures film, which stars another Chazelle favorite, Ryan Gosling, and follows the story of Neil Armstrong and the decade leading up to our country’s first mission to the moon.

Which sci-fi films influenced the style of First Man?
I remember seeing the original Star Wars movies as a kid, and they were life changing… seeing those in the theater really transported me. They opened my eyes to other movies and other movie experiences, like 2001: A Space Odyssey. Along the way, I saw and loved The Right Stuff and Apollo 13.

Tom Cross

Damien is a big fan of all those movies as well, but he really wanted to try a different stylistic approach. He knew that 2001 owns that particular look and style, where you’re super high resolution, antiseptic and sleek in a futuristic way.

For First Man, Damien decided to go with something more personal and intimate. He watched hours of 16mm NASA archival footage, which was often shot by astronauts. He loved the idea of First Man feeling like we put a documentary cameraman in the space capsules. He also saw that these spacecrafts appeared more machine-age than space-age. All the gauges and rivets looked like they belonged in a tank from World War II. So I think all of that lo-fi, analog feel informed the cinema vérité-style that he chose.

As a creative editor, you have animatics, previz or temp comps in the Avid, how do you determine the pacing? Could you talk about the creative process working on a big visual effects film?
Damien preplans everything down to the letter. He did that on Whiplash and La La Land, and he did that on First Man, especially all of the big action set pieces — the X-15, the Gemini 8 and Apollo 11 scenes. He had storyboards done, and animatics that he cut with some rough sound effects. So I always used those as a starting point.

I rely heavily on sound. I really try to use it to help illustrate what we’re looking at, especially if we’re using placeholder shots. In general, I’m most reliant on the performances to help me time things out. What the actors bring is really the heartbeat of any action scene. If you don’t identify with the character or get into a point of view, then the action scene becomes something else. It might work on some formal level, but it’s less subjective, which is the opposite of what Damien was going for.

Can you talk about him capturing things in-camera?
Damien made the choice with production designer Nathan Crowley, VFX supervisor Paul Lambert and cinematographer Linus Sandgren to try to shoot as many things in-camera as possible. The backgrounds that you see out all the spacecraft windows were projected on LED screens and then captured in-camera. Later, our VFX artists would improve, or sometimes replace, those windows. But the beautiful thing that in-camera gave us were these amazing reflections on the visors, faces and eyes. That sort of organic play of light is very difficult to replicate later. Having the in-camera VFX was invaluable to me when I was editing and great for rough cut screenings.

A big part of the film played with only the point of view of the astronaut and feeling like it’s a VR experience. Could you talk about that?
It came down to what Damien and Ryan Gosling would refer to as “the moon and the kitchen sink.” That meant that the movie would hinge on the balance between the NASA space missions and the personal domestic storylines. For the earthbound scenes with Neil and his family, Damien wanted the audience to feel like a fly on the wall in their home. He wanted it to feel intimate, and that called for a cinema verité documentary approach to the camera and the cutting.

He wanted to continue that documentary style inside the space capsules but then take it even further. He wanted to make those scenes as subjective as possible. He shot these beautiful POV shots of everything that Neil sees — the Gemini 8 seat before he climbs in, the gauges inside, the view out the window — and we intercut those with Ryan’s face and eyes. Damien really encouraged me to lean into a simple but effective cutting pattern that went back and forth between those elements. It all had to feel immersive.

What about the sound in those POV shots?
It was brilliantly created by our sound designer Ai-Ling Lee and then mixed by Ai-Ling, Frank Montano and Jon Taylor. Damien and I sketched out where all those sounds would be in our Avid rough cuts. Then Ai-Ling would use our template and take it to the next level. We played around with sound in a way that we hadn’t done on Whiplash or La La Land. We made room for sound. We would linger on POV shots of the walls of the space capsule so that we’d have room to put creaks and metal groans from Ai-Ling. We really played those moments up and then tried to answer those sounds with a look from Neil or one of the other astronauts. The goal was to make the audience feel like they were experiencing what the astronauts were experiencing. I never knew how close they were to not even making it to the lunar surface.

There was that pressure of the world watching as alarms are going off in this capsule, and was fuel running out. It was very dramatic. Damien always wanted to honor how heroic these astronauts were by showing how difficult their missions were. They risked everything. We tried to illustrate this by creating sequences that were experiential. We tried to do that through subjective cutting patterns, through sound and by using the big screen in certain ways.

Can you talk about working in IMAX?
Damien is a big canvas director. He always thinks about the big screen. On La La Land, he and Linus shoot in Fox’s original Cinemascope aspect ratio, which is 2:55.

On First Man, he again wanted to tell the story on a wide canvas but then, somehow, take it up a notch at the appropriate moment. He wanted to find a cinematic device that would adequately transport the audience to another world. He came up with this kind of Wizard of Oz transition where the camera passes through the hatch door and out onto the moon. The image opens up from 2.40 to full 1.43 IMAX.

The style and the pace changes after that point. It slows down so that the audience can take in the breathtaking detail that IMAX renders. The scene becomes all about the shadows and the texture of the lunar surface. All the while, we linger even longer on the POV shots so that the viewer feels like they are climbing down that ladder.

What editing system did you use?
We edited on the Avid Media Composer using DNxHD 115. I found that resolution really helpful to assess the focus and detail of the image, especially because we shot a lot of 16mm and 35mm 2-perf.

Tom Cross

I would love to give a shout out to your team, for your assistants and apprentices and anybody else that helped.
I was pretty blessed with a very strong editorial crew. If it weren’t for those guys we’d still be editing the movie since Damien shot 1.75 million feet of film. I need to give credit to my editing team’s great organizational prowess. I also had two great additional editors who worked closely with me and Damien — Harry Yoon and John To. They’re great storytellers and they inspired me everyday with their work.

Ryan Chavez, our VFX editor, also did a lot of great cutting. At the same time, he kept me on target with everything VFX-related. Because of our tight schedule, he was joined by a second VFX editor Jody Rogers, who I had previously worked with on David O. Russell’s movie Joy. She was fantastic.

Then I had two amazing first assistants: Jennifer Stellema and Derek Drouin. Both of them were often sent on missions to find needles in haystacks. They had to wade through hundreds of hours of NASA radio comms, stock footage, and also a plethora of insert shots of gauges and switches. Somehow they always knew where to find everything. The Avid script was also an indispensable resource and that was set up and maintained by Assistant Editors Eric Kench and Phillip Trujillo.

On the VFX end, we were very lucky to have our VFX producer Kevin Elam down the hall. We also had two incredible postviz artists — John Weckworth and Joe DiValerio — who fed us shots constantly. It was a very challenging schedule, which got more difficult once we got into film festivals.

Fortunately, our great post supervisors from La La Land —Jeff Harlacker and Jason Miller — were onboard. They’re the ones who really kept us all on track and had the big picture in mind. Together, with our trusted post PA Ryan Cunningham, we were covered.

The truly unsung heroes of this project had to be the families and loved ones of our crew. As we worked the long hours to make this movie, they supported us in every way imaginable. Without them, none of this would be possible.


Barry Goch is a finishing artist at The Foundation, a boutique post facility in the heart of Burbank’s Media District. He is also an instructor for post production at UCLA Extension. You can follow him on Twitter @gochya


Adobe Max 2018: Creative Cloud updates and more

By Mike McCarthy

I attended my first Adobe Max 2018 last week in Los Angeles. This huge conference takes over the LA convention center and overflows into the surrounding venues. It began on Monday morning with a two-and-a-half-hour keynote outlining the developments and features being released in the newest updates to Adobe’s Creative Cloud. This was followed by all sorts of smaller sessions and training labs for attendees to dig deeper into the new capabilities of the various tools and applications.

The South Hall was filled with booths from various hardware and software partners, with more available than any one person could possibly take in. Tuesday started off with some early morning hands-on labs, followed by a second keynote presentation about creative and career development. I got a front row seat to hear five different people, who are successful in their creative fields — including director Ron Howard — discuss their approach to work and life. The rest of the day was so packed with various briefings, meetings and interviews that I didn’t get to actually attend any of the classroom sessions.

By Wednesday, the event was beginning to wind down, but there was still a plethora of sessions and other options for attendees to split their time. I presented the workflow for my most recent project Grounds of Freedom at Nvidia’s booth in the community pavilion, and spent the rest of the time connecting with other hardware and software partners who had a presence there.

Adobe released updates for most of its creative applications concurrent with the event. Many of the most relevant updates to the video tools were previously announced at IBC in Amsterdam last month, so I won’t repeat those, but there are still a few new video ones, as well as many that are broader in scope in regards to media as a whole.

Adobe Premiere Rush
The biggest video-centric announcement is Adobe Premiere Rush, which offers simplified video editing workflows for mobile devices and PCs.  Currently releasing on iOS and Windows, with Android to follow in the future, it is a cloud-enabled application, with the option to offload much of the processing from the user device. Rush projects can be moved into Premiere Pro for finishing once you are back on the desktop.  It will also integrate with Team Projects for greater collaboration in larger organizations. It is free to start using, but most functionality will be limited to subscription users.

Let’s keep in mind that I am a finishing editor for feature films, so my first question (as a Razr-M user) was, “Who wants to edit video on their phone?” But what if the user shot the video on their phone? I don’t do that, but many people do, so I know this will be a valuable tool. This has me thinking about my own mentality toward video. I think if I was a sculptor I would be sculpting stone, while many people are sculpting with clay or silly putty. Because of that I would have trouble sculpting in clay and see little value in tools that are only able to sculpt clay. But there is probably benefit to being well versed in both.

I would have no trouble showing my son’s first-year video compilation to a prospective employer because it is just that good — I don’t make anything less than that. But there was no second-year video, even though I have the footage because that level of work takes way too much time. So I need to break free from that mentality, and get better at producing content that is “sufficient to tell a story” without being “technically and artistically flawless.” Learning to use Adobe Rush might be a good way for me to take a step in that direction. As a result, we may eventually see more videos in my articles as well. The current ones took me way too long to produce, but Adobe Rush should allow me to create content in a much shorter timeframe, if I am willing to compromise a bit on the precision and control offered by Premiere Pro and After Effects.

Rush allows up to four layers of video, with various effects and 32-bit Lumetri color controls, as well as AI-based audio filtering for noise reduction and de-reverb and lots of preset motion graphics templates for titling and such.  It should allow simple videos to be edited relatively easily, with good looking results, then shared directly to YouTube, Facebook and other platforms. While it doesn’t fit into my current workflow, I may need to create an entirely new “flow” for my personal videos. This seems like an interesting place to start, once they release an Android version and I get a new phone.

Photoshop Updates
There is a new version of Photoshop released nearly every year, and most of the time I can’t tell the difference between the new and the old. This year’s differences will probably be a lot more apparent to most users after a few minutes of use. The Undo command now works like other apps instead of being limited to toggling the last action. Transform operates very differently, in that they made proportional transform the default behavior instead of requiring users to hold Shift every time they scale. It allows the anchor point to be hidden to prevent people from moving the anchor instead of the image and the “commit changes” step at the end has been removed. All positive improvements, in my opinion, that might take a bit of getting used to for seasoned pros. There is also a new Framing Tool, which allows you to scale or crop any layer to a defined resolution. Maybe I am the only one, but I frequently find myself creating new documents in PS just so I can drag the new layer, that is preset to the resolution I need, back into my current document. For example, I need a 200x300px box in the middle of my HD frame — how else do you do that currently? This Framing tool should fill that hole in the features for more precise control over layer and object sizes and positions (As well as provide its easily adjustable non-destructive masking.).

They also showed off a very impressive AI-based auto selection of the subject or background.  It creates a standard selection that can be manually modified anywhere that the initial attempt didn’t give you what you were looking for.  Being someone who gives software demos, I don’t trust prepared demonstrations, so I wanted to try it for myself with a real-world asset. I opened up one of my source photos for my animation project and clicked the “Select Subject” button with no further input and got this result.  It needs some cleanup at the bottom, and refinement in the newly revamped “Select & Mask” tool, but this is a huge improvement over what I had to do on hundreds of layers earlier this year.  They also demonstrated a similar feature they are working on for video footage in Tuesday night’s Sneak previews.  Named “Project Fast Mask,” it automatically propagates masks of moving objects through video frames and, while not released yet, it looks promising.  Combined with the content-aware background fill for video that Jason Levine demonstrated in AE during the opening keynote, basic VFX work is going to get a lot easier.

There are also some smaller changes to the UI, allowing math expressions in the numerical value fields and making it easier to differentiate similarly named layers by showing the beginning and end of the name if it gets abbreviated.  They also added a function to distribute layers spatially based on the space between them, which accounts for their varying sizes, compared to the current solution which just evenly distributes based on their reference anchor point.

In other news, Photoshop is coming to iPad, and while that doesn’t affect me personally, I can see how this could be a big deal for some people. They have offered various trimmed down Photoshop editing applications for iOS in the past, but this new release is supposed to be based on the same underlying code as the desktop version and will eventually replicate all functionality, once they finish adapting the UI for touchscreens.

New Apps
Adobe also showed off Project Gemini, which is a sketch and painting tool for iPad that sits somewhere between Photoshop and Illustrator. (Hence the name, I assume) This doesn’t have much direct application to video workflows besides being able to record time-lapses of a sketch, which should make it easier to create those “white board illustration” videos that are becoming more popular.

Project Aero is a tool for creating AR experiences, and I can envision Premiere and After Effects being critical pieces in the puzzle for creating the visual assets that Aero will be placing into the augmented reality space.  This one is the hardest for me to fully conceptualize. I know Adobe is creating a lot of supporting infrastructure behind the scenes to enable the delivery of AR content in the future, but I haven’t yet been able to wrap my mind around a vision of what that future will be like.  VR I get, but AR is more complicated because of its interface with the real world and due to the variety of forms in which it can be experienced by users.  Similar to how web design is complicated by the need to support people on various browsers and cell phones, AR needs to support a variety of use cases and delivery platforms.  But Adobe is working on the tools to make that a reality, and Project Aero is the first public step in that larger process.

Community Pavilion
Adobe’s partner companies in the Community Pavilion were showing off a number of new products.  Dell has a new 49″ IPS monitor, the U4919DW, which is basically the resolution and desktop space of two 27-inch QHD displays without the seam (5120×1440 to be exact). HP was displaying their recently released ZBook Studio x360 convertible laptop workstation, (which I will be posting a review of soon), as well as their Zbook X2 tablet and the rest of their Z workstations.  NVidia was exhibiting their new Turing-based cards with 8K Red decoding acceleration, ray tracing in Adobe Dimension and other GPU accelerated tasks.  AMD was demoing 4K Red playback on a MacBookPro with an eGPU solution, and CPU based ray-tracing on their Ryzen systems.  The other booths spanned the gamut from GoPro cameras and server storage devices to paper stock products for designers.  I even won a Thunderbolt 3 docking station at Intel’s booth. (Although in the next drawing they gave away a brand new Dell Precision 5530 2-in-1 convertible laptop workstation.)   Microsoft also garnered quite a bit of attention when they gave away 30 MS Surface tablets near the end of the show.  There was lots to see and learn everywhere I looked.

The Significance of MAX
Adobe MAX is quite a significant event, especially now that I have been in the industry long enough to start to see the evolution of certain trends — things are not as static as we may expect.  I have attended NAB for the last 12 years, and the focus of that show has shifted significantly away from my primary professional focus. (No Red, Ncidia, or Apple booths, among many other changes)  This was the first year that I had the thought “I should have gone to Sundance,” and a number of other people I know had the same impression. Adobe Max is similar, although I have been a little slower to catch on to that change.  It has been happening for over ten years, but has grown dramatically in size and significance recently.  If I still lived in LA, I probably would have started attending sooner, but it was hardly on my radar until three weeks ago.  Now that I have seen it in person, I probably won’t miss it in the future.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.


Crazy Rich Asians editor Myron Kerstein

By Amy Leland

When the buzz started in anticipation of the premiere of Crazy Rich Asians, there was a lot of speculation about whether audiences would fill the theaters for the first all-Asian cast in an American film since 1993’s Joy Luck Club. Or whether audiences wanted to see a romantic comedy, a format that seemed to be falling out of favor.

The answer to both questions was a resounding, “Yes!” The film grossed $35 million during its opening weekend, against a $30 million budget. It continued going strong its second weekend, making another $28M, the highest Labor Day weekend box office in more than a decade. It was the biggest opening weekend for a rom-com in three years, and is the most successful studio rom-com in nine. All of this great success can be explained pretty simply — it’s a fun movie with a well-told story.

Not long ago, I had the great fun of sitting down with one of its storytellers, editor Myron Kerstein, to discuss this Jon M. Chu-directed film as well as Kerstein’s career as an editor.

How did you get started as an editor?
I was a fine arts major in college and stumbled upon photography, filmmaking, painting and printmaking. I really just wanted to make art of any kind. Once I started doing more short films in college, I found a knack for editing.

When I first moved to New York, I needed to make a living, so I became a PA, and I worked on a series called TV Nation one of Michael Moore’s first shows. It was political satire. There was a production period, and then slowly the editors needed help in the post department. I gravitated toward these alchemists, these amazing people who were making things out of nothing. I really started to move toward post through that experience.

I also hustled quite a bit with all of those editors, and they started to hire me after that job. Slowly but surely I had a network of people who wanted to hire me again. That’s how I really started, and I really began to love it. I thought, what an amazing process to read these stories and look at how much power and influence an editor has in the filmmaking process.

I was not an assistant for too long, because I got to cut a film called Black & White. Then I quickly began doing edits for other indies, one being a film called Raising Victor Vargas, and another film called Garden State. That was my big hit in the indie world, and slowly that lead to more studio films, and then to Crazy Rich Asians.

Myron Kerstein and Crazy Rich Asians actor Henry Golding.

Your first break was on a television show that was nothing like feature films. How did you ultimately move toward cutting feature films?
I had a real attraction to documentary filmmaking, but my heart wanted to make narrative features. I think once you put that out in the universe, then those jobs start coming to you. I then stumbled upon my mentor, Jim Lyons, who cut all of Todd Haynes’s movies for years. When I worked on Velvet Goldmine as an assistant editor, I knew this was where I really needed to be. This was a film with music that was trying to say something, and was also very subversive. Jim and Todd were these amazing filmmakers that were just shining examples of the things I wanted to make in the future.

Any other filmmakers or editors whose work influenced you as you were starting out?
In addition to Todd Haynes, directors like Gus Van Sant and John Hughes. When I was first watching films, I didn’t really understand what editors did, so at the same time I was influenced by Spielberg, or somebody like George Romero. Then I realized there were editors later who made these things. Ang Lee, and his editor Tim Squyres were like a gods to me. I really wanted to work on one of Ang’s crews very badly, but everyone wanted to work with him. I was working at the same facilities where Ang was cutting, and I was literally sneaking into his edit rooms. I would be working on another film, and I would just kind of peek my head in and see what they were doing and that kind of thing.

How did this Crazy Rich Asians come about for you?
Brad Simpson, who was a post supervisor on Velvet Goldmine back in the ‘90s when I was the assistant editor, is a producer on this film. Flash forward 20 years and I stumbled upon this script through agents. I read it and I was like, “I really want to be a part of this, and Brad’s the producer on this thing? Let me reach out to him.” He said, “I think you might be the right fit for this.” It was pretty nerve-wracking because I’d never worked with Jon before. Jon was a pretty experienced filmmaker, and he’d worked with a lot of editors. I just knew that if I could be part of the process, we could make something special.

My first interview with Jon was a Skype interview. He was in Malaysia already prepping for the film. Those interviews are very difficult to not look or sound weird. I just spoke from the heart, and said this is what I think makes me special. These are the ways I can try to influence a film and be part of the process. Lucky enough between that interview and Brad’s recommendation, I got the job.

Myron Kerstein and director Jon Chu.

When did you begin your work on the film?
I basically started the first week of filming and joined them in Malaysia and Singapore for the whole shoot. It was a pretty amazing experience being out there in two Muslim countries — two Westernized Muslim countries that were filled with some of the friendliest people I’ve ever met. It was an almost entirely local crew, a couple of assistant editors, and me. Sometimes I feel like it might not be the best thing for an editor to be around set too much, but in this case it was good for me to see the setting they were trying to portray… and feel the humidity, the steaminess, the romance and Singapore, which is both alien and beautiful at the same time.

What was your collaboration like with Jon Chu?
It was just an organic process, where my DNA started to become infused with Jon’s. The good thing about my going to Malaysia and Singapore was we got to work together early. One thing that doesn’t happen often anymore is a director who actually screens dailies in a theater. Jon would do that every weekend. We would watch dailies, and he would say what he liked and didn’t like, or more just general impressions of his footage. That allowed me to get into his head a bit.

At the same time I was also cutting scenes. At the end of every day’s screening, we would sit down together. He gave me a lot of freedom, but at the same time was there to give me his first impressions of what I was doing. I think we were able to build some trust really early.

Because of the film’s overwhelming success, this has opened doors for other Asian-led projects.
Isn’t that the most satisfying thing in the world? You hope to define your career by moments like this, but rarely get that chance. I watched this film, right when it was released, which was on my birthday. I ended up sitting next to this young Asian boy and his mom. This kid was just giggling and weeping throughout the movie. To have an interaction with a kid like that, who may have never seen someone like himself represented on the screen was pretty outstanding.

Music was such an important part of this film. The soundtrack is so crucial to moments in the film that it almost felt like a musical. Were you editing scenes with specific songs in mind, or did you edit  and then come back and add music?
Jon gave me a playlist very early on of music he was interested in. A lot of the songs sounded like they were from the 1920s — almost big band tunes. Right then I knew the film could have more of a classy Asian-Gatsby quality to it. Then as we were working on the film together, we started trying out these more modern tunes. I think the producers might have thought we were crazy at one point. You’re asking the audience to go down these different roads with you, and that can sometimes work really well, or sometimes can be a train wreck.

But as much as I love working with music, when I assemble I don’t cut with any music in mind. I try not to use it as a crutch. Oftentimes you cut something with music, either with a song in your head, or often editors will cut with a song as a music bed. But, if you can’t tell a story visually without a song to help drive it, then I think you’re fooling yourself.

I really find that my joy of putting in music happens after I assemble, and then I enjoy experimenting. That Coldplay song at the end of the film, for example… We were really struggling with how to end our movie. We had a bunch of different dialogue scenes that were strung together, but we didn’t feel like it was building up to some kind of climax. I figured out the structure and then cut it like any other scene without any music. Then Jon pitched a couple songs. Ironically enough I had an experience with Coldplay from the opening of Garden State. I liked the idea of this full circle in my own career with Coldplay at the end of a romantic comedy that starred an all-Asian cast. And it really felt like it was the right fit.

The graphic design was fascinating, especially in the early scene with Rachel and Nick on their date that kicks off all of the text messages. Is that something that was storyboarded early, or was that something you all figured out in the edit and in post?
Jon did have a very loose six-page storyboard of how we would get from the beginning of this to the end. The storyboard was nothing compared to what we ended up doing. When I first assembled my footage, I stitched together a two-minute sequence of just split screens of people reacting to other people. Some of that footage is in the movie, but it was just a loose sketch. Jon liked it, but it didn’t represent what he imagined this sequence to be. To some extent he had wondered whether we even needed the sequence.

Jon and I discussed it and said, “Let’s give this a shot. Let’s find the best graphics company out there.” We ended up landing with this company called Aspect, led by John Berkowitz. He and his team of artists worked with us to slowly craft this sequence over months. Beginning with, “How do we get the first text bubble to the second person? What do those text bubbles look like? How do they travel?” Then they gave us 20 different options to see how those two elements would work together. Then we asked, “How do we start expanding outward? What information are we conveying? What is the text bubble saying?” It was like this slowly choreographed dance that we ended up putting together over the course of months.

They would make these little Disney-esque pops. We really loved that. That kind of made it feel like we were back in old Hollywood for a second. At the same time we had these modern devices with text bubbles. So far as the tone was concerned, we tried percussion, just drumming, and other old scores. Then we landed on a score from John Williams from 1941, and that gave us the idea that maybe some old-school big band jazz might go really well in this. Our composer Brian Tyler saw it, and said, “I think I can make this even zanier and crazier.”

How do you work with your assistants?
Assistants are crucial as far as getting through the whole process. I actually had two sets of assistants; John To and David Zimmerman were on the first half in Malaysia and Singapore. I found John through my buddy Tom Cross, who edits for Damien Chazelle. I wanted somebody who could help me with the challenges of getting through places like Malaysia and Singapore, because if you’re looking for help for your Avid, or trying to get dailies from Malaysia to America, you’re kind of on your own. Warner Bros. was great and supportive, and they gave us all the technical help. But it’s not like they can fly somebody out if something goes wrong in an hour.

On the post side I ended up using Melissa Remenarich-Aperlo, and she was outstanding. In the post process I needed somebody to hold down the fort and keep me organized, and also somebody for me to bounce ideas off of. I’m a big proponent of using my assistants creatively. Melissa ended up cutting the big fashion montage. I really struggled with that sequence because I felt strongly like this might be a trope that this film didn’t need. That was the debate with a lot of them. Which romantic comedy tropes should we have in this movie? Jon was like, “It’s wish fulfillment. We really need this. I know we’ve seen it a thousand times, but we need this scene.”

I said let’s try something different. Let’s try inter-cutting the wedding arrival with the montage, and let’s try to make it one big story to get us from us not knowing what she’s going to show up in to her arrival. Both of those sequences were fine on their own, but it didn’t feel like either one of them was doing anything interesting. It just felt like we were eating up time, and we needed to get to the wedding, and we had a lot of story to tell. Once we inter-cut them we knew this was the right choice. As Jon said, you need these moments in the film where you can just sit back and take a breath, smile for a minute and get ready for the drama that starts. Melissa did a great job on that sequence.

Do you have any advice for somebody who’s just starting out and really wants to edit feature films?
I would tell them to start cutting. Cut anything they can. If they don’t have the software, they can cut on iMovie on their iPhone. Then they should  reach out to people like me and create a network. And keep doing that until people say yes. Don’t be afraid to reach out to people.

Also don’t be afraid to be an assistant editor. As much as they want to cut, as they should, they also need to learn the process of editing from others. Be willing to stick with it, even if that means years of doing it. I think you’d be surprised how much you learn over the course of time with good editors. I feel like it’s a long bridge. I’ve been doing this for 20 years, and it took a long time to get here, but perseverance goes a long way in this field. You just have to really know you want to do it and keep doing it.


Amy Leland is a film director and editor. Her short film, “Echoes”, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.


Behind the Title: Pace Pictures owner Heath Ryan

NAME: Heath Ryan

COMPANY: Pace Pictures (@PacePictures)

CAN YOU DESCRIBE YOUR COMPANY?
We are a dailies-to-delivery post house, including audio mixing.

Pace’s Dolby Atmos stage.

WHAT’S YOUR JOB TITLE?
Owner and editor.

WHAT DOES THAT ENTAIL?
As owner, I need to make sure everyone is happy.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Psychology. I deal with a lot of producers, directors and artists that all have their own wants and needs. Sometimes what that entails is not strictly post production but managing personalities.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Editing. My company grew out of my love for editing. It’s the final draft of any film. In the over 30 years I have been editing, the power of what an editor can do has only grown.

WHAT’S YOUR LEAST FAVORITE?
Chasing unpaid invoices. It’s part of the job, but it’s not fun.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Late, late in the evening when there are no other people around and you can get some real work done.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Not by design but through sheer single mindedness, I have no other skill set but film production. My sense of direction is so bad that armed with a GPS super computer in my phone even Uber driver is not an option.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I started making films in the single digit years. I won a few awards for my first short film in my teens and never looked back. I’m lucky to have found this passion early.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
This year I edited the reboot to Daddy Daycare called Grand-Daddy Daycare (2019) for Universal. I got to work with director Ron Oliver and actor Danny Trejo, and it meant a lot to me. It deals with what we do with our elders as time creeps up on us all. Sadly, we lost Ron’s mom while we were editing the film so it took on extra special meaning to us both.

Lawless Range

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Lawless Range and The Producer. I produced and edited both projects with my dear friend and collaborator Sean McGinly. A modern-day Western and a behind-the-scenes of a Hollywood pilot. They were very satisfying projects because there was no one to blame but ourselves.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My Meridian Sound system, the Internet and TV.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Yes, I love it. I have always set the tone in the edit bay with music. Especially during dailies – I like to put music on, sometimes films scores, to set the mood of what we are making.


Review: Blackmagic’s eGPU and Intel i9 MacBook Pro 2018

By Brady Betzel

Blackmagic’s eGPU is worth the $699 price tag. You can buy it from Apple’s website, where it is being sold exclusively for the time being. Wait? What? You wanted some actual evidence as to why you should buy the BMD eGPU?

Ok, here you go…

MacBook Pro With Intel i9
First, I want to go over the latest Apple MacBook Pro, which was released (or really just updated) this past July. With some controversial fanfare, the 2018 MacBook Pro can now be purchased with the blazingly fast Intel i9, 2.6GHz (Turbo Boost up to 4.3GHz) six-core processor. In addition, you can add up to 32GB of 2400MHz DDR4 onboard memory. The Radeon Pro 560x GPU with 4GB of GDDR5 memory and even a 4TB SSD storage drive. It has four Thunderbolt 3 ports and, for some reason, a headphone jack. Apple is also touting its improved butterfly keyboard switches as well as its True Tone display technology. If you want to read more about that glossy info head over to Apple’s site.

The 2018 MacBook Pro is a beast. I am a big advocate for the ability to upgrade and repair computers, so Apple’s venture to create what is essentially a leased computer ecosystem that needs to be upgraded every year or two usually puts a bad taste in my mouth.

However, the latest MacBook Pros are really amazing… and really expensive. The top-of-the-line MacBook Pro I was provided with for this review would cost $6,699! Yikes! If I was serious, I would purchase everything but the $2,000 upgrade from the 2TB SSD drive to the 4TB, and it would still cost $4,699. But I suppose that’s not a terrible price for such an intense processor (albeit not technically workstation-class).

Overall, the MacBook Pro is a workhorse that I put through its video editing and color correcting paces using three of the top four professional nonlinear editors: Adobe Premiere, Apple FCP X and Blackmagic’s Resolve 15 (the official release). More on those results in a bit, but for now, I’ll just say a few things: I love how light and thin it is. I don’t like how hot it can get. I love how fast it charges. I don’t like how fast it loses charge when doing things like transcoding or exporting clips. A 15-minute export can drain the battery over 40% while playing Spotify for eight hours will hardly drain the battery at all (maybe 20%).

Blackmagic’s eGPU with Radeon Pro 580 GPU
One of the more surprising releases from Blackmagic has been this eGPU offering. I would never have guessed they would have gone into this area, and certainly would never have guessed they would have gone with a Radeon card, but here we are.

Once you step back from the initial, “Why in the hell wouldn’t they let it be user-replaceable and also not brand dependent” shock, it actually makes sense. If you are Mac OS user, you probably can do a lot in terms of external GPU power already. When you buy a new iMac, iMac Pro or MacBook Pro, you are expecting it to work, full stop.

However, if you are a DIT or colorist that is more mobile than that sweet million-dollar color bay you dream of, you need more. This is where the BMD eGPU falls nicely into place. You plug it in and instantly see it populate in the menu bar. In addition, the eGPU acts as a dock with four USB 3 ports, two Thunderbolt 3 ports and an HDMI port. The MacBook Pro will charge off of the eGPU as well, which eliminates the need for your charger at your docking point.

On the go, the most decked out MacBook Pro can handle its own. So it’s no surprise that FCP X runs remarkably fast… faster than everything else. However, you have to be invested in an FCP X workflow and paradigm — and while I’m not there yet, maybe the future will prove me wrong. Recently, I saw someone on Twitter who developed an online collaboration workflow, so people are excited about it.

Anyway, many of the nonlinear editors I work with can also play on the MacBook Pro, even with 4K Red, ARRI and, especially, ProRes footage. Keep in mind though, with the 2K, 4K, or whatever K footage, you will need to set the debayer to around “half good” if you want a fluid timeline. Even with the 4GB Radeon 560x I couldn’t quite play realtime 4K footage without some sort of compromise in quality.

But with the Blackmagic eGPU, I significantly improved my playback capabilities — and not just in Resolve 15. I did try and plug the eGPU into a PC with Windows 10 I was reviewing at the same time and it was recognized, but I couldn’t get all the drivers sorted out. So it’s possible it will work in Windows, but I couldn’t get it there.

Before I get to the Resolve testing, I did some benchmarking. First I ran Cinebench R15 without the eGPU attached and got the following scores: OpenGL – 99.21fps, reference match 99.5%, CPU – 947cb, CPU (single core) 190cb and MP ratio of 5.00x. With the GPU attached: Open GL — 60.26fps, reference match 99.5%, CPU — 1057 cb, CPU (single core) 186cb and MP ratio of 5.69x. Then I ran Unigine’s Valley Benchmark 1.0 without the eGPU, which got 21.3fps and a score of 890 (minimum 12.4fps/maximum 36.2fps). With the eGPU it got 25.6fps and a score of 1073 (minimum 19.2 fps/max 37.1fps)

Resolve 15 Test
I based all of my tests on a similar (although not exact for the different editing applications) 10-minute timeline, 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (.ari and ProRes444XQ) UHD footage, all with edit page resizes, simple color correction and intermittent sharpening and temporal noise reduction (three frames, better, medium, 10, 10 and 5).

Playback: Without the eGPU I couldn’t play 23.98fps, 4K Red R3D without being set to half-res. With the eGPU I could playback at full-res in realtime (this is what I was talking about in sentence one of this review). The ARRI footage would play at full res, but would go between 1fps and 7fps at full res. The 8K Red footage would play in realtime when set to quarter-res.

One of the most re-assuring things I noticed when watching my Activity Monitor’s GPU history readout was that Resolve uses both GPUs at once. Not all of the apps did.

Resolve 15 Export Tests
In the following tests, I disabled all cache or optimized media options, including Performance Mode.

Test 1: H.264 at 23.98fps, UHD, auto-quality, no frame reordering, force highest-quality debayer/resizes and encoding profile Main)
a. Without eGPU (Radeon Pro 560x): 22 minutes, 16 seconds
b. With BMD eGPU (Radeon Pro 580): 16 minutes and 21 seconds

Test 2: H.265 10-bit, 23.98/UHD, auto quality, no frame reordering, force highest-quality debayer/resizes)
a. Without eGPU: stopped rendering after 10 frames
b. With BMD eGPU: same result

Test 3:
ProRes4444 at 23.98/UHD
a. Without eGPU: 27 min and 29 seconds
b. With BMD eGPU: 22 minutes and 57 seconds

Test 4:
– Edit page cache – enabled Smart User Cache at ProResHQ
a. Without eGPU: 17 minutes and 28 seconds
b. With BMD eGPU: 12 minutes and 22 seconds

Adobe Premiere Pro v.12.1.2
I performed similar testing in Adobe Premiere Pro using a 10-minute timeline at 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (DNxHR SQ 8-bit) UHD footage, all with Effect Control tab resizes and simple Lumetri color correction, including sharpening and intermittent denoise (16) under the HSL Secondary tab in Lumetri applied to shadows only.

In order to ensure your eGPU will be used inside of Adobe Premiere, you must use Metal as your encoder. To enable it go to File > Project Settings > General and change the renderer to Mercury Playback Engine GPU acceleration Metal — (OpenCL will only use the internal GPU for processing.)

Premiere did not handle the high-resolution media as aptly as Resolve had, but it did help a little. However, I really wanted to test the export power with the added eGPU horsepower. I almost always send my Premiere sequences to Adobe Media Encoder to do the processing, so that is where my exports were processed.

Adobe Media Encoder
Test 1: H.264 (No render used during exports: 23.98/UHD, 80Mb/s, software encoding doesn’t allow for profile setup)
a. Open CL with no eGPU: about 140 minutes (sorry had to chase the kids around and couldn’t watch this snail crawl)
b. Metal no eGPU: about 137 minutes (chased the kids around again, and couldn’t watch this snail crawl, either)
c. Open CL with eGPU: wont work, Metal only
d. Metal with eGPU: one hour

Test 2: H.265
a. Without eGPU: failed (interesting result)
b. With eGPU: 40 minutes

Test 3: ProRes4444
a. Without eGPU: three hours
b. With eGPU: one hour and 14 minutes

FCP X
FCP X is an interesting editing app, and it is blazing fast at handling ProRes media. As I mentioned earlier, it hasn’t been in my world too much, but that isn’t because I don’t like it. It’s because professionally I haven’t run into it. I love the idea of roles, and would really love to see that playout in other NLEs. However, my results speak for themselves.

One caveat to using the eGPU in FCP X is that you must force it to work inside of the NLE. At first, I couldn’t get it to work. The Activity Monitor would show no activity on the eGPU. However, thanks to a Twitter post, James Wells (@9voltDC) sent me to this, which allows you to force FCP X to use the eGPU. It took a few tries but I did get it to work, and funny enough I saw times when all three GPUs were being used inside of FCP X, which was pretty good to see. This is one of those use-at-your-own risk things, but it worked for me and is pretty slick… if you are ok with using Terminal commands. This also allows you to force the eGPU onto other apps like Cinebench.

Anyways here are my results with the BMD eGPU exporting from FCP X:

Test 1: H.264
a. Without eGPU: eight minutes
b. With eGPU: eight minutes and 30 seconds

Test 2: H.265: Not an option

Test 3: ProRes4444
a. Without eGPU: nine minutes
b. With eGPU: six minutes and 30 seconds

Summing Up
In the end, the Blackmagic eGPU with Radeon Pro 580 GPU is a must buy if you use your MacBook Pro with Resolve 15. There are other options out there though, like the Razer Core v2 or the Akitio Node Pro.

From this review I can tell you that the Blackmagic eGPU is silent even when processing 8K Red RAW footage (even when the MacBook Pro fans are going at full speed), and it just works. Plug it in and you are running, no settings, no drivers, no cards to install… it just runs. And sometimes when I have three little boys running around my house, I just want that peace of mind and I want things to just work like the Blackmagic eGPU.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


A Conversation: 3P Studio founder Haley Stibbard

Australia’s 3P Studio is a post house founded and led by artisan Haley Stibbard. The company’s portfolio of work includes commercials for brands such as Subway, Allianz and Isuzu Motor Company as well as iconic shows like Sesame Street. Stibbard’s path to opening her own post house was based on necessity.

After going on maternity to have her first child in 2013, she returned to her job at a content studio to find that her role had been made redundant. She was subsequently let go. Needing and wanting to work, she began freelancing as an editor — working seven days a week and never turning down a job. Eventually she realized that she couldn’t keep up with that type of schedule and took her fate into her own hands. She launched 3P Studio, one of Brisbane’s few women-led post facilities.

We reached out to Stibbard to ask about her love of post and her path to 3P Studio.

What made you want to get into post production? School?
I had a strong love of film, which I got from my late dad, Ray. He was a big film buff and would always come home from work when I was a kid with a shopping bag full of $2 movies from the video store and he would watch them. He particularly liked the crime stories and thrillers! So I definitely got my love of film and television from him.

We did not have any film courses at high school in the ‘90s, so the closest I could get was photography. Without a show reel it was hard to get a place at university in the college of art; a portfolio was a requirement and I didn’t have one. I remember I had to talk my way into the film program, and in the end I think they just got sick of me and let me into the course through the back door without a show reel — I can be very persistent when I want to be. I always had enjoyed editing and I was good at it, so in group tasks I was always chosen as the editor and then my love of post came from there.

What was your first job?
My very first job was quite funny, actually. I was working in both a shoe store and a supermarket at the time, and two post positions became available one day, an in-house editor for a big furniture chain and a job as a production assistant for a large VFX company at Movie World on the Gold Coast. Anyone who knows me knows that I would be the worst PA in the world. So, luckily for that company director, I didn’t get the PA job and became the in-house editor for the furniture chain.

I’m glad that I took that job, as it taught me so much — how to work under pressure, how to use an Avid, how to work with deadlines, what a key number was, how to dispatch TVCS to the stations, be quick, be accurate, how to take constructive feedback.

I made every mistake known to man, including one weekend when I forgot to remove the 4×3 safe bars from a TVC and my boss saw it on TV. I ended up having to drive to the office, climb the fence that was locked to get into the office and pull it off air. So I’ve learned a lot of things the hard way, but my boss was a very patient and forgiving man, and 18 years later is now a client of mine!

What job did you hold when you went out on maternity leave?
Before I left on maternity leave to have my son Dashiell, I was an editor for a small content company. I have always been a jack-of-all-trades and I took care of everything from offline to online, grading in Resolve, motion graphics in After Effects and general design. I loved my job and I loved the variety that it brought. Doing something different every day was very enjoyable.

After leaving that job, you started freelancing as an editor. What systems did you edit on at the time and what types of projects? How difficult a time was that for you? New baby, working all the time, etc.
I started freelancing when my son was just past seven months old. I had a mortgage and had just come off six months of unpaid maternity leave, so I needed to make a living and I needed to make it quickly. I also had the added pressure of looking after a young child under the age of one who still needed his mother.

So I started contacting advertising agencies and production companies that I thought may be interested in my skill set. I just took every job that I could get my hands on, as I was always worried that every job that I took could potentially be my last for a while. I was lucky that I had an incredibly well-behaved baby! I never said “no” to a job.

As my client base started to grow, my clients would always book me since they knew that I would never say “no” (they know I still don’t say no!). It got to the point where I was working seven days a week. I worked all day when my son was in childcare and all night after he would go to bed. I would take the baby monitor downstairs where I worked out of my husband’s ‘man den.’

As my freelance business grew, I was so lucky that I had the most supportive husband in the world who was doing everything for me, the washing, the cleaning, the cooking, bath time, as well has holding down his own full-time job as an engineer. I wouldn’t have been able to do what I did for that period of time without his support and encouragement. This time really proved to be a huge stepping stone for 3P Studio.

Do you remember the moment you decided you would start your own business?
There wasn’t really a specific moment where I decided to start my own business. It was something that seemed to just naturally come together. The busier I became, the more opportunities came about, like having enough work through the door to build a space and hire staff. I have always been very strategic in regard to the people that I have brought on at 3P, and the timing in which they have come on board.

Can you walk us through that bear of a process?
At the start of 2016, I made the decision to get out of the house. My work life was starting to blend in with my home life and I needed to have that separation. I worked out of a small office for 12 months, and about six months into that it came to a point where I was able to purchase an office space that would become our studio today.

I went to work planning the fit out for the next six months. The studio was an investment in the business and I needed a place that my clients could also bring their clients for approvals, screenings and collaboration on jobs, as well as just generally enjoying the space.

The office space was an empty white shell, but the beauty of coming into a blank canvas was that I was able to create a studio that was specifically built for post production. I was lucky in that I had worked in some of the best post houses in the country as an editor, and this being a custom build I was able to take all the best bits out of all the places I had previously worked and put them into my studio without the restriction of existing walls.

I built up the walls, ripped down the ceilings and was able to design the edit suites and infrastructure all the way down to designing and laying the cable runs myself that I knew would work for us down the line. Then, we saved money and added more equipment to the studio bit by bit. It wasn’t 0 to 100 overnight, I had to work at the business development side of the company a lot, and I spent a lot of long days sitting by myself in those edit suites doing everything. Soon, word of mouth started to circulate and the business started to grow on the back of some nice jobs from my existing loyal clients.

What type of work do you do, and what gear do you call on?
3P Studio is a boutique post production studio that specializes in full-service post production, we also shoot content when required.

Our clients range anywhere from small content videos for the web all the way up to large commercial campaigns and everything in between.

There are currently six of us working full time in the studio, and we handle everything in-house from offline editing to VFX to videography and sound design. We work primarily in the Adobe Creative suite for offline editing in Premiere, mixed with Maxon Cinema 4D/Autodesk Maya for 3D work, Autodesk Flame and Side Effects Houdini for online compositing and VFX, Blackmagic Resolve for color grading and Pro Tools HD for sound mixing. We use EditShare EFS shared storage nodes for collaborative working and sharing of content between the mix of creative platforms we use.

This year we have invested in a Red Digital Cinema camera as well as an EditShare XStream 200 EFS scale-out single-node server so we can become that one-stop shop for our clients. We have been able to create an amazing creative space for our clients to come and work with us, be it from the bespoke design of our editorial suites or the high level of client service we offer.

How did you build 3P Studios to be different from other studios you’ve worked at?
From a personal perspective, the culture that we have been able to build in the studio is unlike anywhere else I have worked in that we genuinely work as a team and support each other. On the business side, we cater to clients of all sizes and budgets while offering uncompromising services and experience whether they be large or small. Making sure they walk away feeling that they have had great value and exemplary service for their budget means that they will end up being a customer of ours for life. This is the mantra that I have been able to grow the business on.

What is your hiring process like, and how do you protect employees who need to go out on maternity or family leave?
When I interview people to join 3P, attitude and willingness to learn is everything to me — hands down. You can be the most amazing operator on the planet, but if your attitude stinks then I’m really not interested. I’ve been incredibly lucky with the team that I have, and I have met them along the journey at exactly the right times. We have an amazing team culture and as the company grows our success is shared.

I always make it clear that it’s swings and roundabouts and that family is always number one. I am there to support my team if they need me to be, not just inside of work but outside as well and I receive the same support in return. We have flexible working hours, I have team members with young families who, at times, are able to work both in the studio and from home so that they can be there for their kids when they need to be. This flexibility works fine for us. Happy team members make for a happy, productive workplace, and I like to think that 3P is forward thinking in that respect.

Any tips for young women either breaking into the industry or in it that want to start a family but are scared it could cost them their job?
Well, for starters, we have laws in Australia that make it illegal for any woman in this country to be discriminated against for starting a family. 3P also supports the 18 weeks paid maternity leave available to women heading out to start a family. I would love to see more female workers in post production, especially in operator roles. We aren’t just going to be the coffee and tea girls, we are directors, VFX artists, sound designers, editors and cinematographers — the future is female!

Any tips for anyone starting a new business?
Work hard, be nice to people and stay humble because you’re only as good as your last job.

Main Image: Haley Stibbard (second from left) with her team.

Veteran editor Antonio Gómez-Pan joins Therapy Studios

Los Angeles-based post house Therapy Studios has added editor Antonio Gómez-Pan to its team. Born in Madrid, and currently splitting his time between his hometown of Barcelona and LA, Gómez-Pan earned a Bachelor of Arts in film editing at cinema school ESCAC.

He says his journey to editing was “sort of a Darwinian process” after he burnt his hands on some fresnel lights and “discovered the beauty of film editing.” While still in school, he edited Mi Amigo Invisible (2010), which premiered at Sundance Film Festival and Elefante (2012), which won the Best Short Film Award at the LA Film Festival and the Sitges Film Festival, along with many others.

Gómez-Pan’s feature work includes Puzzled Love, Hooked Up and Othello, which won Best European Independent Film at ÉCU 2013. On the advertising side, he has worked with global brands like Adidas, Coca-Cola, Chanel, Unicef, Volkswagen, Nike, Ikea, Toyota and many more. Recently, he was appointed an Academic by the Spanish Motion Picture Arts & Sciences Academy, on top of winning the Gold Medal for Best Editing in Berlin.

When asked what his favorite format is, Gómez-Pan couldn’t choose, saying, “I love commercials because of their immediacy and the need to be able to synthesize, but feature films can be more personal and narratively engaging. Music videos are where you are freer to experiment and the editor’s hand is more visible. Documentaries are so rewarding because they’re created in the editing room more than any other genre. I really cannot choose among them.” His enthusiasm for working across the scale is part of why he was drawn to Therapy, where he says, “They do everything, from broadcast campaigns to long-format shows like HBO’s Sonic Highways.”

Gómez-Pan joins Therapy’s existing roster of editors, which includes Doobie White, Kristin McCasey, Lenny Mesina, Meg Ramsay, Steve Prestemon and Jake Shaver. Gómez-Pan says, “Editorial houses don’t exist in Spain, so we are also the ones dealing with the salary, the schedule and all other non-creative parts of the process. That puts you in a tricky position even before you sit down in the editing suite. The role is incredibly rewarding and the editor is held in high esteem, but already I’ve found that we’re much more protected and respected here in the States.”