Evercast 5.4. Ad 5, Evercast Ad 6 5.18

Category Archives: Editing

Posting Michael Jordan’s The Last Dance — before and during lockdown

By Craig Ellenport

One thing viewers learned from watching The Last Dance — ESPN’s 10-part documentary series about Michael Jordan and the Chicago Bulls — is that Jordan might be the most competitive person on the planet. Even the slightest challenge led him to raise his game to new heights.

Photo by Andrew D. Bernstein/NBAE via Getty Images

Jordan’s competitive nature may have rubbed off on Sim NY, the post facility that worked on the docuseries. Since they were only able to post the first three of the 10 episodes at Sim before the COVID-19 shutdown, the post house had to manage a work-from-home plan in addition to dealing with an accelerated timeline that pushed up the deadline a full two months.

The Last Dance, which chronicles Jordan’s rise to superstardom and the Bulls’ six NBA title runs in the 1990s, was originally set to air on ESPN after this year’s NBA Finals ended in June. With the sports world starved for content during the pandemic, ESPN made the decision to begin the show on April 19 — airing two episodes a night on five consecutive Sunday nights.

Sim’s New York facility offers edit rooms, edit systems and finishing services. Projects that rent these rooms will then rely on Sim’s artists for color correction and sound editing, ADR and mixing. Sim was involved with The Last Dance for two years, with ESPN’s editors working on Avid Media Composer systems at Sim.

When it became known that the 1997-98 season was going to be Jordan’s last, the NBA gave a film crew unprecedented access to the team. They compiled 500 hours of 16mm film from the ‘97-’98 season, which was scanned at 2K for mastering. The Last Dance used a combination of the rescanned 16mm footage, other archival footage and interviews shot with Red and Sony cameras.

Photo by Andrew D. Bernstein/NBAE via Getty Images

“The primary challenge posed in working with different video formats is conforming the older standard definition picture to the high definition 16:9 frame,” says editor Chad Beck. “The mixing of formats required us to resize and reposition the older footage so that it fit the frame in the ideal composition.”

One of the issues with positioning the archival game footage was making sure that viewers could focus when shifting their attention between the ball and the score graphics.

“While cutting the scenes, we would carefully play through each piece of standard definition game action to find the ideal frame composition. We would find the best position to crop broadcast game graphics, recreate our own game graphics in creative ways, and occasionally create motion effects within the frame to make sure the audience was catching all the details and flow of the play,” says Beck. “We discovered that tracking the position of the backboard and keeping it as consistent as possible became important to ensuring the audience was able to quickly orient themselves with all the fast-moving game footage.”

From a color standpoint, the trick was taking all that footage, which was shot over a span of decades, and creating a cohesive look.

Rob Sciarratta

“One of main goals was to create a filmic, dramatic natural look that would blend well with all the various sources,” says Sim colorist Rob Sciarratta, who worked with Blackmagic DaVinci Resolve 15. “We went with a rich, slightly warm feeling. One of the more challenging events in color correction was blending the archival work into the interview and film scans. The older video footage tended to have various quality resolutions and would often have very little black detail existing from all the transcoding throughout the years. We would add a filmic texture and soften the blacks so it would blend into the 16mm film scans and interviews seamlessly. … We wanted everything to feel cohesive and flow so the viewer could immerse themselves in the story and characters.”

On the sound side, senior re-recording mixer/supervising sound editor Keith Hodne used Avid Pro Tools. “The challenge was to create a seamless woven sonic landscape from 100-plus interviews and locations, 500 hours of unseen raw behind-the-scenes footage, classic hip-hop tracks, beautifully scored instrumentation and crowd effects, along with the prerecorded live broadcasts,” he says. “Director Jason Hehir and I wanted to create a cinematic blanket of a basketball game wrapped around those broadcasts. What it sounds like to be at the basketball game, feel the game, feel the crowd — the suspense. To feel the weight of the action — not just what it sounds like to watch the game on TV. We tried to capture nostalgia.”

When ESPN made the call to air the first two episodes on April 19, Sim’s crew still had the final seven episodes to finish while dealing with a work-from-home environment. Expectations were only heightened after the first two episodes of The Last Dance averaged more than 6 million viewers. Sim was now charged with finishing what would become the most watched sports documentary in ESPN’s history — and they had to do this during a pandemic.

Stacy Chaet

When the shutdown began in mid-March, Sim’s staff needed to figure out the best way to finish the project remotely.

“I feel like we started the discussions of possible work from home before we knew it was pushed up,” says Stacy Chaet, Sim’s supervising workflow producer. “That’s when our engineering team and I started testing different hardware and software and figuring out what we thought would be the best for the colorist, what’s the best for the online team, what’s the best for the audio team.”

Sim ended up using Teradici to get Sciarratta connected to a machine at the facility. “Teradici has become a widely used solution for remote at home work,” says Chaet. “We were easily able to acquire and install it.”

A Sony X300 monitor was hand-delivered to Sciarratta’s apartment in lower Manhattan, which was also connected to Sciarratta’s machine at Sim through an Evercast stream. Sim shipped him other computer monitors, a Mac mini and Resolve panels. Sciarratta’s living room became a makeshift color bay.

“It was during work on the promos that Jason and Rob started working together, and they locked in pretty quickly,” says David Feldman, Sim’s senior VP, film and television, East Coast. “Jason knows what he wants, and Rob was able to quickly show him a few color looks to give him options.

David Feldman

“So when Sim transitioned to a remote workflow, Sciarratta was already in sync with what the director, Jason Hehir, was looking for. Rob graded each of the remaining seven episodes from his apartment on his X300 unsupervised. Sim then created watermarked QTs with final color and audio. Rob reviewed each QT to make sure his grade translated perfectly when reviewed on Jason’s retina display MacBook. At that point, Sim provided the director and editorial team access for final review.”

The biggest remote challenge, according to producer Matt Maxson, was that the rest of the team couldn’t see Sciarratta’s work on the X300 monitor.

“You moved from a facility with incredible 4K grading monitors and scopes to the more casual consumer-style monitors we all worked with at home,” says Maxson. “In a way, it provided a benefit because you were watching it the way millions of people were going to experience it. The challenge was matching everyone’s experience — Jason’s, Rob’s and our editors’ — to make sure they were all seeing the same thing.”

Keth Hodne

For his part, Hodne had enough gear in his house in Bay Ridge, Brooklyn. Using Pro Tools with Mac Pro computers at Sim, he had to work with a pared-down version of that in his home studio. It was a challenge, but he got the job done.

Hodne says he actually had more back-and-forth with Hehir on the final episode than any of the previous nine. They wanted to capture Jordan’s moments of reflection.

“This episode contains wildly loud, intense crowd and music moments, but we counterbalance those with haunting quiet,” says Hodne. “We were trying to achieve what it feels like to be a global superstar with all eyes on Jordan, all expectations on Jordan. Just moments on the clock to write history. The buildup of that final play. What does that feel and sound like? Throughout the episode, we stress that one of his main strengths is the ability to be present. Jason and I made a conscious decision to strip all sound out to create the feeling of being present and in the moment. As someone whose main job it is to add sound, sometimes there is more power in having the restraint to pull back on sound.”

ESPN Films_Netflix_Mandalay Sports Media_NBA Entertainment

Even when they were working remotely, the creatives were able to communicate in real time via phone, text or Zoom sessions. Still, as Chaet points out, “you’re not getting the body language from that newly official feedback.”

From a remote post production technology standpoint, Chaet and Feldman both say one of the biggest challenges the industry faces is sufficient and consistent Internet bandwidth. Residential ISPs often do not guarantee speeds needed for flawless functionality. “We were able to get ahead of the situation and put systems in place that made things just as smooth as they could be,” says Chaet. “Some things may have taken a bit longer due to the remote situation, but it all got done.”

One thing they didn’t have to worry about was their team’s dedication to the project.

“Whatever challenges we faced after the shutdown, we benefitted from having lived together at the facility for so long,” says Feldman. “There was this trust that, somehow, we were going to figure out a way to get it done.”


Craig Ellenport is a veteran sports writer who also covers the world of post production. 

Adobe CC updates include ProRes Raw support for Premiere and AE

Adobe updated the tools within its Creative Cloud offerings, including for Premiere Pro, After Effects, Audition, Character Animator, Media Encoder and Premiere Rush. The updates are available now and offer support for Apple ProRes Raw, new creative tools in After Effects, workflow refinements in Character Animator and performance improvements, such as faster Auto Reframe in Premiere Pro.

Here are some details of the updates:

– ProRes Raw support in Premiere Pro and After Effects provides a cross-platform solution for Apple ProRes workflows from camera media through to delivery.
– More streamlined graphics workflows in Premiere Pro include an improved pen tool with better support for Bezier curves, enabling greater precision for creating lines and shapes. Filter effects show attributes that only have keyframes or adjusted parameters so you can focus on the currently active effects.
– Auto Reframe in Premiere is now faster. Powered by Adobe Sensei, the company’s artificial intelligence (AI) and machine learning technology, Auto Reframe automatically reformats and repositions video within different aspect ratios, such as square and vertical video, accelerating workflows for social media and content platforms, such as Quibi.
– Hardware encoding on Windows for H.264 and H.265 (HEVC) is available for Nvidia and AMD GPUs, providing consistently faster exports for these widely used formats with Premiere and Media Encoder.
– Support for audio files in Creative Cloud Libraries enables Premiere users to save, organize and share frequently used audio assets for easy access right from the CC Libraries panel.
– Tapered Shape Strokes in After Effects give motion graphics artists new creative options for animation and design. They can make tapered, wavy, pointed or rounded strokes on shape layers and animate the strokes for stylized looks and motion designs.
– Concentric Shape Repeater in After Effects offers new parameters in the Offset Paths shape effect to create copies of a path that radiate outward or inward for funky designs with a cool retro vibe.
– Mask and Shape Cursor Indicators in After Effects show which tool someone is using and help avoid unnecessary un-dos when drawing shapes and masks.
– Improvements to Audio Triggers and Timeline filtering in Character Animator increase efficiency in animation workflows. A new collection of background puppets let users trigger animated elements within a scene behind their character.
– Automatic audio hardware switching is now available on macOS for After Effects, Media Encoder, Audition, Character Animator, Prelude, Premiere and Premiere Rush. When changing audio devices or simply plugging in headphones, the OS recognizes the hardware, and the Adobe application automatically switches to the current hardware.
– Premiere Rush users can now automatically resize projects to the 4:5 aspect ratio to match formats for Facebook and Instagram videos. Also, back camera switching on an iOS device (requires iOS 13 and a current iPhone) enables capture within Premiere Rush from the selected back camera (ultra-wide, wide or telephoto).
– Lastly, users can now import media from the Files app directly from the Premiere Rush media browser on iOS devices, simplifying access to files stored on device or different cloud services.

Evercast 5.4. Ad 5, Evercast Ad 6 5.18

AMD’s new Radeon Pro VII graphics card for 8K workflows

AMD has introduced the AMD Radeon Pro VII workstation graphics card designed for those working in broadcast and media in addition to CAE and HPC applications. According to AMD, the Radeon Pro VII graphics card offers 16GB of extreme speed HBM2 (high bandwidth memory) and support for six synchronized displays and high-bandwidth PCIe 4.0 interconnect technology.

AMD says the new card considerably speeds up 8K image processing performance in Blackmagic’s DaVinci Resolve in addition to performance speed updates in Adobe’s After Effects and Photoshop and Foundry’s Nuke.

The AMD Radeon Pro VII introduces AMD Infinity Fabric Link technology to the workstation market, which speeds application data throughput by enabling high-speed GPU-to-GPU communications in multi-GPU system configurations. The new workstation graphics card provides the high performance and advanced features that enable post teams and broadcasters to visualize, review and interact with 8K content.

The AMD Radeon Pro VII graphics card is expected to be available beginning mid-June for $1,899. AMD Radeon Pro VII-equipped workstations are expected to be available in the second half of 2020 from OEM partners.

Key features include:
– 16GB of HBM2 with 1TB/s memory bandwidth and full ECC capability to handle large and complex models and datasets smoothly with low latency.
– A high-bandwidth, low-latency connection that allows memory sharing between two AMD Radeon Pro VII GPUs, enabling users to increase project workload size and scale, develop more complex designs and run larger simulations to drive scientific discovery. AMD Infinity Fabric Link delivers up to 5.25x PCIe 3.0 x16 bandwidth with a communication speed of up to 168GB/s peer-to-peer between GPUs.
– Users can access their physical workstation from virtually anywhere with the remote workstation IP built into AMD Radeon Pro Software for Enterprise driver.
– PCIe 4.0 delivers double the bandwidth of PCIe 3.0 to enable smooth performance for 8K, multichannel image interaction.
– Enables precise synchronized output for display walls, digital signage and other visual displays (AMD FirePro S400 synchronization module required).
– Supports up to six synchronized display panels, full HDR and 8K screen resolution (single display) combined with ultra-fast encode and decode support for enhanced multi-stream workflows.
– Optimized and certified with pro applications for stability and reliability. The list of Radeon Pro Software-certified ISV applications can be found here.
– ROCm open ecosystem, an open software platform for accelerated compute, provides an easy GPU programming model with support for OpenMP, HIP and OpenCL and for ML and HPC frameworks.

AMD Radeon Pro workstation graphics cards are supported by the Radeon Pro Software for Enterprise driver, offering enterprise-grade stability, performance, security, image quality and other features, including high-resolution screen capture, recording and video streaming. The company says the latest release offers up to a 14 percent year-over-year performance improvement for current-generation AMD Radeon Pro graphics cards. The new software driver is now available for download from AMD.com.

AMD also released updates for AMD Radeon ProRender, a physically-based rendering engine built on industry standards that enables accelerated rendering on any GPU, any CPU and any OS. The updates include new plugins for Side Effects Houdini and Unreal Engine and updated plugins for Autodesk Maya and Blender.

For developers, an updated AMD Radeon ProRender SDK is now available on the redesigned GPUOopen.com site and is now easier to implement with an Apache License 2.0. AMD also released a beta SDK of the next-generation Radeon ProRender 2.0 rendering engine with enhanced CPU and GPU rendering support with open-source versions of the plugins.


Production begins again on New Zealand’s Shortland Street series

By Katie Hinsen

The current global pandemic has shut down production all over the world. Those who can have moved to working from home, and there’s speculation about how and when we’ll get back to work again.

New Zealand, a country with a significant production economy, has announced that it will soon reopen for shoots. The most popular local television show, Shortland Street, was the first to resume production after an almost six-week break. It’s produced by Auckland’s South Pacific Pictures.

Dylan Reeve

I am a native New Zealander who has worked in post there on and off over the years. Currently I live in Los Angeles, where I am an EP for dailies and DI at Nice Shoes, so taking a look at how New Zealand is rolling things out interests me. With that in mind, I reached out to Dylan Reeve, head of post production at Shortland Street, to find out how it looked the week they went back to work under Level 3 social distancing restrictions.

Shortland Street is a half-hour soap that runs five nights a week on prime-time television. It has been on air for around 28 years and has been consistently among the highest-rated shows in the nation. It’s a cultural phenomenon. While the cast and crew take a single three-week annual break from production during the Christmas holiday season, the show has never really stopped production … until the pandemic hit.

Shortland Street’s production crew is typically made up of about 100 people; the post department consists of two editors, two assistants, a composer and Reeve, who is also the online editor. Sound mixes and complex VFX are done elsewhere, but everything else for the production is done at the studio.

New Zealand responded to COVID-19 early, instituting one of the harshest lockdowns in the world. Reeve told me that they went from alert Level 1 — basic social distancing, more frequent handwashing — to Level 3 as soon as the first signs of community transmission were detected. They stayed at this level for just two days before going to Level 4: complete lockdown. New Zealanders had 48 hours to get home to their families, shop for supplies and make sure they were ready.

“On a Monday afternoon at about 1:30pm, the studio emptied out,” explains Reeve. “We were shut down, but we were still on air, and we had about five or six weeks’ worth of episodes in various stages of production and post. I then had two days to figure out and prepare for how we were going to finish all of those and make sure they got delivered so that the show could continue to be on air.”

Shortland Street’s main production building dressed as the exterior of the hospital where the show is set, with COVID workplace safety materials on the doors.

The nature of the show’s existing workflow meant that Reeve had to copy all the media to drives and send Avids and drives home with the editors. The assistant editors logged in remotely for any work they needed to do, and Reeve took what he needed home as well to finish onlining, prepping and delivering those already-shot episodes to the broadcaster. They used Frame.io for review and approval with the audio team and with the directors, producers and network.

“Once we knew we were coming back into Level 3, and the government put out more refined guidelines about what that required, we had a number of HoD meetings — figuring out how we could produce the show while maintaining the restrictions necessary.”

I asked Reeve whether he and his crew felt safe going back to work. He reminded me that New Zealand only went back down to Level 3 once there had been a period with no remaining evidence of community transmission. Infection rates in New Zealand had spent two weeks in single digits, including two days when no new cases had been reported.

Starting Up With Restrictions
My conversation with Reeve took place on May 4, right after his first few days back at work. I asked him to explain some of the conditions under which the production was working while the rest of the country was still in isolation. Level 3 in New Zealand is almost identical to the lockdown restrictions put in place in US cities like New York and Los Angeles.

“One of the key things that has changed in terms of how we’re producing the show is that we physically have way less crew in the building. We’re working slower, and everyone’s having to do a bit more, maybe, than they would normally.

Shortland Street director Ian Hughes and camera operator Connagh Heath discussing blocking with a one-metre guide.

“When crew are in a controlled workspace where we know who everyone is,” he continues, “that allows us to keep track of them properly — they’re allowed to work within a meter of one another physically (three feet). Our policy is that we want staff to stay two meters (six feet) apart from one another as much as possible. But when we’re shooting, when it’s necessary, they can be a meter from one another.”

Reeve says the virus has certainly changed the nature of what can be shot. There are no love scenes, no kissing and no hugs. “We’re shooting to compensate for that; staging people to make them seem closer than they are.

Additionally, everything stays within the production environment. Parts of our office have been dressed; parts of our building have been dressed. We’ll do a very low-profile exterior shoot for scenes that take place outside, but we’re not leaving the lot.”

Under Level 3, everyone is still under isolation at home. This is why, explains Reeve, social distancing has to continue at work. That way any infection that comes into the team can be easily traced and contained and affect as few others as possible. Every department maintains what they call a “bubble,” and very few individuals are allowed to cross between them.

Actors are doing their own hair and makeup, and there are no kitchen or craft services available. The production is using and reusing a small number of regular extras, with crew stepping in occasionally as well. Reeve noted that Australia was also resuming production on Neighbours, with crew members acting as extras.

“Right now in our studio, our full technical complement consists of three camera operators at the moment, just one boom operator and one multi-skilled person who can be the camera assist, the lighting assist and the second boom op if necessary. I don’t know how a US production would get away with that. There’s no chance that someone who touches lights on a union production can also touch a boom.”

Post Production
Shortland Street’s post department is still working from home. Now that they are back in production, they are starting to look at more efficient ways to work remotely. While there are a lot of great tools out there for remote post workflows, Reeve notes that for them it’s not that easy, especially when hardware and support are halfway across the world, borders are closed and supply chains are disrupted.

There are collaboration tools that exist, but they haven’t been used “simply because the pace and volume of our production means it’s often hard to adapt for those kinds of products,” he says. “Every time we roll camera, we’re rolling four streams of DNxHD 185, so nearly 800Mb/s each time we roll. We record that media directly into the server to be edited within hours, so putting that in the cloud or doing anything like that was never the best workflow solution. When we wanted feedback, we just grabbed people from the building and dragged them into the edit suite when we wanted them to look at something.”

Ideally, he says, they would have tested and invested in these tools six months ago. “We are in what I call a duct tape stage. We’re taking things that exist, that look useful, and we’re trying to tape them together to make a solution that works for us. Coming out of this, I’m going to have to look at the things we’ve learned and the opportunities that exist and decide whether or not there might be some ways we can change our future production. But at the moment, we’re just trying to make it through.”

Because Shortland Street has only just resumed shooting, they haven’t reached the point yet where they need to do what Reeve calls “the first collaborative director/editor thing” from start to finish. “But there are two plans that we’re working toward. The easy, we-know-it-works plan is that we do an output, we stick it on Frame.io, the director watches it, puts notes on it, sends it back to us. We know that works, and we do that sometimes with directors anyway.

“The more exciting idea is that we have the directors join us on a remote link and watch the episodes as they would if they were in the room. We’ve experimented with a few things and haven’t found a solution that makes us super-happy. It’s tricky because we don’t have an existing hardware solution in place that’s designed specifically for streaming a broadcast output signal over an internet connection. We can do a screen-share, and we’ve experimented with Zoom and AnyDesk, but in both those cases, I’ve found that sometimes the picture will break up unacceptably, or sync will drift — especially using desktop-sharing software that’s not really designed to share full-screen video.”

Reeve and crew are just about to experiment with a tool used for gaming called Parsec. It’s designed to share low-latency, in-sync, high-frame-rate video. “This would allow us to share an entire desktop at, theoretically, 60fps with half-second latency or less. Very brief tests looked good. Plan A is to get the directors to join us on Parsec and screen-share a full-screen output off Avid. They can watch it down and discuss with the editor in real time or just make their own notes and work through it interactively. If that experience isn’t great, or if the directors aren’t enjoying it, or if it’s just not working for some reason, we’ll fall back to outputting a video, uploading it to Frame.io and waiting for notes.

What’s Next?
What are the next steps for other productions returning to work? Shortland Street is the only production that chose to resume under Level 3. The New Zealand Film Commission has said that filming will resume eventually under Level 2, which is being rolled out in several stages beginning this week. Shortland Street’s production company has several other shows, but none have plans to resume yet.

“I think it’s a lot harder for them to stay contained because they can’t shoot everything in the studio,” explains Reeve. “Our production has an added advantage because it is constantly shooting and the core cast and crew are mostly the same every day. I think these types of productions will find it easiest to come back.”

Reeve says that anyone coming into their building has to sign in and deliver a health declaration — recent travel, contact with any sick person, other work they’ve been engaged in. “I think if you can do some of that reasonable contact tracing with the people in your production, it will be easier to start again. The more contained you can keep it, the better. It’s going to be hard for productions that are on location, have high turnover or a large number of extras — anything where they can’t keep within a bubble.

“From a post point of view, I think we’re going to get a lot more comfortable working remotely,” he continues. “And there are lots of editors who already do that, especially in New Zealand. If that can become the norm, and if there are tools and workflows that are well established to support that, it could be really good for post production. It offers a lot of great opportunities for people to essentially broaden their client essentially or the geographic regions in which they can work.

Productions are going to have to make their own sort of health and safety liability decisions, according to Reeve. “All of the things we are doing are effectively responding to New Zealand government regulation, but that won’t be the case for everyone else.”

He sees some types of production finding an equilibrium. “Love Island might be the sort of reality show you can make. You can quarantine everyone going into that show for 14 days, make sure they’re all healthy, and then shoot the show because you’re basically isolated from the world. Survivor as well, things like that. But a reality show where people are running around the streets isn’t happening anymore. There’s no Amazing Race, that’s for sure.”


After a 20-year career talent-side, Katie Hinsen turned her attention to building, developing and running post facilities with a focus on talent, unique business structures and innovative use of technology. She has worked on over 90 major feature and episodic productions, founded the Blue Collar Post Collective, and currently leads the dailies & DI department at Nice Shoes.


Posting John Krasinski’s Some Good News

By Randi Altman

Need an escape from a world filled with coronavirus and murder hornets? You should try John Krasinski’s weekly YouTube show, Some Good News. It focuses on the good things that are happening during the COVID-19 crisis, giving people a reason to smile with things such as a virtual prom, Krasinski’s chat with astronauts on the ISS and bringing the original Broadway cast of Hamilton together for a Zoom singalong.

L-R: Remy, Olivier, Josh and Lila Senior

Josh Senior, owner of Leroi and Senior Post in Dumbo, New York, is providing editing and post to SGN. His involvement began when he got a call from a mutual friend of Krasinski’s, asking if he could help put something together. They sent him clips via Dropbox, and a workflow was born.

While the show is shot at Krasinski’s house in New York at different times during the week, Senior’s Fridays, Saturdays and Sundays are spent editing and posting SGN.

In addition to his post duties, Senior is an EP on the show, along with his producing partner Evan Wolf Buxbaum at their production company, Leroi. The two work in concert with Allyson Seeger and Alexa Ginsburg, who executive produced for Krasinski’s company, Sunday Night Productions. Production meetings are held on Tuesday, and then shooting begins. After footage is captured, it’s still shared via Dropbox or good old iMessage.

Let’s find out more…

What does John use for the shoot?
John films on two iPhones. A good portion of the show is screen-recorded on Zoom, and then there’s the found footage user-generated content component.

What’s your process once you get the footage? And, I’m assuming, it’s probably a little challenging getting footage from different kinds of cameras?
Yes. In the alternate reality where there’s no coronavirus, we run a pretty big post house in Dumbo, Brooklyn. And none of the tools of the trade that we have there are really at play here, outside of our server, which exists as the ever-present backend for all of our remote work.

The assets are pulled down from wherever they originate. The masters are then housed behind an encrypted firewall, like we do for all of our TV shows at the post house. Our online editor is the gatekeeper. All the editors, assistant editors, producers, animators, sound folks — they all get a mirrored drive that they download, locally, and we all get to work.

Do you have a style guide?
We have a bible, which is a living document that we’ve made week over week. It has music cues, editing style, technique, structure, recurring themes, a living archive of all the notes that we’ve received and how we’ve addressed them. Also, any style that’s specific to segments, post processing, any phasing or audio adjustments that we make all live within a document, that we give to whoever we onboard to the show.

Evan Wolf Buxbaum

Our post producers made this really elegant workflow that’s a combination of Vimeo and Slack where we post project files and review links and share notes. There’s nothing formal about this show, and that’s really cool. I mean, at the same time, as we’re doing this, we’re rapidly finishing and delivering the second season of Ramy on Hulu. It comes out on May 29.

I bet that workflow is a bit different than SGN’s.
It’s like bouncing between two poles. That show has a hierarchy, it’s formalized, there’s a production company, there’s a network, there’s a lot of infrastructure. This show is created in a group text with a bunch of friends.

What are you using to edit and color Some Good News?
We edit in Adobe Premiere, and that helps mitigate some of the challenges of the mixed media that comes in. We typically color inside of Adobe, and we use Pro Tools for our sound mix. We online and deliver out of Resolve, which is pretty much how we work on most of our things. Some of our shows edit in Avid Media Composer, but on our own productions we almost always post in Premiere — so when we can control the full pipeline, we tend to prefer Adobe software.

Are review and approvals with John and the producers done through iMessage in Dropbox too?
Yes, and we post links on Vimeo. Thankfully we actually produce Some Good News as well as post it, so that intersection is really fluid. With Ramy it’s a bit more formalized. We do notes together and, usually internally, we get a cut that we like. Then it goes to John, and he gives us his thoughts and we retool the edit; it’s like a rapid prototyping rather than a gated milestone. There are no network cuts or anything like that.

Joanna Naugle

For me, what’s super-interesting is that everyone’s ideas are merited and validated. I feel like there’s nothing that you shouldn’t say because this show has no agenda outside of making people happy, and everybody’s uniquely qualified to speak to that. With other projects, there are people who have an experience advantage, a technical advantage or some established thought leadership. Everybody knows what makes people happy. So you can make the show, I can make the show, my mom can make the show, and because of that, everything’s almost implicitly right or wrong.

Let’s talk about specific episodes, like the ones featuring the prom and Hamilton? What were some of the challenges of working with all of that footage. Maybe start with Hamilton?
That one was a really fun puzzle. My partner at Senior Post, Joanna Naugle, edited that. She drew on a lot of her experience editing music videos, performance content, comedy specials, multicam live tapings. It was a lot like a multicam live pre-taped event being put together.

We all love Hamilton, so that helps. This was a combination of performers pre-taping the entire song and a live performance. The editing technique really dissolves into the background, but it’s clear that there’s an abundance of skill that’s been brought to that. For me, that piece is a great showcase of the aesthetic of the show, which is that it should feel homemade and lo-fi, but there’s this undercurrent of a feat to the way that it’s put together.

Getting all of those people into the Zoom, getting everyone to sound right, having the ability to emphasize or de-emphasize different faces. To restructure the grid of the Zoom, if we needed to, to make sure that there’s more than one screen worth of people there and to make sure that everybody was visible and audible. It took a few days, but the whole show is made from Thursday to Sunday, so that’s a limiting factor, and it’s also this great challenge. It’s like a 48-hour film festival at a really high level.

What about the prom episode?
The prom episode was fantastic. We made the music performances the day before and preloaded them into the live player so that we could cut to them during the prom. Then we got to watch the prom. To be able to participate as an audience member in the content that you’re still creating is such a unique feeling and experience. The only agenda is happiness, and people need a prom, so there’s a service aspect of it, which feels really good.

John Krasinski setting up his shot.

Any challenges?
It’s hard to put things together that are flat, and I think one of the challenges that we found at the onset was that we weren’t getting multiple takes of anything, so we weren’t getting a lot of angles to play with. Things are coming in pretty baked from a production standpoint, so we’ve had to find unique and novel ways to be nonlinear when we want to emphasize and de-emphasize certain things. We want to present things in an expositional way, which is not that common. I couldn’t even tell you another thing that we’ve worked on that didn’t have any subjectivity to it.

Let’s talk sound. Is he just picking up audio from the iPhones or is he wearing a mic?
Nope. No, mic. Audio from the iPhones that we just run through a few filters on Pro Tools. Nobody mics themselves. We do spend a lot of time balancing out the sound, but there’s not a lot of effect work.

Other than SGN and Ramy, what are some other shows you guys have worked on?
John Mulaney & the Sack Lunch Bunch, 2 Dope Queens, Random Acts of Flyness, Julio Torres: My Favorite Shapes by Julio Torres and others.

Anything that I haven’t asked that you think is important?
It’s really important for me to acknowledge that this is something that is enabling a New York-based production company and post house to work fully remotely. In doing this week over week, we’re really honing what we think are tangible practices that we can then turn around and evangelize out to the people that we want to work with in the future.

I don’t know when we’re going to get back to the post house, so being able to work on a show like this is providing this wonderful learning opportunity for my whole team to figure out what we can modulate from our workflow in the office to be a viable partner from home.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Video Chat: Posting Late Night With Seth Meyers from home

By Randi Altman

For many, late-night shows have been offering up laughs during a really tough time, with hosts continuing to shoot from dens, living rooms, backyards and country houses, often with spouses and kids pitching in as crew.

NBC’s Late Night With Seth Meyers is one of those shows. They had their last in-studio taping on March 13, followed by a scheduled hiatus week, followed by the news they wouldn’t be able to come back to the studio. That’s when his team started preproduction and workflow testing to figure out questions like “How are we going to transfer files?” and “How are we going to get it on the air?”

I recently interviewed associate director and lead editor Dan Dome about their process and how that workflow has been allowing Meyers to check in daily from his wasp-ridden and probably haunted attic.

(Watch our original Video Interview here or below.)

How are you making this remote production work?
We’re doing a combination of things. We are using our network laptops to edit footage that’s coming in for interviews or comedy pieces. That’s all being done locally, meaning on our home systems and without involving our SAN or anything like that. So we’re cutting interviews and comedy pieces and then sending links out for approval via Dropbox. Why Dropbox? The syncing features are really great when uploading and downloading footage to all the various places we need to send it.

Once a piece is approved and ready to go into the show — we know the timings are right, we know the graphics are right, we know the spelling is correct, audio levels look good, video levels look good — then we upload that back to Dropbox and back to our computers at 30 Rock where our offices are located. We’re virtually logging into our machines there to compile the show. So, yeah, there are a few bits and pieces to building stuff remotely. And then there are a few bits and pieces to actually compiling the show on our systems back at home base.

What do you use for editing?
We’re still on Adobe Premiere. We launched on Premiere when the show started in February of 2014, and we’re still using that version — it’s solid and stable, and doing a daily show, we don’t necessarily get a ton of time to test new versions. So we have a stable version that we like for doing the show composite aspect of things.

When we’re back at 30 Rock and editing remote pieces, we’re using the newer versions of Adobe Premiere Pro CC 2015.2 9.2.0 (41 Build). At home we are using Premiere Pro CC 2020 14.0.4 (Build 18).

Let talk about how Seth’s been shooting. What’s his main camera?
Some of the home studio recording has been on iPads and iPhones. Then we’re using Zoom to do interviews, and there are multiple records of that happening. The files are then uploaded and downloaded between the edit team, and our director is in on the interviews, setting up cameras and trying to get it to look the best it can.

Once those interviews are done, the different records get uploaded to Dropbox. On my home computer, I use a 6TB CalDigit drive for Dropbox syncing and media storage. (Devon Schwab and Tony Dolezal, who are also editing pieces, use 4TB G-RAID drives with Thunderbolt 3.) So as soon as they tell me the file is up, I sync locally on the folder I know it’s going to, the media automatically downloads, and we simultaneously download it to our systems at 30 Rock. So it syncs there as well. We have multiple copies of it, and if we need to, we can hand off a project between me, Devin or Tony; we can do that pretty easily.

Have you discovered any challenges or happy surprises working this way?
It has been a nice happy surprise that it’s like, “Oh wow, this is working pretty well.” We did have a situation where we thought we might lose power on the East coast because of rains and winds and things like that. So we had safeguards in place for that, as far as having an evergreen show that was ready to go for that night in case we did end up losing power. It would have been terrible, but everything held up, and it worked pretty well.

So there are certainly some challenges to working this way, but it’s amazing that we are working and we can keep our mind on other things and just try to help entertain people while this craziness is going on.

You can watch our original Video Interview with Dome here:


Chimney Group: Adapting workflows in a time of crisis

By Dana Bonomo

In early March, Chimney delivered a piece for TED, created to honor women on International Women’s Day featuring Reshma Saujani, founder of Girls Who Code. This was in the early days of coronavirus taking hold in the United States. We had little comprehension at that point of the true extent to which we would be impacted as a country and as an industry. As the situation grew and awareness around the severity of the COVID-19 health crisis sunk in, we started to realize that it would be animated projects like this one that we would come to rely upon.

TED & Ultimate Software: International Women’s Day

This film showcases the use of other creative solutions when live-action projects can’t be shot. But the real function of work like this is that, on an emotional level, it feels good to make something with a socially actionable message.

In just the last few weeks, platforms have been saturated with COVID-19-related content: salutes to healthcare workers, PSAs from federal, state and local authorities and brands sharing messages of unity. Finding opportunities that can include some form of social purpose help provide hope to our communities while also raising the spirits of those creating it. We are currently in production on two of these projects and they help us feel like we’re contributing in some small way with the resources we have.

As a global company, Chimney is always highlighting our worldwide service capabilities, with 12 offices on four continents, and our abilities to work together. We’ve routinely used portals such as Zoho and Slack in the past, yet now I’m enjoying the shift in how we’re communicating with each other in a more connected and familiar way. Just a short time ago we might have used a typical workflow, and today we’re sharing and exchanging ideas and information at an exponential rate.

As a whole, we prefer to video chat, have more follow-ups and create more opportunities to work on internal company goals in addition to just project pipelines and calendars. There’s efficiency in brainstorming and solving creative challenges in real time, either as a virtual brainstorm or idea exchange in PM software and project communication channels. So at the end of a meeting, internal review or present, current project kick off, we have action items in place and ready to facilitate on a global scale.

Our company’s headquarters is in Stockholm, Sweden. You may have heard that Sweden’s health officials have taken a different approach to handling COVID-19 than most countries, and it is resulting in less drastic social distancing and isolation measures while still being quite mindful of safety. Small shoots are still possible with crews of 10 or less — so we can shoot in Sweden with a fully protected crew, executing safe and sanitary protocols —and we can livestream to clients worldwide from set.

This is Chimney editor Sam O’Hare’s work-from-home setup.

Our CEO North America Marcelo Gandola is encouraging us individually to schedule personal development time, whether it’s for health and wellness, master classes on subjects that interest us, certifications for our field of expertise, or purely creative and expressive outlets. Since many of us used our commute time for that before the pandemic, we can still use that time for emotional recharging in different ways. By setting aside time for this, we regain some control of our situation. It lifts our morale and it can be very self-affirming, personally and professionally.

While most everyone has remote work capabilities these days, there’s a level of creative energy in the air, driven by the need to employ different tactics — either by working with what you have (optimizing existing creative assets, produced content, captured content from the confines of home) or replacing what was intended to be live-action with some form of animation or graphics. For example, Chimney’s Creative Asset Optimization has been around for some time now. Using Edisen, our customization platform, we can scale brands’ creative content on any platform, in any market at any time, without spending more. From title changes to language versioning and adding incremental design elements, clients get bigger volumes of content with high-quality creative for all channels and platforms. So a campaign that might have had a more limited shelf life on one platform can now stretch to an umbrella campaign with a variety of applications depending on its distribution.

Dana Bonomo

They say that necessity is the mother of invention, and it’s exciting to see how brands and makers are creatively solving current challenges. Our visual effects team recently worked on a campaign (sorry we can’t name this yet) that took existing archival footage and — with the help of VFX — generated content that resonated with audiences today. We’re also helping clients figure out remote content capture solutions in lieu of their live events getting canceled.

I was recently on a Zoom call with students at my alma mater, SUNY Oneonta, in conversation with songwriter and producer John Mayer. He said he really feels for students and younger people during this time, because there’s no point of reference for them to approach this situation. The way the younger generation is adapting — reacting by living so fully despite so many limitations — they are the ones building that point of reference for the future. I think that holds true for all generations… there will always be something to be learned. We don’t fully know what the extent of our learning will be, but we’re working creatively to make the most of it.

Main Image: Editor Zach Moore’s cat is helping him edit


Dana Bonomo is managing director at Chimney Group in NYC.


COVID-19: How our industry is stepping up

We’ve been using this space to talk about how companies are discounting products, raising money and introducing technology to help with remote workflows, as well as highlighting how pros are personally pitching in.

Here are the latest updates, followed by what we’ve gathered to date:

Adobe
Adobe has made a $4.5 million commitment to trusted organizations that are providing vital assistance to those most in need.

• Adobe is joining forces with other tech leaders in the Bay Area to support the COVID-19 Coronavirus Regional Response Fund of the Silicon Valley Community Foundation, a trusted foundation that serves a network of local nonprofits. Adobe’s $1 million donation will help provide low-income people in Santa Clara County through The Santa Clara County Homelessness Prevention System Financial Assistance Program  with immediate financial assistance to help pay rent or meet other basic needs. Additionally, Adobe is donating $250,000 to the Valley Medical Center Foundation to purchase life-saving ventilators for Bay Area hospitals.
• Adobe has donated $1 million to the COVID-19 Fund of the International Federation of Red Cross and Red Crescent Societies, the recognized global leader in providing rapid disaster relief and basic human and medical services. Adobe’s support will help aid vulnerable communities impacted by COVID-19 around the world. This is in addition to the $250,000 the company is donating to Direct Relief as a part of Adobe’s #HonorHeroes campaign.
• To support the community in India, Adobe is donating $1 million towards the American India Foundation (AIF) and the Akshaya Patra Foundation. The donation will help AIF source much-needed ventilators for hospitals, while the grant for Akshaya Patra will provide approximately 5 million meals to impacted families.

Harbor
Harbor is releasing Inspiration in Isolation, a new talk series that features filmmakers in candid conversation about their creative process during this unprecedented time and beyond. The web series aims to reveal the ideas and rituals that contribute to their creative process. The premiere episode features celebrated cinematographer Bradford Young and senior colorist Joe Gawler. The two, who are collaborators and friends, talk community, family, adapting to change and much more.

The full-length episodes will be released on Harbor’s new platform, HarborPresents, with additional content on Harbor’s social media (@HarborPictureCo).

HPA
The HPA has formed the HPA Industry Recovery Task Force, which will focus on sustainably resuming production and post services, with the aim of understanding how to enable content creation in an evolving world impacted by the pandemic.

The task force’s key objectives are:
• To serve as a forum for collaboration, communication and thought leadership regarding how to resume global production and post production in a sustainable fashion.
• To understand and influence evolving technical requirements, such as the impact of remote collaboration, work from home and other workflows that have been highlighted by the current crisis.
• To provide up-to-date information and access to emerging health and safety guidelines that will be issued by various governments, municipalities, unions, guilds, industry organizations and content creators.
• To provide collaborative support and guidance to those impacted by the crisis.

Genelec
Genelec is donating a percentage of every sale of its new Raw loudspeaker range to the Audio Engineering Society (AES) for the remainder of this year. Additionally, Genelec will fund 10 one-year AES memberships for those whose lives have been impacted by the COVID-19 crisis. A longtime sustaining member of AES, Genelec is making the donation to help sustain the society’s cash flow, which has been significantly affected by the coronavirus situation.

OWC
OWC has expanded its safety protocols, as they continue to operate as an essential business in Illinois. They have expanded their already strong standard operating practice in terms of cleanliness with additional surface disinfection actions, as well as both gloves and masks being used by their warehouse and build teams. Even before recent events, manufacturing teams used gloves to prevent fingerprinting units during build, but those gloves have new importance now. In addition, OWC has both MERV air filters in place and a UV air purifier, which combined are considered to be 99.999% effective in killing/capturing all airborne bacteria and viruses.

Red

For a limited time, existing DSMC2 and Red Ranger Helium and Gemini customers can purchase a Red Extended Warranty at a discounted price. Existing customers who are into their second year of warranty can pay the standard pricing they would receive within their first year instead of the markup price. For example, instead of paying $1,740 (the 20% markup), a DSMC2 Gemini owner who is in within the second year of warranty can purchase an Extended Warranty for $1,450.

This promotion has been extended to June 30. Adding the Red Extended Warranty not only increases the warranty coverage period but also provides benefits such as priority repair, expedited shipping, and premium technical support directly from Red. Customers also have access to the Red Rapid Replacement Program. Extended Warranty is also transferable to new owners if completing a Transfer of Ownership with Red.

DejaSoft
DejaSoft has extended its offering of giving editors 50% off all their DejaEdit licenses — it now goes through the end of June. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow. DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Assimilate
Assimilate is offering all of its products — including Scratch 9.2, Scratch VR 9.2, PlayPro 9.2, Scratch Web and the recently released Live Looks and Live Assist — for free through October 31. Users can register for free licenses. Online tutorials are here and free access to Lowepost online Scratch training is here.

B&H
B&H is partnering with suppliers to donate gear to the teams at Mount Sinai and other NYC hospitals to help health care professionals and first responders stay in touch with their loved ones. Some much-needed items are chargers, power sources, battery packs and mobile accessories. B&H is supporting the Mayor’s Fund to Advance New York City and Direct Relief.

FXhome
FXhome last month turned the attention of its “Pay What You Want” initiative to direct proceeds to help fight Covid-19. This month, in an effort to teach the community new skills, and inspire them with new ideas to help them reinvent themselves, FXhome has today launched a new, entirely free Master Class series designed to teach everything from basic editing, to creating flashy title sequences, to editing audio and of course, learning basic VFX and compositing.

Nugen Audio 
Nugen Audio has a new “Staying Home, Staying Creative” initiative aimed at promoting collaboration and creativity in a time of social distancing. Included are a variety of videos, interviews and articles that will inspire new artistic approaches for post production workflows. The company is also providing temporary replacement licenses for any users who do not have access to their in-office workstations.

Already available on the Staying Creative web page is a special interview with audio post production specialist Keith Alexander. Building from his specialty in remote recording and sound design for broadcast, film and gaming, Alexander shares some helpful tips on how to work efficiently in a home-based setting and how to manage audio cleanup and broadcast-audio editing projects from home. There’s also an article focused on three ways to improve lo-fi drum recording in a less-than-ideal space.

Nugen is also offering temporary two-month licenses for current iLok customers, along with one additional Challenge Response license code authorization. The company has also reduced the prices of all products in its web store.

Tovusound 
Tovusound has extended its 20% discount until the end of the month and has added some new special offers.

The Spot Edward Ultimate Suite expansion, regularly $149, is now $79 with coupon. It adds the Spot creature footstep and movement instrument to the Edward footstep, cloth and props designer. Customers also get free WAV files with the purchase of all Edward instruments and expansions and with all Tovusound bundles. Anyone who purchased one of the applicable products after April 1 also has free access to the WAV files.

Tovusound will continue to donate an additional 10% of the sales price to the CleanOceanProject.org. Customers may claim their discounts by entering STAYHOME in the “apply coupon” field at checkout. All offers end on April 30.

 

Previous Updates

Object Matrix and Cinesys-Oceana
Object Matrix and Cinesys-Oceana are hosting a series of informal online Beer Roundtable events in the coming months. The series will discuss the various challenges with implementing hybrid technology for continuity, remote working and self-serve access to archive content.You can register for the next Beer Roundtable here. The sessions will be open, fun and relaxed. Participants are asked to grab themselves a drink and simply raise their glass when they wish to ask a question.

During the first session, Cinesys-Oceana CTO Brent Angle and Object Matrix CEO Jonathan Morgan will introduce what they believe to be the mandatory elements of the ultimate hybrid technology stack. This will be followed by a roundtable discussion hosted by Harry Skopas, director M&E solutions architecture and technical sales at Cinesys-Oceana, with guest appearances from the media and sports technology communities.

MZed
MZed, an online platform for master classes in filmmaking, photography and visual storytelling, is donating 20% of all sales to the Los Angeles Food Bank throughout April. For every new MZed Pro membership, $60 is donated, equating to 240 meals to feed hungry children, seniors and families. MZed serves the creative community, a large portion of which lives in the LA area and is being hit hard by the lockdown due to the coronavirus. MZed hopes to help play a role in keeping high-risk members of the community fed during a time of extreme uncertainty.

MZed has also launched a “Get One, Gift One” initiative. When someone purchases an MZed Pro membership, that person will not only be supporting the LA Food Bank but will instantly receive a Pro membership to give to someone else. MZed will email details upon purchase.

MZed offers hundreds of hours of training courses covering everything from photography and filmmaking to audio and lighting in courses like “The Art of Storytelling” with Alex Buono and Philip Bloom’s Cinematic Masterclass.

NAB Show
NAB Show’s new digital experience, NAB Show Express, will take place May 13-14. The platform is free and offers 24-hour access to three educational channels, on-demand content and a Solutions Marketplace featuring exhibitor product information, announcements and demos. Registration for the event will open on April 20 at NABShowExpress.com. Each channel will feature eight hours of content streamed daily and available on-demand to accommodate the global NAB Show audience. NAB Show Express will also offer NAB Show’s signature podcast, exploring relevant themes and featuring prominent speakers.

Additionally, NAB Show Express will feature three stand-alone training and executive leadership events for which separate registrations will be available soon. These include:
• Executive Leadership Summit (May 11), produced in partnership with Variety
• Cybersecurity & Content Protection Summit (May 12), produced in partnership with Content Delivery & Security Association (CDSA) and Media & Entertainment Services Alliance (MESA) – registration fees apply
• Post | Production World Online (May 17-19), produced in partnership with Future Media Conferences (FMC) – registration fees apply.

Atto 
Atto Technology is supporting content producers who face new workflow and performance challenges by making Atto Disk Benchmark for macOS more widely available and by updating Atto 360 tuning, monitoring and analytics software. Atto 360 for macOS and Linux have been updated for enhanced stability and include an additional tuning profile. The current Windows release already includes these updates. The software is free and can be downloaded directly from Atto.

Sigma
Sigma has launched a charitable giving initiative in partnership with authorized Sigma lens dealers nationwide. From now until June 30, 2020, 5% of all Sigma lens sales made through participating dealers will be donated to a charitable organization of the dealers’ choice. Donations will be made to organizations working on COVID-19 relief efforts to help ease the devastation many communities are feeling as a result of the global crisis. A full list of participating Sigma dealers and benefiting charities can be found here.

FXhome 
To support those who are putting their lives on the line to provide care and healing to those impacted by the global pandemic, FXhome is adding Partners In Health, Doctors Without Borders and the Center for Disaster Philanthropy as new beneficiaries of the FXhome “Pay What You Want” initiative.

Pay What You Want is a goodwill program inspired by the HitFilm Express community’s desire to contribute to the future development of HitFilm Express, the company’s free video editing and VFX software. Through the initiative, users can contribute financially, and those funds will be allocated for future development and improvements to HitFilm. Additionally, FXhome is contributing a percentage of the proceeds to organizations dedicated to global causes important to the company and its community. The larger the contribution from customers, the more FXhome will donate.

Besides adding the three new health-related beneficiaries, FXhome has extended its campaign to support each new cause from one month to three months, beginning in April and running through the end of June. A percentage of all proceeds of revenues generated during this time period will be donated to each cause.

Covid-19 Film and TV Emergency Relief Fund
Created by The Film and TV Charity in close partnership with the BFI, the new COVID-19 Film and TV Emergency Relief Fund provides support to the many thousands of active workers and freelancers who have been hit hardest by the closure of productions across the UK. The fund has received initial donations totaling £2.5 million from Netflix, the BFI, BBC Studios, BBC Content, WarnerMedia and several generous individuals.

It is being administered by The Film and TV Charity, with support from BFI staff. The Film and TV Charity and the BFI is covering all overheads, enabling donations to go directly to eligible workers and freelancers across film, TV and cinema. One-off grants of between £500 and £2,500 will be awarded based on need. Applications for the one-off grants can be made via The Film and TV Charity’s website. The application process will remain open for two weeks.

The Film and TV Charity also has a new COVID-19 Film and TV Repayable Grants Scheme offering support for industry freelancers waiting for payments under the Government’s Self-employment Income Support Scheme. Interest-free grants of up to £2,000 will be offered to those eligible for Self-employment Income Support but who are struggling with the wait for payments in June. The Covid-19 Film and TV Repayable Grants Scheme opens April 15. Applicants will have one week to make a claim via The Film and TV Charity’s website.

Lenovo
Lenovo is offering a free 120-day license of Mechdyne’s TGX Remote Desktop software, which uses Nvidia Quadro GPUs and a built-in video encoder to compress and send information from the host workstation to the end-point device to decode. This eliminates lag on complex and detailed application files.

Teams can share powerful, high-end workstation resources across the business, easily dialing up performance and powerful GPUs from their standard workstation to collaborate remotely with coworkers around the world.

Users keep data and company IP secure on-site while reducing the risk of data breaches and remotely administering computer hardware assets from anywhere, anytime.
Users install the trial on their host workstations and install the receiver software on their local devices to access their applications and projects as if they were in the office.

Ambidio 
To help sound editors, mixers and other post pro who suddenly find themselves working from home, Ambidio is making its immersive sound technology, Ambidio Looking Glass, available for free. Sound professionals can apply for a free license through Ambidio’s website. Ambidio is also waiving its per-title releasing fee for home entertainment titles during the current cinema shutdown. It applies to new titles that haven’t previously been released through Blu-ray, DVD, digital download or streaming. The free offer is available through May 31.

Ambidio Looking Glass can be used as a monitoring tool for theatrical and television projects requiring immersive sound. Ambidio Looking Glass produces immersive sound that approximates what can be achieved on a studio mix stage, except it is playable through standard stereo speaker systems. Editors and mixers working from home studios can use it to check their work and share it with clients, who can also hear the results without immersive sound playback systems.

“The COVID-19 pandemic is forcing sound editors and mixers to work remotely,” says Ambidio founder Iris Wu. “Many need to finish projects that require immersive sound from home studios that lack complex speaker arrays. Ambidio Looking Glass provides a way for them to continue working with dimensional sound and meet deadlines, even if they can’t get to a mix stage.”

Qumulo
Through July 2020, Qumulo is offering its cloud-native file software for free to public and private-sector medical and health care research organizations that are working to minimize the spread and impact of the COVID-19 virus.

“Research and health care organizations across the world are working tirelessly to find answers and collaborate faster in their COVID-19 vaccine mission,” said Matt McIlwain, chairman of the board of trustees of the Fred Hutchinson Cancer Research Center and managing partner at Madrona Venture Group. “It will be through the work of these professionals, globally sharing and analyzing all available data in the cloud, that a cure for COVID-19 will be discovered.”

Qumulo’s cloud-native file and data services allows organizations to use the cloud to capture, process, analyze and share data with researchers distributed across geographies. Qumulo’s software works seamlessly with the applications medical and health care researchers have been using for decades, as well as with artificial intelligence and analytics services more recently developed in the cloud.

Medical organizations can register to use Qumulo’s file software in the cloud, which will be deployable through the Amazon Web Services and Google Cloud marketplaces.

Goldcrest Post
Goldcrest Post has established the capability to conduct most picture and sound post production work remotely. Colorists, conform editors and other staff are now able to work from home or a remote site and connect to the facility’s central storage and technical resources via remote collaboration software. Clients can monitor work through similar secure, fast and reliable desktop connections.

The service allows Goldcrest to ensure theatrical and television projects remain on track while allowing clients to oversee work in as normal a manner as possible under current circumstances.

Goldcrest has set up a temporary color grading facility at a remote site convenient for its staff colorists. The site includes a color grading control panel, two color-calibrated monitors and a high-speed connection to the main Goldcrest facility. The company has also installed desktop workstations and monitors in the homes of editors and other staff involved in picture conforming and deliverables. Sound mixing is still being conducted on-site, but sound editorial and ancillary sound work is being done from home.In taking these measures, the facility has reduced its on-site staff to a bare minimum while keeping workflow disruption to a minimum.

Ziva Dynamics
Ziva Dynamics is making Ziva VFX character simulation software free for students and educators. The same tools used on Game of Thrones, Hellboy and John Wick: Chapter 3 are now available for noncommercial projects, offering students the chance to learn physics-based character creation before they graduate. Ziva VFX Academic licenses are fully featured and receive the same access and support as other Ziva products.

In addition to the software, Ziva Academic users will now receive free access to Ziva Dynamics’ simulation-ready assets Zeke the Lion (previously $10,000) and Lila the Cheetah. Thanks to Ziva VFX’s Anatomy Transfer feature, the Zeke rig has helped make squirrels, cougars, dogs and more for films like John Wick 3, A Dog’s Way Home and Primal.

Ziva Dynamics will also be providing a free Ziva Academic floating lab license to universities so students can access the software in labs across campuses whenever they want. Ziva VFX Academic licenses are free and open to any fully accredited institution, student, professor or researcher (an $1,800 value). New licenses can be found in the Ziva store and are provided following a few eligibility questions. Academic users on the original paid plan can now increase their license count for free.

OpenDrives 
OpenDrives’ OpenDrives Anywhere is an in-place private cloud model that enables customers with OpenDrives to work on the same project from multiple locations without compromising performance. With existing office infrastructure, teams already have an in-place private cloud and can extend its power to each of their remote professionals. No reinvestment in storage is needed.

Nothing changes from a workflow perspective except physical proximity. With simple adjustments, remote control of existing enterprise workstations can be extended via a secure connection. HP’s ZCentral Remote Boost (formerly RGS) software will facilitate remote access over secure connection to your workstations, or Teradici can provide both dedicated external hardware and software solutions for this purpose, giving teams the ability to support collaborative workflows at low cost. OpenDrives can also get teams quickly set up in under two hours on a corporate VPN and in under 24 hours without.

Prime Focus Technologies 
Prime Focus Technologies (PFT), the technology arm of Prime Focus, has added new features and advanced security enhancements to Clear to help customers embrace the virtual work environment. In terms of security, Clear now has a new-generation HTML 5 player enabled with Hollywood-grade DRM encryption. There’s also support for just-in-time visual watermarking embedded within the stream for streaming through Clear as a secure alternative to generating watermarking on the client side.

Clear also has new features that make it easier to use, including direct and faster download from S3 and Azure storage, easier partner onboarding and an admin module enhancement with condensed permissions to easily handle custom user roles. Content acquisition is made easier with a host of new functionalities to simplify content acquisition processes and reduce dependencies as much as possible. Likewise, for easier content servicing, there is now automation in content localization, to make it easier to perform and review tasks on Clear. For content distribution, PFT has enabled on-demand cloud distribution on Clear through the most commonly used cloud technologies.

Brady and Stephenie Betzel
Many of you know postPerspective contributor and online video editor Brady Betzel from his great reviews and tips pieces. During this crisis, he is helping his wife, Stephenie, make masks for her sister (a nurse) and colleagues working at St. John’s Regional Medical Center in Oxnard, California, in addition to anyone else who works on the “front lines.” She’s sewn over 300 masks so far and is not stopping. Creativity and sewing is not new to her. Her day job is also creating. You can check out her work on Facebook and Instagram.

Object Matrix 
Object Matrix co-founder Nick Pearce has another LinkedIn dispatch, this time launching Good News Friday, where folks from around the globe check in with good news!  You can also watch it on YouTube. Pearce and crew are also offering video tips for surviving working from home. The videos, hosted by Pearce, and are updated weekly. Check them out  here.

Conductor
Conductor is waiving charges for orchestrating renders in the cloud. Updated pricing is reflected in the cost calculator on Conductor’s Pricing page. These changes will last at least through May 2020. To help expedite any transition needs, the Conductor team will be on call for virtual render wrangling of cloud submissions, from debugging scenes and scripts to optimizing settings for cost, turnaround time, etc. If you need this option, then email support@conductortech.com.

Conductor is working with partners to set up online training sessions to help studios quickly adopt cloud strategies and workflows. The company will send out further notifications as the sessions are formalized. Conductor staff is also available for one-on-one studio sessions as needed for those with specific pipeline considerations.

Conductor’s president and CEO Mac Moore said this: “The sudden onset of this pandemic has put a tremendous strain on our industry, completely changing the way studios need to operate virtually overnight. Given Conductor was built on the ‘work from anywhere’ premise, I felt it our responsibility to help studios to the greatest extent possible during this critical time.”

Symply
Symply is providing as many remote workers in the industry as possible with a free 90-day license to SymplyConveyor, its secure, high-speed transfer and sync software. Symply techs will be available to install SymplyConveyor remotely on any PC, Mac or Linux workstation pair or server and workstation.

The no-obligation offer is available at gosymply.com. Users sign up, and as long as they are in the industry and have a need, Symply techs will install the software. The number of free 90-day licenses is limited only by Symply’s ability to install them given its limited resources.

Foundry
Foundry has reset its trial database so that users can access a new 30-day trial for all products regardless of the date of their last trial. The company continues to offer unlimited non-commercial use of Nuke and Mari. On the educational side, students who are unable to access school facilities can get a year of free access to Nuke, Modo, Mari and Katana.

They have also announced virtual events, including:

• Foundry LiveStream – a series of talks around projects, pipelines and tools.
• Foundry Webinars – A 30 to 40-minute technical deep dive into Foundry products, workflows and third-party tools.
• Foundry Skill-Ups – A 30-minute guide to improving your skills as a compositor/lighter/texture artist to get to that next level in your career.
• Foundry Sessions – Special conversations with our customers sharing insights, tips and tricks.
• Foundry Workflow Wednesdays –10-minute weekly videos posted on social media showing tips and tricks with Nuke from our experts.

Alibi Music Library
Alibi Music Library is offering free whitelisted licensing of its Alibi Music and Sound FX catalogs to freelancers, agencies and production companies needing to create or update their demo reels during this challenging time.

Those who would like to take advantage of this opportunity can choose Demo Reel 2020 Gratis from the shopping cart feature on Alibi’s website next to any desired track(s). For more info, click here.

2C Creative
Caleb & Calder Sloan’s Awesome Foundation, the charity of 2C Creative founders Chris Sloan and Carla Kaufman Sloan, is running a campaign that will match individual donations (up to $250 each) to charities supporting first responders, organizations and those affected by COVID-19. 2C is a creative agency & production company serving the TV/streaming business with promos, brand integrations, trailers, upfront presentations and other campaigns. So far, the organization’s “COVID-19 Has Met Its Match” campaign has raised more than $50,000. While the initial deadline date for people to participate was April 6, this has now been extended to April 13. To participate, please visit ccawesomefoundation.org for a list of charities already vetted by the foundation or choose your own. Then, simply email a copy of your donation receipt to: cncawesomefoundation@gmail.com and they will match it!

Red Giant 
For the filmmaking education community, Red Giant is offering Red Giant Complete — the full set of tools including Trapcode Suite, Magic Bullet Suite, Universe, VFX Suite and Shooter Suite — free for students or faculty members of a university, college or high school. Instead of buying separate suites or choosing which tools best suits one’s educational needs or budget, students and teachers can get every tool Red Giant makes completely free of charge. All that’s required is a simple verification.

How to get a free Red Giant Complete license if you are a student, teacher or faculty member:
1. Use school or organization ID or any proof of current employment or enrollment for verification. More information on academic verification is available here.
2. Send your academic verification to academic@redgiant.com.
3. Wait for approval via email before purchasing.
4. Once you get approval, go to the Red Giant Complete Product Page and “buy” your free version. You will only be able to buy the free version if you have been pre-approved.

The free education subscription will last 180 days. When that time period ends, users will need to reverify their academic status to renew their free subscription.

Flanders Scientific
Remote collaboration and review benefits greatly from having the same type of display calibrated the same way in both locations. To help facilitate such workflow consistency, FSI is launching a limited time buy one, get one for $1,000 off special on its most popular monitor, the DM240.

Nvidia
For those pros needing to power graphics workloads without local hardware, cloud providers, such as Amazon Web Services and Google Cloud, offer Nvidia Quadro Virtual Workstation instances to support remote, graphics-intensive work quickly without the need for any on-prem infrastructure. End-users only need a connected laptop or thin client, as the virtual workstations support the same Nvidia Quadro drivers and features as the physical Quadro GPUs used by pro artists and designers in local workstations.

Additionally, last week, Nvidia has expanded its free virtual GPU software evaluation to 500 licenses for 90 days to help companies support their remote workers with their existing GPU infrastructure. Nvidia vGPU software licenses — including Quadro Virtual Workstation — enable GPU-accelerated virtualization so that content creators, designers, engineers and others can continue their work. More details are available here.  Nvidia has also posted a separate blog on virtual GPUs to help admins who are working to support remote employees

Harman
Harman is offering a free e-learning program called Learning Sessions in conjunction with Harman Pro University.

The Learning Sessions and the Live Workshop Series provide a range of free on-demand and instructor-led webinars hosted by experts from around the world. The Industry Expert workshops feature tips and tricks from front of house engineers, lighting designers, technicians and other industry experts, while the Harman Expert workshops feature in-depth product and solution webinars by Harman product specialists.

• April 7—Lighting for Churches: Live and Video with Lucas Jameson and Chris Pyron
• April 9—Audio Challenges in Esports with Cameron O’Neill
• April 15—Special Martin Lighting Product Launch with Markus Klüesener
• April 16—Lighting Programming Workshop with Susan Rose
• April 23—Performance Manager: Beginner to Expert with Nowell Helms

Apple
Apple is offering free 90-day trials of Final Cut Pro X and Logic Pro X apps for all in order to help those working from home and looking for something new to master, as well as for students who are already using the tools in school but don’t have the apps on their home computers.

Avid
For its part, Avid is offering free temp licenses for remote users of the company’s creative tools. Commercial customers can get a free 90-day license for each registered user of Media Composer | Ultimate, Pro Tools, Pro Tools | Ultimate and Sibelius | Ultimate. For students whose school campuses are closed, any student of an Avid-based learning institution that uses Media Composer, Pro Tools or Sibelius can receive a free 90-day license for the same products.

Aris
Aris, a full-service production and post house based in Los Angeles, is partnering with ThinkLA to offer free online editing classes for those who want to sharpen their skills while staying close to home during this worldwide crisis. The series will be taught by Aris EP/founder Greg Bassenian, who is also an award-winning writer and director. He has also edited numerous projects for clients including Coca-Cola, Chevy and Zappos.

mLogic
mLogic is offering a 15% discount on its mTape Thunderbolt 3 LTO-7 and LTO-8 solutions The discount applies to orders placed on the mTape website through April 20th. Use discount code mLogicpostPerspective15%.

Xytech
Xytech has launched “Xytech After Dark,” a podcast focusing on trends in the media and broadcasting industries. The first two episodes are now available on iTunes, Spotify and all podcasting platforms.

Xytech’s Greg Dolan says the podcast “is not a forum to sell, but instead to talk about why create the functionality in MediaPulse and the types of things happening in our industry.”

Hosted by Xytech’s Gregg Sandheinrich, the podcast will feature Xytech staff, along with special guests. The first two episodes cover topics including the recent HPA Tech Retreat (featuring HPA president Seth Hallen), as well as the cancellation of the NAB Show, the value of trade shows and the effects of COVID-19 on the industry.

Adobe
Adobe shared a guide to best practices for working from home. It’s meant to support creators and filmmakers who might be shifting to remote work and need to stay connected with their teams and continue to complete projects. You can find the guide here.

Adobe’s principal Creative Cloud evangelist, Jason Levine, hosted a live stream — Video Workflows With Team Projects that focus on remote workflows.

Additionally, Karl Soule, Senior Technical Business Development Manager, hosed a stream focusing on Remote video workflows and collaboration in the enterprise. If you sign up on this page, you can see his presentation.

Streambox
Streambox has introduced a pay-as-you-go software plan for video professionals who use its Chroma 4K, Chroma UHD, Chroma HD and Chroma X streaming encoder/decoder hardware. Since the software has been “decoupled” from the hardware platform, those who own the hardware can rent the software on a monthly basis, pause the subscription between projects and reinstate it as needed. By renting software for a fixed period, creatives can take on jobs without having to pay outright for technology that might have been impractical

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage .capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.

 

 


Posting Everest VR: Journey to the Top of the World

While preparing to climb both Mount Everest and Mount Lhotse without the use of bottled oxygen, renowned climber Ueli Steck fell to his death in late April of 2017. VR director and alpine photographer Jonathan Griffith and mountain guide Tenji Sherpa, both friends of Steck, picked up the climber’s torch, and the result was the 8K 3D documentary Everest VR: Journey to the Top of the World, produced by Facebook’s Oculus.

Over the course of three years, Griffith shot footage following Tenji and some of the world’s most accomplished climbers in some of the world’s most extreme locations. The series also includes footage that lets viewers witness what it is like to be engulfed in a Himalayan avalanche, cross a crevasse and staring deep in its depths, take a huge rock-climbing fall, camp under the stars and soak in the view from the top of the world.

For the post part of the doc, Griffith called on veteran VR post pro Matthew DeJohn for editing and color correction, VR stitching expert Keith Kolod and Brendan Hogan for sound design.

“It really was amazing how a small crew was able to get all of this done,” says Griffith. “The collaboration between myself as the cameraman and Matt and Keith was a huge part of being able to get this series done — and done at such as a high quality.

“Matt and Keith would give suggestions on how to capture for VR, how camera wobbling impacted stitching, how to be aware of the nadir and zenith in each frame and to think about proximity issues. The efficient post process helped in letting us focus on what was needed, and I am incredibly happy with the end result.”

DeJohn was tasked with bringing together a huge amount of footage from a number of different high-end camera systems, including the Yi Halo and Z Cam V1 Pro.

DeJohn called on Blackmagic Resolve for this part of the project, saying that using one tool for all helped speed up the process.“A VR project usually has different teams of multiple people for editing, grading and stitching, but with Resolve, Keith and I handled everything,” he explains.

Within Resolve, DeJohncut the series at 2Kx2K, relinked to 8Kx8K source and then change the timeline resolution to 8Kx8K for final color and rendering. He used the Fairlight audio editing tab to make fine adjustments, manage different narration takes with audio layers, and manage varied source files such as mono-narration, stereo music and four-channel ambisonic spatial audio.

In terms of color grading, DeJohn says, “I colored the project from the very first edit so when it came to finalize the color it was just a process of touching things up.”

Fusion Studio was used for stereoscopic alignment fixes, motion graphics, rig removal, nadir patches, stabilization, stereo correction of the initial stitch, re-orienting 360 imagery, viewing the 360 scenes in a VR headset and controlling focal areas. More intense stitching work was done by Kolod using Fusion Studio.

Footage of such an extreme environment, as well as the closeness of climbers to the cameras, provided unique challenges for Kolod who had to rebuild portions of images from individual cameras. He also had to manually ramp down the stereo on the images north and south poles to ensure easy viewing, fix stereo misalignment and distance issues between the foreground and background and calm excessive movement in images.

“A regular fix I had to make was adjusting incorrect vertical alignments, which create huge problems for viewing. Even if a camera is a little bit off, the viewer can tell,” says Kolod. “The project used a lot of locked-off tripod cameras, and you would think that the images coming from them would be completely steady. But a little bit of wind or slight movement in what is usually a calm frame makes a scene unwatchable in VR. So I used Fusion for stabilization on a lot of shots.”

“High-quality VR work should always be done with manual stitching with an artist making sure there are no rough areas. The reason why this series looks so amazing is that there was an artist involved in every part of the process — shooting, editing, grading and stitching,” concludes Kolod.

Vegas Post upgrades for VFX, compositing and stills

Vegas Creative Software, in partnership with FXhome, has added new versions of Vegas Effects and Vegas Image to the Vegas Post suite of editing, VFX, compositing and imaging tools for video professionals, editors and VFX artists.

The Vegas Post workflow centers on Vegas Pro for editing and adds Vegas Effects and Vegas Image for VFX, compositing and still-image editing.

Vegas Effects is a full-featured visual effects and compositing tool that provides a variety of high-quality effects, presets and correction tools. With over 800 effects and filters to tweak, combine, pull apart and put back together, Vegas Effects provides users with a powerful library of effects including:
• Particle generators
• Text and titling
• Behavior effects
• 3D model rendering
• A unified 3D space
• Fire and lightning generators
• Greenscreen removal
• Muzzle flash generators
• Picture in picture
• Vertical video integration

Vegas Image is a non-destructive raw image compositor that enables video editors to work with still-image and graphical content and incorporate it directly into their final productions — all directly integrated with Vegas Post. This new version of Vegas Image contains feature updates including:
• Brush masks: A new mask type that allows the user to brush in/out effects or layers and includes basic brush settings like radius, opacity, softness, spacing and smoothing
• Multiple layer transform: Gives the ability to move, rotate and scale a selection of layers
• Multi-point gradient effect: An effect that enables users to create colored gradients using an unlimited amount of colored points
• Light rays effect: An effect that uses bright spots to cast light rays in scenes, e.g., light rays streaming through trees
• Raw denoise: Bespoke denoise step for raw images, which can remove defective pixels and large noise patterns
• Lens distortion effect: Can be used to perform lens-based adjustments, such as barrel/pincushion distortion or chromatic aberration
• Halftone effect: Produces a halftone look, like a newspaper print or pop art
• Configurable mask overlay color: Users can now pick what color is overlaid when the mask overlay render option is enabled

Vegas Post is available now for $999 or as a subscription starting at $21 per month.

Hecho Studios: Mobilizing talent and pipeline to keep working

By Ryan Curtis

When Hecho first learned of the possibility of a shutdown due to COVID-19, we started putting together a game plan to maintain the level of production quality and collaboration that we are all used to, but this time remotely. Working closely with our chief content officer Tom Dunlap, our post production workflow manager Nathan Fleming and senior editor Stevo Chang, we first identified the editors, animators, colorists, Flame artists, footage researchers and other post-related talent who work with us regularly. We then built a standing army of remote talent who were ready to embrace the new normal and get to work.

Ryan Curtis

It was a formidable challenge to get the remote editorial stations up and running. We had a relatively short notice that we were going to have to finalize and enact a WFH game plan in LA. In order to keep productions running smoothly, we teamed with our equipment vendor, VFX Technologies, to give our IT team the ability to remote in and fully outfit each work station with software. They also scheduled a driver to make contact-free drop offs at the homes of our artists. We’ve deployed over 15 iMacs for editorial, animation and finishing needs. We can scale as needed, and only need two to three days’ notice to get a new artist fully set up at home with the appropriate tools. Our remote edit bay workstations are mainly iMac Pros, running the Adobe suite of tools, Maxon Cinema 4D, Blackmagic DaVinci Resolve and Autodesk Flame.

We have outfitted each member of our team with Signiant, which allows for rapid speed file transfers for larger files. If an artist’s home internet is not up to snuff for their project, we have been boosting their internet speeds. To maintain file integrity, we are rolling out the same file structure as you would find on our server, allowing us to archive projects back to the server remotely once delivered. We’ve also designated key people who can access the in-office stations and server virtually, retrieve assets and migrate them to remote teams to refresh existing campaigns.

The need to review during each phase of production has never been stronger. We tested a wide variety of review solutions, and have currently settled on the following:

• For Animation/Design-Based Projects:
Frankie – Export-based interactive reviews
• For Editorial Projects:
Evercast – Live plug and play sessions
Wiredrive (often times paired with Google Hangouts or Zoom)
• For Finishing:
Vimeo Review – Export-based color reviews
Streambox – Live color collaboration (paired with Google Hangouts or Zoom)
Frankie – Export-based interactive reviews
Wiredrive for deliverables (often times paired with Google Hangouts or Zoom)

Our collective of talent remains our contracted veteran Hecho crew, well over 50 people who know our shorthand and in-office workflows and can easily be onboarded to our new remote workflow. If needed to satisfy a specific creative challenge, we bring in new talent and quickly onboard them into the Hecho family.

In terms of how we deal with approvals, it depends on the team and the project. If you have a dedicated team to a project it can be even more efficient than working in the office. Overcommunication is key, and transparency with feedback and workflows is paramount to a successful project. However, in many cases, efficiencies can be lost and projects currently move about 20 percent slower than if we were in the office. To combat this, some teams have structured a little differently as it can be hard to wrangle busy individuals with fast deadlines remotely. So having approved backup approvers on board has been immensely helpful to keep projects moving along on time. And without clients in the bay, we lean even more on our post producers to funnel all questions and feedback from clients, ensuring clear back and forth with artists.

NFL #stayhomestaystrong

Challenges Solved
Aside from the lack of in-person interaction and the efficiencies of quick catch ups in the hall or in the bay, the biggest challenge has been home internet speeds. This affects everything else that’s involved with a WFH set up. In some cases we had to actually upgrade current ISP contracts in order to reach an acceptable baseline for getting work done: streaming reviews, file sharing, etc.

The other challenge was quickly testing/evaluating new tools and then getting everybody up to speed on how to use them. Evercast was probably the trickiest new product because it involves live streaming from an editor’s machine (using Adobe Premiere) while multiple “reviewers” watch them work in real time. As you can imagine, there are many factors that can affect live streaming: CPU of the streaming computer, bitrate you’re streaming, etc. Luckily, once we had gone through a couple setups and reviews (trial and error) things got much easier. Also the team at Evercast (thanks Brad, Tyrel, and Robert!) were great in helping us figure out some of the issues we ran into early on.

Our First WFH Projects
For our first COVID-19 response project, we worked with agency 72andSunny and the NFL to share the uplifting message #Stayhomestaystrong. Behind the scenes, our post team produced a complete offline to online workflow in record time and went from brief to live in six days while everyone transitioned to working entirely remotely. #Stayhomestaystrong also helped bring in $35 million in donations toward COVID relief groups. Credits include editors Amanda Tuttle, Andrew Leggett, assistant editors: Max Pankow, Stephen Shirk, animator Lawrence Wyatt, Flame artists Rachel Moorer, Gurvand Tanneau and Paul Song and post producer Song Cho.

Stay INspired

Another project we worked with 72andSunny on was COVID-19 response ad, Pinterest Stay INspired, involving heavy motion graphics and a large number of assets, which ranged from stock photos, raw video files from remote shoots and licensed UGC assets. The designers, motion graphics artists, writers and clients used a Google Slides deck to link thumbnail images directly to the stock photo or UGC asset. Notes were sent directly to their emails via tags in the comments section of the slides.

Our team shared storyboards, frequently jumped on video conference calls and even sent recorded hand gestures to indicate the kind of motion graphic movement they were looking for. Credits for this one include editor/motion designer: Stevo Chang, motion designer Sierra Hunkins, associate editor Josh Copeland and post producer Cho, once again.

What We Learned
WFH reinforced the need for the utmost transparency in team structures and the need for super-clear communication. Each and every member of our team has needed to embrace the change and take on new challenges and responsibilities. What worked before in office, doesn’t necessarily work in a remote situation.

The shutdown also forced us to discover new technologies, like Evercast, and we likely wouldn’t have signed up for Signiant for a while. Moving forward, these tools have both been great additions to what we can offer our clients. These new technologies also open up future opportunities for us to work with clients we didn’t have access to before (out of state and overseas). We can do live remote sessions without the client having to physically be in a bay which is a game changer.


Ryan Curtis is head of post production at two-time Emmy-nominated Hecho Studios, part of MDC’s Constellation collective of companies.

Behind the Title: Kaboom’s Doug Werby

After starting his career as an editor, Doug Werby transitioned to director as well. He loves when a project combines both of his talents.

Name: Doug Werby

Company: Kaboom

Can you describe what Kaboom does?
Kaboom is a full-creative service production company providing production and post.

Doug Werby on set

What’s your job title?
Director and editor

What does that entail?
Whatever it takes to pull down and execute a project to the highest degree of my ability. It means having a concrete vision from beginning to end and always collaborating with the team. From directing a voiceover session in LA from a remote island off the coast of Croatia to editing at 2am for an east coast 6am delivery. Whatever it takes. I’m an all-in person by nature.

What would surprise people the most about what falls under that title?
That everything I’ve learned in editing I apply to each moment of directing. I started my career as an editor, and it’s about seeing collaborations from different angles and using that to produce creative work efficiently. I believe my strength is in making other peoples ideas better. Shaping the narrative with all the tools and talent available.

What’s your favorite part of the job?
Editing: Cracking open a fresh bin of un-cut dailies in my editing studio when everything is quiet.
Directing: First shot of the first day of shooting.

What’s your least favorite?
Editing: Editing screen captures for app interfaces.
Directing: Late, late-night shoots.

What is your most productive time of day?
After my first cup of coffee at 7:30am til around 11:30am, and then from 8pm to 11pm after a dessert espresso.

If you didn’t have this job, what would you be doing instead?
Origami or pottery, but basically the same thing I already do – shaping things with my hands – but paid commissions are more rarified.

Why did you choose this profession?
It was really the only possible option that made my heart beat faster.

How early did you know this would be your path?
When I was 22 years old. After four years at a liberal arts college, not knowing what the heck to study but always loving film and radio, I made that my focus and, ultimately, my career.

Can you name some recent projects you have worked on?
On the editing front for Kaboom: Campaigns for American Express promoting Wimbledon. This was a “social” project we cut in NYC. It was great fun bringing together celebrity, humor, music, stylized art direction and motion graphics for the small screen.

Wimbledon

The Oakland Airport TV edit for Kaboom. This was a throwback to the days of cutting deadpan mockumentary humor. I love this format and working closely with the creatives, we got the most out of the footage. Plus, I love Oakland Airport.

My two personal short films: For the past few years I’ve been parlaying all my skills from the commercial world and applying them to the scripted drama genre. I’ve come up with a series of real-life stories adapted for film that I’m packaging up to present as a whole. The idea would be to create a series of 10 half-hour programs all dealing with kindness. Individually the films have been honored at multiple film festivals.

The first is called No Tricks, based on a gritty, real-life experience of Julio Diaz that unveils a mugging gone good. Two men from different worlds bring change and some unexpected wisdom.

Motorbike Thief tells a real-life incident that happened to Michael Coffin when he discovered a stranger with his stolen bike. So enraged at the sight, he confronts the assailant in no uncertain terms and just when the situation is about to get out of hand, the anger turns empathetic and an unlikely friendship develops.

Do you put on a different hat when cutting a specific genre?
Completely. When editing spots and promotions, I’m trying to tell the most entertaining story in the shortest amount of time while staying focused on a clear message. When editing scripted material, I’m focused on story beats, character development and performance. Performance trumps all.

Oakland Airport

What is the project you are most proud of?
The work I did as a director with Kaboom for Bank of America via Hill Holiday a few years back for the Special Olympics. Making stars out of unsung heroes and shining a light on how brave these individuals are was a great honor. The films really puts things into perspective and makes you think about what we take for granted.

What do you edit on?
Adobe Premiere Pro is my current weapon of choice, but I would edit on an iPhone, Amiga 500 or a Moviola if need be.

Favorite plugin?
That would be Dope Transitions for Premiere.

Name three pieces of technology you can’t live without?
iPhone, iMac, airplane.

What do you do to destress from it all?
I bike the hills and valleys around the San Francisco Bay Area. I work out as much as possible, and I help my wonderful partner cook and entertain our friends and family. And travel!

Adobe’s Productions for Premiere is now available

If you remember, back at Sundance, Adobe announced a new tool for Premiere called Productions. Productions provides a flexible and scalable framework for organizing projects, sharing assets between them and keeping everything streamlined. It is now available within the latest version of Premiere.

Productions for Premiere Pro was designed from the ground up with input from top filmmakers and Hollywood editorial teams. Early versions of the underlying technology were tested on recent films such as Terminator: Dark Fate and Dolemite Is My Name. Special builds of Premiere with Productions are being used now in editorial on films like David Fincher’s Mank.

Productions makes it possible to divide large or complex workflows into smaller pieces based on the existing Premiere project format. Productions connects the projects, making them into components of the larger workflow and enabling a variety of use cases. For example, an editorial team working on a film can use Productions to organize its workflow around reels and scenes.

Episodic shows can be grouped by season, so it’s easy to access other shows to retrieve things like title sequences or audio elements. Agencies can allocate a Production to each client, so they can quickly reference and retrieve assets from existing projects.

Media referencing across projects means editors can reuse assets within a production without creating duplicates. And the new Production panel in Premiere Pro is a command center for managing multi-project workflows. Any projects that get added to the Productions folder become part of the production. Productions keeps everything in sync, meaning that any changes made on a Mac or Windows disk are reflected in Premiere Pro and vice versa.

Using shared local storage, multiple editors can work on different projects in the same production. Project Locking ensures that no one overwrites another’s work; colleagues can still access each other’s project and copy content from it, but they can’t make changes until a given edit is complete.

All projects in a Production share the same settings, including scratch disks. This means that preview files rendered by one editor can be available for all editors who use that project, ensuring smooth playback and time savings for the whole team.

The Production panel allows users to see all the projects and shows who is working on what so the team can track the progress.

How is Productions different than Team Projects?
Productions is designed for collaborators working on shared local storage. Team Projects is built for remote collaboration — assets can be stored locally with individual users; project files are securely stored in Creative Cloud.

The two tool sets are distinct and currently cannot be combined. Productions is part of Premiere Pro and is included with all licenses. Team Projects is part of Team and Enterprise licenses for Premiere Pro and After Effects.

In order to support users working from home due to COVID-19, Adobe is making Team Projects available to all users through August 17.

Telestream Vantage option automates IMF packages from Adobe Premiere

Telestream has introduced Vantage IMF Producer, a Vantage option that automates the creation of IMF (Interoperable Master Format) packages from Adobe Premiere. Using a Vantage panel within Premiere, editors can access Vantage’s IMF processing and packaging.

IMF packages are the preferred method to deliver show masters to companies like Netflix, Fox, Disney and many others. “While IMF has become the delivery format of choice for many media-services providers and production companies, Adobe Premiere Pro users need to primarily focus on the storytelling process,” says Sue Skidmore, head of partner relations for video at Adobe. “Having a plugin panel to Adobe Premiere Pro for Vantage’s automated IMF packaging workflows gives editors confidence they can deliver compliant IMF masters right from their timeline.”

IMF is a SMPTE standard for providing a single, interchangeable master file format and structure for the distribution of content between businesses around the world. IMF provides a framework for creating a true file-based final master. Part of the Vantage Media Processing Platform, IMF Producer automates the creation of all files required in an IMF package from a single output render of a Premiere timeline. In addition to generating the primary package, editors can create additional sequences, which become supplemental IMF packages that contain different versions of audio, subtitles, edit points, Dolby Vision HDR metadata and more.

IMF Producer can process up to four jobs simultaneously from Premiere. Using Vantage Timed Text Flip, IMF Producer also provides full support for IMSC1 subtitles. The subtitle feature is required for IMF packages and is frequently missing from other solutions.

Editor Brandy Troxler joins SF’s 1606 Studio

San Francisco editorial boutique 1606 Studio has hired editor Brandy Troxler. Having worked in the Bay Area for more than a decade, Troxler has edited spots for Mini USA, Yelp, Toyota, Texas.gov and others. Most recently, she was an in-house editor at San Francisco agency Heat.

1606 Studio executive producer Jon Ettinger has known Troxler for years; she worked at Beast Editorial when he was executive producer there. “She’s a great collaborator and good team player,” he observes. “She fits the vibe established here by our partners, Doug Walker, Connor McDonald and Brian Lagerhausen, which is to work hard and form long-term partnership with our clients.”

Troxler joined Heat in 2019 after six years at Beast Editorial. A graduate of Elon University in North Carolina, she also worked at Footpath Pictures in Raleigh-Durham, and Barbary Post in San Francisco.

Her first spot for 1606 Studio was a PSA produced for International Women’s Day by UN Women, a United Nations organization working for global equality. It’s from the agency Erich & Kallman and was directed by Caruso Company’s Doug Walker. The spot begins with what appears to be a news broadcast from the 1950s as a male newscaster recites a litany of workplace issues that negatively affect women. As he speaks, the scene around him becomes more modern and it becomes apparent that the issues he is referring to apply today.

“It’s simple, but powerful,” Troxler says. “As a woman of color, it was awesome to have the opportunity to tell that story. First and foremost, I am a storyteller and I like to tell diverse stories. While docu-style is a focus of mine, I quite enjoy cutting comedy as well.”

Aris EP offering free editing courses during COVID crisis

Aris, a full-service production and post house based in Los Angeles, is partnering with ThinkLA to offer free online editing classes for those who want to sharpen their skills while staying close to home during this worldwide crisis. The series will be taught by Aris EP/founder Greg Bassenian, who is also an award-winning writer and director. He has also edited numerous projects for clients including Coca-Cola, Chevy and Zappos.

“As the production industry has come to a rapid halt, these are challenging times for everyone. Thankfully, our post artists are able to continue their work remotely,” says Bassenian. “But we began to think about a way we could give back during these extraordinarily difficult times, and how we could offer something of assistance to both industry and non-industry professionals. We hope that this can provide people with some assistance and benefit, and hopefully some inspiration as we all try to get through this situation together.”

ThinkLA is a nonprofit organization with a mission to connect, inspire and educate the Los Angeles marketing community. The course is the first in a three-course series throughout the two companies are offering through the month of April. The Fundamentals of Editing covers how to set up a project, the basics of working with footage, and finishing and exporting your completed piece. The virtual course is a primer for those interested in learning the principles of video editing unfolding over three one-hour lessons held on consecutive Fridays – April 3rd, 10th, and 17th – at 11 am PDT.

The only prerequisite is that students have an Adobe Premiere Pro subscription before the classes begin. You can download a free 30-day trial here. The course is limited to 50 members per session — on a first come first serve basis — and no prior editing experience is necessary. You can sign up for the classes here.

Apple and Avid offer free temp licenses during COVID-19 crisis

Apple is offering free 90-day trials of Final Cut Pro X and Logic Pro X apps for all in order to help those working from home and looking for something new to master, as well as for students who are already using the tools in school but don’t have the apps on their home computers.

Apple Final Cut X

Apple is extending what is normally a 30-day trial for Final Cut Pro X, while a free trial is new to Logic Pro X. The extension to 90 days is for a limited time and will revert to 30 days across both apps in the future.

Trials for both Final Cut Pro X and Logic Pro X are now available. Customers can download the free trials on the web pages for Final Cut Pro X  and Logic Pro X. The 90-day extension is also available to customers who have already downloaded the free 30-day trial of Final Cut Pro X.

For its part, Avid is offering free temp licenses for remote users of the company’s creative tools. Commercial customers can get a free 90-day license for each registered user of Media Composer | Ultimate, Pro Tools, Pro Tools | Ultimate and Sibelius | Ultimate. For students whose school campuses are closed, any student of an Avid-based learning institution that uses Media Composer, Pro Tools or Sibelius can receive a free 90-day license for the same products.

The offer is open through April 17.

Main Image: Courtesy of Avid

Words of wisdom from editor Jesse Averna, ACE

We are all living in a world we’ve never had to navigate before. People’s jobs are in flux, others are working from home, and anxiety is a regular part of our lives. Through all the chaos, Jesse Averna has been a calming voice on social media, so postPerspective reached out to ask him to address our readership directly.

Jesse, who was co-founder of the popular Twitter chat and Facebook group @PostChat, works at Disney Animation Studio and is a member of the American Cinema Editors.


Hey,

How are you doing? This isn’t an ad. I’m not going to sell you anything or try to convince you of anything. I just want to take the opportunity to check in. Like many of you, I’m a post professional (an editor) currently working from home. If we don’t look out for each other, who will? Please know that it’s okay not to be okay right now. I have to be honest, I’m exhausted. I’m just endlessly reading news and searching for new news and reading posts about news I’ve already read and searching again for news I might have missed …

I want to remind you of a couple things that I think might bring some peace, if you let me. I fear it’s about to get much darker and much scarier, so we need to anchor ourselves to some hope.

You are valuable. The world is literally different because you are here. You have intrinsic value, and that will never change. No matter what. You are thought about and loved, despite whatever the voice in your head says. I’m sure your first reaction to reading that is to blow it off, but try to own it. Even for just a moment. It’s true.

You don’t deserve what’s going on, but let it bring some peace that the whole world is going through it together. You might be isolated, but you’re not alone. We are forced to look out for one another by looking out for ourselves. It’s interesting; I feel so separate and vulnerable, but the truth is that the whole planet is feeling and reacting to this as one. We are in sync, whether we know it or not — and that’s encouraging to me. We ALL want to be well and be safe, and we want our neighbors to be well also. We have a rare moment of feeling like a people, like a planet.

If you are feeling anxious, do me a favor tonight. Go outside and look at the stars. Set a timer for five minutes. No entertainment or phone or anything else. Just five minutes. Reset. Feel yourself on a cosmic scale. Small. A blink of an eye. But so, so valuable.

And please give yourself a break. A sanity check. If you need help, please reach out. If you need to nest, do it. You need to tune out, do it. Take care of yourself. This is an unprecedented moment. It’s okay not to be okay. Once you can, though, see who you can help. This complete shift of reality has made me think about legacy. This is a unique legacy-building moment. That student who reached out to you on LinkedIn asking for advice? You now have time to reply. That nonprofit you thought about volunteering your talents to? Now’s your chance. Even just to make the connection. Who can you help? Check in on? You don’t need any excuse in our current state to reach out.

I know I’m just some rando you’re reading on the internet, but I believe you are going to make it through this. You are wonderful. Do everything you can to be safe. The world needs you. It’s a better place because you are here. You know things, have ideas to share and will make things that none of the rest of us do or have.

Hang in there, my friends, and let me know if you have any thoughts, encouragements or tips for staying sane during this time. I’ll try to compile them into another article to share.

Jesse
@dr0id


Jesse Averna  — pictured on his way to donate masks — is a five-time Emmy-winning ACE editor living in LA and working in the animation feature world. 

Finishing artist Tim Nagle discuses work on indie film Miss Juneteenth

Lucky Post Flame artist Tim Nagle has a long list of projects under his belt, including collaborations with David Lowery — providing Flame work on the short film Pioneer as well as finishing and VFX work to Lowery’s motion picture A Ghost Story. He is equally at home working on spots, such as campaigns for AT&T, Hershey’s, The Home Depot, Jeep, McDonald’s and Ram..

Nagle began his formal career on the audio side of the business, working as engineer for Solid State Logic, where he collaborated with clients including Fox, Warner Bros., Skywalker, EA Games and ABC.

Tim Nagle

We reached out to Nagle about his and Lucky Post’s work on the feature film Miss Juneteenth, which premiered at Sundance and was recently honored by SXSW 2020 as the winner of the Louis Black Lone Star award.

Miss Juneteenth was directed (and written) by Channing Godfrey Peoples — her first feature-length film. It focuses on a woman from the south — a bona fide beauty queen once crowned Miss Juneteenth, a title commemorating the day slavery was abolished in Texas. The film follows her journey as she tries to hold onto her elegance while striving to survive. She looks for ways to thrive despite her own shortcomings as she marches, step by step, toward self-realization.

How did the film come to you?
We have an ongoing relationship with Sailor Bear, the film’s producing team of David Lowery, Toby Halbrooks and James Johnston. We’ve collaborated with them on multiple projects, including The Old Man & The Gun, directed by Lowery.

What were you tasked to do?
We were asked to provide dailies transcoding, additional editorial, VFX, color and finishing and ultimately delivery to distribution.

How often did you talk to director Channing Godfrey Peoples?
Channing was in the studio, working side by side with our creatives, including colorist Neil Anderson and me, to get the project completed for the Sundance deadline. It was a massive team effort, and we felt privileged to help Channing with her debut feature.

Without spoilers, what most inspires you about the film?
There’s so much to appreciate in the film — it’s a love letter to Texas, for one. It’s directed by a woman, has a single mother at its center and is a celebration of black culture. The LA Times called it one of the best films to come out of Sundance 2020.

Once you knew the film was premiering at Sundance, what was left to complete and in what amount of time?
This was by far the tightest turnaround we have ever experienced. Everything came down to the wire, sound being the last element. It’s one of the advantages of having a variety of talent and services under one roof — the creative collaboration was immediate, intense and really made possible by our shorthand and proximity.

How important do you think it is for post houses to be diversified in terms of the work they do?
I think diversification is important not only for business purposes but also to keep the artists creatively inspired. Lucky Post’s ongoing commitment to support independent film, both financially and creatively, is an integrated part of our business along with brand-supported work and advertising. Increasingly, as you see greater crossover of these worlds, it just seems like a natural evolution for the business to have fewer silos.

What does it mean to you as a company to have work at Sundance? What kinds of impact do you see — business, morale and otherwise?
Having a project that we put our hands on accepted into Sundance was such an honor. It is unclear what the immediate and direct business impacts might be, but for morale, this is often where the immediate value is clear. The excitement and inspiration we all get from projects like this just naturally makes how we do business better.

What software and hardware did you use?
On this project we started with Assimilate Scratch for dailies creation. Editorial was done in Adobe Premiere. Color was Blackmagic DaVinci Resolve, and finishing was done in Autodesk Flame.

What is a piece of advice that you’d give to filmmakers when considering the post phase of their films?
We love being involved as early as possible — certainly not to get in anyone’s way,  but to be in the background supporting the director’s creative vision. I’d say get with a post company that can assist in setting looks and establishing a workflow. With a little bit of foresight, this will create the efficiency you need to deliver in what always ends up being a tight deadline with the utmost quality.

Workstations: Offline Editing Workflows

By Karen Moltenbrey

When selecting a workstation, post facilities differ in their opinions about what’s most important, depending on the function the workstations will serve. It goes without saying that everyone wants value. And for some, power is tantamount. For others, speed is a top priority. And for others still, reliability reigns supreme. Luckily for users, today’s workstations can check all those boxes.

As Eric Mittan, director of technology at New York’s Jigsaw Productions, is quick to point out, it’s hard to fathom the kinds of upgrades in power we’ve seen in workstations just in the time he has been working with them professionally. He recalls that in 2004, it took an overnight encoding session to author a standard-definition DVD with just one hour of video — and that task was performed on one of the first dual-processor desktops available to the regular consumer. “Nowadays, that kind of video transcode can take 15 minutes on a ‘light’ laptop, to say nothing of the fact that physical media like the DVD has gone the way of the dinosaur,” he says.

Eric Mittan

That is just the tip of the iceberg in terms of the revolution that workstations have undergone in a very short period. Here, we examine the types of workstations that a pair of studios are using for their editing tasks. Jigsaw, a production company, does a large portion of its own post through Apple iMacs that run Avid Media Composer; it is also a client of post houses for work such as color and final deliverables. Meanwhile, another company, Final Cut, is also a Mac-based operation, running Avid Media Composer and Adobe Premiere Pro, although the company’s Flames run on HP workstations.

[Editor’s Note: These interviews were done prior to the coronavirus lockdown.]

Jigsaw Productions
Jigsaw Productions is a documentary television and film company that was founded in 1978 by documentary filmmaker Alex Gibney. It has since transitioned from a company that made one movie at a time to one that is simultaneously producing multiple features and series for distribution by a number of networks and distribution partners.

Today, Jigsaw does production and offline editorial for all its own films and series. “Our commitment is to filmmakers bringing real stories to their audience,” Mittan says. Jigsaw’s film and episodic projects include the  political (Client 9: The Rise and Fall of Eliot Spitzer), the musical (History of the Eagles) and the athletic (The Armstrong Lie).

On the technical front, Jigsaw does all the creative editorial in house using Avid’s Media Composer. After Jigsaw’s producers and directors are satisfied with the storytelling, the lion’s share of the more technical work is left to the company’s partners at various post houses, such as Harbor, Technicolor, Light Iron and Final Frame, among others. Those facilities do the color timing and DCP generation in the case of the feature titles. Most of the conform and online work for Jigsaw’s TV series is now done in house and then sent out for color.

“I wouldn’t say for sure that we have mastered the Avid-to-Resolve online workflow, but we have become better at it with each project,” says Mittan. It’s Mittan’s job to support post and offline operations along with the needs of the others in the office. The backbone of the post fleet comprises 26 (2018) 27-inch i7 iMacs with 32GB of RAM. During 2018 and 2019, Jigsaw experienced a period of rapid growth, adding 19 new edit suites. (That was in addition to the original 13 built out before Mittan came aboard in 2017.) There are also some earlier iMac models that are used for lighter tasks, such as screening, occasional transcoding and data transfers, as well as eight Mac mini screening stations and five Mac Pro cylinders for heavy transcoding and conform/online tasks. Approximately 10 or more 2019 models round out the remainder of the hardware, though they were purchased with i5 processors, not i7s.

“Jigsaw’s rapid expansion pushed us to buy new machines in addition to replacing a significant portion of our 2012/2013 model Mac Pro and iMac units that had comprised most of our workstations prior to my arrival,” Mittan notes. Each project group at the company is responsible for its own data management and transcoding its own dailies.

Furthermore, Jigsaw has an Avid Nexis shared storage system. “Our editors need to be able to run the latest version of Avid and must maintain and play back multiple streams of DNxHR SQ via a 1Gb connection to our Nexis shared storage. While documentary work tends to be lower resolution and/or lower bandwidth than narrative scripted work, every one of our editors deserves to be able to craft a story with as few technical hiccups as possible,” says Mittan. “Those same workstations frequently need to handle heavy transcodes from interview shoots and research archive gathered each day by production teams.”

When buying new equipment, Mittan looks to strikes a balance between economy and sustainability. While the work at Jigsaw does not always require the latest and greatest of all possible end-user technology, he says, each purchase needs to be made with an eye toward how useful it will remain three to five years into the future.

Salt, Fat, Acid, Heat

While expansion in the past few years resulted in the need for additional purchases, Mittan is hoping to get Jigsaw on a regular schedule of cycling through each of the units over a period of five to six years. Optimally, the edit suite units are used for between three or more years before being downgraded for lighter tasks and eventually used as screening stations for Jigsaw’s producers. Even beyond that, the post machines could see life in years six to eight as office workstations for some of the non-post staff and interns. Although Mittan has yet to access one of the new Mac Pro towers, he is impressed by what he has read and hopes for an acquisition in 2021 to replace the Mac Pro cylinders for online and conform work.

Post at Jigsaw runs Avid Media Composer on the Apple machines. They also use the Adobe Creative Cloud suite for motion graphics within Adobe After Effects and Photoshop. Mittan has also implemented a number of open-source software tools to supplement Jigsaw’s tool kit for assistant editors. That includes command-line tools (like FFmpeg) for video and audio transcodes and Rsync for managed file transfers and verification.

“I’ve even begun to write a handful of custom software scripts that have made short work of tasks common to documentary filmmaking — mostly the kind of common video transcoding jobs that would usually require a paid title but that can be taken care of just as well with free software,” he says.

Additionally, Jigsaw makes frequent use of servers, either functioning as a device for a specific task or for automation.

Jigsaw has done projects for HBO (Robin Williams Come Into My Mind), Showtime (Enemies: The President, Justice & the FBI), Discovery Channel (Why We Hate), A&E (The Clinton Affair) and more, as well as for Netflix (Salt Fat Acid Heat, The Family) — work Mittan describes as an exercise in managing more and more pixels.

The Family

Indeed, documentaries can present big challenges when it comes to dealing with a plethora of media formats. “Documentary work can frequently deal with subjects that have already had a significant media footprint in legacy resolutions. This means that if you’re trying to build a documentary in 4K, you’re going to be dealing with archival footage that is usually HD or SD. You may shoot a handful of new interviews in your new, so-called ‘native’ footage but be overwhelmed by hours upon hours of footage from a VHS collection, or stories that have been downloaded from the website of a TV station in the Midwest,” he adds.

“Working with mixed resolutions means you have to have the capability of running and gunning with your new 4K footage, but the lower resolutions can’t leave your creative editors feeling as though they’ve been left with remnants from another time in history. Blending all of those elements together in a way that tells a cohesive story requires technology that can bring together all of those pieces (and newly generated elements like graphics and reenactments) into a unified piece of media without letting your viewing audience feel the whiplash of frequent resolution changes.”

Miky Wolf

Final Cut
Celebrating its 25th anniversary this year, Final Cut was founded in London by editor Rick Russell. It expanded to New York 20 years ago and to Los Angeles 15 years ago. Across all three offices and several subsidiaries – Significant Others VFX, Machine Sound and The Lofts — Final Cut has more than 100 staff and artists worldwide, offering offline editing, online editing, VFX, graphics, finishing, sound design, mixing and original composition, as well as “dry-hire” facilities for long-form content such as original Netflix series like Sex Education.

Primarily, Final Cut does offline creative editorial. Through Significant Others, it offers online editing and finishing. Although, as editor Miky Wolf notes, there are smaller jobs — such as music videos and digital work — for which the facility “does it all.”

Ryan Johnson

The same can be said of technical supervisor Ryan Johnson, whose job it is to design, implement and maintain the technical infrastructure for Final Cut’s New York and Los Angeles offices. This includes the workstations, software, data storage, backups, networking and security. “The best workstations should be like the best edited films. Something you don’t notice. If you are aware of the workstation while you’re working, it’s typically not a good thing,” Wolf says.

Johnson agrees. “Really, the workstation is just there to facilitate the work. It should be invisible. In fact, ours are mostly hidden under desks and are rarely seen. Mostly, it’s a purpose-built machine, designed less for aesthetics and portability than for reliability and practicality.”

Final Cut’s edit room runs off a Mac Pro with 32GB of RAM; there are two editing monitors, a preview monitor on the desk and a client monitor. The majority of the company’s edit workstations are six-core 2013 Mac Pro “trash cans” with AMD FirePro D500 GPUs and 32GB of RAM. There are approximately 16 of these workstations spread between the NY and LA offices. Moreover, the workstations use little to no local storage since the work resides on Avid’s Nexis servers. Each workstation is connected to a pair of 24-inch LCD displays, while video and audio from the edit software are delivered via Blackmagic Design hardware to an LCD preview monitor on the editor’s desk and to an OLED TV for clients.

The assistant editors all work on 27-inch iMacs of various vintages, mainly 2017 i7 models with 32GB of RAM.For on-set/off-site work, Final Cut keeps a fleet of MacBook Pros, mostly the 2015 Thunderbolt 2 pre-Touch Bar models. These travel with USB 3 SSDs for media storage. Final Cut’s Flames, however, all run on dual 12-core HP Z8s with 128GB of RAM. These machines use local SSD arrays for media storage.

According to Johnson, the workstations (running macOS 10.14.6) mostly are equipped with Avid Media Composer or Adobe Premiere Pro, and the editors sometimes “dabble” in Blackmagic’s DaVinci Resolve (for transcoding or when someone wants to try their hand at editing on it). “We primarily work with compressed proxy footage — typically DNxHD 115 or ProRes LT — at 1080p, so bandwidth requirements aren’t too high. Even lower-spec machines handle a few streams well,” he says. “Sequences that involve many layers or complicated effects will often require rendering, but the machines are fast enough that wait times aren’t too long.”

The editors also use Soundminer’s products for their sound effects library. The assistants perform basic compositing in Adobe After Effects, which the machines handle well, Johnson adds. “However, occasionally they will need to transcode raw/camera original footage to our preferred codec for editing. This is probably the most computationally intensive task for any of the machines, and we’ll try to use newer, faster models for this purpose.”

Stray Dolls feature film

Wherever possible, Final Cut deploys the same types of workstations across all its locations, as maintenance becomes easier when parts are interchangeable, and software compatibility is easier to manage when dealing with a homogeneous collection of machines. Not to mention the political benefit: Everybody gets the same machine, so there’s no workstation envy, so to speak.

Reliability and expandability are the most important factors Johnson considers in a workstation. He acknowledges that the 2013 Mac Pros were a disappointment on both counts: “They had thermal issues from the start — Apple admitted as much — that resulted in unpredictable behavior, and you were stuck with whichever 2013-era GPU you chose when purchasing the machine,” he says. “We expect to get many trouble-free years out of the workstations we buy. They should be easy to fix, maintain and upgrade.”

When selecting workstations for Final Cut, a Macintosh shop, there is not a great deal of choice. “Our choices are quickly narrowed down to whatever Apple happens to be selling,” explains Johnson. “Given the performance tiers of the models available, it is a matter of analyzing our performance needs versus our budget. In an ideal world, the entire staff would be working on the fastest possible machine with the most RAM and so forth, but alas, that is not always in the budget. Therefore, compromise must be found in selecting machines that can capably handle the typical workload and are fast enough not to keep editors and assistants waiting too long for renders.”

The most recent purchase were the new iMacs for the assistants in LA. “For the money, they are great machines, and I’ve found them to be reliable even when pushing them through all night renders, transcodes, etc. They’re at least as fast as the Mac Pros and, in most applications, even faster,” Johnson points out. He expects to replace the 2013 Mac Pros this year.

Florence and the Machine “Big God” music video

Wolf notes that he must be able to work as efficiently at home as he does at the office, “and that’s one nice thing about the evolution of offline editing. A combination of robust laptops and portable SSDs has allowed us to take the work anywhere.”

Using the above-described setup, Final Cut recently finished a campaign for an advertising client in which the edit started on set in LA, continued in the hotel room and then finished back in NY. “We needed to be able to work remotely, even on the plane home, just to get the first cuts done in time,” Wolf explains. “Agencies expect you to be fast. They schedule presentations assuming we can work around the clock to get stuff together — we need systems that can support us.”

Johnson highlighted another recent project with a tight schedule that involved cutting a multi-camera sequence in UHD from portable SSD storage on a standard iMac. “This would have been impossible just a few years ago,” he adds.

Main Image: Netflix’s Sex Education


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

My Top Five Ergonomic Workstation Accessories

By Brady Betzel

Instead of writing up my normal “Top Five Workstation Accessories” column this year, I wanted to take a slightly different route and focus on products that might lessen pain and maybe even improve your creative workflow — whether you are working at a studio or, more likely these days, working from home.

As an editor, I sit in a chair for most of my day, and that is on top of my three- to four-hour round-trip commute to work. As aches and pains build up (I’m 36, and I’m sure it doesn’t just get better), I had to start looking for solutions to alleviate the pain I can see coming in the future. In the past I have mentioned products like the Wacom Intuos Pro Pen tablet, which is great and helped me lessen wrist pain. Or color correction panels such as theLoupedeck, which helps creative workflows but also prevents you from solely using the mouse, also lessening wrist pain.

This year I wanted to look at how the actual setup of a workstation environment that might prevent pain or alleviate it. So get out of your seat and move around a little, take a walk around the block, and when you get back, maybe rethink how your workstation environment could become more conducive to a creativity-inspiring flow.

Autonomous SmartDesk 2 
One of the most useful things in my search for flexibility in the edit bay is the standup desk. Originally, I went to Ikea and found a clearance tabletop in the “dents” section and then found a kitchen island stand that was standing height. It has worked great for over 10 years; the only issue is that it isn’t easily adjustable, and sometimes I need to sit to really get my editing “flow” going.

Many companies offer standing desk solutions, including manual options like the classic VariDesk desk riser. If you have been in the offline editing game over the past five to 10 years, then you have definitely seen these come and go. But at almost $400, you might as well look for a robotic standing desk. This is where the Autonomous SmartDesk 2 comes into play. Depending on whether you want the Home Office version, which stands between 29.5 inches and 48 inches, or the Business Office version, which stands between 26 inches and 52 inches, you are looking to spend $379 or $479, respectively (with free shipping included).

The SmartDesk 2 desktop itself is made of MDF (medium-density fibreboard) material, which helps to lower the overall cost but is still sturdy and will hold up to 300 pounds. From black to white oak, there are multiple color options that not only help alleviate pains but can also be a conversation piece in the edit bay. I have the Business version in black along with a matching black chair, and I love that it looks clean and modern. The SmartDesk 2 is operated using a front-facing switch plate complete with up, down and four height-level presets. It operates smoothly and, to be honest, impressively. It gives a touch of class to any environment. Setup took about half an hour, and it came with easy-to-follow instructions, screws/washers and tools.

Keep an eye out for my full review of the Autonomous SmartDesk 2 and ErgoChair 2, but for now think about how a standup desk will at least alleviate some of the sitting you do all day while adding some class and conversation to the edit bay.

Autonomous ErgoChair 2 
Along with a standup desk — and more important in, my opinion — is a good chair. Most offline editors and assistant editors work at a company that either values their posture and buys Herman Miller Aeron chairs, or cheaps out and buys the $49 special at Office Depot. I never quite understood the benefit of saving a few bucks on a chair, especially if a company pays for health insurance — because in the end, they will be paying for it. Not everyone likes or can afford the $1,395 Aeron chairs, but there are options that don’t involve ruining your posture.

Along with the Autonomous SmartDesk 2, you should consider buying the ErgoChair 2, which costs $349 — a similar price to other chairs, like the Secretlab Omega series gaming chair that retails for $359. But the ErgoChair 2 has the best of both worlds: an Aeron chair-feeling mesh back and neck support plus a super-comfortable seat cushion with all the adjustments you could want. Even though I have only had the Autonomous products for a few weeks now, I can already feel the difference when working at home. It seems like a small issue in the grand scheme of things, but being comfortable allows my creativity to flow. The chair took under 30 minutes to build and came with easy-to-follow instructions and good tools, just like the SmartDesk 2.

A Footrest
When I first started in the industry, as soon as I began a freelance job, I would look for an old Sony IMX tape packing box. (Yes, the green tapes. Yes, I worked with tape. And yes, I can operate an MSW-2000 tape deck.) Typically, the boxes would be full of tapes because companies bought hundreds and never used them, and they made great footrests! I would line up a couple boxes under my feet, and it made a huge difference for me. Having a footrest relieves lower back pressure that I find hard to relieve any other way.

As I continue my career into my senior years, I finally discovered that there are actual footstools! Not just old boxes. One of my favorites is on Amazon. It is technically an adjustable nursing footstool but works great for use under a desk. And if you have a baby on the way, it’s a two-for-one deal. Either way, check out the “My Brest Friend” on Amazon. It goes for about $25 with free one-day Amazon Prime shipping. Or if you are a woodworker, you might be able to make your own.

GoFit Muscle Hook 
After sitting in an edit bay for multiple hours, multiple days in a row, I really like to stretch and use a massager to un-stuff my back. One of the best massagers I have seen in multiple edit bays is called the GoFit Muscle Hook.

Luckily for us it’s available at almost any Target or on the Target website for about $25. It’s an alien-looking device that can dig deep into your shoulder blades, neck and back. You can use it a few different ways — large hook for middle-of-the-back issues, smaller hook that I like to use on the neck and upper back, and the neck massage on the bar (that one feels a little weird to me).

There are other massage devices similar to the Muscle Hook, but in my opinion the GoFit Muscle Hook is the best. The plastic-composite seems indestructible and almost feels like it could double as a self-defense tool. But it can work out almost any knots you have worked up after a long day. If you don’t buy anything else for self-care, buy the Muscle Hook. You will be glad you did. Anyone who gets one has that look of pain and relief when they use it for the first time.

Foam Roller
Another item that I just started using was a foam roller. You can find them anywhere for the most part, but I found one on Amazon for $13.95 plus free Amazon Prime one-day shipping. It’s also available on the manufacturer’s website for about $23. Simply, it’s a high-density foam cylinder that you roll on top of. It sounds a little silly, but once you get one, you will really wonder how you lived without one. I purchased an 18-inch version, but they range from 12 inches to 36 inches. And if you have three young sons at home, they can double as fat lightsabers (but they hurt, so keep an eye out).

Summing Up
In the end, there are so many ways to try keeping a flexible editing lifestyle, from kettlebells to stand-up desks. I’ve found that just getting over the mental hurdle of not wanting to move is the biggest catalyst. There are so many great tech accessories for workstations, but we hardly mention ones that can keep our bodies moving and our creativity flowing. Hopefully, some of these ergonomic accessories for your workstation will spark an idea to move around and get your blood flowing.

For some workout inspiration, Onnit has some great free workouts featuring weird stuff like maces, steel clubs and sandbags, but also kettlebells. The site also has nutritional advice. For foam roller stretches, I would check out the same Onnit Academy site.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Digital Anarchy’s Transcriptive 2.0

By Barry Goch

Not long ago, I had the opportunity to go behind the scenes at Warner Bros. to cover the UHD HDR remastering of The Wizard of Oz. I had recorded audio of the entire experience so I could get accurate quotes from all involved — about an hour of audio. I then uploaded the audio file to Rev.com and waited. And waited. And waited. A few days later they came back and said they couldn’t do it. I was perplexed! I checked the audio file, and I could clearly hear the voices of the different speakers, but they couldn’t make it work.

That’s when my editor, Randi Altman, suggested Digital Anarchy’s Transcriptive, and it saved the day. What is Transcriptive? It’s an automated, intelligent transcription plugin for Adobe Premiere editors designed to automatically transcribe video using multiple speech and natural language processing engines with accuracy.

Well, not only did Transcriptive work, it worked super-fast, and it’s affordable and simple to use … once everything is set up. I spent a lot of time watching Transcriptive’s YouTube videos and then had to create two accounts for the two different AI transcription portals that they use. After a couple of hours of figuring and setup, I was finally good to go.

Digital Anarchy has lots of videos on YouTube about setting up the program. Here is a link to the overview video and a link to 2.0 new features. After getting everything set up, it took less than five minutes from start to finish to transcribe a one-minute video. That includes the coolest part: automatically linking the transcript to the video clip with word-for-word accuracy.

Transcriptive extension

Step by Step
Import video clip into Premiere, select the clips, and open the Transcriptive Extension.

Tell Transcriptive if you want to use an existing transcript or create a new transcription.

Then choose the AI that you want to transcribe your clip. You see the cost upfront, so no surprises.

Launch app

I picked the Speechmatics AI:

Choosing AI

Once you press continue, Media Encoder launches.

Media Encoder making FLAC file automatically.

And Media Encoder automatically makes a FLAC file and uploads it to the transcription engine you picked.

One minute later, no joke, I had a finished transcription linked word-accurately to my source video clip.

Final Thoughts
The only downside to this is that the transcription isn’t 100% accurate. For example, it heard Lake Tahoe as “Lake Thomas” and my son’s name, Oliver, as “over.”

Final transcription

This lack of accuracy is not a deal breaker for me, especially since I would have been totally out of luck without it on The Wizard of Oz article, which you can read here. For me, the speed and ease of use more than compensates for the lack of accuracy. And, as AI’s get better, the accuracy will only improve.

And on February 27, Digital Anarchy released Transcriptive V.2.0.3, which is compatible with Adobe Premiere v14.0.2. The update also includes a new prepaid option that can lower the cost of transcription to $2.40 per hour of footage. Transcriptive’s tight integration with Premiere makes it a must-have for working with transcripts when cutting long- and short-form projects.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Goldcrest Post’s Jay Tilin has passed away

Jay Tilin, head of production at New York’s Goldcrest Post, passed away last month after a long illness. For 40 years, Tilin worked in the industry as an editor, visual effects artist and executive. His many notable credits include the Netflix series Marco Polo and the HBO series Treme and True Detective.

“Jay was in integral part of New York’s post production community and one of the top conform artists in the world,” said Goldcrest Post managing director Domenic Rom. “He was beloved by our staff and clients as an admired colleague and valued friend. We offer our heartfelt condolences to his family and all who knew him.”

Tilin began his career in 1980 as an editor with Devlin Productions. He also spent many years at The Tape House, Technicolor, Riot and Deluxe, all in New York. He was an early adopter of many now standard post technologies, from the advent of HD video in the 1990s through more recent implementations of 4K and HDR finishing.

His credits also include the HBO series Boardwalk Empire, the Sundance Channel series Hap and Leonard, the PBS documentary The National Parks and the Merchant Ivory feature City of Your Final Destination. He also contributed to numerous commercials and broadcast promos. A native New Yorker, Tilin earned a degree in broadcasting from SUNY Oswego.

Tilin is survived by his wife Betsy, his children Kelsey and Sam, his mother Sonya and his sister Felice (Trudy).

Editor Anthony Marinelli joins Northern Lights

Editor Anthony Marinelli has joined post studio Northern Lights. Marinelli’s experience spans commercial, brand content, film and social projects. Marinelli comes to Northern Lights from a four-year stint at TwoPointO where he was also a partner. He has previously worked at Kind Editorial, Alkemy X, Red Car, Cut+Run and Crew Cuts.

Marinelli’s work includes projects for Mercedes, FedEx, BMW, Visa, Pepsi, Scotts, Mount Sinai and Verizon. He also edited the Webby Award-winning documentary “Alicia in Africa,” featuring Alicia Keys for Keep a Child Alive.

Marinelli is also an active in independent theater and film. He has written and directed many plays and short films, including Acoustic Space, which won Best Short at the 2018 Ridgewood Guild Film Festival and Best Short Screenplay in the Richmond International Film Festival.

Marinelli’s most recent campaigns are for Mount Sinai and Bernie & Phyl’s for DeVito Verdi.

He works on Avid Media Composer and Adobe Premiere. You can watch his reel here.

Blackmagic releases Resolve 16.2, beefs up audio post tools

Blackmagic has updated its color, edit, VFX and audio post tool to Resolve 16.2. This new version features major Fairlight updates for audio post as well as many improvements for color correction, editing and more.

This new version has major new updates for editing in the Fairlight audio timeline when using a mouse and keyboard. This is because the new edit selection mode unlocks functionality previously only available via the audio editor on the full Fairlight console, so editing is much faster than before. In addition, the edit selection mode makes adding fades and cuts and even moving clips only a mouse click away. New scalable waveforms let users zoom in without adjusting the volume. Bouncing lets customers render a clip with custom sound effects directly from the Fairlight timeline.

Adding multiple clips is also easier, as users can now add them to the timeline vertically, not just horizontally, making it simpler to add multiple tracks of audio at once. Multichannel tracks can now be converted into linked groups directly in the timeline so users no longer have to change clips manually and reimport. There’s added support for frame boundary editing, which improves file export compatibility for film and broadcast deliveries. Frame boundary editing now adds precision so users can easily trim to frame boundaries without having to zoom all the way in the timeline. The new version supports modifier keys so that clips can be duplicated directly in the timeline using the keyboard and mouse. Users can also copy clips across multiple timelines with ease.

Resolve 16.2 also includes support for the Blackmagic Fairlight Sound Library with new support for metadata based searches, so customers don’t need to know the filename to find a sound effect. Search results also display both the file name and description, so finding the perfect sound effect is faster and easier than before.

MPEG-H 3D immersive surround sound audio bussing and monitoring workflows are now supported. Additionally, improved pan and balance behavior includes the ability to constrain panning.

Fairlight audio editing also has index improvements. The edit index is now available in the Fairlight page and works as it does in the other pages, displaying a list of all media used; users simply click on a clip to navigate directly to its location in the timeline. The track index now supports drag selections for mute, solo, record enable and lock as well as visibility controls so editors can quickly swipe through a stack of tracks without having to click on each one individually. Audio tracks can also be rearranged by click and dragging a single track or a group of tracks in the track index.

This new release also includes improvements in AAF import and export. AAF support has been refined so that AAF sequences can be imported directly to the timeline in use. Additionally, if the project features a different time scale, the AAF data can also be imported with an offset value to match. AAF files that contain multiple channels will also be recognized as linked groups automatically. The AAF export has been updated and now supports industry-standard broadcast wave files. Audio cross-fades and fade handles are now added to the AAF files exported from Fairlight and will be recognized in other applications.

For traditional Fairlight users, this new update makes major improvements in importing old legacy Fairlight projects —including improved speed when opening projects with over 1,000 media files, so projects are imported more quickly.

Audio mixing is also improved. A new EQ curve preset for clip EQ in the inspector allows removal of troublesome frequencies. New FairlightFX filters include a new meter plug-in that adds a floating meter for any track or bus, so users can keep an eye on levels even if the monitoring panel or mixer are closed. There’s also a new LFE filter designed to smoothly roll off the higher frequencies when mixing low-frequency effects in surround.

Working with immersive sound workflows using the Fairlight audio editor has been updated and now includes dedicated controls for panning up and down. Additionally, clip EQ can now be altered in the inspector on the editor panel. Copy and paste functions have been updated, and now all attributes — including EQ, automation and clip gain — are copied. Sound engineers can set up their preferred workflow, including creating and applying their own presets for clip EQ. Plug-in parameters can also be customized or added so that users have fast access to their preferred tool set.

Clip levels can now be changed relatively, allowing users to adjust the overall gain while respecting existing adjustments. Clip levels can also be reset to unity, easily removing any level adjustments that might have previously been made. Fades can also be deleted directly from the Fairlight Editor, making it faster to do than before. Sound engineers can also now save their preferred track view so that they get the view they want without having to create it each time. More functions previously only available via the keyboard are now accessible using the panel, including layered editing. This also means that automation curves can now be selected via the keyboard or audio panel.

Continuing on with the extensive improvements to the Fairlight audio, there has also been major updates to the audio editor transport control. Track navigation is now improved and even works when nothing is selected. Users can navigate directly to the timecode entry window above the timeline from the audio editor panel, and there is added support for high-frame-rate timecodes. Timecode entry now supports values relative to the current CTI location, so the playhead can move along the timeline relative to the position rather than a set timecode.

Support has also been added so the colon key can be used in place of the user typing 00. Master spill on console faders now lets users spill out all the tracks to a bus fader for quick adjustments in the mix. There’s also more precision with rotary controls on the panel and when using a mouse with a modifier key. Users can also change the layout and select either icon or text-only labels on the Fairlight editor. Legacy Fairlight users can now use the traditional — and perhaps more familiar — Fairlight layout. Moving around the timeline is even quicker with added support for “media left” and “media right” selection keys to jump the playhead forward and back.

This update also improves editing in Resolve. Loading and switching timelines on the edit page is now faster, with improved performance when working with a large number of audio tracks. Compound clips can now be made from in and out points so that editors can be more selective about which media they want to see directly in the edit page. There is also support for previewing timeline audio when performing live overwrites of video-only edits. Now when trimming, the duration will reflect the clip duration as users actively trim, so they can set a specific clip length. Support for a change transition duration dialogue.

The media pool now includes metadata support for audio files with up to 24 embedded channels. Users can also duplicate clips and timelines into the same bin using copy and paste commands. Support for running the primary DaVinci Resolve screen as a window when dual-screen mode is enabled. Smart filters now let users sort media based on metadata fields, including keywords and people tags, so users can find the clips they need faster.

Quick Chat: Editing Leap Day short for Stella Artois

By Randi Altman

To celebrate February 29, otherwise known as Leap Day, beer-maker Stella Artois released a short film featuring real people who discover their time together is valuable in ways they didn’t expect. The short was conceived by VaynerMedia, directed by Division7s Kris Belman and cut by Union partner/editor Sloane Klevin. Union also supplied Flame work on the piece.

The film begins with the words, ”There is a crisis sweeping the nation” set on a black screen. Then we see different women standing on the street talking about how easy it is to cancel plans. “You’re just one text away,” says one. “When it’s really cold outside and I don’t want to go out, I use my dog excuse,” says another. That’s when the viewer is told, through text on the screen, that Stella Artois has set out to right this wrong “by showing them the value of their time together.”

The scene changes from the street to a restaurant where friends are reunited for a meal and a goblet of Stella after not seeing each other for a while. When the check comes the confused diners ask about their checks, as an employee explains, that the menu lists prices in minutes, and that Leap Day is a gift of 24 hours and that people should take advantage of that by “uncancelling plans.”

Prior to February 29, Stella encouraged people to #UnCancel plans and catch up with friends over a beer… paid for by the brand. Using the Stella Leap Day Fund — a $366,000 bank of beer reserved exclusively for those who spend time together (there are 366 days in a Leap Year) — people were able to claim as much as a 24-pack when sharing the film using #UnCancelPromo and tagging someone they would like to catch up with.

Editor Sloane Klevin

For the film short, the diners were captured with hidden cameras. Union editor Klevin, who used an Avid Media Composer 2018.12.03 with EditShare storage, was tasked with finding a story in their candid conversations. We reached out to her to find out more about the project and her process.

How early did you get involved in this project, and what kind of input did you have?
I knew I was probably getting the job about a week before they shot. I had no creative input into the shoot; that really only happens when I’m editing a feature.

What was your process like?
This was an incredibly fast turnaround. They shot on a Wednesday night, and it was finished and online the following Wednesday morning at 12am.

I thought about truncating my usual process in order to make the schedule, but when I saw their shooting breakdown for how they planned to shoot it all in one evening, I knew there wouldn’t be a ton of footage. Knowing this, I could treat the project the way I approach most unscripted longform branded content.

My assistant, Ryan Stacom, transcoded and loaded the footage into the Avid overnight, then grouped the four hidden cameras with the sound from the hidden microphones — and, brilliantly, production had time-of-day timecode on everything. The only thing that was tricky was when two tables were being filmed at once. Those takes had to be separated.

The Simon Says transcription software was used to transcribe the short pre and post interviews we had, and Ryan put markers from the transcripts on those clips so I could jump straight to a keyword or line I was searching for during the edit process. I watched all the verité footage myself and put markers on anything I thought was usable in the spot, typing into the markers what was said.

How did you choose the footage you needed?
Sometimes the people had conversations that were neither here nor there, because they had no idea they were being filmed, so I skipped that stuff. Also, I didn’t know if the transcription software would be accurate with so much background noise from the restaurant on the hidden table microphones, so markering myself seemed the best option. I used yellow markers for lines I really liked, and red for stuff I thought we might want to be able to find and audition, but those wasn’t necessarily my selects. That way I could open the markers tool, and read through my yellow selects at a glance.

Once I’d seen everything, I did a music search of Asche & Spencer’s incredibly intuitive, searchable music library website, downloaded my favorite tracks and started editing.  Because of the fast turnaround, the agency was nice enough to send an outline for how they hoped the material might be edited. I explored their road map, which was super helpful, but went with my gut on how to deviate. They gave me two days to edit, which meant I could post for the director first and get his thoughts.

Then I spent the weekend playing with the agency and trying other options. The client saw the cut and gave notes on both days I was with the agency, then we spent Monday and Tuesday color correcting (thanks to Mike Howell at Color Collective), reworking the music track, mixing (with Chris Afzal at Wave Studios), conforming, subtitling.

That was a crazy fast turnaround.
Considering how fast the turnaround was, it went incredibly smoothly. I attribute that to the manageable amount of footage, fantastic casting that got us really great reactions from all the people they filmed, and the amount of communication my producer at Union and the agency producer had in advance.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

 

Review: Loupedeck+ for Adobe’s Creative Cloud — a year later

By Mike McCarthy

It has been a little over a year since Loupedeck first announced support for Adobe’s Premiere Pro and After Effects thanks to its Loupedeck+ hardware interface panel. As you might know, Loupedeck was originally designed for Adobe Lightroom users. When Loupedeck was first introduced, I found myself wishing there was something similar for Premiere, so I was clearly pleased when that became a reality.

I was eager to test it and got one before starting editorial on a large feature film back in January. While I was knee-deep in the film, postPerspective’s Brady Betzel wrote a thorough review of the panel and how to use it in Premiere and Lightroom. My focus has been a bit different, working to find a way to make it a bit more user-friendly and looking for ways to take advantage of the tool’s immense array of possible functions.

Loupedeck+

I was looking forward to using the panel on a daily basis while editing the film (which I can’t name yet, sorry) because I would have three months of consistent time in Premiere to become familiar with it. The assistant editor on the film ordered a Loupedeck+ when he heard I had one. To our surprise, both of the panels sat idle for most of the duration of that project, even though we were using Premiere for 12 to 16 hours a day. There are a few reasons fo that from my own experience and perspective:

1) Using Premiere Pro 12 involved a delay — which made the controls, especially the dials, much less interactive — but that has been solved in Premiere 13. Unfortunately, we were stuck in version 12 on the film for larger reasons.

2) That said, even in Premiere 13, every time you rotate the dials, it sends a series of individual commands to Premiere that fills up your actions history with one or two adjustments. Pressing a dial resets its value to the default, which alleviates the need to undo that adjustment, but what about the other edit I just made before that? Long gone by that point. If you are just color correcting, this limitation isn’t an issue, but if you are alternating between making color adjustments and other fixes as you work through a sequence, this is a potential problem.

Loupedeck+

A solution? Limit each adjustment so that it’s seen as a single action until another value is adjusted or until a second or two go by — similar in principle to linear keyframe thinning, when you use sliders to make audio level adjustments.

3) Lastly, there was the issue of knowing what each button and dial would do, since there are a lot of them (40 buttons and 14 dials), and they are only marked for their functionality in Lightroom. I also couldn’t figure out how to map it to the functions I wanted to use the most. (The intrinsic motion effect values.)

The first issue will solve itself as I phase out Premiere 12 once this project is complete. The second could be resolved by some programming work by Loupedeck or Adobe, depending on where the limitations lie. Also, adding direct access to more functions in the Loupedeck utility would make it more useful to my workflows — specifically, the access to the motion effect values. But all of that hinges on me being able to remember the functions associated with each control, and those functions being more efficient than doing it with my mouse and keyboard.=

What solution works for you?

Dedicated Interface v. Mouse/Keyboard
The Loupedeck has led to a number of interesting debates about the utility of a dedicated interface for editing compared to a mouse and/or keyboard. I think this is a very interesting topic, as the interface between the system and the user is the heart of what we do. The monitor(s) and speakers are the flip side of that interface, completing the feedback loop. While I have little opinion on speakers because most of my work is visual, I have always been very into having the “best” monitorsolutions and figuring out exactly what “best” means.

It used to mean two 24-inch WUXGA panels, and then it meant a 30-inch LCD. Then I discovered that two 30-inch LCDs were too much for me to use effectively. Similarly, 4K had too many pixels for a 27-inch screen in Windows 7. An ultrawide 34-inch 3440×1440 is my current favorite, although my 32-inch 8K display is starting to grow on me now that Windows 10 can usually scale content on it smoothly.

Our monitor is how our computer communicates with us, and the mouse and keyboard are how we communicate with it. The QWERTY keyboard is a relic from the typewriter era, designed to be inefficient, to prevent jamming the keys. Other arrangements have been introduced but have not gained widespread popularity. The mouse is a much more flexible analog input device for giving more nuanced feedback. (Keys are only on or off, no in-between.) But it is not as efficient at discrete tasks as a keyboard shortcut, provided that you can remember it.

Keyboard shortcuts

This conundrum has led to debates about the best or most efficient way of controlling applications on the system, and editors have some pretty strong opinions on the matter. I am not going to settle it once and for all, but I am going to attempt to step back and look at the bigger picture. Many full-time operators who have become accustomed to their applications are very fast using their keyboard shortcuts, and Avid didn’t even support mouse editing on the timeline until a few years ago. This leads many of those operators to think that keyboard shortcuts are the most efficient possible method of operating, dismissing the possibility that there might be better solutions. But I am confident that for people starting from scratch, they could be at least as efficient, if not more so, using an interface that was actually designed for what they are doing.

Loupedeck is by no means the first or only option in that regard. I have had a Contour Shuttle Pro 2 for many years and have used it on rare occasions for certain highly repetitive tasks. Blackmagic sells a number of physical interface options for Resolve, including its new editing keyboard, and there have been many others for color correction, which is the focus of the Loupedeck’s design as well.

Shuttle Pro 2

Many people also use tablets or trackballs as a replacement for the mouse, but that usually is more about ergonomics and doesn’t compete with keyboard functionality. These other dedicated interfaces are designed to replace some of the keyboard and mouse functionality, but none of them totally replace the QWERTY keyboard, as we will still have to be able to type, to name files, insert titles, etc. But that is what a keyboard is designed to do, compared to pressing spacebar for playback or CTRL+K to add a layer slice. These functions are tasks that have been assigned to the keyboard for convenience, but they are not intrinsically connected with them.

There is no denying the keyboard is a fairly flexible digital input tool, consistently available on nearly all systems and designed to give your fingers lots of easily accessible options. Editors are hardly the only people repurposing it or attempting to use it to maximize efficiency in ways it wasn’t originally designed for. Gamers wear out their WASD keys because their functionality has nothing to do with their letter values and is entirely based on their position on the board. And while other interfaces have been marketed, most gamers are still using a QWERTY keyboard and mouse as their primary physical interface. People are taught the QWERTY keyboard from an early age to develop unconscious muscle memory and, ideally, to allow them to type as they think.

QWERTY keyboard

Once those unconscious links are developed, it is relatively easy to repurpose them for other uses. You think “T” and you press it without thinking about where it is. This is why the keyboard is so efficient as an input device, even outside of the tasks it was originally designed for. But what is preventing people from becoming as efficient with their other physical interfaces? Time with the device and good design are required. Controls have to be able to be identified by touch, without looking, to make that unconscious link possible, which is the reason for the bumps on your F and J keys. But those mental finger mappings may compete with your QWERTY muscle memory, which you are still going to need to be an effective operator, so certain people might be better off sticking with that.

If you are super-efficient with your keyboard shortcuts, and they do practically everything you need, then you are probably not in the target market for the Loupedeck or other dedicated interfaces. If you aren’t that efficient on your keyboard, or you do more analog tasks (color correction) that don’t take place with the discrete steps provided by a keyboard, then a dedicated interface might be more attractive to you. Ironically, my primary temp color tool on my recent film was Lumetri curves, which aren’t necessarily controlled by the Loupedeck.

 

Mike’s solution

That was more about contrast because “color” isn’t really my thing, but for someone who uses those tools that the Loupedeck is mapped to, I have no doubt the Loupedeck would be much faster than using mouse and keyboard for those functions. Mapping the dials to the position, scale and opacity values would improve my workflow, and that currently works great in After Effects, especially in 3D, but not in Premiere Pro (yet). Other functions like slipping and sliding clips are mapped to the Loupedeck dials, but they are not marked, making them very hard to learn. My solution to that is to label them.

Labeling the Loupedeck
I like the Loupedeck, but I have trouble keeping track of the huge variety of functions available, with four possible tasks assigned to each dial per application. Obviously, it would help if the functions were fairly consistent across applications, but currently, by default, they are not. There are some simple improvements that can be made, but not all of the same functions are available, even between Premiere and After Effects. Labeling the controls would be helpful, even just in the process of learning them, but they change between apps, so I don’t want to take a sharpie to the console itself.

Loupedeck CT

The solution I devised was to make cutouts, which can be dropped over the controls, with the various functions labeled with color-coded text. There are 14 dials, 40 buttons and four lights that I had to account for in the cutout. I did separate label patterns for Premiere, After Effects and Photoshop. They were initially based on the Loupedeck’s default settings for those applications, but I have created custom cutouts that have more consistent functionality when switching between the various apps.

Loupedeck recently introduced the new Loupedeck CT (Creative Tool), which is selling for $550. At more than twice the price, it is half the size and labels the buttons and dials with LCD screens that change to reflect the functions available for whatever application and workspace you might be in. This offers a similar but static capability to the much larger set of controls available on the cheaper Loupedeck+.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Matt Shaw on cutting Conan Without Borders: Ghana and Greenland

By Randi Altman

While Conan O’Brien was airing his traditional one-hour late night talk show on TBS, he and his crew would often go on the road to places like Cuba, South Korea and Armenia for Conan Without Borders — a series of one-hour specials. He would focus on regular folks, not celebrities, and would embed himself into the local culture… and there was often some very mediocre dancing, courtesy of Conan. The shows were funny, entertaining and educational, and he enjoyed doing them.

Conan and Matt on the road.

In 2019, Conan and his crew, Team Coco, switched the nightly show from one hour to a new 30-minute format. The format change allowed them to produce three to four hour-long Conan Without Borders specials per year. Two of the places the show visited last year were Ghana and Greenland. As you might imagine, they shoot a lot of footage, which all must be logged and edited, often while on the road.

Matt Shaw is one of the editors on Conan, and he went on the road with the show when it traveled to Greenland. Shaw’s past credits include Deon Cole’s Black Box and The Pete Holmes Show (both from Conan O’Brien’s Conaco production company) and The Late Late Show with James Corden (including Carpool Karaoke). One of his first gigs for Team Coco was editing Conan Without Borders: Made in Mexico. That led to a full-time editing gig on Conan on TBS and many fun adventures.

We reached out to Shaw to find out more about editing these specials and what challenges he faced along the way.

You recently edited Conan Without Borders — the Greenland and Ghana specials. Can you talk about preparing for a job like that? What kind of turnaround did you have?
Our Ghana special was shot back in June 2019, with the original plan to air in August, but it was pushed back to November 7 because of how fast the Greenland show came up.

In terms of prep for a show like Ghana, we mainly just know the shooting specs and will handle the rest once the crew actually returns. For the most part, that’s the norm. Ideally, we’ll have a working dark week (no nightly Conan show), and the three editors — me, Rob Ashe and Chris Heller — will take the time to offload, sync and begin our first cuts of everything. We’ll have been in contact with the writers on the shoot to get an idea of what pieces were shot and their general notes from the day.

With Greenland, we had to mobilize and adjust everything to accommodate a drastically different shoot/delivery schedule. The Friday before leaving, while we were prepping the Ghana show to screen for an audience, we heard there might be something coming up that would push Ghana back. On Monday, we heard the plan was to go to Greenland on Wednesday evening, after the nightly show, and turn around Greenland in place of Ghana’s audience screening. We had to adjust the nightly show schedule to still have a new episode ready for Thursday while we were in Greenland.

How did you end up on the Greenland trip?
Knowing we’d only have six days from returning from Greenland to having to finish the show broadcast, our lead editor, Rob Ashe, suggested we send an editor to work on location. We were originally looking into sending footage via Aspera from a local TV studio in Nuuk, Greenland, but we just wouldn’t have been able to turn it around fast enough. We decided about two days before the trip began that I’d go and do what I could to offload, backup, sync and do first cuts on everything.

How much footage did you have per episode, and what did they shoot on?
Ghana had close to 17 hours of material shot over five days on Sony Z450s at 4K XAVC, 29.97. Greenland was closer to 12 hours shot over three days on Panasonic HPX 250s, P2 media recording at 1080 60i.

We also used iPhone/iPad/GoPro footage picked up by the rest of the crew as needed for both shows. I also had a DJI Osmo pocket camera to play with when I had a chance, and we used some of that footage during the montage of icebergs.

So you were editing segments while they were still shooting?
In Greenland, I was cutting daily in the hotel. Midday, I’d get a drop of cards, offload, sync/group and the first cuts on everything. We had a simple offline edit workflow set up where I’d upload my cuts to Frame.io and email my project files to the team — Rob and Chris — in Burbank. They would then download and sync the Frame.io file to a top video layer in the timeline and continue cutting down, with any additional notes from the writers.

Generally, I’d have everything from Day One uploaded by the start of Day Two, etc. It seemed to work out pretty well to set us up for success when we returned. I was also getting notes on requests to help cut a few highlights from our remotes and to put on Team Coco’s Instagram account.

On our return day, we flew to Ilulissat for an iceberg expedition. We had about two hours on the ground before having to return to the airport and fly to Kangerlussuaq, where our chartered plane was waiting to take us back to California. On the flight back, I worked for another four hours or so to sort through the remaining segments and prep everything so we could hit the ground running the following morning. During the flight home, we screened some drone footage from the iceberg trip for Conan, and it really got everyone excited.

What are the challenges of working on the road and with such tight turnarounds?
The night we left for Greenland was preceded by a nightly show in Burbank. After the show ended, we hopped on a plane to fly eight hours to Kangerlussuaq for customs, then another to Nuuk. The minute we landed, we were filming for about three hours before checking into the hotel. I grabbed the morning’s camera cards, went to my room and began cutting. By the time I went to bed, I had cuts done of almost everything from the first day. I’m a terrible sleeper on planes, so the marathon start was pretty insane.

Outside of the little sleep, our offload speeds were slower because we were using different cameras than usual — for the sake of traveling lighter — because the plane we flew in had specific weight restrictions. We actually had to hire local crew for audio and B and C camera because there wasn’t enough room for everyone in the plane to start.

In general, I think the overall trip went as smooth as it could have. It would be interesting to see how it would play out for a longer shoot schedule.

What editing system did you use? What was your setup like? What kind of storage were you using?
On the road I had my MacBook Pro (2018 model), and we rented an identical backup machine in case mine died. For storage, we had four 1TB G-Tech USB-C drives and a 4TB G-RAID to back everything up. I had a USB-3.0 P2 card reader as well and multiple backup readers. A Bluetooth mouse and keyboard rounded out the kit, so I could travel with everything in a backpack.

We had to charter a plane in order to fly directly to Greenland. With such a tight turnaround between filming and delivering the actual show, this was the only way to actually make the special happen. Commercial flights fly only a few days per week out of neighboring countries, and once you’re in Greenland, you either have to fly or take a boat from city to city.

Matt Shaw editing on plane.

On the plane, there was a conference table in the back, so I set up there with one laptop and the G-RAID to continue working. The biggest trouble on the plane was making sure everything stayed secure on the table while taking off and making turns. There were a few close calls when everything started to slide away, and I had to reach to make sure nothing was disconnected.

How involved in the editing is Conan? What kind of feedback did you get?
In general, if Conan has specific notes, the writers will hear them during or right after a shoot is finished. Or we’ll test-screen something after a nightly show taping and indirectly get notes from the writers then.

There will be special circumstances, like our cold opens for Comic-Con, when Conan will come to edit and screen a close-to-final cut. And there just might be a run of jokes that isn’t as strong, but he lets us work with the writers to make what we all think is the best version by committee.

Can you point to some of the more challenging segments from Greenland or Ghana?
The entire show is difficult with the delivery-time constraints while handling the nightly show. We’ll be editing the versions for screening sometimes up to 10 minutes before they have to screen for an audience as well as doing all the finishing (audio mix, color as needed, subtitling and deliverables).

For any given special, we’re each cutting our respective remotes during the day while working on any new comedy pieces for that day’s show, then one or two of us will split the work on the nightly show, while the other keeps working with the travel show writers. In the middle of it all, we’ll cut together a mini tease or an unfinished piece to play into that night’s show to promote the specials, so the main challenge is juggling 30 things at a time.

For me, I got to edit this 1980s-style action movie trailer based on an awesome poster Conan had painted by a Ghanaian artist. We had puppets built, a lot of greenscreen and a body double to composite Conan’s head onto for fight scenes. Story-wise, we didn’t have much of a structure to start, but we had to piece something together in the edit and hope it did the ridiculous poster justice.

The Thursday before our show screened for an audience was the first time Mike Sweeney (head writer for the travel shows) had a chance to look at any greenscreen footage and knew we were test-screening it the following Monday or Tuesday. It started to take shape when one of our graphics/VFX artists, Angus Lyne, sent back some composites. In the end, it came together great and killed with the audience and our staff, who had already seen anything and everything.

Our other pieces seem to have a linear story, and we try to build the best highlights from any given remote. With something like this trailer, we have to switch our thought process to really build something from scratch. In the case of Greenland and Ghana, I think we put together two really great shows.

How challenging is editing comedy versus drama? Or editing these segments versus other parts of Conan’s world?
In a lot of the comedy we cut, the joke is king. There are always instances when we have blatant continuity errors, jump cuts, etc., but we don’t have to kill ourselves trying to make it work in the moment if it means hurting the joke. Our “man on the street” segments are great examples of this. Obviously, we want something to be as polished and coherent as possible, but there are cases when it just isn’t best, in our opinion, and that’s okay.

That being said, when we do our spoofs of whatever ad or try to recreate a specific style, we’re going to do everything to make that happen. We recently shot a bit with Nicholas Braun from Succession where he’s trying to get a job from Conan during his hiatus from Succession. This was a mix of improv and scripted, and we had to match the look of that show. It turned out well and funny and is in the vein of Succession.

What about for the Ghana show?
For Ghana, we had a few segments that were extremely serious and emotional. For example, Conan and Sam Richardson visited Osu Castle, a major slave trade port. This segment demands care and needs to breathe so the weight of it can really be expressed, versus earlier in the show, when Conan was buying a Ghana shirt from a street vendor, and we hard-cut to him wearing a shirt 10 sizes too small.

And Greenland?
Greenland is a place really affected by climate change. My personal favorite segment I’ve cut on these travel specials is the impact the melting icecaps could have on the world. Then there is a montage of the icebergs we saw, followed by Conan attempting to stake a “Sold” sign on an iceberg, signifying he had bought property in Greenland for the US. Originally, the montage had a few jokes within the segment, but we quickly realized it’s so beautiful we shouldn’t cheapen it. We just let it be beautiful.

Comedy or drama, it’s really about being aware of what you have in front of you and what the end goal is.

What haven’t I asked that’s important?
For me, it’s important to acknowledge how talented our post team is to be able to work simultaneously on a giant special while delivering four shows a week. Being on location for Greenland also gave me a taste of the chaos the whole production team and Team Coco goes through, and I think everyone should be proud of what we’re capable of producing.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The Den editorial boutique launches in Los Angeles

Christjan Jordan, editor of award-winning work for clients including Amazon, GEICO and Hulu, has partnered with industry veteran Mary Ellen Duggan to launch The Den, an independent boutique editorial house in Los Angeles.

Over the course of his career, Jordan has worked with Arcade Edit, Cosmo Street and Rock Paper Scissors, among others. He has edited such spots as Alexa Loses Her Voice for Amazon, Longest Goal Celebration Ever for GEICO, #notspecialneeds for World Down Syndrome Day out of Publicis NY and Super Bowl 2020 ads Tom Brady’s Big Announcement for Hulu and Famous Visitors for Walmart. Jordan’s work has been recognized by the Cannes Lions, AICE, AICP, Clio, D&AD, One Show and Sports Emmy awards.

Yes, with Mary Ellen, agency producers are guided by an industry veteran that knows exactly what agencies and clients are looking for,” says Jordan. “And for me, I love fostering young editors. It’s an interesting time in our industry and there is a lot of fresh creative talent.”

In her career, Duggan has headed production departments at both KPB and Cliff Freeman on the East Coast and, most recently, Big Family Table in Los Angeles. In addition, she has freelanced all over the country.

“The stars aligned for Christjan and I to work together,” says Duggan. “We had known each other for years and had recently worked on a Hulu campaign together. We had a similar vision for what we thought the editorial experience should be. A high end boutique editorial that is nimble, has a roster of diverse talent, and a real family vibe.”

Veteran producer Rachel Seitel has joined as partner and head of business development. The Den will be represented by Diane Patrone at The Family on the East Coast and by Ezra Burke and Shane Harris on the West Coast.

The Den’s founding roster also features editor Andrew Ratzlaff and junior editor Hannelore Gomes. The staff works on Avid Media Composer and Adobe Premiere.

LVLY adds veteran editor Bill Cramer

Bill Cramer, an editor known for his comedy and dialogue work, among other genres, has joined the editorial roster at LVLY, a content creation and creative studio based in New York City.

Cramer joins from Northern Lights. Prior to that he had spent many years at Crew Cuts, where he launched his career and built a strong reputation for his work on many ads and campaigns. Clients included ESPN, GMC, LG, Nickelodeon, Hasbro, MLB, Wendy’s and American Express. Check out his reel.

Cramer reports that he wasn’t looking to make a move but that LVLY’s VP/MD, Wendy Brovetto, inspired him. “Wendy and I knew of each other for years, and I’d been following LVLY since they did their top-to-bottom rebranding. I knew that they’re doing everything from live -action production to podcasting, VFX, design, VR and experiential, and I recognized that joining them would give me more opportunities to flex as an editor. Being at LVLY gives me the chance to take on any project, whether that’s a 30-second commercial, music video or long-form branded content piece; they’re set up to tackle any post production needs, no matter the scale.”

“Bill’s a great comedy/dialogue editor, and that’s something our clients have been looking for,” says Brovetto. “Once I saw the range of his work, it was an easy decision to invite him to join the LVLY team. In addition to being a great editor, he’s a funny guy, and who doesn’t need more humor in their day?”

Cramer, who works on both Avid Media Composer and Adobe Premiere, joins an editorial roster that includes Olivier Wicki, J.P. Damboragian, Geordie Anderson, Noelle Webb, Joe Siegel and Aaron & Bryan.

Behind the Title: Dell Blue lead editor Jason Uson

This veteran editor started his career at LA’s Rock Paper Scissors, where he spent four years learning the craft from editors such as Bee Ottinger and Angus Wall. After freelancing at Lost Planet, Spot Welders and Nomad, he held staff positions at Cosmo Street, Harpo Films and Beast Editorial before opening Foundation Editorial his own post boutique in Austin.

NAME: Jason Uson

COMPANY: Austin, Texas-based Dell Blue

Can you describe what Dell Blue does?
Dell Blue is the in-house agency for Dell Technologies.

What’s your job Title?
Senior Lead Creative Editor

What does that entail?
Outside of the projects that I am editing personally, there are multiple campaigns happening simultaneously at all times. I oversee all of them and have my eyes on every edit, fostering and mentoring our junior editors and producers to help them grow in their careers.

I’ve helped establish and maintain the process regarding our workflow and post pipeline. I also work closely with our entire team of creatives, producers, project managers and vendors from the beginning of each project and follow it through from production to post. This enables us to execute the best possible workflow and outcome for every project.

To add another layer to my role, I am also directing spots for Dell when the project is right.

Alienware

That’s a lot! What else would surprise people about what falls under that title?
The number of hours that go into making sure the job gets done and is the best it can be. Editing is a process that takes time. Creating something of value that means something is an art no matter how big or small the job might be. You have to have pride in every aspect of the process. It shows when you don’t.

What’s your favorite part of the job?
I have two favorites. The first is the people. I know that sounds cliché, but it’s true. The team here at Dell is truly something special. We are family. We work together. Play together. Happy Hour together. Respect, support and genuinely care for one another. But, ultimately, we care about the work. We are all aligned to create the best work possible. I am grateful to be surrounded by such a talented and amazing group of humans.

The second, which is equally important to me, is the process of organizing my project, watching all the footage and pulling selects. I make sure I have what I need and check it off my list. Music, sound effects, VO track, graphics and anything else I need to get started. Then I create my first timeline. A blank, empty timeline. Then I take a deep breath and say to myself, “Here we go.” That’s my favorite.

Do you have a least favorite?
My least favorite part is wrapping a project. I spend so much time with my clients and creatives and we really bond while working on a project together. We end on such a high note of excitement and pride in what we’ve done and then, just like that, it’s over. I realize that sounds a bit dramatic. Not to worry, though, because lucky for me, we all come back together in a few months to work on something new and the excitement starts all over again.

What is your most productive time of day?
This also requires a two-part answer. The first is early morning. This is my time to get things done, uninterrupted. I go upstairs and make a fresh cup of coffee. I open my deck doors. I check and send emails, and get my personal stuff done. This clears out all of my distractions for the day before I jump into my edit bay.

The second part is late at night. I get to replay all of the creative decisions from the day and explore other options. Sometimes, I get lucky and find something I didn’t see before.

If you didn’t have this job, what would you be doing instead?
That’s easy. I’d be a chef. I love to cook and experiment with ingredients. And I love to explore and create an amazing dining experience.

I see similarities between editors and chefs. Both aim to create something impactful that elicits an emotional response from the “elements” they are given. For chefs, the ingredients, spices and techniques are creatively brought together to bring a dish to life.

For editors, the “elements” that I am given, in combination with the use of my style, techniques, sound design, graphics and music etc. all give life to a spot.

How early did you know this would be your path?
I had originally moved to Los Angeles with dreams of becoming an actor. Yes, it’s groundbreaking, I know. During that time, I met editor Dana Glauberman (The Mandalorian, Juno, Up in the Air, Thank You for Smoking, Creed II, Ghostbusters: Afterlife). I had lunch with her at the studios one day in Burbank and went on a tour of the backlot. I got to see all the edit bays, film stages, soundstages and machine rooms. To me, this was magic. A total game-changer in an instant.

While I was waiting on that one big role, I got my foot in the door as a PA at editing house Rock Paper Scissors. One night after work, we all went for drinks at a local bar and every commercial on TV were the ones (editors) Angus Wall and Adam Pertofsky had worked on within the last month, and I was blown away. Something clicked.

This entire creative world behind the scenes was captivating to me. I made the decision at that moment to lean in and go for it. I asked the assistant editor the following morning if he would teach me — and I haven’t looked back. So, Dana, Angus and Adam… thank you!

Can you name some of your recent projects?
I edited the latest global campaign for Alienware called Everything Counts, which was directed by Tony Kaye. More recently, I worked on the campaign for Dell’s latest and greatest business PC laptop that launches in March 2020, which was directed by Mac Premo.

Dell business PC

Side note: I highly recommend Googling Mac Premo. His work is amazing.

What project are you most proud of?
There are two projects that stand out for me. The first one is the very first spot I ever cut — a Budweiser ad for director Sam Ketay and the Art Institute of Pasadena. During the edit, I thought, “Wow, I think I can do this.” It went on to win a Clio.

The second is the latest global campaign for Alienware, which I mentioned above. Director Tony Kaye is a genius. Tony and I sat in my edit bay for a week exploring and experimenting. His process is unlike any other director I have worked with. This project was extremely challenging on many levels. I honestly started looking at footage in a very different way. I evolved. I learned. And I strive to continue to grow every day.

Name three pieces of technology you can’t live without.
Wow, good question. I guess I’ll be that guy and say my phone. It really is a necessity.

Spotify, for sure. I am always listening to music in my car and trying to match artists with projects that are not even in existence yet.

My Bose noise cancelling headphones.

What social media channels do you follow?
I use Facebook and LinkedIn — mainly to stay up to date on what others are doing and to post my own updates every now and then.

I’m on Instagram quite a bit. Outside of the obvious industry-related accounts I follow, here are a few of my random favorites:

@nuts_about_birds
If you love birds as much as I do, this is a good one to follow.

@sergiosanchezart
This guy is incredible. I have been following his work for a long time. If you are looking for a new tattoo, look no further.

@andrewhagarofficial
I was lucky enough to meet Andrew through my friend @chrisprofera and immediately dove into his music. Amazing. Not to mention his dad is Sammy Hagar. Enough said.

@kaleynelson
She’s a talented photographer based in LA. Her concert stills are impressive.

@zuzubee
I love graffiti art and Zuzu is one of the best. Based is Austin, she has created several murals for me. You can see her work all over the city, as well as installations during SXSW and Austin City Limits, on Bud Light cans, and across the US.

Do you listen to music at work? What types?
I do listen to music when I work but only when I’m going through footage and pulling selects. Classical piano is my go-to. It opens my mind and helps me focus and dive into my footage.

Don’t get me wrong, I love music. But if I am jamming to my favorite, Sammy Hagar, I can’t drive…I mean dive… into my footage. So classical piano for me.

How do you de-stress from it all?
This is an understatement, but there are a few things that help me out. Sometimes during the day, I will take a walk around the block. Get a little vitamin D and fresh air. I look around at things other than my screen. This is something (editors) Tom Muldoon and John Murray at Nomad used to do every day. I always wondered why. Now I know. I come back refreshed and with my mind clear and ready for the next challenge.

I also “like” to hit the gym immediately after I leave my edit bay. Headphones on (Sammy Hagar, obviously), stretch it out and jump on the treadmill for 30 minutes.

All that is good and necessary for obvious reasons, but getting back to cooking… I love being in the kitchen. It’s therapy for me. Whether I am chopping and creating in the kitchen or out on the grill, I love it. And my wife appreciates my cooking. Well, I think she does at least.

Photo Credits: Dell PC and Jason Uson images – Chris Profera

An online editor’s first time at Sundance

By Brady Betzel

I’ve always wanted to attend the Sundance Film Festival, and my trip last month did not disappoint. Not only is it an iconic industry (and pop-culture) event, but the energy surrounding it is palpable.

Once I got to Park City and walked Main Street — with the sponsored stores (Canon and Lyft among others) and movie theaters, like the Egyptian — I started to feel an excitement and energy that I haven’t felt since I was making videos in high school and college… when there were no thoughts of limits and what I should or shouldn’t do.

A certain indescribable nervousness and love started to bubble up. Sitting in the luxurious Park City Burger King with Steve Hullfish (Art of the Cut) and Joe Herman (Cinemontage) before my second screening of Sundance 2020: Dinner in America, I was thinking how I was so lucky to be in a place that is packed with creatives. It sounds cliché and trite, but it really is reinvigorating to surround yourself with positive energy — especially if you can get caught up in cynicism like me.

It brought me back to my college classes, taught by Daniel Restuccio (another postPerspective writer), at California Lutheran University, where we would cut out pictures from magazines, draw pictures, blow up balloons, eat doughnuts and do whatever we could to get our ideas out in the open.

While Sundance occasionally felt like an amalgamation of the thirsty-hipster Coachella crowd mixed with a high school video production class (but with million-dollar budgets), it still had me excited to create. Sundance 2020 in Park City was a beautiful resurgence of ideas and discussions about how we as an artistic community can offer accessibility to everyone and anyone who wants to tell their own story on screen.

Inclusiveness Panel
After arriving in Park City, my first stop was a panel hosted by Adobe called “Empowering Every Voice in Film and the World.” Maybe it was a combination of the excitement of Sundance and the discussion about accessibility, but it really got me thinking. The panel was expertly hosted by Adobe’s Meagan Keane and included producer, director Yance Ford (Disclosure: Trans Lives on Screen, Oscar-nominated for Strong Island); editor Eileen Meyer (Crip Camp); editor Stacy Goldate (Disclosure: Trans Lives on Screen); and director Crystal Kayiza (See You Next Time).

I walked away feeling inspired and driven to increase my efforts in accessibility. Eileen said one of her biggest opportunities came from the Karen Schmeer Film Editing Fellowship, a year-long fellowship for emerging documentary editors.

Yance drove home the idea of inclusivity and re-emphasized the idea of access to equipment. But it’s not simply about access — you also have to make a great story and figure out things like distribution. I was really struck by all the speakers on-stage, but Yance really spoke to me. He feels like the voice we need when representing marginalized groups and to see more content from these creatives. The more content we see the better.

Crystal spoke about the community needing to tell stories that don’t necessarily have standard plot points and stakes. The idea to encourage people to create their stories and for those that are in power to help and support these stories and trust the filmmakers, regardless of whether you identify with the ideas and themes.

Rebuilding Paradise

Screenings
One screening I attended was Rebuilding Paradise, directed by Ron Howard. He was at the premiere, along with some of the people who lost everything in the Paradise, California fires. In the first half of November 2018, there were several fires that raged out of control in California. One surrounded the city of Simi Valley and worked its way toward the Pacific Coast. (It was way too close for my comfort in Simi Valley. We eventually evacuated but were fine.)

Another fire was in the town of Paradise, which burnt almost the entire city to the ground. Watching Rebuilding Paradise filled me with great sadness for those who lost family members and their homes. Some of the “found footage” was absolutely breathtaking. One in particular was of a father racing out of what appears to be hell, surrounded by flames, in his car with his child asking if they were going to die. Absolutely incredible and heart wrenching.

Dinner in America

Another film I saw was Dinner in America, as referenced earlier in this piece. I love a good dark comedy/drama, so when I got a ticket to Adam Carter Rehmeier’s Dinner in America I was all geared up. Little did I know it would start off with a disgruntled 20-something throwing a chair through a window and lighting the front sidewalk on fire. Kudos to composer John Swihart, who took a pretty awesome opening credit montage and dropped the heat with his soundtrack.

Dinner in America is a mid-‘90s Napoleon Dynamite cross-pollinated with the song “F*** Authority” by Pennywise. Coincidentally, Swihart composed the soundtrack for Napoleon Dynamite. Seriously, the soundtrack to Dinner in America is worth the ticket price alone, in my opinion. It adds so much to one of the main character’s attitude. The parallel editing mixed with the fierce anti-authoritarianism love story, lived by Kyle Gallner and Emily Skeggs, make for a movie you probably won’t forget.

Adam Rehmeier

During the Q&A at the end, writer, director and editor Rehmeier described how he essentially combined two ideas that led to Dinner in America. As I watched the first 20 minutes, it felt like two separate movies, but once it came together it really paid off. Much like the cult phenomenon Napoleon Dynamite, Dinner in America will resonate with a wide audience. It’s worth watching when it comes to a theater (or streaming platform) near you. In the meantime, check out my video interview with him.

Adobe Productions
During Sundance, Adobe announced an upcoming feature for Premiere called “Productions.” While in Park City, I got a small demo of the new Productions at Adobe’s Sundance Production House. It took about 15 minutes before I realized that Adobe has added the one feature that has set Avid Media Composer apart for over 20 years — bin locking. Head’s up Avid, Adobe is about to release multi-user workflow that is much easier to understand and use than on previous iterations of Premiere.

The only thing that caught me off guard was the nomenclature — Productions and Projects. Productions is the title, but really a “Production” is a project, and what they call a “project” is a bin. If you’re familiar with Media Composer, you can create a project and inside have folders and bins. Bins are what house media links, sequences, graphics and everything else. In the new Productions update, a “Production” will house all of your “Projects” (i.e. a Project with bins).

Additionally, you will be able to lock “Projects.” This means that in a multi-user environment (which can be something like a SAN or even an Avid Nexis), a project and media can live on the shared server and be accessed by multiple users. These users can be named and identified inside of the Premiere Preferences. And much like Blackmagic’s DaVinci Resolve, you can update the “projects” when you want to — individually or all projects at once. On its face, Productions looks like the answer to what every editor has said is one of the only reasons Avid is still such a powerhouse in “Hollywood” — the ability to work relatively flawlessly among tons of editors simultaneously. If what I saw works the way it should, Adobe is looking to take a piece of the multi-user environment pie Avid has controlled for so long.

Summing Up
In the end, the Sundance Film Festival 2020 in Park City was likely a once-in-a-lifetime experience for me. From seeing celebrities, meeting other journalists, getting some free beanies and hand warmers (it was definitely not 70 degrees like California), to attending parties hosted by Canon and Light Iron — Sundance can really reinvigorate your filmmaking energy.

It’s hard to keep going when you get burnt out by just how hard it is to succeed and break through the barriers in film and multimedia creation. But seeing indie films and meeting like-minded creatives, you can get excited to create your own story. And you realize that there are good people out there, and sometimes you just have to fly to Utah to find them.

Walking down Main Street, I found a coffee shop named Atticus Coffee and Tea House. My oldest son’s name is Atticus, so I naturally had to stop in and get him something, I ended up getting him a hat and me a coffee. It was good. But what I really did was sit out front pretending to shoot b-roll and eavesdropping on some conversations. It really is true that being around thoughtful energy is contagious. And while some parts of Sundance feel like a hipster-popularity contest, there are others who are there to improve and absorb culture from all around.

The 2020 Sundance Film Festival’s theme in my eyes was to uplift other people’s stories. As Harper Lee wrote in “To Kill a Mockingbird” when Atticus Finch is talking with Scout: “First of all, if you learn a simple trick, Scout, you’ll get along a lot better with all kinds of folks. You never really understand a person until you consider things from his point of view . . . until you climb into his skin and walk around in it.”


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

DejaEdit collaborative editing platform available worldwide

DejaSoft has expanded the availability of its DejaEdit collaborative editing solution for Avid Media Composer, Avid Nexis and EditShare workflows. Already well-established in Scandinavia and parts of Europe, the software-defined network solution is now accessible across the UK, Europe, Latin America, Middle East, Asia, Africa, China and North America.

DejaEdit allows editors to transfer media files and timelines automatically and securely around the world without having to be online continuously. It effectively acts as a media file synchronizer for multiple remote Avid systems.

DejaEdit allows multi-site post facilities to work as one, enabling multiple remote editors to work together, allowing media exchanges with VFX houses and letting editors easily migrate between office and home or mobile-based editing installations throughout the lifecycle of a project.

DejaEdit is available in two applications: Client and Nexus. The Client version works directly with Media Composer, whereas the Nexus variant further enables synchronization with projects stored on Nexis or EditShare storage systems.

DejaSoft and the DejaEdit platform are a collaboration between CEO Clas Hakeröd and CTO Nikolai Waldman, both editors and post pros and founders of boutique post facility Can Film based in Sweden.

The tool is already being used by Oscar-nominated editor Yorgos Mavropsaridis, ACE, of The Favourite, The Lobster and recently Suicide Tourist; Scandinavian producer Daniel Lägersten, who has produced TV series such as Riverside and The Spiral; editor Rickard Krantz, who used it on The Perfect Patient (aka Quick), which has been nominated for Sweden’s Guldbagge Award (similar to a BAFTA) for editing; and post producer Anna Knochenhauer, known for her work on Euphoria featuring Alicia Vikander, The 100-Year-Old Man Who Climbed Out the Window and Disappeared, Lilya 4-Ever and Together.

Review: Neat Video 5 noise reduction plugin

By Brady Betzel

One of the best (and most underrated) tricks in an online editor’s tool kit is to have good image restoration techniques. Removing digital video imperfections — from flicker to digital video noise — is not easy, and not easy to do well. That is, unless you have good noise reduction software like Neat Video.

While Neat Video might not be that well-known, once you see how simply (or intricatly) Neat Video 5 works inside of apps like Blackmagic’s DaVinci Resolve, it will be hard to forget the company’s name.

(While the software was recently updated to 5.1.5 — with expanded GPU support as well as support for new versions of Resolve, Adobe and Nuke — nothing really changes for this review. You can check out a detailed list of the updates here.)

Neat Video 5 is a noise reduction plugin. In a Windows OS environment, Neat Video is compatible with apps like Adobe After Effects, Adobe Premiere Pro, DaVinci Resolve, Avid Media Composer, Vegas, Magix, Edius, Virtual Dub, and the OFX-compatible apps Nuke, Fusion, Scratch, HitFilm, Mamba, Natron, Flame, Baselight and DustBuster. In a macOS environment, Neat Video 5 is compatible with After Effects, Premiere, Final Cut Pro X, Motion 5, OFX, Resolve and Media Composer. In Linux, the software is compatible with OFX-compatible apps and Resolve.

Neat Video 5 comes in three flavors: Demo, Home and Pro. The Demo version works in up to 1280×720 resolution with a watermark. Home is literally made for the home user: It will process video up to 1920×1080 resolutions, it will use up to one GPU, and it is for non-commercial use. The cost is just $74.90 for most apps (Resolve is $89.90). The Pro version has no resolution restrictions, will work on two or more GPUs simultaneously, and can be used commercially. The Pro version starts at $129.90 per app ($159.90 for Resolve). Because Neat Video 5 for OFX works with so many apps, it only comes in Pro ($249.90) and Studio ($349.90) versions. The Studio version adds the ability for a floating license. You can see all of the pricing details here.

If there is one line you should take away from this review, it is this: Neat Video 5 is by far the easiest and best noise reduction software I have used in any application to date. And while this review is focusing on the Resolve version of Neat Video 5, all other apps work in much the same way. You can find Neat Video’s software-specific Quick Start Guides to help. Once you install and register your Neat Video 5 license, removing digital video noise is as easy as applying Neat Video 5 to a node in the color tab, clicking on “Prepare Noise Profile,” clicking on “Auto Profile,” and clicking “Apply.” Then, unless you want to fine-tune your noise reduction, you are done. Obviously, I have somewhat simplified how Neat Video 5 works, but essentially it can be done in as little as three steps per clip, and the results are typically amazing. If they aren’t amazing, you can jump back into Neat Video 5 and manually adjust specifics until the noise reduction looks correct. But I will say that in about 90% of cases, the Auto Profiling will do all of the noise reduction work necessary.

For tinkerers, or for those who need to go far beyond an Auto Profile, you can manually adjust your settings. But taking a step back, Neat Video needs an area of your image that has a uniform color and noise profile to process how it removes noise. The automatic profiling will do its best to find an area, but it doesn’t always work. What you need to keep in mind when building a good noise profile inside of Neat Video is that the area being processed needs to be as uniform as possible (think dark night sky or a wall painted in one color) — meaning no prominent features, a high noise level (something in the high four area is better), the largest possible sample area and no warnings from Neat Video.

So, if your automatic profile doesn’t do the job, you can find an area of your image that meets the above requirements and then build a profile. From there you can use one of the Neat Video 5 features, like “Profile Check.” Profile Check will highlight details that aren’t being affected by Neat Video, giving you a clear representation of what noise is being reduced and whether you need to adjust your profile to better reduce video noise.

At this point you might be wondering where you tweak advanced settings. When you load Neat Video, you will be in Beginner mode. To get into Advanced mode, go to the Tools menu, where you will see a lot of advanced functions that can help you fine-tune your noise profile. And if you still can’t get a good noise reduction profile, you can try out the “Generic Profile,” which can help you build a profile even if your video doesn’t have a large enough area of uniform noise. There are also presets — such as like light flicker, moire flicker, repeat frame issues, dust and scratch filters (including scan lines), jitter of details, artifact removal filter and more — that can solve certain problems.

Neat Video 5 is faster than previous generations. As in previous versions, there is even a tool that inside of Neat Video preferences that will run your CPU and GPU through a benchmark to specify whether you should run on CPU only, GPU only, or a combo of both. In Neat Video 5, if you have trouble with a clip, you can use up to four “Variants” of noise reduction in the new playback window to see how each profile works with your clip.

In terms of playback and rendering, noise reduction is never fast. However, inside of Neat Video the new playback window will typically play back your footage to preview the noise reduction before you jump back into Resolve. Inside of Resolve, even in just 1080p, my sequence would crawl to just a few frames of playback per second. It is one of the most processor- and GPU-intensive tasks you will run on your computer.

In my testing I applied Neat Video 5 to the first node in my color correction tree, followed by a basic color correction in a one-minute timeline. I took those same clips and compared my Neat Video results to Resolve’s Temporal and Spatial noise reduction tools. In terms of visual results, Neat Video 5 was superior. If that’s not the case for you, then jump into YCbCr viewer mode inside of Neat Video 5, isolate each channel and tweak each channel individually so you won’t affect your overall noise reduction if it isn’t necessary. Not only did Neat Video 5 handle normal noise in the shadows well but on clips with very tight lines, it was able to keep a lot of the details while removing the noise. Resolve’s noise reduction tools had a harder time removing noise but keeping detail. Temporal noise reduction really didn’t do much, and while Spatial noise reduction did work it would heavily blur and distort the image — essentially not acceptable.

To get a good example of how Neat Video 5 slams a computer system, I exported 1080p MP4s. Resolve’s built-in Temporal noise reduction took 1:03, while the Spatial noise reduction took 1:05. The Neat Video 5 render of the same one-minute timeline took 3:51 — almost four times as long. I was curious how much longer a 4K render would take. Using 4K (UHD) media, I applied a simple color correction and on a previous serial node that applied Neat Video 5. I exported a 4K (UHD) MP4, which took 52 seconds without Neat Video 5 applied and 16:27 with Neat Video applied — at least 16 times more render time! So while Neat Video 5 is an amazing tool, there is a trade-off in high render times.

To find additional training on more advanced noise reduction techniques in Neat Video, check out the video tutorials. I find myself watching these just because of how much you can learn about noise reduction in general. They aren’t as exciting as watching Game of Thrones or The Handmaid’s Tale, but they will push your knowledge in noise reduction to the next level.

Summing Up
I’ve used Neat Video for a while, so when I was approached to review Version 5 I immediately said yes. Noise reduction is post skill that not many possess.

If you are an online editor or colorist looking to separate yourself from the pack, learn all the noise reduction techniques you can and definitely check out Neat Video 5. Not only can Neat Video 5 work automatically, but you can fine-tune your noise reduction as much as you want.

And when demoing your color correction services, think about using Neat Video 5 to remove camera noise, flickering and chroma issues; color correcting your footage; and, finally, adding some grain back into your shot. Not only will your footage look better, but you’ll have a technical workflow that will definitely impress clients. Just don’t forget to account for the extra render time.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Oscar-nominated Jojo Rabbit editor Tom Eagles: blending comedy and drama

By Daniel Restuccio

As an editor, Tom Eagles has done it all. He started his career in New Zealand cutting promos before graduating to assistant editor then editor on television series such as Secret Agent Men and Spartacus. Eventually he connected with up-and-coming director Taika Waititi and has worked with him on the series What We Do in the Shadows and the critically acclaimed feature Hunt for the Wilderpeople. Their most recent feature collaboration, 20th Century Fox’s Jojo Rabbit, earned Eagles BAFTA and Oscar nominations as well as an ACE Eddie Award win.

Tom Eagles

We recently caught up with him to talk about the unique storytelling style of Taika films, specifically Jojo Rabbit.

(Warning: If you haven’t seen the film yet, there might be some spoilers ahead.)

How did your first conversation with Taika go?
Fairly early on, unprompted, he gave me a list of his top five favorite films. The kind of scope and variety of it was startling, but they were also my top five favorite films. We talked about Stalker, from filmmaker Andrei Tarkovsky, and I was a massive Tarkovsky fan at the time. He also talked about Annie Hall and Bad Lands.

At that point in time, there weren’t a lot of people doing the type of work that Taika does: that mix of comedy and drama. That was the moment I thought, “I’ve got to work with this guy. I don’t know if I’m going to find anyone else like this in New Zealand.”

How is Jojo different than your previous collaboration on Hunt for the Wilderpeople?
We had a lot more to work with on Jojo. There’s a lot more coverage in a typical scene, while the Wilderpeople was three shots: a master and two singles. With Jojo, we just threw everything at it. Taika’s learned over the years that it’s never a bad thing to have another shot. Same goes for improv. It’s never a bad thing to have a different line. Jojo was a much bigger beast to work on.

Jojo is rooted in a moment in history, which people know well, and they’re used to a certain kind of storytelling around that moment. I think in the Czech Republic, where we shot, they make five World War II movies a year. They had a certain idea of how things should look, and we weren’t doing that. We were doing Taika’s take, so we weren’t doing desaturated, handheld, grim, kitchen sink realism. We were creating this whole other world. I think the challenge was to try and bring people along on that journey.

I saw an early version of the script, and the Hitler character wasn’t in the opening scene. How did that come about?
One of the great things about working with Taika is he always does pick-ups. Normally, it’s something that we figure out that we need during the process of the edit. He rewrote a bunch of different options for the ending of the movie, a few scenes dotted throughout and the opening of the film.

He shot three versions. In one, it was just Jojo on his own, trying to psych himself up. Then there were variations on how much Adolf we would have in the film. What we found when we screened the film up to that point was that people were on board with the film, but it sometimes took them a while to get there … to understand the tone of the film. The moment we put imaginary Adolf in that scene, it was like planting a flag and saying, “This is what this film is. It’s going to be a comedy about Hitler and Nazis, and you’re either with us or you’re walking out, but if you’re with us, you will find out it’s about a lot more than that.”

Some directors sit right at the editor’s elbow, overlooking every cut, and some go away and leave the editor to make a first cut. What was this experience like?
While I’ve experienced both, Taika’s definitely in the latter category. He’s interested in what you have to say and what you might bring to the edit. He also wants to know what people think, so we screen the film a lot. Across the board — it’s not just isolated to me, but anyone he works with — he just wants more ideas.

After the shooting finished, he gave me two weeks. He went and had a break and encouraged me to do what I wanted with the assemble, to cut scenes and to not be too precious about including everything. I did that, but I was still relatively cautious; there were some things I wanted him to see.

We experimented with various structures. We tried an archiving thing for the end of the film. There was a fantasy sequence in which Elsa is talking about the story of the Jews, and we see flights of fancy of what she thinks … a way for her to escape into fantasy. That was an idea of Taika’s. He just left me to it for a couple of weeks, and we looked at it and decided against it in the end. It was a fun process because when he comes back, he’s super fresh. You offer up one idea and he throws five back.

How long was the first cut?
I asked my assistant the other day, and he said it was about two hours and forty minutes, so I guess I have to go with that, which sounds long to me. That might have been the first compile that had all of the scenes in it, and what I showed Taika was probably half an hour shorter. We definitely had a lot to play with.

Do you think there’s going to be a director’s cut?
I think what you see is the director’s cut. There’s not a version of the film that has more stuff in it than we wanted in it. I think it is pretty much the perfect direction. I might have cut a little bit more because I think I just work that way. There were definitely things that we missed, but I wouldn’t put them back in because of what we gained by taking them out.

We didn’t lean that heavily on comedy once we transitioned into drama. The longer you’re away from Jojo and Elsa, that’s when we found that the story would flounder a little bit. It’s interesting because when I initially read the script, I was worried that we would get bored of that room, and that it would feel too much like a stage play. So we added all of this color and widened the world out. We had these scenes where Jojo goes out into the world, but actually the relationship between the two of them — that’s the story. Each scene in that relationship, the kind of gradual progression toward each other, is what’s moving the story forward.

This movie messes with your expectations, in terms of where you think it’s going or even how it’s saying it. How did you go about creating your own rhythms for that style of storytelling?
I was fortunate in that I already had Taika’s other films to lean on, so partly it was just trying to wrestle this genre into his world … into his kind of subgenre of Taika. It’s really just a sensibility a lot of the time. I was aware that I wanted a breathlessness to the pace of things, especially for the first half of the movie in order to match Jojo’s slightly ADD, overexcited character. That slows down a little bit when it needs to and when he’s starting to understand the world around him a little bit more.

Can you talk about the music?
Music also was important. The needle drops. Taika had a bunch of them already. He definitely had The Beatles and Bowie, and it was fleshing out a few more of those. I think I found the Roy Orbison piece. Temp music was also really important. It was quite hard to find stuff. Taika’s brief was: I don’t want it to sound like all the other movies in the genre. As much as we respected Schindler’s List, he didn’t want it to sound like Schindler’s List.

You edited on Avid Media Composer?
We cut on Avid, and it was the first time I really used ScriptSync. I had been wary of it, to be honest. I watch all the dailies through from head to tail and see the performances in context and feel how they affect me. Once that’s done, ScriptSync is great for comparing takes or swapping out a read on a line. Because we had so much improv on this film, we had to go through and enter all of that in manually. Sometimes we’d use PhraseFind to search on a particular word that I’d remembered an actor saying in an ad-lib. It’s a much faster and more efficient way of finding that stuff.

That said, I still periodically go back and watch dailies. As the film starts to solidify, so does what I’m looking for in the dailies, so I’ll always go back and see if there’s anything that I view differently with a new in mind.

You mentioned the difference between Wilderpeople and Jojo in terms of coverage. How much more coverage did you have? Were there multiple cameras?
There were two and sometimes three cameras (ARRI Alexa). Some scenes were single camera, so there was a lot more material mastered. Some directors get a bit iffy about two cameras, but we just rolled it.

If we had the option, we would almost always lean on the A camera, and part of the trick was to try and make it look like his other movies. We wanted the coverage plan to feel simple; it should still feel like a master, couple of mediums and a couple of singles, all in that very flat framing approach of his. Often, the characters are interacting with each other perpendicular to the camera in these fixed static wides.

Again, one of the things Taika was concerned with was that it should feel like his other movies. Just because we have a dolly, we don’t have to use it every time. We had all of those shots, we had those options, and often it was about pairing things back to try and stay in time.

Does he give you a lot of takes, and does he create different emotional variations within those takes?
We definitely had a lot of takes. And, yes, there would be a great deal of variety of performance, whether it’s him just trying to push an actor and get them to a specific place, or sometimes we just had options.

Was there an average — five takes, 10 takes?
It’s really hard to say. These days everyone just does rolling resets. You look at your bin and you think, “Ah, great, they did five takes, and there’s only three set-ups. How long is it going to take me?” But you open it up, and each take is like half an hour long, and they’re reframing on the fly.

With Scarlett Johansson, you do five takes max, probably. But with the kids it would be a lot of rolling resets and sometimes feeding them lines, and just picking up little lines here and there on the fly. Then with the comedians, it was a lot of improv, so it’s hard to quantify takes, but it was a ton of footage.

If you include the archive footage, I think we had 300 to 400 hours. I’m not sure how much of that was our material, but it would’ve been at least 100 hours.

I was impressed by the way you worked the “getting real” scenes: the killing of the rabbit and the hanging scene. How did you conceptualize and integrate those really important moments?
For the hanging scene, I was an advocate for having it as early in the movie as possible. It’s the moment in the film where we’ve had all this comedy and good times [regarding] Nazis, and then it drives home that this film is about Nazis, and this is what Nazis do.

I wanted to keep the rabbit scene fun to a degree because of where it sits in the movie. I know, obviously, it’s quite a freaky scene for a lot of people, but it’s kind of scary in a genre way for me.

Something about those woods always remind me of Stand by Me. That was the movie that was in my mind, and just the idea of those older kids, the bullies, being dicks. Moments like that and, much more so, the moment when Jojo finds Elsa; I thought of that sequence as a mini horror film within the film. That was really useful to let the scares drive it because we were so much in Jojo’s point of view. It’s taking those genres and interjecting a little bit of humor or a little bit of lightness into them to keep them in tone with Taika’s overall sensibility.

I read that you tried to steer clear of the sentimentality. How did you go about doing that?
It’s a question of taste with the performance(s) and things that other people might like. I will often feel I’m feeding the audience or demanding of the audience an emotional response. The scene where Jojo finds Rosie. We shot an option seeing Rosie hanging there. It just felt too much. It felt like it was really bludgeoning people over the head with the horror of the moment. It was enough to see the shoes. Every time we screened the movie and Jojo stands up, we see the shoes and everyone gasps. I think people have gotten the information that they need.


Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

Editor David Cea joins Chicago’s Optimus  

Chicago-based production and post house Optimus has added editor David Cea to its roster. With 15 years of experience in New York and Chicago, Cea brings a varied portfolio of commercial editing experience to Optimus.

Cea has cut spots for brands such as Bank of America, Chevrolet, Exxon, Jeep, Hallmark, McDonald’s, Microsoft and Target. He has partnered with many agencies, including BBDO, Commonwealth, DDB, Digitas, Hill Holliday, Leo Burnett, Mother and Saatchi & Saatchi.

“I grew up watching movies with my dad and knew I wanted to be a part of that magical process in some way,” explains Cea. “The combination of Goodfellas and Monty Python gave me all the fuel I needed to start my film journey. It wasn’t until I took an editing class in college that I discovered the part of filmmaking I wanted to pursue. The editor is the one who gets to shape the final product and bring out the true soul of the footage.”

After studying film at Long Island’s Hofstra University, Cea met Optimus editor and partner Angelo Valencia while working as his assistant at Whitehouse New York in 2005. Cea then moved on to hone his craft further at Cosmo Street in New York. Chicago became home for him in 2013 as he spent three years at Whitehouse. After heading back east for a couple of years, he returned to Chicago to put down roots.

While Avid Media Composer is Cea’s go-to choice for editing, he is also proficient in Adobe Premiere.

FXhome’s HitFilm Express 14, ‘Pay What You Want’ option

FXhome has a new “Pay What You Want” good-will program inspired by the HitFilm Express community’s requests to be able to help pay for development of the historically free video editing and VFX software. Pay What You Want gives users the option to contribute financially, ensuring that those funds will be allocated for future development and improvements to HitFilm.

Additionally, FXhome will contribute a percentage of the proceeds of Pay What You Want to organizations dedicated to global causes important to the company and its community. At its launch, the FXhome Pay What You Want initiative will donate a portion of its proceeds to the WWF and the Australia Emergency Bushfire Fund. The larger the contribution from customers, the more FXhome will donate.

HitFilm Express remains a free download, however, first-time customers will now have the option to “Pay What You Want” on the software. They’ll also receive some exclusive discounts on HitFilm add-on packs and effects.

Coinciding with the release of Pay What You Want, FXhome is releasing HitFilm Express 14, the first version of HitFilm Express to be eligible for the Pay What You Want initiative. HitFilm Express 14 features a new and simplified export process, new text controls, a streamlined UI and a host of new features.

For new customers who would like to download HitFilm Express 14 and also contribute to the Pay What You Want program, there are three options available:

• Starter Pack Level: With a contribution as little as $9, new HitFilm Express 14 customers will also receive a free Starter Pack of software and effects that includes:
o Professional dark mode interface
o Edit tools including Text, Split Screen Masking, PiP, Vertical Video, Action Cam Crop
o Color tools including Exposure, Vibrance, Shadows and Highlights, Custom Gray, Color Phase, Channel Mixer and 16-bit color
o Additional VFX packs including Shatter, 3D Extrusion, Fire, Blood Spray and Animated Lasers
• Content Creator Level: With contributions of $19 or more, users will receive everything included in the Starter Pack, as well as:
o Edit: Repair Pack with Denoise, Grain Removal and Rolling Shutter
o Color: LUT Pack with LUTs and Grading Transfer
o Edit: Beautify Pack with Bilateral Blur and Pro Skin Retouch
• VFX Artist Level: Users who contribute from $39 to $99 get everything in the Starter Pack and Content Creator levels plus:
o Composite Toolkit Pack with Wire Removal, Projector, Clone and Channel Swapper
o Composite Pro-Keying Pack for Chroma Keying
o Motion Audio Visual Pack with Atomic Particles, Audio Spectrum and Audio Waveform
o VFX Neon Lights Pack with Lightsword Ultra (2-Point Auto), Lightsword Ultra (4-Point Manual), Lightsword Ultra (Glow Only) and Neon Path
o VFX Lighting Pack with Anamorphic Lens Flares, Gleam, Flicker and Auto Volumetrics

What’s new in HitFilm Express 14
HitFilm Express 14 adds a number of VFX workflow enhancements to enable even more sophisticated effects for content creators, including a simplified export workflow that allows users to export content directly from the timeline and comps, new text controls, a streamlined UI and a host of new features. Updates include:

• Video Textures for 3D Models: For creators who already have the 3D: Model Render Pack, they can now use a video layer as a texture on a 3D model to add animated bullet holes, cracked glass or changing textures.
• Improvements to the Export Process: In HitFilm Express 14, the Export Queue is now an Export Panel, and is now much easier to use. Exporting can also now be done from the timeline and from comps. These “in-context” exports will export the content between the In and Out points set or the entire timeline using the current default preset (which can be changed from the menu).
• Additional Text Controls: Customizing text in HitFilm Express 14 is now even simpler, with Text panel options for All Caps, Small Caps, Subscript and Superscript. Users can also change the character spacing, horizontal or vertical scale, as well as baseline shift (for that Stranger-Things-style titling).
• Usability and Workflow Enhancements: In addition to the new and improved export process, FXhome has also implemented new changes to the interface to further simplify the entire post production process, including a new “composite button” in the media panel, double-click and keyboard shortcuts. A new Masking feature adds new automation to the workflow; when users double-click the Rectangle or Ellipse tools, a centered mask is automatically placed to fill the center of the screen. Masks are also automatically assigned colors, which can be changed to more easily identify different masks.
• Effects: Users can now double-click the effects panel to apply to the selected layer and drop 2D effects directly onto layers in the viewer. Some effects — such as the Chroma Key and Light Flares — can be dropped on a specific point, or users can select a specific color to key by. Users can also now favorite “effects” for quick and easy access to their five most recently used effects from the ‘Effects’ menu in the toolbar.
• Additional Improvements: Users can now use Behavior effects from the editor timeline, click-drag across multiple layers to toggle “solo,” “locked” or “visibility” settings in one action, and access templates directly from the media panel with the new Templates button. Menus have also been added to the tab of each panel to make customization of the interface easier.
• Open Imerge Pro files in HitFilm: Imerge Pro files can now be opened directly from HitFilm as image assets. Any changes made in the Imerge Pro project will be automatically updated with any save, making it easier to change image assets in real time.
• Introducing Light Mode: The HitFilm Express interface is now available in Light Mode and will open in Light Mode the first time you open the software. Users with a pre-existing HitFilm Express license can easily change back to the dark theme if desired.

HitFilm Express 14 is available immediately and is a free download. Customers downloading HitFilm Express 14 for the first time are eligible to participate in the new Pay What You Want initiative. Free effects and software packs offered in conjunction with Pay What You Want are only available at initial download of HitFilm Express 14.

Nomad Editorial hires eclectic editor Dan Maloney

Nomad Editing Company has added editor Dan Maloney to its team. Maloney is best known for his work cutting wry, eclectic comedy spots in addition to more emotional content. While his main tool is Avid Media Composer, he is also well-versed in Adobe Premiere.

“I love that I get to work in so many different styles and genres. It keeps it all interesting,” he says.

Prior to joining Nomad, Maloney cut at studios such as Whitehouse Post, Cut+Run, Spot Welders and Deluxe’s Beast. Throughout his career, Maloney has uses his eye for composition on a wide range of films, documentaries, branded content and commercials, including the Tide Interview spot that debuted at Super Bowl XLII.
“My editing style revolves mostly around performance and capturing that key moment,” he says. “Whether I’m doing a comedic or dramatic piece, I try to find that instance where an actor feels ‘locked in’ and expand the narrative out from there.”

According to Nomad editor/partner Jim Ulbrich, “Editing is all about timing and pace. It’s a craft and you can see Dan’s craftsmanship in every frame of his work. Each beat is carefully constructed to perfection across multiple mediums and genres. He’s not simply a comedy editor, visual storyteller, or doc specialist. He’s a skilled craftsman.”

Adobe Premiere Productions: film projects, collaborative workflows

By Mike McCarthy

Adobe announced a new set of features coming to its NLE Premiere Pro. They now support “Productions” within Premiere, which allows easier management of sets of projects being shared between different users. The announcement, which came during the Sundance Film Festival, is targeted at filmmakers working on large-scale projects with teams of people collaborating on site.

Productions extends and refines Premiere’s existing “Shared Project” model, making it easier to manage work spread across a large number of individual projects, which can become unwieldy with the current implementation.

Shared Projects should not be confused with Team Projects, which is an online project-sharing tool set across different locations that each have their own local media and Adobe Anywhere, which is a cloud based streaming editing platform with no local files. Shared Projects are used between users on a local network, usually with high-quality media, with simple mechanisms for passing work between different users. Shared Projects were introduced in Premiere Pro 2018 and included three components. Here, I’m going to tell you what the issues were and how the new Adobe Productions solves them:

1) The ability to add a shortcut to another project into the project panel, which was next to useless. The projects were in no other way connected with each other, and incrementing the target project to a new name (V02) broke the link. The only benefit was to see who might have the shortcut-ed project open and locked, which brings us to:

2) The ability to lock projects that were open on one system, preventing other users from inadvertently editing them at the same time and overwriting each other’s work, which should have been added a long time ago. This was previously managed through a process called “shout down the hall” before opening projects.

3) And most significantly, the inability to open more than one project at the same time. The previous approach was to import other projects into your existing project, but this resulted in massive project files that took forever to load, among other issues. Opening more than one project at once allowed projects to be broken into smaller individual parts, and then different people could more easily work on different parts at the same time.

For the last two-plus years, large films have been able to break down their work into many smaller projects and distribute those projects between numerous users who are working on various parts. And those users can pass the pieces back and forth without concern for overwriting each other’s work. But there was no central way to control all of those projects, and the master project/Shared Project shortcut system required you not to version your projects (bad file management) or to re-linking every project version to the master project (tedious).

You also end up with lots of copies of your media, as every time an asset is used in a different project, a new copy of it is copied into that project. If you update or edit an asset in one project, it won’t change the copies that are used in other projects (master clip effects, relinking, reinterpreting footage, proxies, etc.).

Problems Solved
Premiere’s new Production Panel and tool set are designed to solve those problems. First, it gives you a panel to navigate and explore all of the projects within your entire production, however you structure them within your master project folder. You can see who has what open and when.

When you copy an asset into a sequence from another project, it maintains a reference to the source project, so subsequent changes to that asset (color correction, attaching full-res media, etc.) can propagate to the instance in the sequence of the other project — if both projects are open concurrently to sync.

If the source project can’t be found, the child instance is still a freestanding piece of media that fully functions; it just no longer receives synchronized updates from the master copy. (So you don’t have a huge web of interconnected projects that will all fail if one of them is corrupted or deleted.)

All projects in a Production have the same project settings, (Scratch disks, GPU renderer, etc.) keeping them in sync and allowing you to update those settings across the production and share render files between users. And all files are stored on your local network for maximum performance and security.

In the application, this allows all of the source media to be in dedicated “dailies” projects, possibly a separate project for every day of filming. Then each scene or reel can be its own project, with every instance in the sequences referencing back to a master file in the dailies. Different editors and assistants can be editing different scenes, and all of them can have any source project open concurrently in read-only mode without conflict. As soon as someone saves changes, an icon will alert users that they can update the copy they have open and unlock it to continue working.

Some Limitations
Moving a sequence from one project to another doesn’t retain a link to the original because that could become a mess quickly. But it would be nice to be able to make edits to “reels” and have those changes reflected in a long-play project that strings those reels together. And with so many projects open at once, it can become difficult to keep track of what sequences go with what project panels.

Ideally, a color-coded panel system would help with that, either with random colors for contrast or with user-assigned colors by type of project. In that case it would still be good to highlight what other panels are associated with the selected panel, since two projects might be assigned the same color.

Summing Up
Regardless of those potential changes, I have been using Shared Projects to its fullest potential on a feature film throughout 2019, and I look forward to the improvements that the new Production panel will bring to my future workflows.

Check out this video rundown:


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Quick Chat: Director Sorrel Brae on Rocket Mortgage campaign

By Randi Altman

Production company Native Content and director Sorrel Brae have collaborated once again with Rocket Mortgage’s in-house creative team on two new spots in the ongoing “More Than a House” campaign. Brae and Native had worked together on the campaigns first four offerings.

The most recent spots are More Than a Tradition and More Than a Bear. More Than a Tradition shows a ‘50s family sitting down to dinner and having a fun time at home. Then the audience sees the same family in modern times, hammering home how traditions become traditions.

More Than a Bear combines fantasy and reality as it shows a human-sized teddy bear on an operating table. Then viewers see a worried boy looking on as his mother is repairing the his stuffed animal. Each spot opens with the notes of Bob Dylan’s “The Man In Me,” which is featured in all the “More Than a House” spots.

More Than a Bear was challenging, according to Brae, because there was some darker material in this piece as compared to the others  —  viewers aren’t sure at first if the bear will make it. Brae worked closely with DP Jeff Kim on the lighting and color palette to find a way to keep the tone lighthearted. By embracing primary colors, the two were able to channel a moodier tone and bring viewers inside a scared child’s imagination while still maintaining some playfulness.

We reached out to director Brae to find our more.

Sorrel Brae

What did you shoot these two spots on, and why?
I felt that in order for the comedy to land and the idea to shine, the visual separation between fantasy and reality had to be immediate, even shocking. Shooting on an Alexa Mini, we used different lenses for the two looks: Hawk V-Lite Vintage ’74 anamorphic for epic and cinematic fantasy, and spherical Zeiss and Cooke S4 primes for reality. The notable exception was in the hospital for the teddy bear spot, where our references were the great Spielberg and Zemeckis films from the ‘80s, which are primarily spherical and have a warmer, friendlier feeling.

How did you work with the DP and the colorist on the look? And how would you describe the look of each spot, and the looks within each spot? 
I was fortunate to bring on longtime collaborators DP Jeffrey Kim and colorist Mike Howell for both spots. Over the years, Jeff and I have developed a shorthand for working together. It all starts with defining our intention and deciding how to give the audience the feelings we want them to have.

In Tradition, for example, that feeling is a warm nostalgia for a bygone era that was probably a fantasy then, just as it is now. We looked to period print advertisements, photographs, color schemes, fonts — everything that spoke to that period. Crucial to pulling off both looks in one day was Heidi Adams’ production design. I wanted the architecture of the house to match when cutting between time periods. Her team had to put a contemporary skin on a 1950s interior for us to shoot “reality” and then quickly reset the entire house back to 1950s to shoot “fantasy.”

The intention for More Than a Bear was trickier. From the beginning I worried a cinematic treatment of a traumatic hospital scene wouldn’t match the tone of the campaign. My solution with Jeff was to lean into the look of ‘80s fantasy films like E.T. and Back to the Future with primary colors, gelled lights, a continuously moving camera and tons of atmosphere.

Mike at Color Collective even added a retro Ektachrome film emulation for the hospital and a discontinued Kodak 5287 emulation for the bedroom to complete the look. But the most fun was the custom bear that costume designer Bex Crofton-Atkins created for the scene. My only regret is that the spot isn’t 60 seconds because there’s so much great bear footage that we couldn’t fit into the cut.

What was this edited on? Did you work with the same team on both campaigns?
The first four spots of this campaign were cut by Jai Shukla out of Nomad Edit. Jai did great work establishing the rhythm between fantasy and reality and figuring out how to weave in Bob Dylan’s memorable track for the strongest impact. I’m pretty sure Jai cuts on Avid, which I like to tease him about.

These most recent two spots (Tradition and Teddy Bear) were cut by Zach DuFresne out of Hudson Edit, who did an excellent job navigating scripts with slightly different challenges. Teddy Bear has more character story than any of the others, and Tradition relies heavily on making the right match between time periods. Zach cuts on Premiere, which I’ve also migrated to (from FCP 7) for personal use.

Were any scenes more challenging than the others?
What could be difficult about kids, complex set design, elaborate wardrobe changes and detailed camera moves on a compressed schedule? In truth, it was all equally challenging and rewarding.

Ironically, the shots that gave us the most difficulty probably look the simplest. In Tradition there’s a SteadiCam move that introduces us into the contemporary world, has match cuts on either end and travels through most of the set and across most of the cast. Because everyone’s movements had to perfectly align with a non-repeatable camera, that one took longer than expected.

And on Teddy Bear, the simple shot looking up from the patient’s POV as the doctor/mom looms overhead was surprisingly difficult. Because we were on an extremely wide lens (12mm or similar), our actress had to nail her marks down to the millimeter, otherwise it looked weird. We probably shot that one setup 20 times.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Talking with Franki Ashiruka of Nairobi’s Africa Post Office

By Randi Altman

After two decades of editing award-winning film and television projects for media companies throughout Kenya and around the world, Franki Ashiruka opened Africa Post Office, a standalone, post house in Nairobi, Kenya. The studio provides color grading, animation, visual effects, motion graphics, compositing and more. In addition, they maintain a database of the Kenyan post production community that allows them to ramp up with the right artists when the need arises.

Here she talks about the company, its workflow and being a pioneer in Nairobi’s production industry.

When did you open Africa Post Office, and what was your background prior to starting this studio?
Africa Post Office (APO) opened its doors in February 2017. Prior to starting APO, I was a freelance editor with plenty of experience working with well-established media houses such as Channel 4 (UK), Fox International Channels (UK), 3D Global Leadership (Nigeria), PBS (USA), Touchdown (New Zealand), Greenstone Pictures (New Zealand) and Shadow Films (South Africa).

In terms of Kenya-based projects, I’ve worked with a number of production houses including Quite Bright Films, Fat Rain Films, Film Crew in Africa, Mojo Productions, Multichoice, Zuku, Content House and Ginger Ink Films.

I imagine female-run, independent studios in Africa are rare?
On the contrary, Kenya has reached a point where more and more women are emerging as leaders of their own companies. I actually think there are more women-led film production companies than male-led. The real challenge was that before APO, there was nothing quite like it in Nairobi. Historically, video production here was very vertical — if you shot something, you’d need to also manage post within whatever production house you were working in. There were no standalone post houses until us. That said, with my experience, even though hugely daunting, I never thought twice about starting APO. It is what I have always wanted to do, and if being the first company of our kind didn’t intimidate me, being female was never going to be a hindrance.

L-R: Franki Ashiruka, Kevin Kyalo, Carole Kinyua and Evans Wenani

What is the production and post industry like in Nairobi? 
When APO first opened, the workload was commercial-heavy, but in the last two years that has steadily declined. We’re seeing this gap filled by documentary films, corporate work and television series. Feature films are also slowly gaining traction and becoming the focus of many up-and-coming filmmakers.

What services do you provide, and what types of projects do you work on?
APO has a proven track record of successful delivery on hundreds of film and video projects for a diverse range of clients and collaborators, including major corporate entities, NGOs, advertising and PR agencies, and television stations. We also have plenty of experience mastering according to international delivery standards. We’re proud to house a complete end-to-end post ecosystem of offline and online editing suites.

Most importantly, we maintain a very thorough database of the post production community in Kenya.
This is of great benefit to our clients who come to us for a range of services including color grading, animation, visual effects, motion graphics and compositing. We are always excited to collaborate with the right people and get additional perspectives on the job at hand. One of our most notable collaborators is Ikweta Arts (Avatar, Black Panther, Game of Thrones, Hacksaw Ridge), owned and run by Yvonne Muinde. They specialize in providing VFX services with a focus in quality matte painting/digital environments, art direction, concept and post visual development art. We also collaborate with Keyframe (L’Oréal, BMW and Mitsubishi Malaysia) for motion graphics and animations.

Can you name some recent projects and the work you provided?
We are incredibly fortunate to be able to select projects that align with our beliefs and passions.

Our work on the short film Poacher (directed by Tom Whitworth) won us three global Best Editing Awards from the Short to the Point Online Film Festival (Romania, 2018), Feel the Reel International Film Festival (Glasgow, 2018) and Five Continents International Film Festival (Venezuela, 2019).

Other notable work includes three feature documentaries for the Big Story segment on China Global Television Network, directed by Juan Reina (director of the Netflix Original film Diving Into the Unknown), Lion’s Den (Quite Bright Films) an adaptation of ABC’s Shark Tank and The Great Kenyan Bake Off (Showstopper Media) adopted from the BBC series The Great British Bake Off. We also worked on Disconnect, a feature film produced by Kenya’s Tosh Gitonga (Nairobi Half Life), a director who is passionate about taking Africa’s budding film industry to the next level. We have also worked on a host of television commercials for clients extending across East Africa, including Kenya, Rwanda, South Sudan and Uganda.

What APO is most proud of though, is our clients’ ambitions and determination to contribute toward the growth of the African film industry. This truly resonates with APO’s mantra.

You recently added a MAM and some other gear. Can you talk about the need to upgrade?
Bringing on the EditShare EFS 200 nodes has significantly improved the collaborative possibilities of APO. We reached a point where we were quickly growing, and the old approach just wasn’t going to cut it.

Prior to centralizing our content, projects lived on individual hard disks. This meant that if I was editing and needed my assistant to find me a scene or a clip, or I needed VFX on something, I would have to export individual clips to different workstations. This created workflow redundancies and increased potential for versioning issues, which is something we couldn’t afford to be weighed down with.

The remote capabilities of the EditShare system were very appealing as well. Our color grading collaborator, Nic Apostoli of Comfort and Fame, is based in Cape Town, South Africa. From there, he can access the footage on the server and grade it while the client reviews with us in Nairobi. Flow media asset management also helps in this regard. We’re able to effectively organize and index clips, graphics, versions, etc. into clearly marked folders so there is no confusion about what media should be used. Collaboration among the team members is now seamless regardless of their physical location or tools used, which include the Adobe Creative Suite, Foundry Nuke, Autodesk Maya and Maxon Cinema 4D.

Any advice for others looking to break out on their own and start a post house?
Know what you want to do, and just do it! Thanks Nike …


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

ACE Eddie Awards: Parasite and Jojo Rabbit among winners

By Dayna McCallum

The 70th Annual ACE Eddie Awards concluded with wins for Parasite (edited by Jinmo Yang) for Best Edited Feature Film (Dramatic) and Jojo Rabbit (edited by Tom Eagles) for Best Edited Feature Film (Comedy). Yang’s win marks the first time in ACE Eddie Awards history that a foreign language film won the top prize.

The winner of the Best Edited Feature Film (Dramatic) category has gone on to win the Oscar for film editing in 11 of the last 15 years. In other feature categories, Toy Story 4 (edited by Axel Geddes, ACE) won Best Edited Animated Feature Film and Apollo 11 (edited by Todd Douglas Miller) won Best Edited Documentary.

For the second year in a row, Killing Eve won for Best Edited Drama Series (Commercial Television) for “Desperate Measures” (edited by Dan Crinnion). Tim Porter, ACE, took home his second Eddie for Game of Thrones “The Long Night” in the Best Edited Drama Series (Non-Commercial Television) category, and Chernobyl “Vichnaya Pamyat” (edited by Jinx Godfrey and Simon Smith) won Best Edited Miniseries or Motion Picture for Television.

Other television winners included Better Things “Easter” (edited by Janet Weinberg) for Best Edited Comedy Series (Commercial Television), and last year’s Eddie winner for Killing Eve,  Gary Dollner, ACE, for Fleabag “Episode 2.1″ in the Best Edited Comedy Series (Non-Commercial Television) category.

Lauren Shuler Donner received the ACE’s Golden Eddie honor, presented to her by Marvel’s Kevin Feige. In her heartfelt acceptance speech, she noted to an appreciative crowd, “I’ve witnessed many times an editor make chicken salad our of chicken shit.”

Alan Heim and Tina Hirsch received Career Achievement awards presented by filmmakers Nick Cassavetes and Ron Underwood respectively. Cathy Repola, national executive director of the Motion Picture Editors Guild, was presented  with the ACE Heritage Award. American Cinema Editors president Stephen Rivkin, ACE, presided over the evening’s festivities for the final time, as his second term is ending.  Actress D’Arcy Carden, star of NBC’s The Good Place, served as the evening’s host.

Here is the complete list of winners:

BEST EDITED FEATURE FILM (DRAMA):
Parasite 
Jinmo Yang

Tom Eagles – Jojo Rabbit

BEST EDITED FEATURE FILM (COMEDY):
Jojo Rabbit
Tom Eagles

BEST EDITED ANIMATED FEATURE FILM:
Toy Story 4
Axel Geddes, ACE

BEST EDITED DOCUMENTARY (FEATURE):
Apollo 11
Todd Douglas Miller

BEST EDITED DOCUMENTARY (NON-THEATRICAL):
What’s My Name: Muhammad Ali
Jake Pushinsky, ACE

BEST EDITED COMEDY SERIES FOR COMMERCIAL TELEVISION:
Better Things: “Easter”
Janet Weinberg, ACE

BEST EDITED COMEDY SERIES FOR NON-COMMERCIAL TELEVISION:
Fleabag: “Episode 2.1”
Gary Dollner, ACE

BEST EDITED DRAMA SERIES FOR COMMERCIAL TELEVISION: 
Killing Eve: “Desperate Times”
Dan Crinnion

BEST EDITED DRAMA SERIES FOR NON-COMMERCIAL TELEVISION:
Game of Thrones: “The Long Night”
Tim Porter, ACE

BEST EDITED MINISERIES OR MOTION PICTURE FOR TELEVISION:
Chernobyl: “Vichnaya Pamyat”
Jinx Godfrey & Simon Smith

BEST EDITED NON-SCRIPTED SERIES:
VICE Investigates: “Amazon on Fire”
Cameron Dennis, Kelly Kendrick, Joe Matoske, Ryo Ikegami

ANNE V. COATES AWARD FOR STUDENT EDITING
Chase Johnson – California State University, Fullerton


Main Image: Parasite editor Jinmo Yang

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.

Review: HP’s ZBook G6 mobile workstation

By Brady Betzel

In a year that’s seen AMD reveal an affordable 64-core processor with its Threadripper 3, it appears as though we are picking up steam toward next-level computing.

Apple finally released its much-anticipated Mac Pro (which comes with a hefty price tag for the 1.5TB upgrade), and custom-build workstation companies — like Boxx and Puget Systems — can customize good-looking systems to fit any need you can imagine. Additionally, over the past few months, I have seen mobile workstations leveling the playing field with their desktop counterparts.

HP is well-known in the M&E community for its powerhouse workstations. Since I started my career, I have either worked on a MacPro or an HP. Both have their strong points. However, workstation users who must be able to travel with their systems, there have always been some technical abilities you had to give up in exchange for a smaller footprint. That is, until now.

The newly released HP ZBook 15 G6 has become the rising the rising tide that will float all the boats in the mobile workstation market. I know I’ve said it before, but the classification of “workstation” is technically much more than just a term companies just throw around. The systems with workstation-level classification (at least from HP) are meant to be powered on and run at high levels 24 hours a day, seven days a week, 365 days a year.

They are built with high-quality, enterprise-level components, such as ECC (error correcting code) memory. ECC memory will self-correct errors that it sees, preventing things like blue screens of death and other screen freezes. ECC memory comes at a cost, and that is why these workstations are priced a little higher than a standard computer system. In addition, the warranties are a little more inclusive — the HP ZBook 15 G6 comes with a standard three-year/on-site service warranty.

Beyond the “workstation” classification, the ZBook 15 G6 is amazingly powerful, brutally strong and incredibly colorful and bright. But what really matters is under the hood. I was sent the HP ZBook 15 G6 that retails for $4,096 and contains the following specs:
– Intel Xeon E-2286M (eight cores/16 threads — 2.4GHz base/5GHz Turbo)
– Nvidia Quadro RTX 3000 (6GB VRAM)
15.6-inch UHD HP Dream Color display, anti-glare, WLED backlit 600 nits, 100% DCI-P3
– 64GB DDR4 2667MHz
– 1TB PCIe Gen 3 x4 NVMe SSD TLC
– FHD webcam 1080p plus IR camera
– HP collaboration keyboard with dual point stick
– Fingerprint sensor
– Smart Card reader
– Intel Wi-Fi 6 AX 200, 802.11ac 2×2 +BT 4.2 combo adapter (vPro)
– HP long-life battery four-cell 90 Wh
– Three-year limited warranty

The ZBook 15 G6 is a high-end mobile workstation with a price that reflects it. However, as I said earlier, true workstations are built to withstand constant use and, in this case, abuse. The ZBook 15 G6 has been designed to pass up to 21 extensive MIL-STD 810G tests, which is essentially worst-case scenario testing. For instance, drop testing of around four feet, sand and dust testing, radiation testing (the sun beating down on the laptop for an extended period) and much more.

The exterior of the G6 is made of aluminum and built to withstand abuse. The latest G6 is a little bulky/boxy, in my opinion, but I can see why it would hold up to some bumps and bruises, all while working at blazingly fast speeds, so bulk isn’t a huge issue for me. Because of that bulk, you can imagine that this isn’t the lightest laptop either. It weighs in at 5.79 pounds for the lowest end and measures 1 inch by 14.8 inches by 10.4 inches.

On the bottom of the workstation is an easy-to-access panel for performing repairs and upgrades yourself. I really like the bottom compartment. I opened it and noticed I could throw in an additional NVMe drive and an SSD if needed. You can also access memory here. I love this because not only can you perform easy repairs yourself, but you can perform upgrades or part replacements without voiding your warranty on the original equipment. I’m glad to see that HP kept this in mind.

The keyboard is smaller than a full-size version but has a number keypad, which I love using when typing in timecodes. It is such a time-saver for me. (I credit entering in repair order numbers when I fixed computers at Best Buy as a teenager.) On the top of the keyboard are some handy shortcuts if you do web conferences or calls on your computer, including answering and ending calls. The Bang & Olufsen speakers are some of the best laptop speakers I’ve heard. While they aren’t quite monitor-quality, they do have some nice sound on the low end that I was able to fine-tune in the Bang & Olufsen audio control app.

Software Tests
All right, enough of the technical specs. Let’s get on to what people really want to know — how the HP ZBook 15 G6 performs while using apps like Blackmagic’s DaVinci Resolve and Adobe Premiere Pro. I used sample Red and Blackmagic Raw footage that I use a lot in testing. You can grab the Red footage here and the BRaw footage here. Keep in mind you will need to download the BRaw software to edit with BRaw inside of Adobe products, which you can find here).

Performance monitor while exporting in Resolve with VFX.

For testing in Resolve and Premiere, I strung out one-minute of 4K, 6K and 8K Red media in one sequence and the 4608×2592 4K and 6K BRaw media in another. During the middle of my testing Resolve had a giant Red API upgrade to allow for better realtime playback of Red Raw files if you have an Nvidia CUDA-based GPU.

First up is Resolve 16.1.1 and then Resolve 16.1.2. Both sequences are set to UHD (3840×2160) resolution. One sequence of each codec contains just color correction, while another of each codec contains effects and color correction. The Premiere sequence with color and effects contains basic Lumetri color correction, noise reduction (50) and a Gaussian blur with settings of 0.4. In Resolve, the only difference in the color and effects sequence is that the noise reduction is spatial and set to Enhanced, Medium and 25/25.

In Resolve, the 4K Red media would play in realtime while the 6K (RedCode 3:1) would jump down to about 14fps to 15fps, and the 8K (RedCode 7:1) would play at 10fps at full resolution with just color correction. With effects, the 4K media would play at 20fps, 6K at 3fps and 8K at 10fps. The Blackmagic Raw video would play at real time with just color correction and around 3fps to 4fps with effects.

This is where I talk about just how loud the fans in the ZBook 15 G6 can get. When running exports and benchmarks, the fans are noticeable and a little distracting. Obviously, we are running some high-end testing with processor- and GPU-intensive tests but still, the fans were noticeable. However, the bottom of the mobile workstation was not terribly hot, unlike the MacBook Pros I’ve tested before. So my lap was not on fire.

In my export testing, I used those same sequences as before and from Adobe Premiere Pro 2020. I exported UHD files using Adobe Media Encoder in different containers and codecs: H.264 (Mov), H.265 (Mov), ProResHQ, DPX, DCP and MXF OP1a (XDCAM). The MXF OP1a was at 1920x1080p export.
Here are my results:

Red (4K,6K,8K)
– Color Only: H.264 – 5:27; H.265 – 4:45; ProResHQ – 4:29; DPX – 3:37; DCP – 10:38; MXF OP1a – 2:31

Red Color, Noise Reduction (50), Gaussian Blur .4: H.264 – 4:56; H.265 – 4:56; ProResHQ – 4:36; DPX – 4:02; DCP – 8:20; MXF OP1a – 2:41

Blackmagic Raw
Color Only: H.264 – 2:05; H.265 – 2:19; ProResHQ – 2:04; DPX – 3:33; DCP – 4:05; MXF OP1a – 1:38

Color, Noise Reduction (50), Gaussian Blur 0.4: H.264 – 1:59; H.265 – 2:22; ProResHQ – 2:07; DPX – 3:49; DCP – 3:45; MXF OP1a – 1:51

What is surprising is that when adding effects like noise reduction and a Gaussian blur in Premiere, the export times stayed similar. While using the ZBook 15 G6, I noticed my export times improved when I upgraded driver versions, so I re-did my tests with the latest Nvidia drivers to make sure I was consistent. The drivers also solved an issue in which Resolve wasn’t reading BRaw properly, so remember to always research drivers.

The Nvidia Quadro RTX 3000 really pulled its weight when editing and exporting in both Premiere and Resolve. In fact, in previous versions of Premiere, I noticed that the GPU was not really being used as well as it should have been. With the Premiere Pro 2020 upgrade it seems like Adobe really upped its GPU usage game — at some points I saw 100% GPU usage.

In Resolve, I performed similar tests, but instead of ProResHQ I exported a DNxHR QuickTime file/package instead of a DCP and IMF package. For the most part, they are stock exports in the Deliver page of Resolve, except I forced Video Levels, Forced Debayer and Resizing to Highest Quality. Here are my results from Resolve version 16.1.1 and 16.1.2. (16.1.2 will be in parenthesis.)

– Red (4K, 6K, 8K) Color Only: H.264 – 2:17 (2:31); H.265 – 2:23 (2:37); DNxHR – 2:59 (3:06); IMF – 6:37 (6:40); DPX – 2:48 (2:45); MXF OP1A – 2:45 (2:33)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 5:00 (5:15); H.265 – 5:18 (5:21); DNxHR – 5:25 (5:02); IMF – 5:28 (5:11); DPX – 5:23 (5:02); MXF OP1a – 5:20 (4:54)

-Blackmagic Raw Color Only: H.264 – 0:26 (0:25); H.265 – 0:31 (0:30); DNxHR – 0:50 (0:50); IMF – 3:51 (3:36); DPX – 0:46 (0:46); MXF OP1a – 0:23 (0:22)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 7:51 (7:53); H.265 – 7:45 (8:01); DNxHR – 7:53 (8:00); IMF – 8:13 (7:56); DPX – 7:54 (8:18); MXF OP1a – 7:58 (7:57)

Interesting to note: Exporting Red footage with color correction only was significantly faster from Resolve, but for Red footage with effects applied, export times were similar between Resolve and Premiere. With the CUDA Red SDK update to Resolve in 16.1.2, I thought I would see a large improvement, but I didn’t. I saw an approximate 10% increase in playback but no improvement in export times.

Puget

Puget Systems has some great benchmarking tools, so I reached out to Matt Bach, Puget Systems’ senior labs technician, about my findings. He suggested that the mobile Xeon could possibly still be the bottleneck for Resolve. In his testing he saw a larger increase in speed with AMD Threadripper 3 and Intel i9-based systems. Regardless, I am kind of going deep on realtime playback of 8K Red Raw media on a mobile workstation — what a time we are in. Nonetheless, Blackmagic Raw footage was insanely fast when exporting out of Resolve, while export time for the Blackmagic Raw footage with effects was higher than I expected. There was a consistent use of the GPU and CPU in Resolve much like in the new version of Premiere 2020, which is a trend that’s nice to see.

In addition to Premiere and Resolve testing, I ran some common benchmarks that provide a good 30,000-foot view of the HP ZBook 15 G6 when comparing it to other systems. I decided to use the Puget Systems benchmarking tools. Unfortunately, at the time of this review, the tools were only working properly with Premiere and After Effects 2019, so I ran the After Effects benchmark using the 2019 version. The ZBook 15 G6 received an overall score of 802, render score of 79, preview score of 75.2 and tracking score of 86.4. These are solid numbers that beat out some desktop systems I have tested.

Corona

To test some 3D applications, I ran the Cinebench R20, which gave a CPU score of 3243, CPU (single core) score of 470 and an M/P ratio of 6.90x. I recently began running the Gooseberry benchmark scene in Blender to get a better sense of 3D rendering performance, and it took 29:56 to export. Using the Corona benchmark, it took 2:33 to render 16 passes, 3,216,368 rays/s. Using Octane Bench the ZBook 15 G6 received a score of 139.79. In the Vray benchmark for CPU, it received 9833 Ksamples, and in the Vray GPU testing, 228 mpaths. I’m not going to lie; I really don’t know a lot about what these benchmarks are trying to tell me, but they might help you decide whether this is the mobile workstation for your work.

Cinebench

One benchmark I thought was interesting between driver updates for the Nvidia Quadro RTX 3000 was the Neat Bench from Neat Video — the noise reduction plugin for video. It measures whether your system should use the CPU, GPU or a combination thereof to run Neat Video. Initially, the best combination result was to use the CPU only (seven cores) at 11.5fps.

After updating to the latest Nvidia drivers, the best combination result was to use the CPU (seven cores) and GPU (Quadro RTX 3000) at 24.2fps. A pretty incredible jump just from a driver update. Moral of the story: Make sure you have the correct drivers always!

Summing Up
Overall, the HP ZBook 15 G6 is a powerful mobile workstation that will work well across the board. From 3D to color correction apps, the Xeon processor in combination with the Quadro RTX 3000 will get you running 4K video without a problem. With the HP DreamColor anti-glare display using up to 600 nits of brightness and covering 100% of the DCI-P3 color space, coupled with the HDR option, you can rely on the attached display for color accuracy if you don’t have your output monitor attached. And with features like two USB Type-C ports (Thunderbolt 3 plus DP 1.4 plus USB 3.1 Gen 2), you can connect external monitors for a larger view of your work

The HP Fast Charge will get you out of a dead battery fiasco with the ability to go from 0% to 50% charge in 45 minutes. All of this for around $4,000 seems to be a pretty low price to pay, especially because it includes a three-year on-site warranty and because the device is certified to work seamlessly with many apps that pros use with HP’s independent software vendor verifications.

If you are looking for a mobile workstation upgrade, are moving from desktop to mobile or want an alternative to a MacBook Pro, you should price a system out online.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Behind the Title: Film Editor Edward Line

By Randi Altman

This British editor got his start at Final Cut in London, honing his craft and developing his voice before joining Cartel in Santa Monica.

NAME: Edward Line

COMPANY: Cartel

WHAT KIND OF COMPANY IS CARTEL?
Cartel is an editorial and post company based in Santa Monica. We predominantly service the advertising industry but also accommodate long-form projects and other creative content. I joined Cartel as one of the founding editors in 2015.

CAN YOU GIVE US SOME MORE DETAIL ABOUT YOUR JOB?
I assemble the raw material from a film shoot into a sequence that tells the story and communicates the idea of a script. Sometimes I am involved before the shoot and cut together storyboard frames to help the director decide what to shoot. Occasionally, I’ll edit on location if there is a technical element that requires immediate approval for the shoot to move forward.

Edward Line working on Media Composer

During the edit, I work closely with the directors and creative teams to realize their vision of the script or concept and bring their ideas to life. In addition to picture editing, I incorporate sound design, music, visual effects and graphics into the edit. It’s a collaboration between many departments and an opportunity to validate existing ideas and try new ones.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THE FILM EDITOR TITLE?
A big part of my job involves collaborating with others, working with notes and dealing with tricky situations in the cutting room. Part of being a good editor is having the ability to manage people and ideas while not compromising the integrity and craft of the edit. It’s a skill that I’m constantly refining.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love being instrumental in bringing creative visions together and seeing them realized on screen, while being able to express my individual style and craft.

WHAT’S YOUR LEAST FAVORITE?
Tight deadlines. Filming with digital formats has allowed productions to shoot more and specify more deliverables. However, providing the editor proportional time to process everything is not always a consideration and can add pressure to the process.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I am a morning person so I tend to be most productive when I have fresh eyes. I’ve often executed a scene in the first few hours of a day and then spent the rest of the day (and night) fine-tuning it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I have always had a profound appreciation for design and architecture, and in an alternate universe, I could see myself working in that world.

WHY DID YOU CHOOSE THIS PROFESSION?
I’ve always had ambitions to work in filmmaking and initially worked in TV production after I graduated college. After a few years, I became curious about working in post and found an entry-level job at the renowned editorial company Final Cut in London. I was inspired by the work Final Cut was doing, and although I’d never edited before, I was determined to give editing a chance.

CoverGirl

I spent my weekends and evenings at the office, teaching myself how to edit on Avid Media Composer and learning editing techniques with found footage and music. It was during this experimental process, that I fell in love with editing and I never looked back.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
In the past year I have edited commercials for CoverGirl, Sephora, Bulgari, Carl’s Jr. and Smartcar. I have also cut a short film called Dad Was, which will be submitted to festivals in 2020.

HOW HAVE YOU DEVELOPED NEW SKILLS WHEN CUTTING FOR A SPECIFIC GENRE OR FORMAT?
Cutting music videos allowed me to hone my skills to edit musical performance while telling visual stories efficiently. I learned how to create rhythm and pace through editing and how to engage an audience when there is no obvious narrative. The format provided me with a fertile place to develop my individual editing style and perfect my storytelling skills.

When I started editing commercials, I learned to be more disciplined in visual storytelling, as most commercials are rarely longer than 60 seconds. I learned how to identify nuances in performance and the importance of story beats, specifically when editing comedy. I’ve also worked on numerous films with VFX, animation and puppetry. These films have allowed me to learn about the potential for these visual elements while gaining an understanding of the workflow and process.

More recently, I have been enjoying cutting dialogue in short films. Unlike commercials, this format allows more time for story and character to develop. So when choosing performances, I am more conscious of the emotional signals they send to the audience and overarching narrative themes.

Sephora

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It’s tough to narrow this down to one project…

Recently, I worked on a commercial for the beauty retailer Sephora that promoted its commitment to diversity and inclusivity. The film Identify As We is a celebration of the non-binary community and features a predominantly transgender cast. The film champions ideas of being different and self expression while challenging traditional perceptions of beauty. I worked tirelessly with the director and creative team to make sure we treated the cast and footage with respect while honoring the message of the campaign.

I’m also particularly proud of a short film that I edited called Wale. The film was selected for over 30 film festivals across the globe and won several awards. The culmination of the film’s success was receiving a BAFTA nomination and being shortlisted for the 91st Academy Awards for Best Live Action Short Film.

WHAT DO YOU USE TO EDIT?
I work on Avid Media Composer, but I have recently started to flirt with Adobe Premiere. I think it’s good to be adaptable, and I’d hate to restrict my ability to work on a project because of software.

Wale

ARE YOU OFTEN ASKED TO DO MORE THAN EDIT? IF SO, WHAT ELSE ARE YOU ASKED TO DO?
Yes, I usually incorporate other elements such as sound design, music and visual effects into my edits as they can be instrumental to the storytelling or communication of an idea. It’s often useful for the creative team and other film departments to see how these elements contribute to the final film, and they can sometimes inform decisions in the edit.

For example, sound can play a major part in accenting a moment or providing a transition to another scene, so I often spend time placing sound effects and sourcing music during the edit process. This helps me visualize the scene in a broader context and provides new perspective if I’ve become overfamiliar with the footage.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
No surprises, but my smartphone! Apart from the obvious functions, it’s a great place to review edits and source music when I’m on the move. I’ve also recently purchased a Bluetooth keyboard and Wacom tablet, which make for a tidy work area.

I’m also enjoying using my “smart thermostat” at home which learns my behavior and seems to know when I’m feeling too hot or cold.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Once I have left the edit bay, I decompress by listening to music on the way home. Once home, I take great pleasure from cooking for myself, friends and family.

Maryanne Brandon’s path, and editing Star Wars: The Rise of Skywalker

By Amy Leland

In the interest of full disclosure, I have been a fan of both the Star Wars world and the work of J.J. Abrams for a very long time. I saw Star Wars: Episode IV – A New Hope  in the theaters with my big brother when I was five years old, and we were hooked. I don’t remember a time in my life without Star Wars. And I have been a fan of all of Abrams’ work, starting with Felicity. Periodically, I go back and rewatch Felicity, Alias and Lost. I was, in fact, in the middle of Season 2 of Alias and had already purchased my ticket for The Rise of Skywalker when I was assigned this interview.

As a female editor, I have looked up to Maryann Brandon, ACE, and Mary Jo Markey, ACE — longtime Abrams collaborators — for years. A chance to speak with Brandon was more than a little exciting. After getting the fangirl out of my system at the start of the interview, we had a wonderful conversation about her incredible career and this latest Star Wars offering.

After working in the world of indie film in New York City after NYU film school, Brandon has not only been an important part of J.J. Abrams’ world — serving as a primary editor on Alias, and then on Mission Impossible III, Super 8 and two films each in the Star Trek and Star Wars worlds — but has also edited The Jane Austen Book Club, How to Train Your Dragon and Venom, among others.

Maryann Brandon

Let’s dig a bit deeper with Brandon…

How did your path to editing begin?
I started in college, but I wasn’t really editing. I was just a member of the film society. I was recruited by the NYU Graduate Film program in 1981 because they wanted women in the program. And I thought, it’s that or working on Wall Street, and I wasn’t really that great with the money or numbers. I chose film school.

I had no idea what it was going to be like because I don’t come from a film background or a film family. I just grew up loving films. I ended up spending three years just running around Manhattan, making movies with everyone, and everyone did every job. Then, when I got out of school, I had to finish my thesis film, and there was no one to edit it for me. So I ended up editing it myself. I started to meet people in the business because New York was very close. I got offered a paid position in editing, and I stayed.

I met and worked for some really incredible people along the way. I worked as a second assistant on the Francis Ford Coppola film The Cotton Club. I went from that to working as a first assistant on Richard Attenborough’s version of A Chorus Line. I was sent to London and got swept up in the editing part of it. I like telling stories. It became the thing I did. And that’s how it happened.

Who inspired you in those early days?
I was highly influenced by Dede Allen. She was this matriarch of New York at that time, and I was so blown away by her and her personality. I mean, her work spoke for itself, but she was also this incredible person. I think it’s my nature anyway, but I learned from her early on an approach of kindness and caring. I think that’s part of why I stayed in the cutting room.

On set, things tend to become quite fraught sometimes when you’re trying to make something happen, but the cutting room is this calm place of reality, and you could figure stuff out. She was very influential to me, and she was such a kind, caring person. She cared about everyone in the cutting room, and she took time to talk to everyone.

There was also John Bloom, who was the editor on A Chorus Line. We became very close, and he always used to call me over to see what he was doing. I learned tons from him. In those days, we cut on film, so it was running through your fingers.

The truth is everyone I meet influences me a bit. I am fascinated by each person’s approach and why they see things the way they do.

While your resume is eclectic, you’ve worked on many sci-fi and action films. Was that something you were aiming for, or did it happen by chance?
I was lucky enough to meet J.J. Abrams, and I was lucky enough to get on Alias, which was not something I thought I’d want to do. Then I did it because it seemed to suit me at the time. It was a bit of faith and a bit of, “Oh, that makes sense for you, because you grew up loving Twilight Zone and Star Trek.”

Of course, I’d love to do more drama. I did The Jane Austen Book Club and other films like that. One does tend to get sort of suddenly identified as, now I’m the expert on sci-fi and visual effects. Also, I think because there aren’t a lot of women who do that, it’s probably something people notice. But I’d love to do a good comedy. I’d love to do something like Jumanji, which I think is hilarious.

How did this long and wonderful collaboration with J.J. Abrams get started?
Well, my kids were getting older. It was getting harder and harder for me to go on location with the nanny, the dog, the nanny’s kids, my kids, set up a third grade class and figure out how to do it all. A friend of mine who was a producer on Felicity had originally tried to get me to work on that show. She said, “You’ll love J.J. You’ll love (series creator) Matt Reeves. Come and just meet us.” I just thought television is such hard work.

Then he was starting this new show, Alias. My friend said, “You’re going to love it. Just meet him.” And I did. Honestly, I went to an interview with him, and I spent an hour basically laughing at every joke he told me. I thought, “This guy’s never going to hire me.” But he said, “Okay, I’ll see you tomorrow.” That’s how it started.

What was that like?
Alias was so much fun. I didn’t work on Felicity, which was more of a straightforward drama about a college girl growing up. Alias was this crazy, complicated, action-filled show, but also a girl trying to grow up. It was all of those things. It was classic J.J. It was a challenge, and it was really fun because we all discovered it together. There were three other female editors who are amazing — Mary Jo Markey, Kristin Windell, and Virginia Katz — and there was J.J. and Ken Olin, who was a producer in residence there and director. We just found the show together, and that was really fun.

How has your collaboration with J.J. changed over time?
It’s changed in terms of the scope of a project and what we have to do. And, obviously, the level of conflict and communication is pretty easy because we’ve known each other for so long. There’s not a lot of barriers like, “Hey, I’m trying to get to know you. What do I…?” We just jump right in. Over the years, it’s changed a bit.

On The Rise of Skywalker, I cut this film with a different co-editor. Mary Jo [Markey, Brandon’s longtime co-editor] was doing something else at the time, so I ended up working with Stefan Grube. The way I had worked with Mary Jo was we would divide up the film. She’d do her thing and I’d do mine. But because these films are so massive, I prefer not to divide it up, but instead have both of us work on whatever needs working on at the time to get it done. I proposed this to J.J., and it worked out great. Everything got cut immediately and we got together periodically to ask him what he thought.

Another thing that changed was, because we needed to turn over our visual effects really quickly, I proposed that I cut on the set, on location, when they were shooting. At first J.J. was like, “We’ve never done this before.” I said, “It’s the only way I’m going to get your eyes on sequences,” because by the time the 12-hour day is over, everyone’s exhausted.

It was great and worked out well. I had this little mobile unit, and the joke was it was always within 10 feet of wherever J.J. was. It was also great because I felt like I was part of the crew, and they felt like they could talk to me. I had the DP asking me questions. I had full access to the visual effects supervisor. We worked out shots on the set. Given the fact that you could see what we already had, it really was a game-changer.

What are some of the challenges of working on films that are heavy on action, especially with the Star Wars and Star Trek films and all the effects and CGI?
There’s a scene where they arrive on Exogal, and they’re fighting with each other and more ships are arriving. All of that was in my imagination. It was me going, “Okay, that’ll be on the screen for this amount of time.” I was making up so much of it and using the performances and the story as a guide. I worked really closely with the visual effects people describing what I thought was going to happen. They would then explain that what I thought was going to happen was way too much money to do.

Luckily I was on the set, so I could work it out with J.J. as we went. Sometimes it’s better for me just to build something that I imagine and work off of that, but it’s hard. It’s like having a blank page and then knowing there’s this one element, and then figuring out what the next one will be.

There are people who are incredibly devoted to the worlds of Star Trek and Star Wars and have very strong feelings about those worlds. Does that add more pressure to the process?
I’m a big fan of Star Trek and Star Wars, as is J.J. I grew up with Star Trek, and it’s very different because Star Trek was essentially a week-to-week serial that featured an adventure, and Star Wars is this world where they’re in one major war the whole time.

Sometimes I would go off on a tangent, and J.J. and my co-editor Stefan would be like, “That’s not in the lore,” and I’d have to pull it back and remember that we do serve a fan base that is loyal to it. When I edit anything, I really try to abandon any kind of preconceived thing I have so I can discover things.

I think there’s a lot of pressure to answer to the first two movies, because this is the third, and you can’t just ignore a story that’s been set up, right? We needed to stay within the boundaries of that world. So yeah, there’s a lot of pressure to do that, for sure. One of the things that Chris Terrio and J.J., as the writers, felt very strongly about was having it be Leia’s final story. That was a labor of love for sure. All of that was like a love letter to her.

I don’t know how much of that had been decided before Carrie Fisher (Leia) died. It was my understanding that you had to reconstruct based on things she shot for the other films.
She died before this film was even written, so all of the footage you see is from Episode 7. It’s all been repurposed, and scenes were written around it. Not just for the sake of writing around the footage, but they created scenes that actually work in the context of the film. A lot of what works is due to Daisy Ridley and the other actors who were in the scenes with her. I mean, they really brought her to life and really sold it. I have to say they were incredible.

With two editors co-editing on set during production, you must have needed an extensive staff of assistant editors. How do you work with assistant editors on something of this scale?
I’ve worked with an assistant editor named Jane Tones on the last couple of films. She is amazing. She was the one who figured out how to make the mobile unit work on set. She’s incredibly gifted, both technologically and story-wise. She was instrumental in organizing everything to do with the edit and getting us around. Stefan’s assistant was Warren Paeff, and he is very experienced. We also had a sound person we carried with us and a couple of other assistants. I had another assistant, Ben Cox, who was such a Star Wars fan. When I said, “I’m happy to hire you, but I only have a second assistant position.” He was like, “I’ll take it!”

What advice do you have for someone starting out or who would like to build the kind of career you’ve made?
I would say, try to get a PA job or a job in the cutting room where you really enjoy the people, and pay attention. If you have ideas, don’t be shy but figure out how to express your ideas. I think people in the cutting room are always looking for anyone with an opinion or reaction because you need to step back from it. It’s a love of film, a love of storytelling and a lot of luck. I work really hard, but I also had a lot of good fortune meeting the people I did.


Amy Leland is a film director and editor. Her short film, Echoes, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

CVLT adds Joe Simons as lead editor

Bi-coastal production studio CVLT, which offers full-service production and post, has Joe Simons as lead editor. He will be tasked with growing CVLT’s editorial department. He edits on Adobe Premiere and will be based in the New York studio.

Simons joins CVLT after three years at The Mill, where his edited the “It’s What Connects Us” campaign for HBO, the “Top Artist of the Year” campaign for Spotify and several major campaigns for Ralph Lauren, among many others. Prior to The Mill, he launched his career at PS260 before spending four years at editing house Cut+Run.

Simons’ addition comes at a time when CVLT is growing into a full concept-to-completion creative studio, launching campaigns for top luxury and fashion brands, including Lexus, Peloton and Louis Vuitton.

“Having soaked up everything I could at The Mill and Cut+Run, it was time for me to take that learning and carve my own path,” says Simons.

Maxon and Red Giant to merge

Maxon, developers of pro 3D software solutions, and Red Giant, makers of tools for editors, VFX artists, and motion designers, have agreed to merge under the media and entertainment division of Nemetschek Group. The transaction is expected to close in January 2020, subject to regulatory approval and customary closing conditions.

Maxon, best known for its 3D product Cinema 4D, was formed in 1986 to provide high-end yet accessible 3D software solutions. Artists across the globe rely on Maxon products to create high-end visuals. In April of this year, Maxon acquired Redshift, developer of the GPU-accelerated Redshift render engine.

Since 2002, Red Giant has built its brand through products such as Trapcode, Magic Bullet, Universe, PluralEyes and its line of visual effects software. Its tools are used in the fields of film, broadcast and advertising.

The two companies provide tools for companies including ABC, CBS, NBC, HBO, BBC, Sky, Fox Networks, Turner Broadcasting, NFL Network, WWE, Viacom, Netflix, ITV Creative, Discovery Channel, MPC, Digital Domain, VDO, Sony, Universal, The Walt Disney Company, Blizzard Entertainment, BMW, Facebook, Apple, Google, Vitra, Nike and many more.

Main Photo: L-R: Maxon CEO Dave McGavran and Red Giant CEP Chad Bechert

Behind the title: Cutters editor Steve Bell

“I’ve always done a fair amount of animation design, music rearranging and other things that aren’t strictly editing, but most editors are expected to play a role in aspects of the post process that aren’t strictly editing.”

Name: Steve Bell

What’s your job title?
Editor

Company: Cutters Editorial

Can you describe your company?
Cutters is part of a global group of companies offering offline editing, audio engineering, VFX and picture finishing, production and design – all of which fall under Cutters Studios. Here in New York, we do traditional broadcast TV advertising and online content, as well as longer format work and social media content for brands, directors and various organizations that hire us to develop a concept, shoot and direct.

Cutters New York

What’s your job title?
Editor

What’s your favorite part of the job?
There’s a stage to pretty much every project where I feel I’ve gotten a good enough grasp of the material that I can connect the storytelling dots and see it come to life. I like problem solving and love the feeling you get when you know you’ve “figured it out.”

Depending on the scale of the project, it can start a few hours in, a few days in or a few weeks in, but once it hits you can’t stop until you see the piece finished. It’s like reading a good page-turner; you can’t put it down. That’s the part of the creative process I love and what I like most about my job.

What’s your least favorite?
It’s those times when it becomes clear that I’ve/we’ve probably looked at something too many times to actually make it better. That certainly doesn’t happen on many jobs, but when it does, it’s probably because too many voices have had a say; too many cooks in the kitchen, as they say.

What is your most productive time of the day?
Early in the morning. I’m most clearheaded at the very beginning of the day, and then sometimes toward the very end of a long day. But those times also happen to be when I’m most likely to be alone with what I’m working on and free from other distractions.

If you didn’t have this job, what would you be doing instead? 
Baseball player? Astronaut? Joking. But let’s face it, we all fantasize about fulfilling the childhood dreams that are completely different from what we do. To be truthful I’m sure I’d be doing some kind of writing, because it was my desire to be a writer, particularly of film, that indirectly led me to be an editor.

Why did you choose this profession? How early on did you know this would be your path?
Well the simple answer is probably that I had opportunities to edit professionally at a relatively young age, which forced me to get better at editing way before I had a chance to get better at writing. If I keep editing I may never know if I can write!

Stella Artois

Can you name some recent projects you have worked on?
The Dwyane Wade Budweiser retirement film, Stella Artois holiday spots, a few films for the Schott/Hamilton watch collaboration. We did some fun work for Rihanna’s Savage X Fenty release. Early in the year I did a bunch of lovely spots for Hallmark Hall of Fame programming.

Do you put on a different hat when cutting for a specific genre?
For sure. There are overlapping tasks, but I do believe it takes a different set of skills to do good dramatic storytelling than it takes to do straight comedy, or doc or beauty. Good “Storytelling” (with a capital ‘S’) is helpful in all of it — I’d probably say crucial. But it comes down to the important element that’s used to create the story: emotion, humor, rhythm, etc. And then you need to know when it needs to be raw versus formal, broad versus subtle and so forth. Different hats are needed to get that exactly right.

What is the project that you are most proud of and why?
I’m still proud of the NHL’s No Words spot I worked on with Cliff Skeete and Bruce Jacobson. We’ve become close friends as we’ve collaborated on a lot of work since then for the NHL and others. I love how effective that spot is, and I’m proud that it continues to be referenced in certain circles.

NHL No Words

In a very different vein, I think I’m equally proud of the work I’ve done for the UN General Assembly meetings, especially the film that accompanied Kathy Jetnil-Kijiner’s spoken word performance of her poem “Dear Matafele Peinem” during the opening ceremonies of the UN’s first Climate Change conference. That’s an issue that’s very important to me and I’m grateful for the chance to do something that had an impact on those who saw it.

What do you use to edit?
I’m a Media Composer editor, and it probably goes back to the days when I did freelance work for Avid and had to learn it inside out. The interface at least is second nature to me. Also, the media sharing and networking capabilities of Avid make it indispensable. That said, I appreciate that Premiere has some clear advantages in other ways. If I had to start over I’m not sure I wouldn’t start with Premiere.

What is your favorite plugin?
I use a lot of Boris FX plugins for stabilization, color correction and so forth. I used to use After Effects often, and Boris FX offers a way of achieving some of what I once did exclusively in After Effects.

Are you often asked to do more than edit? If so, what else are you asked to do?
I’ve always done a fair amount of animation design, music rearranging and other things that aren’t strictly editing, but most editors are expected to play a role in aspects of the post process that aren’t strictly “film editing.”

Many of my clients know that I have strong opinions about those things, so I do get asked to participate in music and animation quite often. I’m also sometimes asked to help with the write-ups of what we’ve done in the edit because I like talking about the process and clarifying what I’ve done. If you can explain what you’ve done you’re probably that much more confident about the reasons you did it. It can be a good way to call “bullshit” on yourself.

This is a high stress job with deadlines and client expectations. What do you do to de-stress from it all?
Yeah, right?! It can be stressful, especially when you’re occasionally lucky enough to be busy with multiple projects all at once. I take decompressing very seriously. When I can, I spend a lot of time outdoors — hiking, biking, you name it — not just for the cardio and exercise, which is important enough, but also because it’s important to give your eyes a chance to look off into the distance. There are tremendous physical and psychological benefits to looking to the horizon.

Review: The Sensel Morph hardware interface

By Brady Betzel

As an online editor and colorist, I have tried a lot of hardware interfaces designed for apps like Adobe Premiere, Avid Media Composer, Blackmagic DaVinci Resolve and others. With the exception of professional color correction surfaces like the FilmLight Baselight, the Resolve Advanced Panel and Tangent’s Element color correction panels, it’s hard to get exactly what I need.

While they typically work well, there is always a drawback for my workflow; usually they are missing one key shortcut or feature. Enter Sensel Morph, a self-proclaimed morphable hardware interface. In reality, it is a pressure-sensitive trackpad that uses individual purchasable magnetic rubber overlays and keys for a variety of creative applications. It can also be used as a pressure-sensitive trackpad without any overlays.

For example, inside of the Sensel app you can identify the Morph as a trackpad and click “Send Map to Morph,” and it will turn itself into a large trackpad. If you are a digital painter, you can turn the Morph into “Paintbrush Area” and use a brush and/or your fingers to paint! Once you understand how to enable the different mappings you can quickly and easily Morph between settings.

For this review, I am going to focus on how you can use the Sensel Morph with Adobe Premiere Pro. For the record, you can actually use it with any NLE by creating your own map inside of the Sensel app. The Morph essentially works with keyboard shortcuts for NLEs. With that in mind, if you customize your keyboard shortcuts you are going to want to enable the default mapping inside of Premiere or adjust your settings to match the Sensel Morph’s settings.

Before you plug in your Morph, you will need to click over to https://sensel.com/pages/support, where you can get a quick-start guide in addition to the Sensel app you will need to install before you get working. After it’s downloaded and installed, you will want to plug in the Morph via the USB and let it charge before using the Bluetooth connection. It took a while for the Morph to fully charge, about two hours, but once I installed the Sensel app, added the Video Editing Overlay and opened Adobe Premiere, I was up and working.

To be honest, I was a little dubious about the Sensel Morph. A lot of these hardware interfaces have come across my desk, and they usually have poor software implementation, or the hardware just doesn’t hold up. But the Sensel Morph broke through my preconceived ideas of hardware controllers for NLEs like Premiere, and for the first time in a long time, I was inspired to use Premiere more often.

It’s no secret that I learned professional editing in Avid Media Composer and Symphony. And most NLEs can’t quite rise to the level of professional experience that I have experienced in Symphony. One of those experiences is how well and fluid the keyboard and Wacom tablet work together. The first time I plugged in the Sensel Morph, overlayed the Video Editing Overlay on top of the Morph and opened Premiere, I began to have that same feeling but inside of Premiere!

While there are still things Premiere has issues with, the Sensel Morph really got me feeling good about how well this Adobe NLE worked. And to be honest, some of those issues relate to me not learning Premiere’s keyboard shortcuts like I did in Avid. The Sensel Morph felt like a natural addition to my Premiere editing workflow. It was the first time I started to feel that “flow state” inside of Premiere that I previously got into when using Media Composer or Symphony, and I started trimming and editing like a mad man. It was kind of shocking to me.

You may be thinking that I am blowing this out of proportion, and maybe I am, a little, but the Morph immediately improved my lazy Premiere editing. In fact, I told someone that Adobe should package these with first-time Premiere users.

I really like the way the timeline navigation works (much like the touch bar). I also like the quick Ripple Left/Right commands, and I like how you can quickly switch timelines by pressing the “Timeline” button multiple times to cycle through them. I did feel like I needed a mouse some of the time and keyboard for some of the time, but for about 60% of the time I could edit without them. Much like how I had to force myself to use a Wacom tablet for editing, if you try not to use a mouse I think you will get by just fine. I did try and use a Wacom stylus with the Sensel Morph and, unfortunately, it did not work.

What improvements could the Sensel Morph make? Specifically in Premiere, I wish they had a full-screen shortcut (“`”) labeled on the Morph. It’s one of those shortcuts I use all the time, whether I want to see my timeline full screen, the effects controls full screen or the Program feed full screen. And while I know I could program it using the Sensel app, the OCD in me wants to see that reflected onto the keys. While we are on the keys subject, or overlay, I do find it a little hard to use when I customize the key presses. Maybe ordering a custom printed overlay could assuage this concern.

One thing I found odd was the GPU usage that the Sensel app needed. My laptop’s fans were kicking on, so I opened up Task Manager and saw that the Sensel app was taking 30% of my Nvidia RTX 2080. Luckily, you really only need it open when changing overlays or turning it into a trackpad, but I found myself leaving it open by accident, which could really hurt performance.

Summing Up
In the end, is the Sensel Morph really worth the $249? It does come with one free overlay of your choice with the $249 purchase price, along with a one-year warranty; but if you want more overlays those will set you back from $35 to $59 depending on the overlay.

The Video Editing one is $35 while the new Buchla Thunder overlay is $59. From a traditional Keyboard, Piano Key, Music Production, or even Drum Pad Overlay there are a few different options you can choose from. If you are a one-person band that goes between Premiere and apps like Abelton, then it’s 100 percent worth it. If you use Premiere a lot, I still think it is worth it. The iPad Mini-size and weight is really nice, and when using over Bluetooth you feel untethered. Its sleek and thin design really allows you to bring this morphable hardware interface with you anywhere you take your laptop or tablet.

The Sensel Morph is not like any of the other hardware interfaces I have used. Not only is it extremely mobile, but it works well and is compatible with a lot of content creation apps that pros use daily. They really delivered on this one.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.