Tag Archives: Avid Media Composer

Review: Avid Media Composer Symphony 2018 v.12

By Brady Betzel

In February of 2018, we saw a seismic shift in the leadership at Avid. Chief executive officer Louis Hernandez Jr. was removed and subsequently replaced by Jeff Rosica. Once Rosica was installed, I think everyone who was worried Avid was about to be liquidated to the highest bidder breathed a sigh of temporary relief. Still unsure whether new leadership was going to right a tilting ship, I immediately wanted to see a new action plan from Avid, specifically on where Media Composer and Symphony were going.

Media Composer with Symphony

Not long afterward, I was happily reading how Avid was taking lessons from its past transgressions and listening to its clients. I heard Avid was taking tours around the industry and listening to what customers and artists needed from them. Personally, I was asking myself if Media Composer with Symphony would ever be the finishing tool of Avid DS was. I’m happy to say, it’s starting to look that way.

It appears from the outside that Rosica is indeed the breath of fresh air Avid needed. At NAB 2019, Avid teased the next iteration of Media Composer, version 2019, with overhauled interface and improvements, such as a 32-bit float color pipeline workflow complete with ACES color management and a way to deliver IMF packages; a new engine with a distributed processing engine; and a whole new product called Media Composer|Enterprise, all of which will really help sell this new Media Composer. But the 2019 update is coming soon and until then I took a deep dive into Media Composer 2018 v12, which has many features editors, assistants, and even colorists have been asking for: a new Avid Titler, shape-based color correction (with Symphony option), new multicam features and more.

Titling
As an online editor who uses Avid Media Composer with Symphony option about 60% of the time, titling is always a tricky subject. Avid has gone through some rough seas when dealing with how to fix the leaky hole known as the Avid Title Tool. The classic Avid Title Tool was basic but worked. However, if you aligned something in the Title Tool interface to Title Safe zones, it might jump around once you close the Title Tool interface. Fonts wouldn’t always stay the same when working across PC and MacOS platforms. The list goes on, and it is excruciatingly annoying.

Titler

Let’s take a look at some Avid history: In 2002, Avid tried to appease creators and introduced the, at the time, a Windows-only titler: Avid Marquee. While Marquee was well-intentioned, it was extremely difficult to understand if you weren’t interested in 3D lighting, alignment and all sorts of motion graphics stuff that not all editors want to spend time learning. So, most people didn’t use it, and if they did it took a little while for anyone taking over the project to figure out what was done.

In December of 2014, Avid leaned on the New Blue Titler, which would work in projects higher than 1920×1080 resolution. Unfortunately, many editors ran into a very long render at the end, and a lot bailed on it. Most decided to go out of house and create titles in Adobe Photoshop and Adobe After Effects. While this all relates to my experience, I assume others feel the same.

In Avid Media Composer 2018, the company has introduced the Avid Titler, which in the Tools menu is labeled: Avid Titler +. It works like an effect rather than a rendered piece of media like in the traditional Avid Title Tool, where an Alpha and a Fill layer worked. This method is similar to how NewBlue or Marquee functioned. However, Avid Titler works by typing directly on the record monitor; adding a title is as easy as marking an in and out point and clicking on the T+ button on the timeline.

You can specify things like kerning, shadow, outlines, underlines, boxes, backgrounds and more. One thing I found peculiar was that under Face, the rotation settings rotate individual letters and not the entire word by default. I reached out to Avid and they are looking into making the entire word rotation option the default in the mini toolbar of Avid Titler. So stay tuned.

Also, you can map your fast forward and rewind buttons to “Go To Next/Previous Event.” This allows you to jump to your next edits in the timeline but also to the next/previous keyframes when in the Effect Editor. Typically, you click on the scrub line in the record window and then you can use those shortcuts to jump to the next keyframe. In the Avid Titler, it would just start typing in the text box. Furthermore, when I wanted to jump out of Effect Editor mode and back into Edit Mode, I usually hit “y,” but that did not get me out of Effects Mode (Avid did mention they are working on updates to the Avid Titler that would solve this issue). The new Avid Titler definitely has some bugs and/or improvements that are needed, and they are being addressed, but it’s a decent start toward a modern title editor.

Shape-based color correction

Color
If you want advanced color correction built into Media Composer, then you are going to want the Symphony option. Media Composer with the Symphony option allows for more detailed color correction using secondary color corrections as well as some of the newer updates, including shape-based color correction. Before Resolve and Baselight became more affordable, Symphony was the gold standard for color correction on a budget (and even not on a budget since it works so well in the same timeline the editors use). But what we are really here for is the 2018 v.12 update of Shapes.

With the Symphony option, you can now draw specific regions on the footage for your color correction to affect. It essentially works similarly to a layer-based system like Adobe Photoshop. You can draw shapes with the same familiar tools you are used to drawing with in the Paint or AniMatte tools and then just apply your brightness, saturation or hue swings in those areas only. On the color correction page you can access all of these tools on the right-hand side, including the softening, alpha view, serial mode and more.

When using the new shape-based tools you must point the drop-down menu to “CC Effect.” From there you can add a bunch of shapes on top of each other and they will play in realtime. If you want to lay a base correction down, you can specify it in the shape-based sidebar, then click shape and you can dial in the specific areas to your or your client’s taste. You can check off the “Serial Mode” box to have all corrections interact with one another or uncheck the box to allow for each color correction to be a little more isolated — a really great option to keep in mind when correcting. Unfortunately, tracking a shape can only be done in the Effect Editor, so you need to kind of jump out of color correction mode, track, and then go back. It’s not the end of the world, but it would be infinitely better if you could track efficiently inside of the color correction window. Avid could even take it further by allowing planar tracking by an app like Mocha Pro.

Shape-based color correction

The new shape-based corrector also has an alpha view mode identified by the infinity symbol. I love this! I often find myself making mattes in the Paint tool, but it can now be done right in the color correction tool. The Symphony option is an amazing addition to Media Composer if you need to go further than simple color correction but not dive into a full color correction app like Baselight or Resolve. In fact, for many projects you won’t need much more than what Symphony can do. Maybe a +10 on the contrast, +5 on the brightness and +120 on the saturation and BAM a finished masterpiece. Kind of kidding, but wait until you see it work.

Multicam
The final update I want to cover is multicam editing and improvements to editing group clips. I cannot emphasize enough how much time this would have saved me as an assistant editor back in the pre-historic Media Composer days… I mean we had dongles, and I even dabbled in the Meridian box. Literally days of grouping and regrouping could have been avoided with the Edit Group feature. But I did make a living fixing groups that were created incorrectly, so I guess this update is a Catch 22. Anyway, you can now edit groups in Media Composer by creating a group, right-clicking on that group and selecting Edit Group. From there, the group will now open in the Record Monitor as a sequence, and from there you can move, nudge and even add cameras to a previously created group. Once you are finished, you can update the group and refresh any sequences that used that group to update if you wish. One issue is that with mixed frame rate groups, Avid says committing to that sequence might produce undesirable effects.

Editing workspace

Cost of Entry
How much does Media Composer cost these days? While you can still buy it outright, it seems a bit more practical to go monthly since you will automatically get updates, but it can still be a little tricky. Do you need PhraseFind and/or ScriptSync? Do you need the Symphony option? Do you need to access shared storage? There are multiple options depending on your needs. If you want everything, then Media Composer Ultimate for $49 per month is what you want. If you want Media Composer and just one add-on, like Symphony, it will cost $19 per month plus $199 per year for the Symphony option. If you want to test the water before jumping in, you can always try Media Composer First.

For a good breakdown of the Media Composer pricing structure, check out KeyCode Media  page (a certified reseller). Another great link with tons of information organized into easily digestible bites is this. Additionally, www.freddylinks.com is a great resource chock full of everything else Avid, written by Avid technical support specialist Fredrik Liljeblad out of Sweden.

Group editing

Summing Up
In the end, I use and have used Media Composer with Symphony for over 15 years, and it is the most reliable nonlinear editor supporting multiple editors in a shared network environment that I have used. While Adobe Premiere Pro, Apple Final Cut Pro X and Blackmagic Resolve are offering fancy new features and collaboration modes, Avid seems to always hold stabile when I need it the most. These new improvements and a UI overhaul (set to debut in May), new leadership from Rosica, and the confidence of Rosica’s faithful employees all seem to be paying off and getting Avid back on the track they should have always been on.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Avid offers rebuilt engine and embraces cloud, ACES, AI, more

By Daniel Restuccio

During its Avid Connect conference just prior to NAB, Avid announced a Media Composer upgrade, support for ACES color standard and additional upgrades to a number of its toolsets, apps and services, including Avid Nexis.

The chief news from Avid is that Media Composer, its flagship video editing system, has been significantly retooled: sporting a new user interface, rebuilt engine, and additional built-in audio, visual effects, color grading and delivery features.

In a pre-interview with postPerspective, Avid president/CEO Jeff Rosica said, “We’re really trying to leap frog and jump ahead to where the creative tools need to go.”

Avid asked themselves, what did they need to do “to help production and post production really innovate?” He pointed to TV shows and films, and how complex they’re getting. “That means they’re dealing with more media, more elements, and with so many more decisions just in the program itself. Let alone the fact that the (TV or film) project may have to have 20 different variants just to go out the door.”

Jeff Rosica

The new paneled user interface simplifies the workspace, has redesigned bins to find media faster, as well as task-based workspaces showing only what the user wants and needs to see.

Dave Colantuoni, VP of product management at Avid, said they spent the most amount of time studying the way that editors manage and organize bins and content within Media Composer. “Some of our editors use 20, 30, 40 bins at a time. We’ve really spent a lot of time so that we can provide an advantage to you in how you approach organizing your media. “

Avid is also offering more efficient workflow solutions. Users, without leaving Media Composer, can work in 8K, 16K or HDR thanks to the newly built-in 32-bit full float color pipeline. Additionally, Avid continues to work with OTT content providers to help establish future industry standards.

“We’re trying to give as much creative power to the creative people as we can, and bring them new ways to deal with things,” said Rosica. “We’re also trying to help the workflow side. We’re trying to help make sure production doesn’t have to do more with less, or sometimes more with the same budget. Cloud (computing) allows us to bring a lot of new capabilities to the products, and we’re going to be cloud powering a lot of our products… more than you’ve seen before.”

The new Media Composer engine is now native OP1A, can handle more video and audio streams, offers Live Timeline and background rendering, and a distributed processing add-on option to shorten turnaround times and speed up post production.

“This is something our competitors do pretty well,” explained Colantuoni. “And we have different instances of OP1A working among the different Avid workflows. Until now, we’ve never had it working natively inside of Media Composer. That’s super-important because a lot of capabilities started in OP1A, and we can now keep it pristine through the pipeline.”

Said Rosica, “We are also bringing the ability to do distributive rendering. An editor no longer has to render or transcode on their machine. They can perform those tasks in a distributed or centralized render farm environment. That allows this work to get done behind the scenes. This is actually an Avid Supply solution, so it will be very powerful and reliable. Users will be able to do background rendering, as well as distributive rendering and move things off the machine to other centralized machines. That’s going to be very helpful for a lot of post workflows.”

Avid had previously offered three main flavors of Media Composer: Media Composer First, the free version; Media Composer; and Media Composer Ultimate. Now they are also offering a new Enterprise version.

For the first time, large production teams can customize the interface for any role in the organization, whether the user is a craft editor, assistant, logger or journalist. It also offers unparalleled security to lock down content, reducing the chances of unauthorized leaks of sensitive media. Enterprise also integrates with Editorial Management 2019.

“The new fourth tier at the top is what we are calling the Enterprise Edition or Enterprise. That word doesn’t necessarily mean broadcast,” says Rosica. “It means for business deployment. This is for post houses and production companies, broadcast, and even studios. This lets the business, or the enterprise, or production, or post house to literally customize interfaces and customize work spaces to the job role or to the user.”

Nexis Cloudspaces
Avid also announced Avid Nexis|Cloudspaces. So Instead of resorting to NAS or external drives for media storage, Avid Nexis|Cloudspaces allows editorial to offload projects and assets not currently in production. Cloudspaces extends Avid Nexis storage directly to Microsoft Azure.

“Avid Nexis|Cloudspaces brings the power of the cloud to Avid Nexis, giving organizations a cost-effective and more efficient way to extend Avid Nexis storage to the cloud for reliable backup and media parking,” said Dana Ruzicka, chief product officer/senior VP at Avid. “Working with Microsoft, we are offering all Avid Nexis users a limited-time free offer of 2TB of Microsoft Azure storage that is auto-provisioned for easy setup and as much capacity as you need, when you need it.”

ACES
The Academy Color Encoding System (ACES) team also announced that Avid is now part of the ACES Logo Program, as the first Product Partner in the new Editorial Finishing product category. ACES is a free, open, device-independent color management and image interchange system and is the global standard for color management, digital image interchange and archiving. Avid will be working to implement ACES in conformance with logo program specifications for consistency and quality with a high quality ACES-color managed video creation workflow.

“We’re pleased to welcome Avid to the ACES logo program,” said Andy Maltz, managing director of the ACES Council. “Avid’s participation not only benefits editors that need their editing systems to accurately manage color, but also the broader ACES end-user community through expanded adoption of ACES standards and best practices.”

What’s Next?
“We’ve already talked about how you can deploy Media Composer or other tools in a virtualized environment, or how you can use these kind of cloud environments to extend or advance production,” said Rosica. “We also see that these things are going to allow us to impact workloads. You’ll see us continue to power our MediaCentral platform, editorial management of MediaCentral, and even things like Media Composer with AI to help them get to the job faster. We can help automate functions, automate environments and use cloud technologies to allow people to collaborate better, to share better, to just power their workloads. You’re going to see a lot from us over time.”

Review: Boris FX’s Continuum and Mocha Pro 2019

By Brady Betzel

I realize I might sound like a broken record, but if you are looking for the best plugin to help with object removals or masking, you should seriously consider the Mocha Pro plugin. And if you work inside of Avid Media Composer, you should also seriously consider Boris Continuum and/or Sapphire, which can use the power of Mocha.

As an online editor, I consistently use Continuum along with Mocha for tight blur and mask tracking. If you use After Effects, there is even a whittled-down version of Mocha built in for free. For those pros who don’t want to deal with Mocha inside of an app, it also comes as a standalone software solution where you can copy and paste tracking data between apps or even export the masks, object removals or insertions as self-contained files.

The latest releases of Continuum and Mocha Pro 2019 continue the evolution of Boris FX’s role in post production image restoration, keying and general VFX plugins, at least inside of NLEs like Media Composer and Adobe Premiere.

Mocha Pro

As an online editor I am alway calling on Continuum for its great Chroma Key Studio, Flicker Fixer and blurring. Because Mocha is built into Continuum, I am able to quickly track (backwards and forwards) difficult shapes and even erase shapes that the built-in Media Composer tools simply can’t do. But if you are lucky enough to own Mocha Pro you also get access to some amazing tools that go beyond planar tracking — such as automated object removal, object insertion, stabilizing and much more.

Boris FX’s latest updates to Boris Continuum and Mocha Pro go even further than what I’ve already mentioned and have resulted in a new version naming, this round we are at 2019 (think of it as Version 12). They have also created the new Application Manager, which makes it a little easier to find the latest downloads. You can find them here. This really helps when jumping between machines and you need to quickly activate and deactivate licenses.

Boris Continuum 2019
I often get offline edits effects from a variety plugins — lens flares, random edits, light flashes, whip transitions, and many more — so I need Continuum to be compatible with offline clients. I also need to use it for image repair and compositing.

In this latest version of Continuum, BorisFX has not only kept plugins like Primatte Studio, they have brought back Particle Illusion and updated Mocha and Title Studio. Overall, Continuum and Mocha Pro 2019 feel a lot snappier when applying and rendering effects, probably because of the overall GPU-acceleration improvements.

Particle Illusion has been brought back from the brink of death in Continuum 2019 for a 64-bit keyframe-able particle emitter system that can even be tracked and masked with Mocha. In this revamp of Particle Illusion there is an updated interface, realtime GPU-based particle generation, expanded and improved emitter library (complete with motion-blur-enabled particle systems) and even a standalone app that can design systems to be used in the host app — you cannot render systems inside of the standalone app.

While Particle Illusion is a part of the entire Continuum toolset that works with OFX apps like Blackmagic’s DaVinci Resolve, Media Composer, After Effects, and Premiere, it seems to work best in applications like After Effects, which can handle composites simply and naturally. Inside the Particle Illusion interface you can find all of the pre-built emitters. If you only have a handful make sure you download additional emitters, which you can find in the Boris FX App Manager.

       
Particle Illusion: Before and After

I had a hard time seeing my footage in a Media Composer timeline inside of Particle Illusion, but I could still pick my emitter, change specs like life and opacity, exit out and apply to my footage. I used Mocha to track some fire from Particle Illusion to a dumpster I had filmed. Once I dialed in the emitter, I launched Mocha and tracked the dumpster.

The first time I went into Mocha I didn’t see the preset tracks for the emitter or the world in which the emitter lives. The second time I launched Mocha, I saw track points. From there you can track where you want your emitter to track and be placed. Once you are done and happy with your track, jump back to your timeline where it should be reflected. In Media Composer I noticed that I had to go to the Mocha options and change the option from Mocha Shape to no shape. Essentially, the Mocha shape will act like a matte and cut off anything outside the matte.

If you are inside of After Effects, most parameters can now be keyframed and parented (aka pick-whipped) natively in the timeline. The Particle Illusion plugin is a quick, easy and good-looking tool to add sparks, Milky Way-like star trails or even fireworks to any scene. Check out @SurfacedStudio’s tutorial on Particle Illusion to get a good sense of how it works in Adobe Premiere Pro.

Continuum Title Studio
When inside of Media Composer (prior to the latest release 2018.12), there were very few ways to create titles that were higher resolution than HD (1920×1080) — the New Blue Titler was the only other option if you wanted to stay within Media Composer.

Title Studio within Media Composer

At first, the Continuum Title Studio interface appeared to be a mildly updated Boris Red interface — and I am allergic to the Boris Red interface. Some of the icons for the keyframing and the way properties are adjusted looks similar and threw me off. I tried really hard to jump into Title Studio and love it, but I really never got comfortable with it.

On the flip side, there are hundreds of presets that could help build quick titles that render a lot faster than New Blue Titler did. In some of the presets I noticed the text was placed outside of 16×9 Title Safety, which is odd since that is kind of a long standing rule in television. In the author’s defense, they are within Action Safety, but still.

If you need a quick way to make 4K titles, Title Studio might be what you want. The updated Title Studio includes realtime playback using the GPU instead of the CPU, new materials, new shaders and external monitoring support using Blackmagic hardware (AJA will be coming at some point). There are some great pre-sets including pre-built slates, lower thirds, kinetic text and even progress bars.

If you don’t have Mocha Pro, Continuum can still access and use Mocha to track shapes and masks. Almost every plugin can access Mocha and can track objects quickly and easily.
That brings me to the newly updated Mocha, which has some new features that are extremely helpful including a Magnetic Spline tool, prebuilt geometric shapes and more.

Mocha Pro 2019
If you loved the previous version of Mocha, you are really going to love Mocha Pro 2019. Not only do you get the Magnetic Lasso, pre-built geometric shapes, the Essentials interface and high-resolution display support, but BorisFX has rewritten the Remove Module code to use GPU video hardware. This increases render speeds about four to five times. In addition, there is no longer a separate Mocha VR software suite. All of the VR tools are included inside of Mocha Pro 2019.

If you are unfamiliar with what Mocha is, then I have a treat for you. Mocha is a standalone planar tracking app as well as a native plugin that works with Media Composer, Premiere and After Effects, or through OFX in Blackmagic’s Fusion, Foundry’s Nuke, Vegas Pro and Hitfilm.

Mocha tracking

In addition (and unofficially) it will work with Blackmagic DaVinci Resolve by way of importing the Mocha masks through Fusion. While I prefer to use After Effects for my work, importing Mocha masks is relatively painless. You can watch colorist Dan Harvey run through the process of importing Mocha masks to Resolve through Fusion, here.

But really, Mocha is a planar tracker, which means it tracks multiple points in a defined area that works best in flat surfaces or at least segmented surfaces, like the side of a face, ear, nose, mouth and forehead tracked separately instead of all at once. From blurs to mattes, Mocha tracks objects like glue and can be a great asset for an online editor or colorist.

If you have read any of my plugin reviews you probably are sick of me spouting off about Mocha, saying how it is probably the best plugin ever made. But really, it is amazing — especially when incorporated with plugins like Continuum and Sapphire. Also, thanks to the latest Media Composer with Symphony option you can incorporate the new Color Correction shapes with Mocha Pro to increase the effectiveness of your secondary color corrections.

Mocha Pro Remove module

So how fast is Mocha Pro 2019’s Remove Module these days? Well, it used to be a very slow process, taking lots of time to calculate an object’s removal. With the latest Mocha Pro 2019 release, including improved GPU support, the render time has been cut down tremendously. In my estimation, I would say three to four times the speed (that’s on the safe side). In Mocha Pro 2019 removal jobs that take under 30 seconds would have taken four to five minutes in previous versions. It’s quite a big improvement in render times.

There are a few changes in the new Mocha Pro, including interface changes and some amazing tool additions. There is a new drop-down tab that offers different workflow views once you are inside of Mocha: Essentials, Classic, Big Picture and Roto. I really wish the Essentials view was out when I first started using Mocha, because it gives you the basic tools you need to get a roto job done and nothing more.

For instance, just giving access to the track motion objects (Translation, Scale, Rotate, Skew and Perspective) with big shiny buttons helps to eliminate my need to watch YouTube videos on how to navigate the Mocha interface. However, if like me you are more than just a beginner, the Classic interface is still available and one I reach for most often — it’s literally the old interface. Big Screen hides the tools and gives you the most screen real estate for your roto work. My favorite after Classic is Roto. The Roto interface shows just the project window and the classic top toolbar. It’s the best of both worlds.

Mocha Pro 2019 Essentials Interface

Beyond the interface changes are some additional tools that will speed up any roto work. This has been one of the longest running user requests. I imagine the most requested feature that BorisFX gets for Mocha is the addition of basic shapes, such as rectangles and circles. In my work, I am often drawing rectangles around license plates or circles around faces with X-splines, so why not eliminate a few clicks and have that done already? Answering my need, Mocha now has elliptical and rectangular shapes ready to go in both X-splines and B-splines with one click.

I use Continuum and Mocha hand in hand. Inside of Media Composer I will use tools like Gaussian Blur or Remover, which typically need tracking and roto shapes created. Once I apply the Continuum effect, I launch Mocha from the Effect Editor and bam, I am inside Mocha. From here I track the objects I want to affect, as well as any objects I don’t want to affect (think of it like an erase track).

Summing Up
I can save tons of time and also improve the effectiveness of my work exponentially when working in Continuum 2019 and Mocha Pro 2019. It’s amazing how much more intuitive Mocha is to track with instead of the built-in Media Composer and Symphony trackers.

In the end, I can’t say enough great things about Continuum and especially Mocha Pro. Mocha saves me tons of time in my VFX and image restoration work. From removing camera people behind the main cast in the wilderness to blurring faces and license plates, using Mocha in tandem with Continuum is a match made in post production heaven.

Rendering in Continuum and Mocha Pro 2019 is a lot faster than previous versions, really giving me a leg up on efficiency. Time is money right?! On top of that, using Mocha Pro’s magic Object removal and Modules takes my image restoration work to the next level, separating me from other online editors who use standard paint and tracking tools.

In Continuum, Primatte Studio gives me the leg up on greenscreen keys with its exceptional ability to auto analyze a scene and perform 80% of the keying work before I dial-in the details. Whenever anyone asks me what tools I couldn’t live without, I without a doubt always say Mocha.
If you want a real Mocha Pro education you need to watch all of Mary Poplin’s tutorials. You can find them on YouTube. Check out this one on how to track and replace a logo using Mocha Pro 2019 in Adobe After Effects. You can also find great videos at Borisfx.com.

Mocha point parameter tracking

I always feel like there are tons of tools inside of the Mocha Pro toolset that go unused simply because I don’t know about them. One I recently learned about in a Surfaced Studio tutorial was the Quick Stabilize function. It essentially stabilizes the video around the object you are tracking allowing you to more easily rotoscope your object with it sitting still instead of moving all over the screen. It’s an amazing feature that I just didn’t know about.

As I was finishing up this review I saw that Boris FX came out with a training series, which I will be checking out. One thing I always wanted was a top-down set of tutorials like the ones on Mocha’s YouTube page but organized and sent along with practical footage to practice with.

You can check out Curious Turtle’s “More Than The Essentials: Mocha in After Effects” on their website where I found more Mocha training. There is even a great search parameter called Getting Started on BorisFX.com. Definitely check them out. You can never learn enough Mocha!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Editor Wyatt Smith talks Mary Poppins Returns, Marvel Universe

By Amy Leland

Wyatt Smith’s career as an editor is the kind that makes for a great story. His unintended path began with an unusual opportunity to work with Mariah Carey and a chance meeting with director Rob Marshall. He has since collaborated on big musicals and action films with Marshall, which opened the door to superhero movies. His latest project — in which he was reunited with Marshall — saw him editing a big musical with a title character who is, in her own Disney way, also a superhero.

Smith’s resume is impressive: Doctor Strange, Into the Woods, 300: Rise of an Empire, Thor: The Dark World, Pirates of the Caribbean: On Stranger Tides. When I had a chance to talk with him about Mary Poppins Returns, I first had to ask him how his fascinating journey began.

Wyatt Smith at the Mary Poppins Returns premiere.

Can you talk about what led you to editing?
Some things just happen unexpectedly. Opportunities arise and you just have to hear the knock and not be afraid to open the door. When they were building the now-closed Sony Music Studios in New York City, I knew a lot about computers. Avid was first coming in, and there were all these video engineers who weren’t as savvy with Macs and things like that because they were used to linear, old-school tape editing. I worked in the maintenance department at the studio, servicing online editing suites, as well as setting up their first Avid Media Composer and giving people some tutorials on how to use that.

Then a very odd circumstance came up — they were working on a Mariah Carey concert video and needed an additional editor to work at her house at night (she was working during the day with another editor). My father is in the music business and had ties to Mariah — we had met before — so they thought it would be a comfortable situation. It came out of nowhere, and while I certainly knew, technically, how to edit, creatively I had no idea.

That was my first opportunity to edit, and I never went back to anything else. That was the day. That was it. I started to edit music videos and concerts and little music documentaries. Years and years later that led me to work with Rob Marshall on a music project.

The Tony Bennett American Classic special?
Exactly. I had known the Bennett family and worked with them since Tony Bennett’s “Unplugged.” When Rob was brought on to direct an NBC special celebrating Tony’s career, he wanted to bring his whole film team with him, but the TV network and the Bennett family wanted somebody who knew the music world, and that style of deadline, which is quite different from film.

I was brought in to interview with Rob, and we had a wonderful experience making that show. When it was done, he said, “Next time I make a film, I want you to come along.” To be completely honest, I didn’t believe him. I thought it was very kind of him, and he is a very nice man, but I was like, yeah, sure. In 2008, I think it was the Friday before they started shooting Nine, he called and said, “You gotta get to London.” I immediately quit my job and got on a plane.

I’m guessing the music world was a heavy influence on you, but were you drawn toward movies as well?
I have always been a movie junkie. At an early age, I saw a lot of the big epics, including David Lean’s films — Lawrence of Arabia, A Passage to India — which just transported me to another place and another culture. I loved that.

That was back in the early VHS days, and I had just about every Bond film that had been released. I watched them obsessively. In high school, my closest friend worked in a video rental store, so we constantly had movies. It was always a huge thing for me, but never in my life did I dream of pursuing it. The language of film was never anything I studied or thought about until I was kind of thrust into it.

What was it like coming into this film with Rob Marshall, after so many years of working with him? Do your collaborations now feel different from when you first started working together?
The most important part is trust. When I first met Rob, aside from just not having any confidence, I didn’t remotely know what I was doing. We all know that when you have your actors and your sets if something’s not quite right that’s the time to bring it up. But 12 years ago, the thought of me going to Rob and saying, “I don’t know if that really works, maybe you should grab a shot like…” I’d never, ever. But over the years we’ve developed that trust. I’m still very cautious with things like that, but I now know I can talk to him. And if he has a question, he’ll call me to set and say, “Quickly put this together,” or, “Stay here and watch this with me,” and he’ll explain to me exactly what he’s going for.

Then, once we reach post, unquestionably that relationship changes. We used to cut everything from scratch and start re-watching all the material and rebuilding the film again. Now we can work through existing cuts because I kind of know his intentions. It’s easier for me to see in the scene work what he’s going for, and that only comes from collaborating. Now I’m able to get the movie that’s in his head on screen a lot faster.

Mary Poppins Returns

You were working with complex animations and effects, and also combining those with elaborate choreography and live action. Was there more preplanning for this than you might normally have done?
I wasn’t really involved in the preplanning. I came in about a month before shooting to mostly to catch up with the schedules of the second unit, because I’m always going to work closely with them. I also went through all the storyboards and worked with visual effects and caught up on their look development. We did have a previz team, but we only really needed to previz two of the sequences in the film — the underwater bath time and the balloon sequence.

While previz gives you methodology, shot count, rough lenses and things, it’s missing the real emotion of the story because it is a video game and often cut like a music video. This is no disrespect to previz editors — they’re very good — but I always want to come in and do a pass before we start shooting because I find the timings are very different.

Doctor Strange

Take a film like Marvel’s Doctor Strange. So much of it had been prevized to figure out how to do it. When I came into the Doctor Strange previz cuts early on, they were exciting, psychedelic, wild and really imaginative, but I was losing actors. I found that something that was running at four minutes wasn’t representing any of the dialogue or the emotional content of the actors. So I asked them to give me stills of close-ups to cut them in. After putting in the dialogue, that four-minute sequence becomes seven minutes and you realize it’s too long. Before we go shoot it, how do we make it something that’s more manageable for the ultimate film?

Were you on set during most of the filming?
There were days where Rob would pull me onto set, and then days or weeks where I wouldn’t even see him. I did the traditional assembly process. Even the film I’m cutting right now, which has a very short schedule, four days after they were done shooting I had a cut of the film. It’s the only way for me to know that it’s working. It’s not a great cut, but I know that the movie’s all there. And, most importantly, I need to know, barring the last day of shooting, that I’ve seen every single frame of every take before they wrap. I need the confidence of knowing where it’s all going. I don’t want to discover any of that with a director in post.

On a project this complex, I imagine you must work with multiple assistants?
When I worked on the second Thor movie, The Dark World, I had a friend who was my first assistant, Meagan Costello. She has worked on many Marvel films. When Doctor Strange came up — I think it was almost a year before shooting that I got the call from the director saying I was in —within five seconds, I called Meagan because of her experience, her personality and her incredible skill set. Toward the end of Doctor Strange, when the schedule for Poppins was starting to lock in, she said, “I’ve always wanted to live in New York, and I’ve always wanted to work in a music hall.” I said, “We can make that happen.”

Thor: The Dark World

She is great at running the cutting room, taking care of all of my little, and many, prima donna bugaboos — how things are set up and working, technically, cutting in surround, having the right types of monitors, etc. What’s also important is having someone spiritually and emotionally connected into the film… someone I can talk to and trust.

We had two second assistant editors on Mary Poppins once we were in post — two in the US and two in London. It’s always interesting when you have two different teams. I try to keep as much consistency as I can, so we had Meagan all the way through London and New York. For second assistants in London, we had Gemma Bourne, Ben Renton and Tom Lane. Here in the states we had Alexander Johnson and Christa Haley. Christa is my first assistant on the film I’m currently doing for Focus Features, called Harriet.

On huge films like these, so much of the assistant editor’s time is dealing with the vast deliveries for the studio, the needs of a huge sound and music team as well as a lot of visual effects. In the end, we had about 1,300 hundred visual effect shots. That means a lot of turnovers, screenings and quality control so that nothing is ever coming in or going out without being meticulously watched and listened to.

The first assistant runs the cutting room and the stuff I shouldn’t be thinking about. It’s not stuff I would do well either. I want to be solely focusing on the edit, and when I’m lost in the movie, that’s the greatest thing. Having a strong editorial team allows me to be in a place where I’m not thinking about anything but the cut.

Mary Poppins Returns

That’s always good to hear. Most editors I talk to also care about making sure their assistants are getting opportunities.
When I started out, I had assistants in the room with me. It was very much film-style — the assistant was in the room helping me out with the director and the producers every day. If I had to run out of the room, the assistant could step in.

Unfortunately, the way the world has evolved, with digital post, the assistant editor and editor positions have diverged massively. The skill sets are very different. I don’t think I could do a first assistant editor’s job, but I know they could do my job. Also, the extra level of material keeps them very busy, so they’re not with me in the room. That makes for a much harder path, and that bothers me. I don’t quite know how to fix that yet, but I want to.

This industry started with apprentices, and it was very guild-like. Assistants were very hands on with the editor, so it was very natural to become an editor. Right now, that jump is a little tricky, and I wish I knew how to fix it.

Even if the assistants cut something together for you, it doesn’t necessarily evolve into them getting to work with a director or producer. With Poppins, there’s certainly a scene or two in the film that I asked Meagan to put together for that purpose. Rob works very closely in the cutting room each day, along with John DeLuca, our producer and choreographer. I was wondering if there would be that moment when maybe they’d split off, like, “Oh, go with Meagan and work on this, while I work on this with Rob.” But those opportunities never really arose. It’s hard to figure out how to get that door open.

Do you have any advice for editors who are just starting out?
I love the material I’m working on, and that’s the most important part. Even if something’s not for you, your job is not to make it what you want it to be. The job is to figure out who the audience is and how you make it great for them. There’s an audience for everything, you just have to tap into who that audience is.


Amy Leland is a film director and editor. Her short film, “Echoes”, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Review: Picture Instruments’ plugin and app, Color Cone 2

By Brady Betzel

There are a lot of different ways to color correct an image. Typically, colorists will start by adjusting contrast and saturation followed by adjusting the lift, gamma and gain (a.k.a. shadows, midtones and highlights). For video, waveforms and vectorscopes are great ways of measuring color values and are about the only way to get the most accurate scientific facts on the colors you are manipulating.

Whether you are in Blackmagic Resolve, Avid Media Composer, Adobe Premiere Pro, Apple FCP X or any other nonlinear editor or color correction app, you usually have similar color correction tools across apps — whether you color based on curves, wheels, sliders or even interactively on screen. So when I heard about the way that Picture Instruments Color Cone 2 color corrects — via a Cone (or really a bicone) — I was immediately intrigued.

Color Cone 2 is a standalone app but also, more importantly, a plugin for Adobe After Effects, Adobe Premiere Pro and FCP X. In this review I am focusing on the Premiere Pro plugin, but keep in mind that the standalone version works on still images and allows you to export a 3dl or cube LUTs — a great way for a client to see what type of result you can get quickly from just a still image.

Color Cone 2 is literally a color corrector when used as a plugin for Adobe Premiere. There are no contrast and saturation adjustments, just the ability to select a color and transform it. For instance, you can select a blue sky and adjust the hue, chromanance (saturation) and/or luminance of the resulting color inside of the Color Cone plugin.

To get started you apply the Color Cone 2 plugin to your clip — the plugin is located under Picture Instruments in the Effects tab. Then you click the little square icon in the effect editor panel to open up the Color Cone 2 interface. The interface contains the bicone image representation of the color correction, presets to set up a split-tone color map or a three-point color correct, and the radius slider to adjust the effect your correction has on surrounding color.

Once you are set on a look you can jump out of the Color Cone interface and back into the effect editor inside of Premiere. There you can keyframe all of the parameters you adjusted in the Color Cone interface. This allows for a nice and easy way to transition from no color correction to color correction.

The Cone
The Cone itself is the most interesting part of this plugin. Think of the bicone as the 3D side view of a vectorscope. In other words, if the vectorscope view from a traditional scope is the top view — the bicone in Color Cone would be a side view. Moving your target color from the top cone to the bottom cone will adjust your lightness to darkness (or luminance). At the intersection of the cones is the saturation (or chromanance) and when moving from the center outwards saturation is increased. When a color is selected using the eye dropper you will see a square, which represents the source color selection, a circle representing the target color and an “x” with a line for reference on the middle section.

Additionally, there is a black circle on the saturation section in the middle that shows the boundaries of how far you can push your chromanance. There is a light circle that represents the radius of how surrounding colors are affected. Each video clip can have effects layered on them and one instance of the plugin can handle five colors. If you need more than five, you can add another instance of the plugin to the same clip.

If you are looking to export 3dl and Cube LUTs of your work you will need to use the standalone Color Cone 2 app. The one caveat to using the standalone app is that you can only apply color to still images. Once you do that you can export the LUT to be used in any modern NLE/color correction app.

Summing Up
To be honest, working in Color Cone 2 was a little weird for me. It’s not your usual color correction workflow, so I would need to sit with the plugin for a while to get used to its setup. That being said, it has some interesting components that I wish other color correction apps would use, such as the Cone view. The bicone is a phenomenal way to visualize color correction in realtime.

In my opinion, if Picture Instruments would sell just the Cone as a color measurement tool to work in conjunction with Lumetri, they would have another solid tool. Color Cone 2 has a very unique and interesting way to color correct in Premiere that acts as an advanced secondary color correct tool to the Lumetri color correction tools.

The Color Cone 2 standalone app and plugin costs $139 when purchased together, or $88 individually. In my opinion, video people should probably just stick to the plugin version. Check out Picture Instrument’s website for more info on Color Cone 2 as well as their other products. And check them out on Twitter @Pic_instruments.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Crazy Rich Asians editor Myron Kerstein

By Amy Leland

When the buzz started in anticipation of the premiere of Crazy Rich Asians, there was a lot of speculation about whether audiences would fill the theaters for the first all-Asian cast in an American film since 1993’s Joy Luck Club. Or whether audiences wanted to see a romantic comedy, a format that seemed to be falling out of favor.

The answer to both questions was a resounding, “Yes!” The film grossed $35 million during its opening weekend, against a $30 million budget. It continued going strong its second weekend, making another $28M, the highest Labor Day weekend box office in more than a decade. It was the biggest opening weekend for a rom-com in three years, and is the most successful studio rom-com in nine. All of this great success can be explained pretty simply — it’s a fun movie with a well-told story.

Not long ago, I had the great fun of sitting down with one of its storytellers, editor Myron Kerstein, to discuss this Jon M. Chu-directed film as well as Kerstein’s career as an editor.

How did you get started as an editor?
I was a fine arts major in college and stumbled upon photography, filmmaking, painting and printmaking. I really just wanted to make art of any kind. Once I started doing more short films in college, I found a knack for editing.

When I first moved to New York, I needed to make a living, so I became a PA, and I worked on a series called TV Nation one of Michael Moore’s first shows. It was political satire. There was a production period, and then slowly the editors needed help in the post department. I gravitated toward these alchemists, these amazing people who were making things out of nothing. I really started to move toward post through that experience.

I also hustled quite a bit with all of those editors, and they started to hire me after that job. Slowly but surely I had a network of people who wanted to hire me again. That’s how I really started, and I really began to love it. I thought, what an amazing process to read these stories and look at how much power and influence an editor has in the filmmaking process.

I was not an assistant for too long, because I got to cut a film called Black & White. Then I quickly began doing edits for other indies, one being a film called Raising Victor Vargas, and another film called Garden State. That was my big hit in the indie world, and slowly that lead to more studio films, and then to Crazy Rich Asians.

Myron Kerstein and Crazy Rich Asians actor Henry Golding.

Your first break was on a television show that was nothing like feature films. How did you ultimately move toward cutting feature films?
I had a real attraction to documentary filmmaking, but my heart wanted to make narrative features. I think once you put that out in the universe, then those jobs start coming to you. I then stumbled upon my mentor, Jim Lyons, who cut all of Todd Haynes’s movies for years. When I worked on Velvet Goldmine as an assistant editor, I knew this was where I really needed to be. This was a film with music that was trying to say something, and was also very subversive. Jim and Todd were these amazing filmmakers that were just shining examples of the things I wanted to make in the future.

Any other filmmakers or editors whose work influenced you as you were starting out?
In addition to Todd Haynes, directors like Gus Van Sant and John Hughes. When I was first watching films, I didn’t really understand what editors did, so at the same time I was influenced by Spielberg, or somebody like George Romero. Then I realized there were editors later who made these things. Ang Lee, and his editor Tim Squyres were like a gods to me. I really wanted to work on one of Ang’s crews very badly, but everyone wanted to work with him. I was working at the same facilities where Ang was cutting, and I was literally sneaking into his edit rooms. I would be working on another film, and I would just kind of peek my head in and see what they were doing and that kind of thing.

How did this Crazy Rich Asians come about for you?
Brad Simpson, who was a post supervisor on Velvet Goldmine back in the ‘90s when I was the assistant editor, is a producer on this film. Flash forward 20 years and I stumbled upon this script through agents. I read it and I was like, “I really want to be a part of this, and Brad’s the producer on this thing? Let me reach out to him.” He said, “I think you might be the right fit for this.” It was pretty nerve-wracking because I’d never worked with Jon before. Jon was a pretty experienced filmmaker, and he’d worked with a lot of editors. I just knew that if I could be part of the process, we could make something special.

My first interview with Jon was a Skype interview. He was in Malaysia already prepping for the film. Those interviews are very difficult to not look or sound weird. I just spoke from the heart, and said this is what I think makes me special. These are the ways I can try to influence a film and be part of the process. Lucky enough between that interview and Brad’s recommendation, I got the job.

Myron Kerstein and director Jon Chu.

When did you begin your work on the film?
I basically started the first week of filming and joined them in Malaysia and Singapore for the whole shoot. It was a pretty amazing experience being out there in two Muslim countries — two Westernized Muslim countries that were filled with some of the friendliest people I’ve ever met. It was an almost entirely local crew, a couple of assistant editors, and me. Sometimes I feel like it might not be the best thing for an editor to be around set too much, but in this case it was good for me to see the setting they were trying to portray… and feel the humidity, the steaminess, the romance and Singapore, which is both alien and beautiful at the same time.

What was your collaboration like with Jon Chu?
It was just an organic process, where my DNA started to become infused with Jon’s. The good thing about my going to Malaysia and Singapore was we got to work together early. One thing that doesn’t happen often anymore is a director who actually screens dailies in a theater. Jon would do that every weekend. We would watch dailies, and he would say what he liked and didn’t like, or more just general impressions of his footage. That allowed me to get into his head a bit.

At the same time I was also cutting scenes. At the end of every day’s screening, we would sit down together. He gave me a lot of freedom, but at the same time was there to give me his first impressions of what I was doing. I think we were able to build some trust really early.

Because of the film’s overwhelming success, this has opened doors for other Asian-led projects.
Isn’t that the most satisfying thing in the world? You hope to define your career by moments like this, but rarely get that chance. I watched this film, right when it was released, which was on my birthday. I ended up sitting next to this young Asian boy and his mom. This kid was just giggling and weeping throughout the movie. To have an interaction with a kid like that, who may have never seen someone like himself represented on the screen was pretty outstanding.

Music was such an important part of this film. The soundtrack is so crucial to moments in the film that it almost felt like a musical. Were you editing scenes with specific songs in mind, or did you edit  and then come back and add music?
Jon gave me a playlist very early on of music he was interested in. A lot of the songs sounded like they were from the 1920s — almost big band tunes. Right then I knew the film could have more of a classy Asian-Gatsby quality to it. Then as we were working on the film together, we started trying out these more modern tunes. I think the producers might have thought we were crazy at one point. You’re asking the audience to go down these different roads with you, and that can sometimes work really well, or sometimes can be a train wreck.

But as much as I love working with music, when I assemble I don’t cut with any music in mind. I try not to use it as a crutch. Oftentimes you cut something with music, either with a song in your head, or often editors will cut with a song as a music bed. But, if you can’t tell a story visually without a song to help drive it, then I think you’re fooling yourself.

I really find that my joy of putting in music happens after I assemble, and then I enjoy experimenting. That Coldplay song at the end of the film, for example… We were really struggling with how to end our movie. We had a bunch of different dialogue scenes that were strung together, but we didn’t feel like it was building up to some kind of climax. I figured out the structure and then cut it like any other scene without any music. Then Jon pitched a couple songs. Ironically enough I had an experience with Coldplay from the opening of Garden State. I liked the idea of this full circle in my own career with Coldplay at the end of a romantic comedy that starred an all-Asian cast. And it really felt like it was the right fit.

The graphic design was fascinating, especially in the early scene with Rachel and Nick on their date that kicks off all of the text messages. Is that something that was storyboarded early, or was that something you all figured out in the edit and in post?
Jon did have a very loose six-page storyboard of how we would get from the beginning of this to the end. The storyboard was nothing compared to what we ended up doing. When I first assembled my footage, I stitched together a two-minute sequence of just split screens of people reacting to other people. Some of that footage is in the movie, but it was just a loose sketch. Jon liked it, but it didn’t represent what he imagined this sequence to be. To some extent he had wondered whether we even needed the sequence.

Jon and I discussed it and said, “Let’s give this a shot. Let’s find the best graphics company out there.” We ended up landing with this company called Aspect, led by John Berkowitz. He and his team of artists worked with us to slowly craft this sequence over months. Beginning with, “How do we get the first text bubble to the second person? What do those text bubbles look like? How do they travel?” Then they gave us 20 different options to see how those two elements would work together. Then we asked, “How do we start expanding outward? What information are we conveying? What is the text bubble saying?” It was like this slowly choreographed dance that we ended up putting together over the course of months.

They would make these little Disney-esque pops. We really loved that. That kind of made it feel like we were back in old Hollywood for a second. At the same time we had these modern devices with text bubbles. So far as the tone was concerned, we tried percussion, just drumming, and other old scores. Then we landed on a score from John Williams from 1941, and that gave us the idea that maybe some old-school big band jazz might go really well in this. Our composer Brian Tyler saw it, and said, “I think I can make this even zanier and crazier.”

How do you work with your assistants?
Assistants are crucial as far as getting through the whole process. I actually had two sets of assistants; John To and David Zimmerman were on the first half in Malaysia and Singapore. I found John through my buddy Tom Cross, who edits for Damien Chazelle. I wanted somebody who could help me with the challenges of getting through places like Malaysia and Singapore, because if you’re looking for help for your Avid, or trying to get dailies from Malaysia to America, you’re kind of on your own. Warner Bros. was great and supportive, and they gave us all the technical help. But it’s not like they can fly somebody out if something goes wrong in an hour.

On the post side I ended up using Melissa Remenarich-Aperlo, and she was outstanding. In the post process I needed somebody to hold down the fort and keep me organized, and also somebody for me to bounce ideas off of. I’m a big proponent of using my assistants creatively. Melissa ended up cutting the big fashion montage. I really struggled with that sequence because I felt strongly like this might be a trope that this film didn’t need. That was the debate with a lot of them. Which romantic comedy tropes should we have in this movie? Jon was like, “It’s wish fulfillment. We really need this. I know we’ve seen it a thousand times, but we need this scene.”

I said let’s try something different. Let’s try inter-cutting the wedding arrival with the montage, and let’s try to make it one big story to get us from us not knowing what she’s going to show up in to her arrival. Both of those sequences were fine on their own, but it didn’t feel like either one of them was doing anything interesting. It just felt like we were eating up time, and we needed to get to the wedding, and we had a lot of story to tell. Once we inter-cut them we knew this was the right choice. As Jon said, you need these moments in the film where you can just sit back and take a breath, smile for a minute and get ready for the drama that starts. Melissa did a great job on that sequence.

Do you have any advice for somebody who’s just starting out and really wants to edit feature films?
I would tell them to start cutting. Cut anything they can. If they don’t have the software, they can cut on iMovie on their iPhone. Then they should  reach out to people like me and create a network. And keep doing that until people say yes. Don’t be afraid to reach out to people.

Also don’t be afraid to be an assistant editor. As much as they want to cut, as they should, they also need to learn the process of editing from others. Be willing to stick with it, even if that means years of doing it. I think you’d be surprised how much you learn over the course of time with good editors. I feel like it’s a long bridge. I’ve been doing this for 20 years, and it took a long time to get here, but perseverance goes a long way in this field. You just have to really know you want to do it and keep doing it.


Amy Leland is a film director and editor. Her short film, “Echoes”, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Presenting at IBC vs. NAB

By Mike Nuget

I have been lucky enough to attend NAB a few times over the years, both as an onlooker and as a presenter. In 2004, I went to NAB for the first time as an assistant online editor, mainly just tagging along with my boss. It was awesome! It was very overwhelming and, for the most part, completely over my head.  I loved seeing things demonstrated live by industry leaders. I felt I was finally a part of this crazy industry that I was new to. It was sort of a rite of passage.

Twelve years later, Avid asked me to present on the main stage. Knowing that I would be one of the demo artists that other people would sit down and watch — as I had done just 12 years earlier — was beyond anything I thought I would do back when I first started. The demo showed the Avid and FilmLight collaboration between the Media Composer and the Baselight color system. Two of my favorite systems to work on. (Watch Mike’s presentation here.)

Thanks to my friend and now former co-worker Matt Schneider, who also presented alongside of me, I had developed a very good relationship with the Avid developers and some of the people who run the Avid booth at NAB. And at the same time, the Filmlight team was quickly being put on my speed dial and that relationship strengthened as well.

This past NAB, Avid once again asked me to come back and present on the main stage about Avid Symphony Color and FilmLight’s Baselight Editions plug-in for Avid, but this time I would get to represent myself and my new freelance career change — I had just left my job at Technicolor-Postworks in New York a few weeks prior. I thought that since I was now a full-time freelancer this might be the last time I would ever do this kind of thing. That was until this past July, when I got an email from the FilmLight team asking me to present at IBC in Amsterdam. I was ecstatic.

Preparing for IBC was similar enough as far as my demo, but I was definitely more nervous than I was at NAB. I think it was two reasons: First, presenting in front of many different people in an international setting. Even though I am from the melting pot of NYC, it is a different and interesting feeling being surrounded by so many different nationalities all day long, and pretty much being the minority. On a personal note, I loved it. My wife and I love traveling, and to us this was an exciting chance to be around people from other cultures. On a business level, I guess I was a little afraid that my fast-talking New Yorker side would lose some people, and I didn’t want that to happen.

The second thing was that this was the first time that I was presenting strictly for FilmLight and not Avid. I have been an Avid guy for over 15 years. It’s my home, it’s my most comfortable system, and I feel like I know it inside and out. I discovered Baselight in 2012, so to be presenting in front of FilmLight people, who might have been using their systems for much longer, was a little intimidating.

When I walked into the room, they had setup a full-on production, along with spotlights, three cameras, a projector… the nerves rushed once again. The demo was standing room only. Sometimes when you are doing presentations, time seems to fly by, so I am not sure I remember every minute of the 50-minute presentation, but I do remember at one point within the first few minutes my voice actually trembled, which internally I thought was funny, because I do not tend to get nervous. So instead of fighting it, I actually just said out loud “Sorry guys, I’m a little nervous here,” then took a deep breath, gathered myself, and fell right into my routine.

I spent the rest of the day watching the other FilmLight demos and running around the convention again saying hello to some new vendors and goodbye to those I had already seen, as Sunday was my last day at the show.

That night I got to hang out with the entire Filmlight staff for dinner and some drinks. These guys are hilarious, what a great tight-knit family vibe they have. At one point they even started to label each other, the uncle, the crazy brother, the funny cousin. I can’t thank them enough for being so kind and welcoming. I kind of felt like a part of the family for a few days, and it was tremendously enjoyable and appreciated.

Overall, IBC felt similar enough to NAB, but with a nice international twist. I definitely got lost more since the layout is much more confusing than NAB’s. There are 14 halls!

I will say that the “relaxing areas” at IBC are much better than NAB’s! There is a sandy beach to sit on, a beautiful canal to sit by while having a Heineken (of course) and the food trucks were much, much better.

I do hope I get to come back one day!


Mike Nuget (known to most as just “Nuget”) is a NYC-based colorist and finishing editor. He recently decided to branch out on his own and become a freelancer after 13 years with Technicolor-Postworks. He has honed a skill set across multiple platforms, including FilmLight’s Baselight, Blackmagic’s Resolve, Avid and more. 

Editor Paul Zucker on cutting Hotel Artemis

By Zack Wolder

The Drew Pearce-directed Hotel Artemis is a dark action-thriller set in a riot-torn Los Angeles in the not-too-distant future. What is the Hotel Artemis? It’s a secret members-only hospital for criminals run by Jodie Foster with the help of David Bautista. The film boasts an impressive cast that also includes Sterling K. Brown, Jeff Goldblum, Charlie Day, Sofia Boutella and Jennie Slate.

Hotel Artemis editor Paul Zucker, ACE, has varied credits that toggle between TV and film, including Trainwreck, This is 40, Eternal Sunshine of a Spotless Mind, Girls, Silicon Valley and many others.

We recently reached out to Zucker, who worked alongside picture editor Gardner Gould, to talk about his process on the film.

Paul Zucker and adorable baby.

How did you get involved in this film?
This was kind of a blind date set-up. I wasn’t really familiar with Drew, and it was a project that came to me pretty late. I think I joined about a week, maybe two, before production began. I was told that they were in a hurry to find an editor. I read the script, I interviewed with Drew, and that was it.

How long did it take to complete the editing?
About seven months.

How involved were you throughout the whole phase of production? Were you on set at all?
I wasn’t involved in pre-production, so I wasn’t able to participate in development of the script or anything like that, but as soon as the camera started rolling I was cutting. Most of the film was shot on stages in downtown LA, so I would go to set a few times, but most of the time there was enough work to do that I was sequestered in the edit room and trying to keep up with camera.

I’m an editor who doesn’t love to go to set. I prefer to be uninfluenced by whatever tensions, or lack of tensions, are happening on set. If a director has something he needs me for, if it’s some contribution he feels I can make, I’m happy, able and willing to participate in shot listing, blocking and things like that, but on this movie I was more valuable putting together the edit.

Did you have any specific deadlines you had to meet?
On this particular movie there was a higher-than-average number of requests from director Drew Pearce. Since it was mostly shot on stages, he was able to re-shoot things a little easier than you would if we were on location. So it became important for him to see the movie sooner rather than later.

A bunch of movies ago, I adopted a workflow of sending the director whatever I had each Friday. I think it’s healthy for them to see what they’re working on. There’s always the chance that it will influence the work they’re doing, whether it’s performance of the actors or the story or the script or really anything.

As I understand it from the directors I’ve worked for, seeing the editor’s cut can be the worst day of the process for them. Not because of the quality of the editing, but because it’s hard in that first viewing to look past all the things that they didn’t get on set. Its tough to not just see the mistakes. Which is totally understandable. So I started this strategy of easing them into it. I just send scenes; I don’t send them in sequence. By the time they get to the editors cut, they’ve seen most of the scenes, so the shock is lessened and hopefully that screening is more productive

Do you ever get that sense that you may be distracting them or overwhelming them with something?
Yes, sometimes. A couple of pictures ago, I did my normal thing — sending what I had on a Friday — and the director told me he didn’t want to watch them. For him, issues of post were a distraction while he was in production. So to each his own.

Drew Pearce certainly benefitted. Drew was the type of director who, if I sent it at 9pm, he would be watching it at 9:05pm, and he would be giving me notes at 10:05pm.

Are you doing temp color and things like that?
Absolutely. I do as much as the footage I’m given requires. On this particular movie, the cinematographer, the DIT and the lab were so dialed in that these were the most perfect-looking dailies I think I’ve ever gotten. So I had to do next to nothing. I credit DP Chung-Hoon Chung for that. Generally, if I’m getting dailies that are mismatched in color tone, I’m going to do whatever it takes to smooth it out. Nothing goes in front of the director until it’s had a hardcore sound and color pass. I am always trying to leave as little to the imagination as possible. I try to present something that is as close to the experience that the audience will have when they watch the movie. That means great color, great sound, music, all of that.

Do you ever provide VFX work?
Editorial is typically always doing simple VFX work like split-screens, muzzle-flashes for guns, etc. Those are all things that we’re really comfortable doing.

On this movie, theres a large VFX component, so the temp work was more intense. We had close to 500 VFX shots, and there’s some very involved ones. For example, a helicopter crashes into a building after getting blasted out of the sky with a rocket launcher. There are multiple scenes where characters get operated on by robotic arms. There’s a 3D printer that prints organs and guns. So we had to come up with a large number of temp shots in editorial.

Editor Gardner Gould and assistant editors Michael Costello and Lillian Dawson Bain were instrumental in coming up with these shots.

What about editing before the VFX shots are delivered?
From the very beginning, we are game-planning — what are the priorities for the movie vis-a-vis VFX? Which shots do we need early for story reasons? Which shots are the most time consuming for the VFX department? All of these things are considered as the entire post production department collaborates to come up with a priorities list.

If I need temp versions of shots to help me edit the scene, the assistants help me make them. If we can do them, we’ll do them. These aid in determining final VFX shot length, tempo, action, anything. As the process goes on, they get replaced by shots we get from the VFX department.

One thing I’m always keeping in mind is that shots can be created out of thin air oftentimes. If I have a story problem, sometimes a shot can be created that will help solve it. Sometimes the entire meaning of a scene can change.

What do you expect from your assistant editors?
The first assistant had to have experience with visual effects. The management of workflow for 500 shots is a lot, and on this job, we did not have a dedicated VFX editor. That fell upon (my co-editor) editor Gardner Gould.

I generally kick a lot of sound to the assistant, as I’m kind of rapidly moving through cutting picture. But I’m also looking for someone who’s got that storytelling bone that great editors have. Not everybody has it, not every great assistant has it.

There is so much minutiae on the technical side of being an assistant editor that you run the risk of forgetting that you’re working on a movie for an audience. And, indeed, some assistants just do the assistant work. They never cut scenes, they never do creative work, they’re not interested or they just don’t. So I’m always encouraging them to think like an editor at every point.

I ask them for their opinions. I invite them into the process, I don’t want them to be afraid to tell me what they think. You have to express yourself artistically in every decision you make. I encourage them to think critically and analytically about the movie that we’re working on.

I came up as an assistant and I had a few people who really believed in me. They invited me into the room with the director and they gave me that early exposure that really helped me learn my trade. I’m kind of looking to pay back that favor to my assistants.

Why did you choose to edit this film on Avid? Are you proficient in any other NLEs?
Oh, I’d say strictly Avid. To me, a tool, a technology, should be as transparent as possible. I want to have the minimum of time in between thought and expression. Which means that if I think of an edit, I want to automatically, almost without thinking, be able to do a keystroke and have that decision appear on the monitor. I’m so comfortable with Avid that I’m at that point.

How is your creative process different when editing a film versus a TV show?
Well first, a TV show is going to have a pre-determined length. A movie does not have a pre-determined length. So in television you’re always wrangling with the runtime. The second thing that’s different is in television schedules are a little tighter and turnaround times are a little tighter. You’re constantly in pre-production, production and post at the same time.

Also, television is for a small screen. Film, generally speaking, is for the big screen. The venue matters for a lot of reasons, but it matters for pacing. You’re sitting in a movie theater and maybe you can hold shots a little bit longer because the canvas is so wide and there’s so much to look at. Whereas with the small screen, you’re sitting closer to the television, the screen itself is smaller, maybe the shots are typically not as wide or you cut a little quicker.

You’re a very experienced comedic editor. Was it difficult to be considered for a different type of film?
I guess the answer is yes. The more famous work I’ve done in the last couple of years has been for people like Lena Dunham and Judd Apatow. So people say, “Well, he’s a comedy editor.” But if you look at my resume dating back to the very first thing I did in 2001, I edited my first movie — a pretty radical film for Gus Van Sant called Gerry, and it was not a comedy. Eternal Sunshine was not a comedy. Before Girls, I couldn’t get hired on comedies.

Then I got pulled on by Judd to work on some of his movies, and he’s such a brand name that people see that on your resume and they say, “Well, you must be a comedy editor.” So, yes, it does become harder to break out of that box, but that’s the box that other people put you in, I don’t put myself in that. My favorite filmmakers work across all types of genre.

Where do you find inspiration? Music? Other editors? Directors?
Good question. I mean… inspiration is everywhere. I’m a movie fan, I always have been, that’s the only thing I’ve ever wanted to do. I’m always going to the movies. I watch lots of trailers. I like to keep up with what people are doing. I go back and re-watch the things that I love. Listening to other editors or reading other editors speak about their process is inspiring to me. Listening and speaking with people who love what they do is inspiring.

For Hotel Artemis, I went back and watched some movies that were an influence on this one to get in the tone-zone. I would listen to a lot of the soundtracks that were soundtracks to those movies. As far as watching movies, I watched Assault on Precinct 13, for instance. That’s a siege movie, and Hotel Artemis is kind of a siege movie. Some editors say they don’t watch movies while they’re making a movie, they don’t want to be influenced. It doesn’t bother me. It’s all in the soup.


Zack Wolder is a video editor based in NYC. He is currently the senior video editor at Billboard Magazine.  Follow him on Instagram at @thezackwolder.

Avid adds to Nexis product line with Nexis|E5

The Nexis|E5 NL nearline storage solution from Avid is now available. The addition of this high-density on-premises solution to the Avid Nexis family allows Avid users to manage media across all their online, nearline and archive storage resources.

Avid Nexis|E5 NL includes a new web-based Nexis management console for managing, controlling and monitoring Nexis installations. NexislE5 NL can be easily accessed through MediaCentral | Cloud UX or Media Composer and also integrates with MediaCentral|Production Management, MediaCentral|Asset Management and MediaCentral|Editorial Management to help collaboration, with advanced features such as project and bin sharing. Extending the Nexis|FS (file system) to a secondary storage tier makes it easy to search for, find and import media, enabling users to locate content distributed throughout their operations more quickly.

Build for project parking, staging workflows and proxy archive, Avid reports that Nexis | E5 NL streamlines the workflow between active and non-active assets, allowing media organizations to park assets as well as completed projects on high-density nearline storage, and keep them within easy reach for rediscovery and reuse.

Up to eight Nexis|E5 NL engines can be integrated as one virtualizable pool of storage, making content and associated projects and bins more accessible. In addition, other Avid Nexis Enterprise engines can be integrated into a single storage system that is partitioned for better archival organization.

Additional Nexis|E5 NL features include:
• It’s scalable from 480TB of storage to more than 7PB by connecting multiple Nexis|E5 NL engines together as a single nearline system for a highly scalable, lower-cost secondary tier of storage.
• It offers flexible storage infrastructure that can be provisioned with required capacity and fault-tolerance characteristics.
• Users can configure, control and monitor Nexis using the updated management console that looks and feels like a MediaCentral|Cloud UX application. Its dashboard provides an overview of the system’s performance, bandwidth and status, as well as access to quickly configure and manage workspaces, storage groups, user access, notifications and other functions. It offers the flexibility and security of HTML5 along with an interface design that enables mobile device support.

Pacific Post adds third LA location servicing editorial

Full-service editorial equipment rental and services provider Pacific Post has expanded its footprint with the opening of a new 10,000 square-foot facility in Sherman Oaks, California. This brings the total locations in the LA area to three, including North Hollywood and Hollywood.

The new location offers 25 Avid suites with 24/7 technical support, alongside a writer’s room and several production offices. Pacific Post has retrofitted the entire site, which is supported by Avid Nexis shared storage and 1GB of dedicated Fiber internet connectivity.

“We recently provided equipment and services to the editorial team on Game Over, Man! for Netflix in Sherman Oaks, and continued to receive inquiries from other productions in the area,” says Pacific Post VP Kristin Kumamoto. “The explosion we’ve seen in scripted production, especially for streaming platforms, prompted our decision to add this building to our offerings.”

Kumamoto says a screening room is also close to completion. It features a 150-inch screen and JVC 4K projector for VFX reviews and an enhanced, in-house viewing experience. Additional amenities at Pacific Post Sherman Oaks include MPAA-rated security, reserved parking, a full kitchen and lounge, VoIP phone systems and a substantial electrical infrastructure.

We reached out to Kumamoto to find out more.

Why the investment in Avid over some of the other NLE choices out there currently?
It really stems from the editorial community — from scripted and non-scripted shows that really want to work in shared project environments. They trust the media management with Avid’s shared storage, making it a clear choice when working on projects with the tightest deadlines.

How do you typically work with companies coming in looking for editing space? What is your process?
It usually starts with producers looking for a location that meets the needs of the editors in terms of commute or the proximity to studios for executives.  After that, it really comes down to having a secure and flexible layout along with a host of other requirements.”

With cutting rooms in North Hollywood/Universal City and in Hollywood, we feel Sherman Oaks is the perfect location to complement the other facilities and really give more choices to producers looking to set up cutting rooms in the San Fernando Valley area of LA.